Apr 24 16:39:23.945039 ip-10-0-129-227 systemd[1]: Starting Kubernetes Kubelet... Apr 24 16:39:24.391968 ip-10-0-129-227 kubenswrapper[2579]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 16:39:24.391968 ip-10-0-129-227 kubenswrapper[2579]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 16:39:24.391968 ip-10-0-129-227 kubenswrapper[2579]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 16:39:24.391968 ip-10-0-129-227 kubenswrapper[2579]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 16:39:24.391968 ip-10-0-129-227 kubenswrapper[2579]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 16:39:24.392821 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.392040 2579 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 16:39:24.397187 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397162 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 16:39:24.397187 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397183 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 16:39:24.397187 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397189 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 16:39:24.397187 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397193 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 16:39:24.397436 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397197 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 16:39:24.397436 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397202 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 16:39:24.397436 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397205 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 16:39:24.397436 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397209 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 16:39:24.397436 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397212 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 16:39:24.397436 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397216 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 16:39:24.397436 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397219 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 16:39:24.397436 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397227 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 16:39:24.397436 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397231 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 16:39:24.397436 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397235 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 16:39:24.397436 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397238 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 16:39:24.397436 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397242 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 16:39:24.397436 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397245 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 16:39:24.397436 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397249 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 16:39:24.397436 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397252 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 16:39:24.397436 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397258 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 16:39:24.397436 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397261 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 16:39:24.397436 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397266 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 16:39:24.397436 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397270 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 16:39:24.397436 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397273 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 16:39:24.398207 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397277 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 16:39:24.398207 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397280 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 16:39:24.398207 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397285 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 16:39:24.398207 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397289 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 16:39:24.398207 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397293 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 16:39:24.398207 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397296 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 16:39:24.398207 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397300 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 16:39:24.398207 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397305 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 16:39:24.398207 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397309 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 16:39:24.398207 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397313 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 16:39:24.398207 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397318 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 16:39:24.398207 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397324 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 16:39:24.398207 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397328 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 16:39:24.398207 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397333 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 16:39:24.398207 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397338 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 16:39:24.398207 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397345 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 16:39:24.398207 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397349 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 16:39:24.398207 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397353 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 16:39:24.398207 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397358 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 16:39:24.398207 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397362 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 16:39:24.398724 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397367 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 16:39:24.398724 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397372 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 16:39:24.398724 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397376 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 16:39:24.398724 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397380 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 16:39:24.398724 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397384 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 16:39:24.398724 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397388 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 16:39:24.398724 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397392 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 16:39:24.398724 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397395 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 16:39:24.398724 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397399 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 16:39:24.398724 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397403 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 16:39:24.398724 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397407 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 16:39:24.398724 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397411 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 16:39:24.398724 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397415 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 16:39:24.398724 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397429 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 16:39:24.398724 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397435 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 16:39:24.398724 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397441 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 16:39:24.398724 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397445 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 16:39:24.398724 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397451 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 16:39:24.398724 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397455 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 16:39:24.399214 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397460 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 16:39:24.399214 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397465 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 16:39:24.399214 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397475 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 16:39:24.399214 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397480 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 16:39:24.399214 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397484 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 16:39:24.399214 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397488 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 16:39:24.399214 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397493 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 16:39:24.399214 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397497 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 16:39:24.399214 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397503 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 16:39:24.399214 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397508 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 16:39:24.399214 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397513 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 16:39:24.399214 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397518 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 16:39:24.399214 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397522 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 16:39:24.399214 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397526 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 16:39:24.399214 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397531 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 16:39:24.399214 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397536 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 16:39:24.399214 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397540 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 16:39:24.399214 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397544 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 16:39:24.399214 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397548 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 16:39:24.399214 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397552 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 16:39:24.399698 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397556 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 16:39:24.399698 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397560 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 24 16:39:24.399698 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.397564 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 16:39:24.399698 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398185 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 16:39:24.399698 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398197 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 16:39:24.399698 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398201 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 16:39:24.399698 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398204 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 16:39:24.399698 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398207 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 16:39:24.399698 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398211 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 16:39:24.399698 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398214 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 16:39:24.399698 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398217 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 16:39:24.399698 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398220 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 16:39:24.399698 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398222 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 16:39:24.399698 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398225 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 16:39:24.399698 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398228 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 16:39:24.399698 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398231 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 16:39:24.399698 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398234 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 16:39:24.399698 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398237 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 16:39:24.399698 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398240 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 16:39:24.399698 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398243 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 16:39:24.400244 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398245 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 16:39:24.400244 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398248 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 16:39:24.400244 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398253 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 16:39:24.400244 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398257 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 16:39:24.400244 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398260 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 16:39:24.400244 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398263 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 16:39:24.400244 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398266 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 16:39:24.400244 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398269 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 16:39:24.400244 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398271 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 16:39:24.400244 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398274 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 16:39:24.400244 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398276 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 16:39:24.400244 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398279 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 16:39:24.400244 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398281 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 16:39:24.400244 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398284 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 16:39:24.400244 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398287 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 16:39:24.400244 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398289 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 16:39:24.400244 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398293 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 16:39:24.400244 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398296 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 16:39:24.400244 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398298 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 16:39:24.400244 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398301 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 16:39:24.400791 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398304 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 16:39:24.400791 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398308 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 16:39:24.400791 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398311 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 16:39:24.400791 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398314 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 16:39:24.400791 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398316 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 16:39:24.400791 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398318 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 16:39:24.400791 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398321 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 16:39:24.400791 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398323 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 16:39:24.400791 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398326 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 16:39:24.400791 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398330 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 16:39:24.400791 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398332 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 16:39:24.400791 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398335 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 16:39:24.400791 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398337 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 16:39:24.400791 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398340 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 16:39:24.400791 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398342 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 16:39:24.400791 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398345 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 16:39:24.400791 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398347 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 16:39:24.400791 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398350 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 16:39:24.400791 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398353 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 16:39:24.400791 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398355 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 16:39:24.401281 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398358 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 16:39:24.401281 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398360 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 16:39:24.401281 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398363 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 16:39:24.401281 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398366 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 16:39:24.401281 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398368 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 24 16:39:24.401281 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398370 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 16:39:24.401281 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398373 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 16:39:24.401281 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398375 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 16:39:24.401281 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398379 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 16:39:24.401281 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398381 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 16:39:24.401281 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398384 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 16:39:24.401281 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398387 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 16:39:24.401281 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398389 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 16:39:24.401281 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398391 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 16:39:24.401281 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398394 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 16:39:24.401281 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398398 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 16:39:24.401281 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398400 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 16:39:24.401281 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398403 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 16:39:24.401281 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398405 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 16:39:24.401281 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398407 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 16:39:24.401801 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398410 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 16:39:24.401801 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398412 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 16:39:24.401801 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398415 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 16:39:24.401801 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398417 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 16:39:24.401801 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398420 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 16:39:24.401801 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398422 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 16:39:24.401801 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398424 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 16:39:24.401801 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398427 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 16:39:24.401801 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.398429 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 16:39:24.401801 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399083 2579 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 16:39:24.401801 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399093 2579 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 16:39:24.401801 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399101 2579 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 16:39:24.401801 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399106 2579 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 16:39:24.401801 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399110 2579 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 16:39:24.401801 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399113 2579 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 16:39:24.401801 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399118 2579 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 16:39:24.401801 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399123 2579 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 16:39:24.401801 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399127 2579 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 16:39:24.401801 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399130 2579 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 16:39:24.401801 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399134 2579 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 16:39:24.401801 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399137 2579 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 16:39:24.402303 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399141 2579 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 16:39:24.402303 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399144 2579 flags.go:64] FLAG: --cgroup-root="" Apr 24 16:39:24.402303 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399147 2579 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 16:39:24.402303 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399150 2579 flags.go:64] FLAG: --client-ca-file="" Apr 24 16:39:24.402303 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399153 2579 flags.go:64] FLAG: --cloud-config="" Apr 24 16:39:24.402303 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399156 2579 flags.go:64] FLAG: --cloud-provider="external" Apr 24 16:39:24.402303 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399160 2579 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 16:39:24.402303 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399164 2579 flags.go:64] FLAG: --cluster-domain="" Apr 24 16:39:24.402303 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399167 2579 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 16:39:24.402303 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399170 2579 flags.go:64] FLAG: --config-dir="" Apr 24 16:39:24.402303 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399173 2579 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 16:39:24.402303 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399176 2579 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 16:39:24.402303 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399181 2579 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 16:39:24.402303 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399184 2579 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 16:39:24.402303 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399187 2579 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 16:39:24.402303 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399190 2579 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 16:39:24.402303 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399194 2579 flags.go:64] FLAG: --contention-profiling="false" Apr 24 16:39:24.402303 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399197 2579 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 16:39:24.402303 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399200 2579 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 16:39:24.402303 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399203 2579 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 16:39:24.402303 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399206 2579 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 16:39:24.402303 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399210 2579 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 16:39:24.402303 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399213 2579 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 16:39:24.402303 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399217 2579 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 16:39:24.402303 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399220 2579 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 16:39:24.402944 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399224 2579 flags.go:64] FLAG: --enable-server="true" Apr 24 16:39:24.402944 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399227 2579 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 16:39:24.402944 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399231 2579 flags.go:64] FLAG: --event-burst="100" Apr 24 16:39:24.402944 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399235 2579 flags.go:64] FLAG: --event-qps="50" Apr 24 16:39:24.402944 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399237 2579 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 16:39:24.402944 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399241 2579 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 16:39:24.402944 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399244 2579 flags.go:64] FLAG: --eviction-hard="" Apr 24 16:39:24.402944 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399249 2579 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 16:39:24.402944 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399252 2579 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 16:39:24.402944 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399255 2579 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 16:39:24.402944 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399258 2579 flags.go:64] FLAG: --eviction-soft="" Apr 24 16:39:24.402944 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399261 2579 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 16:39:24.402944 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399264 2579 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 16:39:24.402944 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399267 2579 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 16:39:24.402944 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399271 2579 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 16:39:24.402944 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399274 2579 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 16:39:24.402944 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399276 2579 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 16:39:24.402944 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399279 2579 flags.go:64] FLAG: --feature-gates="" Apr 24 16:39:24.402944 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399283 2579 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 16:39:24.402944 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399286 2579 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 16:39:24.402944 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399290 2579 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 16:39:24.402944 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399293 2579 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 16:39:24.402944 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399297 2579 flags.go:64] FLAG: --healthz-port="10248" Apr 24 16:39:24.402944 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399311 2579 flags.go:64] FLAG: --help="false" Apr 24 16:39:24.402944 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399315 2579 flags.go:64] FLAG: --hostname-override="ip-10-0-129-227.ec2.internal" Apr 24 16:39:24.403555 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399318 2579 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 16:39:24.403555 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399321 2579 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 16:39:24.403555 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399325 2579 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 16:39:24.403555 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399328 2579 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 16:39:24.403555 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399332 2579 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 16:39:24.403555 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399335 2579 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 16:39:24.403555 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399339 2579 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 16:39:24.403555 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399342 2579 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 16:39:24.403555 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399344 2579 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 16:39:24.403555 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399347 2579 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 16:39:24.403555 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399350 2579 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 16:39:24.403555 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399353 2579 flags.go:64] FLAG: --kube-reserved="" Apr 24 16:39:24.403555 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399358 2579 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 16:39:24.403555 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399361 2579 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 16:39:24.403555 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399364 2579 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 16:39:24.403555 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399368 2579 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 16:39:24.403555 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399371 2579 flags.go:64] FLAG: --lock-file="" Apr 24 16:39:24.403555 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399373 2579 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 16:39:24.403555 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399376 2579 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 16:39:24.403555 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399379 2579 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 16:39:24.403555 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399385 2579 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 16:39:24.403555 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399388 2579 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 16:39:24.403555 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399391 2579 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 16:39:24.403555 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399394 2579 flags.go:64] FLAG: --logging-format="text" Apr 24 16:39:24.404156 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399396 2579 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 16:39:24.404156 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399400 2579 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 16:39:24.404156 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399403 2579 flags.go:64] FLAG: --manifest-url="" Apr 24 16:39:24.404156 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399406 2579 flags.go:64] FLAG: --manifest-url-header="" Apr 24 16:39:24.404156 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399410 2579 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 16:39:24.404156 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399413 2579 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 16:39:24.404156 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399419 2579 flags.go:64] FLAG: --max-pods="110" Apr 24 16:39:24.404156 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399422 2579 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 16:39:24.404156 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399425 2579 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 16:39:24.404156 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399428 2579 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 16:39:24.404156 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399431 2579 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 16:39:24.404156 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399434 2579 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 16:39:24.404156 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399437 2579 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 16:39:24.404156 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399440 2579 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 16:39:24.404156 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399448 2579 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 16:39:24.404156 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399452 2579 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 16:39:24.404156 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399455 2579 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 16:39:24.404156 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399458 2579 flags.go:64] FLAG: --pod-cidr="" Apr 24 16:39:24.404156 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399461 2579 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 16:39:24.404156 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399466 2579 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 16:39:24.404156 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399469 2579 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 16:39:24.404156 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399473 2579 flags.go:64] FLAG: --pods-per-core="0" Apr 24 16:39:24.404156 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399476 2579 flags.go:64] FLAG: --port="10250" Apr 24 16:39:24.404156 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399479 2579 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 16:39:24.404717 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399482 2579 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-048627d80ccb58cbf" Apr 24 16:39:24.404717 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399485 2579 flags.go:64] FLAG: --qos-reserved="" Apr 24 16:39:24.404717 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399488 2579 flags.go:64] FLAG: --read-only-port="10255" Apr 24 16:39:24.404717 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399492 2579 flags.go:64] FLAG: --register-node="true" Apr 24 16:39:24.404717 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399495 2579 flags.go:64] FLAG: --register-schedulable="true" Apr 24 16:39:24.404717 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399498 2579 flags.go:64] FLAG: --register-with-taints="" Apr 24 16:39:24.404717 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399502 2579 flags.go:64] FLAG: --registry-burst="10" Apr 24 16:39:24.404717 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399505 2579 flags.go:64] FLAG: --registry-qps="5" Apr 24 16:39:24.404717 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399508 2579 flags.go:64] FLAG: --reserved-cpus="" Apr 24 16:39:24.404717 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399511 2579 flags.go:64] FLAG: --reserved-memory="" Apr 24 16:39:24.404717 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399514 2579 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 16:39:24.404717 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399517 2579 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 16:39:24.404717 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399520 2579 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 16:39:24.404717 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399523 2579 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 16:39:24.404717 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399527 2579 flags.go:64] FLAG: --runonce="false" Apr 24 16:39:24.404717 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399530 2579 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 16:39:24.404717 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399533 2579 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 16:39:24.404717 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399536 2579 flags.go:64] FLAG: --seccomp-default="false" Apr 24 16:39:24.404717 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399539 2579 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 16:39:24.404717 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399542 2579 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 16:39:24.404717 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399545 2579 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 16:39:24.404717 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399548 2579 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 16:39:24.404717 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399551 2579 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 16:39:24.404717 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399554 2579 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 16:39:24.404717 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399557 2579 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 16:39:24.404717 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399560 2579 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 16:39:24.405375 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399563 2579 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 16:39:24.405375 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399566 2579 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 16:39:24.405375 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399569 2579 flags.go:64] FLAG: --system-cgroups="" Apr 24 16:39:24.405375 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399573 2579 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 16:39:24.405375 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399579 2579 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 16:39:24.405375 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399582 2579 flags.go:64] FLAG: --tls-cert-file="" Apr 24 16:39:24.405375 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399585 2579 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 16:39:24.405375 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399590 2579 flags.go:64] FLAG: --tls-min-version="" Apr 24 16:39:24.405375 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399594 2579 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 16:39:24.405375 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399597 2579 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 16:39:24.405375 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399600 2579 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 16:39:24.405375 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399603 2579 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 16:39:24.405375 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399606 2579 flags.go:64] FLAG: --v="2" Apr 24 16:39:24.405375 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399611 2579 flags.go:64] FLAG: --version="false" Apr 24 16:39:24.405375 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399615 2579 flags.go:64] FLAG: --vmodule="" Apr 24 16:39:24.405375 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399619 2579 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 16:39:24.405375 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.399623 2579 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 16:39:24.405375 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399740 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 16:39:24.405375 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399744 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 16:39:24.405375 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399748 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 16:39:24.405375 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399752 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 16:39:24.405375 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399755 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 16:39:24.405375 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399758 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 16:39:24.405971 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399760 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 16:39:24.405971 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399763 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 16:39:24.405971 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399765 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 16:39:24.405971 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399768 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 16:39:24.405971 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399771 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 16:39:24.405971 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399773 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 16:39:24.405971 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399776 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 16:39:24.405971 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399779 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 16:39:24.405971 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399782 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 16:39:24.405971 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399785 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 16:39:24.405971 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399787 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 16:39:24.405971 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399790 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 16:39:24.405971 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399793 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 16:39:24.405971 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399796 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 16:39:24.405971 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399798 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 16:39:24.405971 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399801 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 16:39:24.405971 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399803 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 16:39:24.405971 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399806 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 16:39:24.405971 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399808 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 16:39:24.405971 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399811 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 16:39:24.406463 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399813 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 24 16:39:24.406463 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399815 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 16:39:24.406463 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399818 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 16:39:24.406463 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399820 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 16:39:24.406463 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399823 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 16:39:24.406463 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399825 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 16:39:24.406463 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399827 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 16:39:24.406463 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399830 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 16:39:24.406463 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399832 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 16:39:24.406463 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399836 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 16:39:24.406463 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399838 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 16:39:24.406463 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399840 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 16:39:24.406463 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399843 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 16:39:24.406463 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399845 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 16:39:24.406463 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399848 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 16:39:24.406463 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399850 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 16:39:24.406463 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399852 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 16:39:24.406463 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399855 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 16:39:24.406463 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399857 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 16:39:24.406463 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399859 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 16:39:24.407012 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399862 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 16:39:24.407012 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399865 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 16:39:24.407012 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399867 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 16:39:24.407012 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399869 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 16:39:24.407012 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399873 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 16:39:24.407012 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399877 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 16:39:24.407012 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399880 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 16:39:24.407012 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399883 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 16:39:24.407012 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399885 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 16:39:24.407012 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399888 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 16:39:24.407012 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399890 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 16:39:24.407012 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399893 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 16:39:24.407012 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399895 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 16:39:24.407012 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399897 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 16:39:24.407012 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399900 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 16:39:24.407012 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399902 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 16:39:24.407012 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399904 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 16:39:24.407012 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399907 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 16:39:24.407012 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399910 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 16:39:24.407012 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399912 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 16:39:24.407641 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399915 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 16:39:24.407641 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399919 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 16:39:24.407641 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399921 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 16:39:24.407641 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399923 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 16:39:24.407641 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399926 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 16:39:24.407641 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399928 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 16:39:24.407641 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399931 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 16:39:24.407641 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399933 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 16:39:24.407641 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399935 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 16:39:24.407641 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399937 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 16:39:24.407641 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399942 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 16:39:24.407641 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399945 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 16:39:24.407641 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399947 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 16:39:24.407641 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399951 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 16:39:24.407641 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399957 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 16:39:24.407641 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399960 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 16:39:24.407641 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399963 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 16:39:24.407641 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399966 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 16:39:24.407641 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399968 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 16:39:24.408221 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.399971 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 16:39:24.408221 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.400565 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 16:39:24.409477 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.409454 2579 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 16:39:24.409534 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.409480 2579 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 16:39:24.409534 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409530 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 16:39:24.409590 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409536 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 16:39:24.409590 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409540 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 16:39:24.409590 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409544 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 16:39:24.409590 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409547 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 24 16:39:24.409590 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409551 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 16:39:24.409590 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409554 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 16:39:24.409590 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409557 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 16:39:24.409590 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409559 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 16:39:24.409590 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409562 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 16:39:24.409590 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409565 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 16:39:24.409590 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409568 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 16:39:24.409590 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409570 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 16:39:24.409590 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409573 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 16:39:24.409590 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409576 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 16:39:24.409590 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409579 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 16:39:24.409590 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409582 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 16:39:24.409590 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409585 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 16:39:24.409590 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409587 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 16:39:24.409590 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409589 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 16:39:24.409590 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409592 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 16:39:24.410099 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409595 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 16:39:24.410099 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409599 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 16:39:24.410099 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409604 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 16:39:24.410099 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409607 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 16:39:24.410099 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409610 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 16:39:24.410099 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409612 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 16:39:24.410099 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409616 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 16:39:24.410099 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409619 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 16:39:24.410099 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409621 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 16:39:24.410099 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409624 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 16:39:24.410099 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409627 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 16:39:24.410099 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409629 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 16:39:24.410099 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409631 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 16:39:24.410099 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409634 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 16:39:24.410099 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409636 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 16:39:24.410099 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409638 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 16:39:24.410099 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409641 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 16:39:24.410099 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409643 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 16:39:24.410099 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409646 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 16:39:24.410099 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409648 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 16:39:24.410583 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409651 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 16:39:24.410583 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409653 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 16:39:24.410583 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409656 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 16:39:24.410583 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409659 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 16:39:24.410583 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409661 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 16:39:24.410583 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409664 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 16:39:24.410583 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409668 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 16:39:24.410583 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409672 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 16:39:24.410583 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409675 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 16:39:24.410583 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409677 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 16:39:24.410583 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409680 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 16:39:24.410583 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409683 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 16:39:24.410583 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409685 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 16:39:24.410583 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409688 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 16:39:24.410583 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409690 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 16:39:24.410583 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409693 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 16:39:24.410583 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409695 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 16:39:24.410583 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409698 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 16:39:24.410583 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409700 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 16:39:24.411173 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409703 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 16:39:24.411173 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409705 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 16:39:24.411173 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409708 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 16:39:24.411173 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409710 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 16:39:24.411173 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409713 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 16:39:24.411173 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409715 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 16:39:24.411173 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409718 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 16:39:24.411173 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409720 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 16:39:24.411173 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409723 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 16:39:24.411173 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409725 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 16:39:24.411173 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409741 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 16:39:24.411173 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409743 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 16:39:24.411173 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409746 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 16:39:24.411173 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409750 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 16:39:24.411173 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409753 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 16:39:24.411173 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409755 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 16:39:24.411173 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409759 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 16:39:24.411173 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409761 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 16:39:24.411173 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409764 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 16:39:24.411173 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409766 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 16:39:24.411707 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409769 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 16:39:24.411707 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409772 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 16:39:24.411707 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409774 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 16:39:24.411707 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409777 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 16:39:24.411707 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409779 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 16:39:24.411707 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409782 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 16:39:24.411707 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.409787 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 16:39:24.411707 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409884 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 16:39:24.411707 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409888 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 16:39:24.411707 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409891 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 16:39:24.411707 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409894 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 16:39:24.411707 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409897 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 16:39:24.411707 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409900 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 16:39:24.411707 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409903 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 16:39:24.411707 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409906 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 16:39:24.412178 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409909 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 16:39:24.412178 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409911 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 16:39:24.412178 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409914 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 16:39:24.412178 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409916 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 16:39:24.412178 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409918 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 16:39:24.412178 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409921 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 16:39:24.412178 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409924 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 16:39:24.412178 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409927 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 16:39:24.412178 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409929 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 16:39:24.412178 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409932 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 16:39:24.412178 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409935 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 16:39:24.412178 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409937 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 16:39:24.412178 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409940 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 16:39:24.412178 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409942 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 16:39:24.412178 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409945 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 16:39:24.412178 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409948 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 16:39:24.412178 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409950 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 16:39:24.412178 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409953 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 16:39:24.412178 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409956 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 16:39:24.412178 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409958 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 16:39:24.412657 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409961 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 16:39:24.412657 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409964 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 16:39:24.412657 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409967 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 16:39:24.412657 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409970 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 16:39:24.412657 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409974 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 16:39:24.412657 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409977 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 16:39:24.412657 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409979 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 16:39:24.412657 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409982 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 16:39:24.412657 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409984 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 16:39:24.412657 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409986 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 16:39:24.412657 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409989 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 16:39:24.412657 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409991 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 16:39:24.412657 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409994 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 16:39:24.412657 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409996 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 16:39:24.412657 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.409998 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 16:39:24.412657 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.410001 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 16:39:24.412657 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.410003 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 16:39:24.412657 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.410006 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 16:39:24.412657 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.410008 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 16:39:24.412657 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.410011 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 16:39:24.413213 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.410013 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 16:39:24.413213 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.410016 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 16:39:24.413213 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.410018 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 16:39:24.413213 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.410020 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 16:39:24.413213 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.410023 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 16:39:24.413213 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.410025 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 16:39:24.413213 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.410028 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 16:39:24.413213 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.410030 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 16:39:24.413213 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.410033 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 16:39:24.413213 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.410037 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 16:39:24.413213 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.410041 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 16:39:24.413213 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.410044 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 16:39:24.413213 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.410046 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 16:39:24.413213 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.410049 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 16:39:24.413213 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.410051 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 16:39:24.413213 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.410054 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 16:39:24.413213 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.410056 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 16:39:24.413213 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.410059 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 16:39:24.413213 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.410061 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 16:39:24.413213 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.410064 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 16:39:24.413703 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.410066 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 16:39:24.413703 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.410069 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 16:39:24.413703 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.410072 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 16:39:24.413703 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.410074 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 16:39:24.413703 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.410076 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 16:39:24.413703 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.410079 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 24 16:39:24.413703 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.410081 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 16:39:24.413703 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.410084 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 16:39:24.413703 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.410086 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 16:39:24.413703 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.410089 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 16:39:24.413703 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.410091 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 16:39:24.413703 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.410093 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 16:39:24.413703 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.410096 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 16:39:24.413703 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.410098 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 16:39:24.413703 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.410101 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 16:39:24.413703 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.410104 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 16:39:24.413703 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.410107 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 16:39:24.413703 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:24.410109 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 16:39:24.414211 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.410114 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 16:39:24.414211 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.410240 2579 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 16:39:24.414211 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.412854 2579 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 16:39:24.414211 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.413941 2579 server.go:1019] "Starting client certificate rotation" Apr 24 16:39:24.414211 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.414036 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 16:39:24.414211 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.414082 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 16:39:24.440706 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.440680 2579 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 16:39:24.443118 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.443083 2579 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 16:39:24.459161 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.459142 2579 log.go:25] "Validated CRI v1 runtime API" Apr 24 16:39:24.468568 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.468548 2579 log.go:25] "Validated CRI v1 image API" Apr 24 16:39:24.469880 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.469863 2579 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 16:39:24.470748 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.470715 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 16:39:24.474278 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.474254 2579 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 c884dbcd-465e-4883-ade1-993c17a4b43d:/dev/nvme0n1p3 fe4c99f8-9393-45a9-a374-579e5bf6c5c2:/dev/nvme0n1p4] Apr 24 16:39:24.474351 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.474276 2579 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 16:39:24.480263 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.480155 2579 manager.go:217] Machine: {Timestamp:2026-04-24 16:39:24.47809052 +0000 UTC m=+0.411159710 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100817 MemoryCapacity:32812171264 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2632a2de877341f4c4067e8bb332e7 SystemUUID:ec2632a2-de87-7341-f4c4-067e8bb332e7 BootID:f2e7cceb-2241-4e4c-a0de-e6ed97bf04ee Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:49:de:1b:f4:6b Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:49:de:1b:f4:6b Speed:0 Mtu:9001} {Name:ovs-system MacAddress:1a:50:da:ac:ea:d7 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812171264 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 16:39:24.480263 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.480259 2579 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 16:39:24.480369 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.480347 2579 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 16:39:24.481365 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.481342 2579 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 16:39:24.481514 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.481368 2579 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-129-227.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 16:39:24.481559 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.481523 2579 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 16:39:24.481559 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.481532 2579 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 16:39:24.481559 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.481545 2579 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 16:39:24.481559 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.481555 2579 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 16:39:24.482962 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.482950 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 24 16:39:24.483072 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.483064 2579 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 16:39:24.485582 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.485572 2579 kubelet.go:491] "Attempting to sync node with API server" Apr 24 16:39:24.485630 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.485585 2579 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 16:39:24.485630 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.485599 2579 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 16:39:24.485630 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.485608 2579 kubelet.go:397] "Adding apiserver pod source" Apr 24 16:39:24.485630 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.485616 2579 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 16:39:24.486577 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.486562 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 16:39:24.486663 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.486581 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 16:39:24.489628 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.489611 2579 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 16:39:24.490565 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.490547 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-wknxv" Apr 24 16:39:24.491065 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.491052 2579 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 16:39:24.493002 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.492982 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 16:39:24.493063 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.493027 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 16:39:24.493063 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.493040 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 16:39:24.493063 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.493053 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 16:39:24.493153 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.493064 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 16:39:24.493153 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.493077 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 16:39:24.493153 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.493089 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 16:39:24.493153 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.493101 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 16:39:24.493153 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.493116 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 16:39:24.493153 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.493130 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 16:39:24.493303 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.493168 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 16:39:24.493303 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.493185 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 16:39:24.494146 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.494135 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 16:39:24.494185 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.494148 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 16:39:24.496989 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.496945 2579 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-129-227.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 16:39:24.497507 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:24.497481 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-129-227.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 16:39:24.497602 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:24.497527 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 16:39:24.497838 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.497824 2579 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 16:39:24.497895 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.497861 2579 server.go:1295] "Started kubelet" Apr 24 16:39:24.497962 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.497932 2579 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 16:39:24.498014 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.497939 2579 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 16:39:24.498063 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.498016 2579 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 16:39:24.498712 ip-10-0-129-227 systemd[1]: Started Kubernetes Kubelet. Apr 24 16:39:24.499072 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.499060 2579 server.go:317] "Adding debug handlers to kubelet server" Apr 24 16:39:24.499905 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.499890 2579 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 16:39:24.500040 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.500024 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-wknxv" Apr 24 16:39:24.505918 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.505896 2579 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 16:39:24.505918 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.505903 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 16:39:24.506521 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.506502 2579 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 16:39:24.506644 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.506507 2579 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 16:39:24.506644 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.506637 2579 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 16:39:24.506767 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.506717 2579 reconstruct.go:97] "Volume reconstruction finished" Apr 24 16:39:24.506767 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.506743 2579 reconciler.go:26] "Reconciler: start to sync state" Apr 24 16:39:24.506847 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:24.506824 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-227.ec2.internal\" not found" Apr 24 16:39:24.507063 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.507051 2579 factory.go:55] Registering systemd factory Apr 24 16:39:24.507124 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.507101 2579 factory.go:223] Registration of the systemd container factory successfully Apr 24 16:39:24.507349 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.507333 2579 factory.go:153] Registering CRI-O factory Apr 24 16:39:24.507414 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.507351 2579 factory.go:223] Registration of the crio container factory successfully Apr 24 16:39:24.507414 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.507407 2579 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 16:39:24.507511 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.507435 2579 factory.go:103] Registering Raw factory Apr 24 16:39:24.507511 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.507450 2579 manager.go:1196] Started watching for new ooms in manager Apr 24 16:39:24.507921 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.507909 2579 manager.go:319] Starting recovery of all containers Apr 24 16:39:24.510587 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:24.510564 2579 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 16:39:24.514823 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.514800 2579 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 16:39:24.516807 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.516785 2579 manager.go:324] Recovery completed Apr 24 16:39:24.519908 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:24.519883 2579 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-129-227.ec2.internal\" not found" node="ip-10-0-129-227.ec2.internal" Apr 24 16:39:24.522352 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.522338 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 16:39:24.525299 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.525283 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-227.ec2.internal" event="NodeHasSufficientMemory" Apr 24 16:39:24.525377 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.525318 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-227.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 16:39:24.525438 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.525385 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-227.ec2.internal" event="NodeHasSufficientPID" Apr 24 16:39:24.525991 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.525977 2579 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 16:39:24.525991 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.525989 2579 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 16:39:24.526081 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.526005 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 24 16:39:24.528853 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.528840 2579 policy_none.go:49] "None policy: Start" Apr 24 16:39:24.528890 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.528857 2579 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 16:39:24.528890 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.528867 2579 state_mem.go:35] "Initializing new in-memory state store" Apr 24 16:39:24.571382 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.571348 2579 manager.go:341] "Starting Device Plugin manager" Apr 24 16:39:24.590544 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:24.571392 2579 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 16:39:24.590544 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.571405 2579 server.go:85] "Starting device plugin registration server" Apr 24 16:39:24.590544 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.571792 2579 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 16:39:24.590544 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.571804 2579 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 16:39:24.590544 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.571901 2579 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 16:39:24.590544 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.571981 2579 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 16:39:24.590544 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.571990 2579 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 16:39:24.590544 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:24.572614 2579 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 16:39:24.590544 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:24.572654 2579 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-129-227.ec2.internal\" not found" Apr 24 16:39:24.613346 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.613315 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 16:39:24.614694 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.614669 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 16:39:24.614768 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.614718 2579 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 16:39:24.614768 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.614760 2579 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 16:39:24.614864 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.614770 2579 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 16:39:24.614864 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:24.614811 2579 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 16:39:24.618482 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.618464 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 16:39:24.672326 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.672216 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 16:39:24.674431 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.674403 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-227.ec2.internal" event="NodeHasSufficientMemory" Apr 24 16:39:24.674528 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.674440 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-227.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 16:39:24.674528 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.674456 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-227.ec2.internal" event="NodeHasSufficientPID" Apr 24 16:39:24.674528 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.674488 2579 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-129-227.ec2.internal" Apr 24 16:39:24.684213 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.684192 2579 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-129-227.ec2.internal" Apr 24 16:39:24.684320 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:24.684219 2579 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-129-227.ec2.internal\": node \"ip-10-0-129-227.ec2.internal\" not found" Apr 24 16:39:24.701490 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:24.701472 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-227.ec2.internal\" not found" Apr 24 16:39:24.715862 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.715831 2579 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-227.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-129-227.ec2.internal"] Apr 24 16:39:24.715966 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.715908 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 16:39:24.717564 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.717548 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-227.ec2.internal" event="NodeHasSufficientMemory" Apr 24 16:39:24.717659 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.717583 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-227.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 16:39:24.717659 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.717597 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-227.ec2.internal" event="NodeHasSufficientPID" Apr 24 16:39:24.719948 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.719932 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 16:39:24.720094 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.720080 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-227.ec2.internal" Apr 24 16:39:24.720142 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.720107 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 16:39:24.721183 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.721163 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-227.ec2.internal" event="NodeHasSufficientMemory" Apr 24 16:39:24.721183 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.721172 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-227.ec2.internal" event="NodeHasSufficientMemory" Apr 24 16:39:24.721356 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.721197 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-227.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 16:39:24.721356 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.721216 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-227.ec2.internal" event="NodeHasSufficientPID" Apr 24 16:39:24.721356 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.721269 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-227.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 16:39:24.721356 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.721288 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-227.ec2.internal" event="NodeHasSufficientPID" Apr 24 16:39:24.723620 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.723605 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-227.ec2.internal" Apr 24 16:39:24.723690 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.723640 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 16:39:24.724403 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.724387 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-227.ec2.internal" event="NodeHasSufficientMemory" Apr 24 16:39:24.724474 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.724418 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-227.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 16:39:24.724474 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.724433 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-227.ec2.internal" event="NodeHasSufficientPID" Apr 24 16:39:24.748322 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:24.748304 2579 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-227.ec2.internal\" not found" node="ip-10-0-129-227.ec2.internal" Apr 24 16:39:24.752233 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:24.752217 2579 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-227.ec2.internal\" not found" node="ip-10-0-129-227.ec2.internal" Apr 24 16:39:24.801854 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:24.801827 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-227.ec2.internal\" not found" Apr 24 16:39:24.808155 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.808131 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b4ee6da65101faf961ac9e356d355076-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-227.ec2.internal\" (UID: \"b4ee6da65101faf961ac9e356d355076\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-227.ec2.internal" Apr 24 16:39:24.808248 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.808164 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b4ee6da65101faf961ac9e356d355076-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-227.ec2.internal\" (UID: \"b4ee6da65101faf961ac9e356d355076\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-227.ec2.internal" Apr 24 16:39:24.808248 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.808192 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/abe1697a38d64560f353e38f63759680-config\") pod \"kube-apiserver-proxy-ip-10-0-129-227.ec2.internal\" (UID: \"abe1697a38d64560f353e38f63759680\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-227.ec2.internal" Apr 24 16:39:24.902199 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:24.902168 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-227.ec2.internal\" not found" Apr 24 16:39:24.908567 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.908544 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b4ee6da65101faf961ac9e356d355076-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-227.ec2.internal\" (UID: \"b4ee6da65101faf961ac9e356d355076\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-227.ec2.internal" Apr 24 16:39:24.908617 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.908563 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b4ee6da65101faf961ac9e356d355076-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-227.ec2.internal\" (UID: \"b4ee6da65101faf961ac9e356d355076\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-227.ec2.internal" Apr 24 16:39:24.908655 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.908615 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b4ee6da65101faf961ac9e356d355076-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-227.ec2.internal\" (UID: \"b4ee6da65101faf961ac9e356d355076\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-227.ec2.internal" Apr 24 16:39:24.908655 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.908635 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/abe1697a38d64560f353e38f63759680-config\") pod \"kube-apiserver-proxy-ip-10-0-129-227.ec2.internal\" (UID: \"abe1697a38d64560f353e38f63759680\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-227.ec2.internal" Apr 24 16:39:24.908714 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.908665 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/abe1697a38d64560f353e38f63759680-config\") pod \"kube-apiserver-proxy-ip-10-0-129-227.ec2.internal\" (UID: \"abe1697a38d64560f353e38f63759680\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-227.ec2.internal" Apr 24 16:39:24.908714 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:24.908700 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b4ee6da65101faf961ac9e356d355076-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-227.ec2.internal\" (UID: \"b4ee6da65101faf961ac9e356d355076\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-227.ec2.internal" Apr 24 16:39:25.003258 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:25.003226 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-227.ec2.internal\" not found" Apr 24 16:39:25.050817 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:25.050779 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-227.ec2.internal" Apr 24 16:39:25.054465 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:25.054445 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-227.ec2.internal" Apr 24 16:39:25.103504 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:25.103475 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-227.ec2.internal\" not found" Apr 24 16:39:25.204053 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:25.204027 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-227.ec2.internal\" not found" Apr 24 16:39:25.304703 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:25.304616 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-227.ec2.internal\" not found" Apr 24 16:39:25.405244 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:25.405204 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-227.ec2.internal\" not found" Apr 24 16:39:25.414646 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:25.414628 2579 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 16:39:25.414810 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:25.414794 2579 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 16:39:25.414855 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:25.414809 2579 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 16:39:25.501704 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:25.501643 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 16:34:24 +0000 UTC" deadline="2028-01-14 15:49:56.117674554 +0000 UTC" Apr 24 16:39:25.501704 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:25.501690 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15119h10m30.615987456s" Apr 24 16:39:25.505327 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:25.505297 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-227.ec2.internal\" not found" Apr 24 16:39:25.506351 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:25.506333 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 16:39:25.516247 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:25.516221 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 16:39:25.547678 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:25.547648 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-ljpk9" Apr 24 16:39:25.557480 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:25.557422 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-ljpk9" Apr 24 16:39:25.603759 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:25.603706 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podabe1697a38d64560f353e38f63759680.slice/crio-7737fe662ac5b8cb5ee6446e0684eb7f57c71ba3d2e5a4f0591ecb5e7f2bf98a WatchSource:0}: Error finding container 7737fe662ac5b8cb5ee6446e0684eb7f57c71ba3d2e5a4f0591ecb5e7f2bf98a: Status 404 returned error can't find the container with id 7737fe662ac5b8cb5ee6446e0684eb7f57c71ba3d2e5a4f0591ecb5e7f2bf98a Apr 24 16:39:25.604293 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:25.604268 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4ee6da65101faf961ac9e356d355076.slice/crio-0e249865691ce921356a8a8b87f867f56fbaf60ad98bbdcc30491a0f5ae7e131 WatchSource:0}: Error finding container 0e249865691ce921356a8a8b87f867f56fbaf60ad98bbdcc30491a0f5ae7e131: Status 404 returned error can't find the container with id 0e249865691ce921356a8a8b87f867f56fbaf60ad98bbdcc30491a0f5ae7e131 Apr 24 16:39:25.606425 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:25.606406 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-227.ec2.internal\" not found" Apr 24 16:39:25.609277 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:25.609253 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 16:39:25.617845 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:25.617808 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-227.ec2.internal" event={"ID":"b4ee6da65101faf961ac9e356d355076","Type":"ContainerStarted","Data":"0e249865691ce921356a8a8b87f867f56fbaf60ad98bbdcc30491a0f5ae7e131"} Apr 24 16:39:25.621006 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:25.620977 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-227.ec2.internal" event={"ID":"abe1697a38d64560f353e38f63759680","Type":"ContainerStarted","Data":"7737fe662ac5b8cb5ee6446e0684eb7f57c71ba3d2e5a4f0591ecb5e7f2bf98a"} Apr 24 16:39:25.707069 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:25.707032 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-227.ec2.internal\" not found" Apr 24 16:39:25.807558 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:25.807480 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-227.ec2.internal\" not found" Apr 24 16:39:25.830489 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:25.830459 2579 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 16:39:25.908489 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:25.908459 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-227.ec2.internal\" not found" Apr 24 16:39:26.000219 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.000187 2579 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 16:39:26.006712 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.006676 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-227.ec2.internal" Apr 24 16:39:26.018715 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.018686 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 16:39:26.019788 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.019764 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-227.ec2.internal" Apr 24 16:39:26.028154 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.028130 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 16:39:26.486915 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.486878 2579 apiserver.go:52] "Watching apiserver" Apr 24 16:39:26.495649 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.495615 2579 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 16:39:26.499442 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.499412 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7fpht","kube-system/kube-apiserver-proxy-ip-10-0-129-227.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-227.ec2.internal","openshift-multus/multus-additional-cni-plugins-92bp8","openshift-network-diagnostics/network-check-target-hknkz","openshift-network-operator/iptables-alerter-zdpfk","kube-system/konnectivity-agent-b565t","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4vz8k","openshift-cluster-node-tuning-operator/tuned-tc7zr","openshift-dns/node-resolver-7grsg","openshift-image-registry/node-ca-kdlh8","openshift-multus/multus-ng4dd","openshift-multus/network-metrics-daemon-kmh29"] Apr 24 16:39:26.502423 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.502401 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-b565t" Apr 24 16:39:26.504680 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.504654 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-5ml2p\"" Apr 24 16:39:26.504680 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.504663 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 16:39:26.504897 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.504657 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 16:39:26.507703 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.507684 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-92bp8" Apr 24 16:39:26.507923 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.507891 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hknkz" Apr 24 16:39:26.508027 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:26.507987 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hknkz" podUID="28a21434-57fa-420d-a7e4-3b011eb1dc0e" Apr 24 16:39:26.510052 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.509673 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 16:39:26.510052 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.509868 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 16:39:26.510227 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.510206 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 16:39:26.510294 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.510276 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 16:39:26.510518 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.510498 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-rg4d4\"" Apr 24 16:39:26.510696 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.510678 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 16:39:26.512375 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.512352 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-zdpfk" Apr 24 16:39:26.512783 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.512523 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:26.514601 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.514165 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 16:39:26.514601 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.514187 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 16:39:26.514601 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.514249 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 16:39:26.514601 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.514257 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 16:39:26.514601 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.514575 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 16:39:26.514948 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.514628 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4vz8k" Apr 24 16:39:26.514948 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.514639 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-flcs7\"" Apr 24 16:39:26.514948 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.514576 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 16:39:26.514948 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.514916 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-9mjfw\"" Apr 24 16:39:26.515134 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.515003 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 16:39:26.515134 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.515071 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 16:39:26.515235 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.515153 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 16:39:26.516691 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.516674 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 16:39:26.516805 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.516706 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-42nv5\"" Apr 24 16:39:26.516994 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.516975 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 16:39:26.517070 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.517049 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-tc7zr" Apr 24 16:39:26.517324 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.517291 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 16:39:26.517584 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.517566 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/54f6ae1b-e6ac-48cd-92cc-1c9ae843b609-cnibin\") pod \"multus-additional-cni-plugins-92bp8\" (UID: \"54f6ae1b-e6ac-48cd-92cc-1c9ae843b609\") " pod="openshift-multus/multus-additional-cni-plugins-92bp8" Apr 24 16:39:26.517680 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.517592 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/54f6ae1b-e6ac-48cd-92cc-1c9ae843b609-os-release\") pod \"multus-additional-cni-plugins-92bp8\" (UID: \"54f6ae1b-e6ac-48cd-92cc-1c9ae843b609\") " pod="openshift-multus/multus-additional-cni-plugins-92bp8" Apr 24 16:39:26.517680 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.517619 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/54f6ae1b-e6ac-48cd-92cc-1c9ae843b609-cni-binary-copy\") pod \"multus-additional-cni-plugins-92bp8\" (UID: \"54f6ae1b-e6ac-48cd-92cc-1c9ae843b609\") " pod="openshift-multus/multus-additional-cni-plugins-92bp8" Apr 24 16:39:26.517680 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.517642 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/54f6ae1b-e6ac-48cd-92cc-1c9ae843b609-tuning-conf-dir\") pod \"multus-additional-cni-plugins-92bp8\" (UID: \"54f6ae1b-e6ac-48cd-92cc-1c9ae843b609\") " pod="openshift-multus/multus-additional-cni-plugins-92bp8" Apr 24 16:39:26.517680 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.517659 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/54f6ae1b-e6ac-48cd-92cc-1c9ae843b609-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-92bp8\" (UID: \"54f6ae1b-e6ac-48cd-92cc-1c9ae843b609\") " pod="openshift-multus/multus-additional-cni-plugins-92bp8" Apr 24 16:39:26.517680 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.517676 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/54f6ae1b-e6ac-48cd-92cc-1c9ae843b609-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-92bp8\" (UID: \"54f6ae1b-e6ac-48cd-92cc-1c9ae843b609\") " pod="openshift-multus/multus-additional-cni-plugins-92bp8" Apr 24 16:39:26.517936 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.517700 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q65sl\" (UniqueName: \"kubernetes.io/projected/54f6ae1b-e6ac-48cd-92cc-1c9ae843b609-kube-api-access-q65sl\") pod \"multus-additional-cni-plugins-92bp8\" (UID: \"54f6ae1b-e6ac-48cd-92cc-1c9ae843b609\") " pod="openshift-multus/multus-additional-cni-plugins-92bp8" Apr 24 16:39:26.517936 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.517716 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tfc7\" (UniqueName: \"kubernetes.io/projected/28a21434-57fa-420d-a7e4-3b011eb1dc0e-kube-api-access-7tfc7\") pod \"network-check-target-hknkz\" (UID: \"28a21434-57fa-420d-a7e4-3b011eb1dc0e\") " pod="openshift-network-diagnostics/network-check-target-hknkz" Apr 24 16:39:26.517936 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.517806 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d7c17281-bc8e-4ae5-bf1c-eaf465abf88b-agent-certs\") pod \"konnectivity-agent-b565t\" (UID: \"d7c17281-bc8e-4ae5-bf1c-eaf465abf88b\") " pod="kube-system/konnectivity-agent-b565t" Apr 24 16:39:26.517936 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.517829 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d7c17281-bc8e-4ae5-bf1c-eaf465abf88b-konnectivity-ca\") pod \"konnectivity-agent-b565t\" (UID: \"d7c17281-bc8e-4ae5-bf1c-eaf465abf88b\") " pod="kube-system/konnectivity-agent-b565t" Apr 24 16:39:26.517936 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.517851 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/54f6ae1b-e6ac-48cd-92cc-1c9ae843b609-system-cni-dir\") pod \"multus-additional-cni-plugins-92bp8\" (UID: \"54f6ae1b-e6ac-48cd-92cc-1c9ae843b609\") " pod="openshift-multus/multus-additional-cni-plugins-92bp8" Apr 24 16:39:26.519767 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.519372 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7grsg" Apr 24 16:39:26.520266 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.520063 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 16:39:26.520266 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.520129 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-jbtpc\"" Apr 24 16:39:26.520520 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.520504 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 16:39:26.521294 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.521275 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-qvpdb\"" Apr 24 16:39:26.521386 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.521302 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 16:39:26.521386 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.521338 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 16:39:26.521630 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.521611 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-kdlh8" Apr 24 16:39:26.523624 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.523604 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 16:39:26.523718 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.523634 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 16:39:26.523718 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.523706 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-phdpd\"" Apr 24 16:39:26.523854 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.523611 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 16:39:26.524217 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.524168 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-ng4dd" Apr 24 16:39:26.525920 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.525904 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-j8bfh\"" Apr 24 16:39:26.525987 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.525923 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 16:39:26.527337 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.527321 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kmh29" Apr 24 16:39:26.527440 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:26.527382 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kmh29" podUID="4af5b09f-ceea-413a-bec5-40a2b59c7ea3" Apr 24 16:39:26.558173 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.558101 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 16:34:25 +0000 UTC" deadline="2027-10-26 23:25:57.786995159 +0000 UTC" Apr 24 16:39:26.558173 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.558125 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13206h46m31.228872312s" Apr 24 16:39:26.608012 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.607982 2579 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 16:39:26.618037 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.618009 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9afa95f3-5eb1-47b0-b00d-a487cb566ffc-multus-conf-dir\") pod \"multus-ng4dd\" (UID: \"9afa95f3-5eb1-47b0-b00d-a487cb566ffc\") " pod="openshift-multus/multus-ng4dd" Apr 24 16:39:26.618190 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.618047 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9afa95f3-5eb1-47b0-b00d-a487cb566ffc-host-run-multus-certs\") pod \"multus-ng4dd\" (UID: \"9afa95f3-5eb1-47b0-b00d-a487cb566ffc\") " pod="openshift-multus/multus-ng4dd" Apr 24 16:39:26.618190 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.618068 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/53e5c63d-2459-4e1a-be74-d81070bbbeff-tmp-dir\") pod \"node-resolver-7grsg\" (UID: \"53e5c63d-2459-4e1a-be74-d81070bbbeff\") " pod="openshift-dns/node-resolver-7grsg" Apr 24 16:39:26.618190 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.618099 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdnkg\" (UniqueName: \"kubernetes.io/projected/53e5c63d-2459-4e1a-be74-d81070bbbeff-kube-api-access-zdnkg\") pod \"node-resolver-7grsg\" (UID: \"53e5c63d-2459-4e1a-be74-d81070bbbeff\") " pod="openshift-dns/node-resolver-7grsg" Apr 24 16:39:26.618190 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.618125 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/54f6ae1b-e6ac-48cd-92cc-1c9ae843b609-system-cni-dir\") pod \"multus-additional-cni-plugins-92bp8\" (UID: \"54f6ae1b-e6ac-48cd-92cc-1c9ae843b609\") " pod="openshift-multus/multus-additional-cni-plugins-92bp8" Apr 24 16:39:26.618190 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.618152 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/54f6ae1b-e6ac-48cd-92cc-1c9ae843b609-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-92bp8\" (UID: \"54f6ae1b-e6ac-48cd-92cc-1c9ae843b609\") " pod="openshift-multus/multus-additional-cni-plugins-92bp8" Apr 24 16:39:26.618190 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.618169 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1e8f7734-976a-4f89-b273-9e519d952582-run-ovn\") pod \"ovnkube-node-7fpht\" (UID: \"1e8f7734-976a-4f89-b273-9e519d952582\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:26.618190 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.618185 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/78ae7e56-20b9-4ea1-a705-310d2202a743-sys-fs\") pod \"aws-ebs-csi-driver-node-4vz8k\" (UID: \"78ae7e56-20b9-4ea1-a705-310d2202a743\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4vz8k" Apr 24 16:39:26.618515 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.618202 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/95e0f061-d036-4e7f-9fd6-e08dcb11ea48-host-slash\") pod \"iptables-alerter-zdpfk\" (UID: \"95e0f061-d036-4e7f-9fd6-e08dcb11ea48\") " pod="openshift-network-operator/iptables-alerter-zdpfk" Apr 24 16:39:26.618515 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.618225 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/54f6ae1b-e6ac-48cd-92cc-1c9ae843b609-system-cni-dir\") pod \"multus-additional-cni-plugins-92bp8\" (UID: \"54f6ae1b-e6ac-48cd-92cc-1c9ae843b609\") " pod="openshift-multus/multus-additional-cni-plugins-92bp8" Apr 24 16:39:26.618515 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.618227 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9afa95f3-5eb1-47b0-b00d-a487cb566ffc-multus-daemon-config\") pod \"multus-ng4dd\" (UID: \"9afa95f3-5eb1-47b0-b00d-a487cb566ffc\") " pod="openshift-multus/multus-ng4dd" Apr 24 16:39:26.618515 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.618277 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/7b10f610-a264-47f5-82b9-6ff395db8762-etc-modprobe-d\") pod \"tuned-tc7zr\" (UID: \"7b10f610-a264-47f5-82b9-6ff395db8762\") " pod="openshift-cluster-node-tuning-operator/tuned-tc7zr" Apr 24 16:39:26.618515 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.618314 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1e8f7734-976a-4f89-b273-9e519d952582-host-kubelet\") pod \"ovnkube-node-7fpht\" (UID: \"1e8f7734-976a-4f89-b273-9e519d952582\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:26.618515 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.618346 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1e8f7734-976a-4f89-b273-9e519d952582-log-socket\") pod \"ovnkube-node-7fpht\" (UID: \"1e8f7734-976a-4f89-b273-9e519d952582\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:26.618515 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.618370 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/78ae7e56-20b9-4ea1-a705-310d2202a743-device-dir\") pod \"aws-ebs-csi-driver-node-4vz8k\" (UID: \"78ae7e56-20b9-4ea1-a705-310d2202a743\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4vz8k" Apr 24 16:39:26.618515 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.618398 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1e8f7734-976a-4f89-b273-9e519d952582-node-log\") pod \"ovnkube-node-7fpht\" (UID: \"1e8f7734-976a-4f89-b273-9e519d952582\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:26.618515 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.618425 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99w8w\" (UniqueName: \"kubernetes.io/projected/78ae7e56-20b9-4ea1-a705-310d2202a743-kube-api-access-99w8w\") pod \"aws-ebs-csi-driver-node-4vz8k\" (UID: \"78ae7e56-20b9-4ea1-a705-310d2202a743\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4vz8k" Apr 24 16:39:26.618515 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.618452 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9afa95f3-5eb1-47b0-b00d-a487cb566ffc-multus-cni-dir\") pod \"multus-ng4dd\" (UID: \"9afa95f3-5eb1-47b0-b00d-a487cb566ffc\") " pod="openshift-multus/multus-ng4dd" Apr 24 16:39:26.619006 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.618521 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9afa95f3-5eb1-47b0-b00d-a487cb566ffc-multus-socket-dir-parent\") pod \"multus-ng4dd\" (UID: \"9afa95f3-5eb1-47b0-b00d-a487cb566ffc\") " pod="openshift-multus/multus-ng4dd" Apr 24 16:39:26.619006 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.618548 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9afa95f3-5eb1-47b0-b00d-a487cb566ffc-host-var-lib-kubelet\") pod \"multus-ng4dd\" (UID: \"9afa95f3-5eb1-47b0-b00d-a487cb566ffc\") " pod="openshift-multus/multus-ng4dd" Apr 24 16:39:26.619006 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.618573 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/53e5c63d-2459-4e1a-be74-d81070bbbeff-hosts-file\") pod \"node-resolver-7grsg\" (UID: \"53e5c63d-2459-4e1a-be74-d81070bbbeff\") " pod="openshift-dns/node-resolver-7grsg" Apr 24 16:39:26.619006 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.618600 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvdgh\" (UniqueName: \"kubernetes.io/projected/801dc599-67b9-400f-a14c-835900dba24e-kube-api-access-tvdgh\") pod \"node-ca-kdlh8\" (UID: \"801dc599-67b9-400f-a14c-835900dba24e\") " pod="openshift-image-registry/node-ca-kdlh8" Apr 24 16:39:26.619006 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.618639 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/54f6ae1b-e6ac-48cd-92cc-1c9ae843b609-cni-binary-copy\") pod \"multus-additional-cni-plugins-92bp8\" (UID: \"54f6ae1b-e6ac-48cd-92cc-1c9ae843b609\") " pod="openshift-multus/multus-additional-cni-plugins-92bp8" Apr 24 16:39:26.619006 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.618665 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1e8f7734-976a-4f89-b273-9e519d952582-ovnkube-script-lib\") pod \"ovnkube-node-7fpht\" (UID: \"1e8f7734-976a-4f89-b273-9e519d952582\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:26.619006 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.618688 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9afa95f3-5eb1-47b0-b00d-a487cb566ffc-cni-binary-copy\") pod \"multus-ng4dd\" (UID: \"9afa95f3-5eb1-47b0-b00d-a487cb566ffc\") " pod="openshift-multus/multus-ng4dd" Apr 24 16:39:26.619006 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.618704 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/7b10f610-a264-47f5-82b9-6ff395db8762-etc-sysctl-d\") pod \"tuned-tc7zr\" (UID: \"7b10f610-a264-47f5-82b9-6ff395db8762\") " pod="openshift-cluster-node-tuning-operator/tuned-tc7zr" Apr 24 16:39:26.619006 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.618721 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/801dc599-67b9-400f-a14c-835900dba24e-serviceca\") pod \"node-ca-kdlh8\" (UID: \"801dc599-67b9-400f-a14c-835900dba24e\") " pod="openshift-image-registry/node-ca-kdlh8" Apr 24 16:39:26.619006 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.618759 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/78ae7e56-20b9-4ea1-a705-310d2202a743-socket-dir\") pod \"aws-ebs-csi-driver-node-4vz8k\" (UID: \"78ae7e56-20b9-4ea1-a705-310d2202a743\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4vz8k" Apr 24 16:39:26.619006 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.618791 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/54f6ae1b-e6ac-48cd-92cc-1c9ae843b609-tuning-conf-dir\") pod \"multus-additional-cni-plugins-92bp8\" (UID: \"54f6ae1b-e6ac-48cd-92cc-1c9ae843b609\") " pod="openshift-multus/multus-additional-cni-plugins-92bp8" Apr 24 16:39:26.619006 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.618826 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1e8f7734-976a-4f89-b273-9e519d952582-run-systemd\") pod \"ovnkube-node-7fpht\" (UID: \"1e8f7734-976a-4f89-b273-9e519d952582\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:26.619006 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.618842 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1e8f7734-976a-4f89-b273-9e519d952582-ovnkube-config\") pod \"ovnkube-node-7fpht\" (UID: \"1e8f7734-976a-4f89-b273-9e519d952582\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:26.619006 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.618857 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7b10f610-a264-47f5-82b9-6ff395db8762-sys\") pod \"tuned-tc7zr\" (UID: \"7b10f610-a264-47f5-82b9-6ff395db8762\") " pod="openshift-cluster-node-tuning-operator/tuned-tc7zr" Apr 24 16:39:26.619006 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.618875 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d7c17281-bc8e-4ae5-bf1c-eaf465abf88b-agent-certs\") pod \"konnectivity-agent-b565t\" (UID: \"d7c17281-bc8e-4ae5-bf1c-eaf465abf88b\") " pod="kube-system/konnectivity-agent-b565t" Apr 24 16:39:26.619006 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.618890 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/54f6ae1b-e6ac-48cd-92cc-1c9ae843b609-cnibin\") pod \"multus-additional-cni-plugins-92bp8\" (UID: \"54f6ae1b-e6ac-48cd-92cc-1c9ae843b609\") " pod="openshift-multus/multus-additional-cni-plugins-92bp8" Apr 24 16:39:26.619006 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.618918 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/54f6ae1b-e6ac-48cd-92cc-1c9ae843b609-os-release\") pod \"multus-additional-cni-plugins-92bp8\" (UID: \"54f6ae1b-e6ac-48cd-92cc-1c9ae843b609\") " pod="openshift-multus/multus-additional-cni-plugins-92bp8" Apr 24 16:39:26.619803 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.618939 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q65sl\" (UniqueName: \"kubernetes.io/projected/54f6ae1b-e6ac-48cd-92cc-1c9ae843b609-kube-api-access-q65sl\") pod \"multus-additional-cni-plugins-92bp8\" (UID: \"54f6ae1b-e6ac-48cd-92cc-1c9ae843b609\") " pod="openshift-multus/multus-additional-cni-plugins-92bp8" Apr 24 16:39:26.619803 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.618990 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e8f7734-976a-4f89-b273-9e519d952582-var-lib-openvswitch\") pod \"ovnkube-node-7fpht\" (UID: \"1e8f7734-976a-4f89-b273-9e519d952582\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:26.619803 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.619010 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1e8f7734-976a-4f89-b273-9e519d952582-ovn-node-metrics-cert\") pod \"ovnkube-node-7fpht\" (UID: \"1e8f7734-976a-4f89-b273-9e519d952582\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:26.619803 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.619030 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9afa95f3-5eb1-47b0-b00d-a487cb566ffc-system-cni-dir\") pod \"multus-ng4dd\" (UID: \"9afa95f3-5eb1-47b0-b00d-a487cb566ffc\") " pod="openshift-multus/multus-ng4dd" Apr 24 16:39:26.619803 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.619056 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7tfc7\" (UniqueName: \"kubernetes.io/projected/28a21434-57fa-420d-a7e4-3b011eb1dc0e-kube-api-access-7tfc7\") pod \"network-check-target-hknkz\" (UID: \"28a21434-57fa-420d-a7e4-3b011eb1dc0e\") " pod="openshift-network-diagnostics/network-check-target-hknkz" Apr 24 16:39:26.619803 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.619078 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1e8f7734-976a-4f89-b273-9e519d952582-host-cni-bin\") pod \"ovnkube-node-7fpht\" (UID: \"1e8f7734-976a-4f89-b273-9e519d952582\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:26.619803 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.619113 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/78ae7e56-20b9-4ea1-a705-310d2202a743-etc-selinux\") pod \"aws-ebs-csi-driver-node-4vz8k\" (UID: \"78ae7e56-20b9-4ea1-a705-310d2202a743\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4vz8k" Apr 24 16:39:26.619803 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.619140 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9afa95f3-5eb1-47b0-b00d-a487cb566ffc-host-var-lib-cni-multus\") pod \"multus-ng4dd\" (UID: \"9afa95f3-5eb1-47b0-b00d-a487cb566ffc\") " pod="openshift-multus/multus-ng4dd" Apr 24 16:39:26.619803 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.619165 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lm9fj\" (UniqueName: \"kubernetes.io/projected/9afa95f3-5eb1-47b0-b00d-a487cb566ffc-kube-api-access-lm9fj\") pod \"multus-ng4dd\" (UID: \"9afa95f3-5eb1-47b0-b00d-a487cb566ffc\") " pod="openshift-multus/multus-ng4dd" Apr 24 16:39:26.619803 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.619189 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/7b10f610-a264-47f5-82b9-6ff395db8762-etc-systemd\") pod \"tuned-tc7zr\" (UID: \"7b10f610-a264-47f5-82b9-6ff395db8762\") " pod="openshift-cluster-node-tuning-operator/tuned-tc7zr" Apr 24 16:39:26.619803 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.619201 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/54f6ae1b-e6ac-48cd-92cc-1c9ae843b609-cni-binary-copy\") pod \"multus-additional-cni-plugins-92bp8\" (UID: \"54f6ae1b-e6ac-48cd-92cc-1c9ae843b609\") " pod="openshift-multus/multus-additional-cni-plugins-92bp8" Apr 24 16:39:26.619803 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.619206 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/801dc599-67b9-400f-a14c-835900dba24e-host\") pod \"node-ca-kdlh8\" (UID: \"801dc599-67b9-400f-a14c-835900dba24e\") " pod="openshift-image-registry/node-ca-kdlh8" Apr 24 16:39:26.619803 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.619253 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4af5b09f-ceea-413a-bec5-40a2b59c7ea3-metrics-certs\") pod \"network-metrics-daemon-kmh29\" (UID: \"4af5b09f-ceea-413a-bec5-40a2b59c7ea3\") " pod="openshift-multus/network-metrics-daemon-kmh29" Apr 24 16:39:26.619803 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.619295 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/78ae7e56-20b9-4ea1-a705-310d2202a743-registration-dir\") pod \"aws-ebs-csi-driver-node-4vz8k\" (UID: \"78ae7e56-20b9-4ea1-a705-310d2202a743\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4vz8k" Apr 24 16:39:26.619803 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.618785 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/54f6ae1b-e6ac-48cd-92cc-1c9ae843b609-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-92bp8\" (UID: \"54f6ae1b-e6ac-48cd-92cc-1c9ae843b609\") " pod="openshift-multus/multus-additional-cni-plugins-92bp8" Apr 24 16:39:26.619803 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.619335 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9afa95f3-5eb1-47b0-b00d-a487cb566ffc-cnibin\") pod \"multus-ng4dd\" (UID: \"9afa95f3-5eb1-47b0-b00d-a487cb566ffc\") " pod="openshift-multus/multus-ng4dd" Apr 24 16:39:26.620498 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.619390 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/54f6ae1b-e6ac-48cd-92cc-1c9ae843b609-tuning-conf-dir\") pod \"multus-additional-cni-plugins-92bp8\" (UID: \"54f6ae1b-e6ac-48cd-92cc-1c9ae843b609\") " pod="openshift-multus/multus-additional-cni-plugins-92bp8" Apr 24 16:39:26.620498 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.619704 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/54f6ae1b-e6ac-48cd-92cc-1c9ae843b609-cnibin\") pod \"multus-additional-cni-plugins-92bp8\" (UID: \"54f6ae1b-e6ac-48cd-92cc-1c9ae843b609\") " pod="openshift-multus/multus-additional-cni-plugins-92bp8" Apr 24 16:39:26.620498 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.619759 2579 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 16:39:26.620498 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.619845 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/54f6ae1b-e6ac-48cd-92cc-1c9ae843b609-os-release\") pod \"multus-additional-cni-plugins-92bp8\" (UID: \"54f6ae1b-e6ac-48cd-92cc-1c9ae843b609\") " pod="openshift-multus/multus-additional-cni-plugins-92bp8" Apr 24 16:39:26.620498 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.619769 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b10f610-a264-47f5-82b9-6ff395db8762-etc-kubernetes\") pod \"tuned-tc7zr\" (UID: \"7b10f610-a264-47f5-82b9-6ff395db8762\") " pod="openshift-cluster-node-tuning-operator/tuned-tc7zr" Apr 24 16:39:26.620498 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.620045 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/7b10f610-a264-47f5-82b9-6ff395db8762-etc-sysctl-conf\") pod \"tuned-tc7zr\" (UID: \"7b10f610-a264-47f5-82b9-6ff395db8762\") " pod="openshift-cluster-node-tuning-operator/tuned-tc7zr" Apr 24 16:39:26.620498 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.620070 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/7b10f610-a264-47f5-82b9-6ff395db8762-etc-tuned\") pod \"tuned-tc7zr\" (UID: \"7b10f610-a264-47f5-82b9-6ff395db8762\") " pod="openshift-cluster-node-tuning-operator/tuned-tc7zr" Apr 24 16:39:26.620498 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.620094 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1e8f7734-976a-4f89-b273-9e519d952582-host-cni-netd\") pod \"ovnkube-node-7fpht\" (UID: \"1e8f7734-976a-4f89-b273-9e519d952582\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:26.620498 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.620113 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7b10f610-a264-47f5-82b9-6ff395db8762-run\") pod \"tuned-tc7zr\" (UID: \"7b10f610-a264-47f5-82b9-6ff395db8762\") " pod="openshift-cluster-node-tuning-operator/tuned-tc7zr" Apr 24 16:39:26.620498 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.620132 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/54f6ae1b-e6ac-48cd-92cc-1c9ae843b609-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-92bp8\" (UID: \"54f6ae1b-e6ac-48cd-92cc-1c9ae843b609\") " pod="openshift-multus/multus-additional-cni-plugins-92bp8" Apr 24 16:39:26.620498 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.620151 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9afa95f3-5eb1-47b0-b00d-a487cb566ffc-hostroot\") pod \"multus-ng4dd\" (UID: \"9afa95f3-5eb1-47b0-b00d-a487cb566ffc\") " pod="openshift-multus/multus-ng4dd" Apr 24 16:39:26.620498 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.620171 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7b10f610-a264-47f5-82b9-6ff395db8762-tmp\") pod \"tuned-tc7zr\" (UID: \"7b10f610-a264-47f5-82b9-6ff395db8762\") " pod="openshift-cluster-node-tuning-operator/tuned-tc7zr" Apr 24 16:39:26.620498 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.620201 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1e8f7734-976a-4f89-b273-9e519d952582-systemd-units\") pod \"ovnkube-node-7fpht\" (UID: \"1e8f7734-976a-4f89-b273-9e519d952582\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:26.620498 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.620224 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1e8f7734-976a-4f89-b273-9e519d952582-host-run-ovn-kubernetes\") pod \"ovnkube-node-7fpht\" (UID: \"1e8f7734-976a-4f89-b273-9e519d952582\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:26.620498 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.620249 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/78ae7e56-20b9-4ea1-a705-310d2202a743-kubelet-dir\") pod \"aws-ebs-csi-driver-node-4vz8k\" (UID: \"78ae7e56-20b9-4ea1-a705-310d2202a743\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4vz8k" Apr 24 16:39:26.620498 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.620271 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9afa95f3-5eb1-47b0-b00d-a487cb566ffc-os-release\") pod \"multus-ng4dd\" (UID: \"9afa95f3-5eb1-47b0-b00d-a487cb566ffc\") " pod="openshift-multus/multus-ng4dd" Apr 24 16:39:26.620498 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.620316 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7b10f610-a264-47f5-82b9-6ff395db8762-var-lib-kubelet\") pod \"tuned-tc7zr\" (UID: \"7b10f610-a264-47f5-82b9-6ff395db8762\") " pod="openshift-cluster-node-tuning-operator/tuned-tc7zr" Apr 24 16:39:26.621220 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.620335 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7b10f610-a264-47f5-82b9-6ff395db8762-host\") pod \"tuned-tc7zr\" (UID: \"7b10f610-a264-47f5-82b9-6ff395db8762\") " pod="openshift-cluster-node-tuning-operator/tuned-tc7zr" Apr 24 16:39:26.621220 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.620354 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khcth\" (UniqueName: \"kubernetes.io/projected/4af5b09f-ceea-413a-bec5-40a2b59c7ea3-kube-api-access-khcth\") pod \"network-metrics-daemon-kmh29\" (UID: \"4af5b09f-ceea-413a-bec5-40a2b59c7ea3\") " pod="openshift-multus/network-metrics-daemon-kmh29" Apr 24 16:39:26.621220 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.620396 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d7c17281-bc8e-4ae5-bf1c-eaf465abf88b-konnectivity-ca\") pod \"konnectivity-agent-b565t\" (UID: \"d7c17281-bc8e-4ae5-bf1c-eaf465abf88b\") " pod="kube-system/konnectivity-agent-b565t" Apr 24 16:39:26.621220 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.620421 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/95e0f061-d036-4e7f-9fd6-e08dcb11ea48-iptables-alerter-script\") pod \"iptables-alerter-zdpfk\" (UID: \"95e0f061-d036-4e7f-9fd6-e08dcb11ea48\") " pod="openshift-network-operator/iptables-alerter-zdpfk" Apr 24 16:39:26.621220 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.620438 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xmzv\" (UniqueName: \"kubernetes.io/projected/95e0f061-d036-4e7f-9fd6-e08dcb11ea48-kube-api-access-9xmzv\") pod \"iptables-alerter-zdpfk\" (UID: \"95e0f061-d036-4e7f-9fd6-e08dcb11ea48\") " pod="openshift-network-operator/iptables-alerter-zdpfk" Apr 24 16:39:26.621220 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.620453 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7b10f610-a264-47f5-82b9-6ff395db8762-lib-modules\") pod \"tuned-tc7zr\" (UID: \"7b10f610-a264-47f5-82b9-6ff395db8762\") " pod="openshift-cluster-node-tuning-operator/tuned-tc7zr" Apr 24 16:39:26.621220 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.620470 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1e8f7734-976a-4f89-b273-9e519d952582-host-slash\") pod \"ovnkube-node-7fpht\" (UID: \"1e8f7734-976a-4f89-b273-9e519d952582\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:26.621220 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.620497 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e8f7734-976a-4f89-b273-9e519d952582-etc-openvswitch\") pod \"ovnkube-node-7fpht\" (UID: \"1e8f7734-976a-4f89-b273-9e519d952582\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:26.621220 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.620522 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e8f7734-976a-4f89-b273-9e519d952582-run-openvswitch\") pod \"ovnkube-node-7fpht\" (UID: \"1e8f7734-976a-4f89-b273-9e519d952582\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:26.621220 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.620548 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1e8f7734-976a-4f89-b273-9e519d952582-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7fpht\" (UID: \"1e8f7734-976a-4f89-b273-9e519d952582\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:26.621220 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.620563 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1e8f7734-976a-4f89-b273-9e519d952582-env-overrides\") pod \"ovnkube-node-7fpht\" (UID: \"1e8f7734-976a-4f89-b273-9e519d952582\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:26.621220 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.620577 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9afa95f3-5eb1-47b0-b00d-a487cb566ffc-etc-kubernetes\") pod \"multus-ng4dd\" (UID: \"9afa95f3-5eb1-47b0-b00d-a487cb566ffc\") " pod="openshift-multus/multus-ng4dd" Apr 24 16:39:26.621220 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.620606 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1e8f7734-976a-4f89-b273-9e519d952582-host-run-netns\") pod \"ovnkube-node-7fpht\" (UID: \"1e8f7734-976a-4f89-b273-9e519d952582\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:26.621220 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.620623 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkpz6\" (UniqueName: \"kubernetes.io/projected/1e8f7734-976a-4f89-b273-9e519d952582-kube-api-access-tkpz6\") pod \"ovnkube-node-7fpht\" (UID: \"1e8f7734-976a-4f89-b273-9e519d952582\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:26.621220 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.620672 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/54f6ae1b-e6ac-48cd-92cc-1c9ae843b609-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-92bp8\" (UID: \"54f6ae1b-e6ac-48cd-92cc-1c9ae843b609\") " pod="openshift-multus/multus-additional-cni-plugins-92bp8" Apr 24 16:39:26.621220 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.620689 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9afa95f3-5eb1-47b0-b00d-a487cb566ffc-host-run-k8s-cni-cncf-io\") pod \"multus-ng4dd\" (UID: \"9afa95f3-5eb1-47b0-b00d-a487cb566ffc\") " pod="openshift-multus/multus-ng4dd" Apr 24 16:39:26.621867 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.620793 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/7b10f610-a264-47f5-82b9-6ff395db8762-etc-sysconfig\") pod \"tuned-tc7zr\" (UID: \"7b10f610-a264-47f5-82b9-6ff395db8762\") " pod="openshift-cluster-node-tuning-operator/tuned-tc7zr" Apr 24 16:39:26.621867 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.620829 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntgf7\" (UniqueName: \"kubernetes.io/projected/7b10f610-a264-47f5-82b9-6ff395db8762-kube-api-access-ntgf7\") pod \"tuned-tc7zr\" (UID: \"7b10f610-a264-47f5-82b9-6ff395db8762\") " pod="openshift-cluster-node-tuning-operator/tuned-tc7zr" Apr 24 16:39:26.621867 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.620856 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9afa95f3-5eb1-47b0-b00d-a487cb566ffc-host-run-netns\") pod \"multus-ng4dd\" (UID: \"9afa95f3-5eb1-47b0-b00d-a487cb566ffc\") " pod="openshift-multus/multus-ng4dd" Apr 24 16:39:26.621867 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.620884 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9afa95f3-5eb1-47b0-b00d-a487cb566ffc-host-var-lib-cni-bin\") pod \"multus-ng4dd\" (UID: \"9afa95f3-5eb1-47b0-b00d-a487cb566ffc\") " pod="openshift-multus/multus-ng4dd" Apr 24 16:39:26.621867 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.621194 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d7c17281-bc8e-4ae5-bf1c-eaf465abf88b-konnectivity-ca\") pod \"konnectivity-agent-b565t\" (UID: \"d7c17281-bc8e-4ae5-bf1c-eaf465abf88b\") " pod="kube-system/konnectivity-agent-b565t" Apr 24 16:39:26.624376 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.624327 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d7c17281-bc8e-4ae5-bf1c-eaf465abf88b-agent-certs\") pod \"konnectivity-agent-b565t\" (UID: \"d7c17281-bc8e-4ae5-bf1c-eaf465abf88b\") " pod="kube-system/konnectivity-agent-b565t" Apr 24 16:39:26.624499 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:26.624452 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:39:26.624499 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:26.624467 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:39:26.624499 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:26.624479 2579 projected.go:194] Error preparing data for projected volume kube-api-access-7tfc7 for pod openshift-network-diagnostics/network-check-target-hknkz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:26.624605 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:26.624542 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/28a21434-57fa-420d-a7e4-3b011eb1dc0e-kube-api-access-7tfc7 podName:28a21434-57fa-420d-a7e4-3b011eb1dc0e nodeName:}" failed. No retries permitted until 2026-04-24 16:39:27.124525542 +0000 UTC m=+3.057594735 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-7tfc7" (UniqueName: "kubernetes.io/projected/28a21434-57fa-420d-a7e4-3b011eb1dc0e-kube-api-access-7tfc7") pod "network-check-target-hknkz" (UID: "28a21434-57fa-420d-a7e4-3b011eb1dc0e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:26.626721 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.626697 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q65sl\" (UniqueName: \"kubernetes.io/projected/54f6ae1b-e6ac-48cd-92cc-1c9ae843b609-kube-api-access-q65sl\") pod \"multus-additional-cni-plugins-92bp8\" (UID: \"54f6ae1b-e6ac-48cd-92cc-1c9ae843b609\") " pod="openshift-multus/multus-additional-cni-plugins-92bp8" Apr 24 16:39:26.721428 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.721397 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1e8f7734-976a-4f89-b273-9e519d952582-ovnkube-script-lib\") pod \"ovnkube-node-7fpht\" (UID: \"1e8f7734-976a-4f89-b273-9e519d952582\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:26.721428 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.721430 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9afa95f3-5eb1-47b0-b00d-a487cb566ffc-cni-binary-copy\") pod \"multus-ng4dd\" (UID: \"9afa95f3-5eb1-47b0-b00d-a487cb566ffc\") " pod="openshift-multus/multus-ng4dd" Apr 24 16:39:26.721667 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.721456 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/7b10f610-a264-47f5-82b9-6ff395db8762-etc-sysctl-d\") pod \"tuned-tc7zr\" (UID: \"7b10f610-a264-47f5-82b9-6ff395db8762\") " pod="openshift-cluster-node-tuning-operator/tuned-tc7zr" Apr 24 16:39:26.721667 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.721482 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/801dc599-67b9-400f-a14c-835900dba24e-serviceca\") pod \"node-ca-kdlh8\" (UID: \"801dc599-67b9-400f-a14c-835900dba24e\") " pod="openshift-image-registry/node-ca-kdlh8" Apr 24 16:39:26.722005 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.721979 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/801dc599-67b9-400f-a14c-835900dba24e-serviceca\") pod \"node-ca-kdlh8\" (UID: \"801dc599-67b9-400f-a14c-835900dba24e\") " pod="openshift-image-registry/node-ca-kdlh8" Apr 24 16:39:26.722086 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.722042 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9afa95f3-5eb1-47b0-b00d-a487cb566ffc-cni-binary-copy\") pod \"multus-ng4dd\" (UID: \"9afa95f3-5eb1-47b0-b00d-a487cb566ffc\") " pod="openshift-multus/multus-ng4dd" Apr 24 16:39:26.722136 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.722082 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1e8f7734-976a-4f89-b273-9e519d952582-ovnkube-script-lib\") pod \"ovnkube-node-7fpht\" (UID: \"1e8f7734-976a-4f89-b273-9e519d952582\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:26.722136 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.722098 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/78ae7e56-20b9-4ea1-a705-310d2202a743-socket-dir\") pod \"aws-ebs-csi-driver-node-4vz8k\" (UID: \"78ae7e56-20b9-4ea1-a705-310d2202a743\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4vz8k" Apr 24 16:39:26.722136 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.722130 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1e8f7734-976a-4f89-b273-9e519d952582-run-systemd\") pod \"ovnkube-node-7fpht\" (UID: \"1e8f7734-976a-4f89-b273-9e519d952582\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:26.722284 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.722152 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/7b10f610-a264-47f5-82b9-6ff395db8762-etc-sysctl-d\") pod \"tuned-tc7zr\" (UID: \"7b10f610-a264-47f5-82b9-6ff395db8762\") " pod="openshift-cluster-node-tuning-operator/tuned-tc7zr" Apr 24 16:39:26.722284 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.722156 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1e8f7734-976a-4f89-b273-9e519d952582-ovnkube-config\") pod \"ovnkube-node-7fpht\" (UID: \"1e8f7734-976a-4f89-b273-9e519d952582\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:26.722284 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.722184 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/78ae7e56-20b9-4ea1-a705-310d2202a743-socket-dir\") pod \"aws-ebs-csi-driver-node-4vz8k\" (UID: \"78ae7e56-20b9-4ea1-a705-310d2202a743\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4vz8k" Apr 24 16:39:26.722284 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.722195 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1e8f7734-976a-4f89-b273-9e519d952582-run-systemd\") pod \"ovnkube-node-7fpht\" (UID: \"1e8f7734-976a-4f89-b273-9e519d952582\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:26.722284 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.722188 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7b10f610-a264-47f5-82b9-6ff395db8762-sys\") pod \"tuned-tc7zr\" (UID: \"7b10f610-a264-47f5-82b9-6ff395db8762\") " pod="openshift-cluster-node-tuning-operator/tuned-tc7zr" Apr 24 16:39:26.722284 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.722235 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7b10f610-a264-47f5-82b9-6ff395db8762-sys\") pod \"tuned-tc7zr\" (UID: \"7b10f610-a264-47f5-82b9-6ff395db8762\") " pod="openshift-cluster-node-tuning-operator/tuned-tc7zr" Apr 24 16:39:26.722284 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.722234 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e8f7734-976a-4f89-b273-9e519d952582-var-lib-openvswitch\") pod \"ovnkube-node-7fpht\" (UID: \"1e8f7734-976a-4f89-b273-9e519d952582\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:26.722284 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.722257 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e8f7734-976a-4f89-b273-9e519d952582-var-lib-openvswitch\") pod \"ovnkube-node-7fpht\" (UID: \"1e8f7734-976a-4f89-b273-9e519d952582\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:26.722284 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.722274 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1e8f7734-976a-4f89-b273-9e519d952582-ovn-node-metrics-cert\") pod \"ovnkube-node-7fpht\" (UID: \"1e8f7734-976a-4f89-b273-9e519d952582\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:26.722681 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.722299 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9afa95f3-5eb1-47b0-b00d-a487cb566ffc-system-cni-dir\") pod \"multus-ng4dd\" (UID: \"9afa95f3-5eb1-47b0-b00d-a487cb566ffc\") " pod="openshift-multus/multus-ng4dd" Apr 24 16:39:26.722681 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.722340 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1e8f7734-976a-4f89-b273-9e519d952582-host-cni-bin\") pod \"ovnkube-node-7fpht\" (UID: \"1e8f7734-976a-4f89-b273-9e519d952582\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:26.722681 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.722384 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/78ae7e56-20b9-4ea1-a705-310d2202a743-etc-selinux\") pod \"aws-ebs-csi-driver-node-4vz8k\" (UID: \"78ae7e56-20b9-4ea1-a705-310d2202a743\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4vz8k" Apr 24 16:39:26.722681 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.722424 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9afa95f3-5eb1-47b0-b00d-a487cb566ffc-host-var-lib-cni-multus\") pod \"multus-ng4dd\" (UID: \"9afa95f3-5eb1-47b0-b00d-a487cb566ffc\") " pod="openshift-multus/multus-ng4dd" Apr 24 16:39:26.722681 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.722427 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9afa95f3-5eb1-47b0-b00d-a487cb566ffc-system-cni-dir\") pod \"multus-ng4dd\" (UID: \"9afa95f3-5eb1-47b0-b00d-a487cb566ffc\") " pod="openshift-multus/multus-ng4dd" Apr 24 16:39:26.722681 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.722452 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lm9fj\" (UniqueName: \"kubernetes.io/projected/9afa95f3-5eb1-47b0-b00d-a487cb566ffc-kube-api-access-lm9fj\") pod \"multus-ng4dd\" (UID: \"9afa95f3-5eb1-47b0-b00d-a487cb566ffc\") " pod="openshift-multus/multus-ng4dd" Apr 24 16:39:26.722681 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.722477 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/7b10f610-a264-47f5-82b9-6ff395db8762-etc-systemd\") pod \"tuned-tc7zr\" (UID: \"7b10f610-a264-47f5-82b9-6ff395db8762\") " pod="openshift-cluster-node-tuning-operator/tuned-tc7zr" Apr 24 16:39:26.722681 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.722483 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/78ae7e56-20b9-4ea1-a705-310d2202a743-etc-selinux\") pod \"aws-ebs-csi-driver-node-4vz8k\" (UID: \"78ae7e56-20b9-4ea1-a705-310d2202a743\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4vz8k" Apr 24 16:39:26.722681 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.722509 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/801dc599-67b9-400f-a14c-835900dba24e-host\") pod \"node-ca-kdlh8\" (UID: \"801dc599-67b9-400f-a14c-835900dba24e\") " pod="openshift-image-registry/node-ca-kdlh8" Apr 24 16:39:26.722681 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.722531 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4af5b09f-ceea-413a-bec5-40a2b59c7ea3-metrics-certs\") pod \"network-metrics-daemon-kmh29\" (UID: \"4af5b09f-ceea-413a-bec5-40a2b59c7ea3\") " pod="openshift-multus/network-metrics-daemon-kmh29" Apr 24 16:39:26.722681 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.722549 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/78ae7e56-20b9-4ea1-a705-310d2202a743-registration-dir\") pod \"aws-ebs-csi-driver-node-4vz8k\" (UID: \"78ae7e56-20b9-4ea1-a705-310d2202a743\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4vz8k" Apr 24 16:39:26.722681 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.722551 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/7b10f610-a264-47f5-82b9-6ff395db8762-etc-systemd\") pod \"tuned-tc7zr\" (UID: \"7b10f610-a264-47f5-82b9-6ff395db8762\") " pod="openshift-cluster-node-tuning-operator/tuned-tc7zr" Apr 24 16:39:26.722681 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.722568 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9afa95f3-5eb1-47b0-b00d-a487cb566ffc-cnibin\") pod \"multus-ng4dd\" (UID: \"9afa95f3-5eb1-47b0-b00d-a487cb566ffc\") " pod="openshift-multus/multus-ng4dd" Apr 24 16:39:26.722681 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.722593 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b10f610-a264-47f5-82b9-6ff395db8762-etc-kubernetes\") pod \"tuned-tc7zr\" (UID: \"7b10f610-a264-47f5-82b9-6ff395db8762\") " pod="openshift-cluster-node-tuning-operator/tuned-tc7zr" Apr 24 16:39:26.722681 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.722598 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9afa95f3-5eb1-47b0-b00d-a487cb566ffc-host-var-lib-cni-multus\") pod \"multus-ng4dd\" (UID: \"9afa95f3-5eb1-47b0-b00d-a487cb566ffc\") " pod="openshift-multus/multus-ng4dd" Apr 24 16:39:26.722681 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.722621 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/7b10f610-a264-47f5-82b9-6ff395db8762-etc-sysctl-conf\") pod \"tuned-tc7zr\" (UID: \"7b10f610-a264-47f5-82b9-6ff395db8762\") " pod="openshift-cluster-node-tuning-operator/tuned-tc7zr" Apr 24 16:39:26.722681 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.722638 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/7b10f610-a264-47f5-82b9-6ff395db8762-etc-tuned\") pod \"tuned-tc7zr\" (UID: \"7b10f610-a264-47f5-82b9-6ff395db8762\") " pod="openshift-cluster-node-tuning-operator/tuned-tc7zr" Apr 24 16:39:26.722681 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.722663 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1e8f7734-976a-4f89-b273-9e519d952582-host-cni-netd\") pod \"ovnkube-node-7fpht\" (UID: \"1e8f7734-976a-4f89-b273-9e519d952582\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:26.723389 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.722676 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1e8f7734-976a-4f89-b273-9e519d952582-ovnkube-config\") pod \"ovnkube-node-7fpht\" (UID: \"1e8f7734-976a-4f89-b273-9e519d952582\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:26.723389 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.722749 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7b10f610-a264-47f5-82b9-6ff395db8762-run\") pod \"tuned-tc7zr\" (UID: \"7b10f610-a264-47f5-82b9-6ff395db8762\") " pod="openshift-cluster-node-tuning-operator/tuned-tc7zr" Apr 24 16:39:26.723389 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:26.722760 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:26.723389 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.722780 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1e8f7734-976a-4f89-b273-9e519d952582-host-cni-bin\") pod \"ovnkube-node-7fpht\" (UID: \"1e8f7734-976a-4f89-b273-9e519d952582\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:26.723389 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:26.722807 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4af5b09f-ceea-413a-bec5-40a2b59c7ea3-metrics-certs podName:4af5b09f-ceea-413a-bec5-40a2b59c7ea3 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:27.222794562 +0000 UTC m=+3.155863742 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4af5b09f-ceea-413a-bec5-40a2b59c7ea3-metrics-certs") pod "network-metrics-daemon-kmh29" (UID: "4af5b09f-ceea-413a-bec5-40a2b59c7ea3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:26.723389 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.722808 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/801dc599-67b9-400f-a14c-835900dba24e-host\") pod \"node-ca-kdlh8\" (UID: \"801dc599-67b9-400f-a14c-835900dba24e\") " pod="openshift-image-registry/node-ca-kdlh8" Apr 24 16:39:26.723389 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.722846 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9afa95f3-5eb1-47b0-b00d-a487cb566ffc-cnibin\") pod \"multus-ng4dd\" (UID: \"9afa95f3-5eb1-47b0-b00d-a487cb566ffc\") " pod="openshift-multus/multus-ng4dd" Apr 24 16:39:26.723389 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.722851 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/78ae7e56-20b9-4ea1-a705-310d2202a743-registration-dir\") pod \"aws-ebs-csi-driver-node-4vz8k\" (UID: \"78ae7e56-20b9-4ea1-a705-310d2202a743\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4vz8k" Apr 24 16:39:26.723389 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.722682 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7b10f610-a264-47f5-82b9-6ff395db8762-run\") pod \"tuned-tc7zr\" (UID: \"7b10f610-a264-47f5-82b9-6ff395db8762\") " pod="openshift-cluster-node-tuning-operator/tuned-tc7zr" Apr 24 16:39:26.723389 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.722883 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9afa95f3-5eb1-47b0-b00d-a487cb566ffc-hostroot\") pod \"multus-ng4dd\" (UID: \"9afa95f3-5eb1-47b0-b00d-a487cb566ffc\") " pod="openshift-multus/multus-ng4dd" Apr 24 16:39:26.723389 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.722887 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b10f610-a264-47f5-82b9-6ff395db8762-etc-kubernetes\") pod \"tuned-tc7zr\" (UID: \"7b10f610-a264-47f5-82b9-6ff395db8762\") " pod="openshift-cluster-node-tuning-operator/tuned-tc7zr" Apr 24 16:39:26.723389 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.722906 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7b10f610-a264-47f5-82b9-6ff395db8762-tmp\") pod \"tuned-tc7zr\" (UID: \"7b10f610-a264-47f5-82b9-6ff395db8762\") " pod="openshift-cluster-node-tuning-operator/tuned-tc7zr" Apr 24 16:39:26.723389 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.722923 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1e8f7734-976a-4f89-b273-9e519d952582-host-cni-netd\") pod \"ovnkube-node-7fpht\" (UID: \"1e8f7734-976a-4f89-b273-9e519d952582\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:26.723389 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.723322 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/7b10f610-a264-47f5-82b9-6ff395db8762-etc-sysctl-conf\") pod \"tuned-tc7zr\" (UID: \"7b10f610-a264-47f5-82b9-6ff395db8762\") " pod="openshift-cluster-node-tuning-operator/tuned-tc7zr" Apr 24 16:39:26.731038 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.722948 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9afa95f3-5eb1-47b0-b00d-a487cb566ffc-hostroot\") pod \"multus-ng4dd\" (UID: \"9afa95f3-5eb1-47b0-b00d-a487cb566ffc\") " pod="openshift-multus/multus-ng4dd" Apr 24 16:39:26.731038 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.725174 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1e8f7734-976a-4f89-b273-9e519d952582-systemd-units\") pod \"ovnkube-node-7fpht\" (UID: \"1e8f7734-976a-4f89-b273-9e519d952582\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:26.731038 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.725454 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1e8f7734-976a-4f89-b273-9e519d952582-systemd-units\") pod \"ovnkube-node-7fpht\" (UID: \"1e8f7734-976a-4f89-b273-9e519d952582\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:26.731038 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.725497 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1e8f7734-976a-4f89-b273-9e519d952582-host-run-ovn-kubernetes\") pod \"ovnkube-node-7fpht\" (UID: \"1e8f7734-976a-4f89-b273-9e519d952582\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:26.731038 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.725530 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/78ae7e56-20b9-4ea1-a705-310d2202a743-kubelet-dir\") pod \"aws-ebs-csi-driver-node-4vz8k\" (UID: \"78ae7e56-20b9-4ea1-a705-310d2202a743\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4vz8k" Apr 24 16:39:26.731038 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.725560 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9afa95f3-5eb1-47b0-b00d-a487cb566ffc-os-release\") pod \"multus-ng4dd\" (UID: \"9afa95f3-5eb1-47b0-b00d-a487cb566ffc\") " pod="openshift-multus/multus-ng4dd" Apr 24 16:39:26.731038 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.725589 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7b10f610-a264-47f5-82b9-6ff395db8762-var-lib-kubelet\") pod \"tuned-tc7zr\" (UID: \"7b10f610-a264-47f5-82b9-6ff395db8762\") " pod="openshift-cluster-node-tuning-operator/tuned-tc7zr" Apr 24 16:39:26.731038 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.726090 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7b10f610-a264-47f5-82b9-6ff395db8762-host\") pod \"tuned-tc7zr\" (UID: \"7b10f610-a264-47f5-82b9-6ff395db8762\") " pod="openshift-cluster-node-tuning-operator/tuned-tc7zr" Apr 24 16:39:26.731038 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.726129 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-khcth\" (UniqueName: \"kubernetes.io/projected/4af5b09f-ceea-413a-bec5-40a2b59c7ea3-kube-api-access-khcth\") pod \"network-metrics-daemon-kmh29\" (UID: \"4af5b09f-ceea-413a-bec5-40a2b59c7ea3\") " pod="openshift-multus/network-metrics-daemon-kmh29" Apr 24 16:39:26.731038 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.726166 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/95e0f061-d036-4e7f-9fd6-e08dcb11ea48-iptables-alerter-script\") pod \"iptables-alerter-zdpfk\" (UID: \"95e0f061-d036-4e7f-9fd6-e08dcb11ea48\") " pod="openshift-network-operator/iptables-alerter-zdpfk" Apr 24 16:39:26.731038 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.726197 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9xmzv\" (UniqueName: \"kubernetes.io/projected/95e0f061-d036-4e7f-9fd6-e08dcb11ea48-kube-api-access-9xmzv\") pod \"iptables-alerter-zdpfk\" (UID: \"95e0f061-d036-4e7f-9fd6-e08dcb11ea48\") " pod="openshift-network-operator/iptables-alerter-zdpfk" Apr 24 16:39:26.731038 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.726226 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7b10f610-a264-47f5-82b9-6ff395db8762-lib-modules\") pod \"tuned-tc7zr\" (UID: \"7b10f610-a264-47f5-82b9-6ff395db8762\") " pod="openshift-cluster-node-tuning-operator/tuned-tc7zr" Apr 24 16:39:26.731038 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.726249 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1e8f7734-976a-4f89-b273-9e519d952582-host-slash\") pod \"ovnkube-node-7fpht\" (UID: \"1e8f7734-976a-4f89-b273-9e519d952582\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:26.731038 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.726280 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e8f7734-976a-4f89-b273-9e519d952582-etc-openvswitch\") pod \"ovnkube-node-7fpht\" (UID: \"1e8f7734-976a-4f89-b273-9e519d952582\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:26.731038 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.726315 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e8f7734-976a-4f89-b273-9e519d952582-run-openvswitch\") pod \"ovnkube-node-7fpht\" (UID: \"1e8f7734-976a-4f89-b273-9e519d952582\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:26.731038 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.726349 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1e8f7734-976a-4f89-b273-9e519d952582-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7fpht\" (UID: \"1e8f7734-976a-4f89-b273-9e519d952582\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:26.731038 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.726380 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1e8f7734-976a-4f89-b273-9e519d952582-env-overrides\") pod \"ovnkube-node-7fpht\" (UID: \"1e8f7734-976a-4f89-b273-9e519d952582\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:26.731891 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.726405 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9afa95f3-5eb1-47b0-b00d-a487cb566ffc-etc-kubernetes\") pod \"multus-ng4dd\" (UID: \"9afa95f3-5eb1-47b0-b00d-a487cb566ffc\") " pod="openshift-multus/multus-ng4dd" Apr 24 16:39:26.731891 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.726435 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1e8f7734-976a-4f89-b273-9e519d952582-host-run-netns\") pod \"ovnkube-node-7fpht\" (UID: \"1e8f7734-976a-4f89-b273-9e519d952582\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:26.731891 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.726465 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tkpz6\" (UniqueName: \"kubernetes.io/projected/1e8f7734-976a-4f89-b273-9e519d952582-kube-api-access-tkpz6\") pod \"ovnkube-node-7fpht\" (UID: \"1e8f7734-976a-4f89-b273-9e519d952582\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:26.731891 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.726493 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9afa95f3-5eb1-47b0-b00d-a487cb566ffc-host-run-k8s-cni-cncf-io\") pod \"multus-ng4dd\" (UID: \"9afa95f3-5eb1-47b0-b00d-a487cb566ffc\") " pod="openshift-multus/multus-ng4dd" Apr 24 16:39:26.731891 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.726520 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/7b10f610-a264-47f5-82b9-6ff395db8762-etc-sysconfig\") pod \"tuned-tc7zr\" (UID: \"7b10f610-a264-47f5-82b9-6ff395db8762\") " pod="openshift-cluster-node-tuning-operator/tuned-tc7zr" Apr 24 16:39:26.731891 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.726545 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ntgf7\" (UniqueName: \"kubernetes.io/projected/7b10f610-a264-47f5-82b9-6ff395db8762-kube-api-access-ntgf7\") pod \"tuned-tc7zr\" (UID: \"7b10f610-a264-47f5-82b9-6ff395db8762\") " pod="openshift-cluster-node-tuning-operator/tuned-tc7zr" Apr 24 16:39:26.731891 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.726575 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9afa95f3-5eb1-47b0-b00d-a487cb566ffc-host-run-netns\") pod \"multus-ng4dd\" (UID: \"9afa95f3-5eb1-47b0-b00d-a487cb566ffc\") " pod="openshift-multus/multus-ng4dd" Apr 24 16:39:26.731891 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.726605 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9afa95f3-5eb1-47b0-b00d-a487cb566ffc-host-var-lib-cni-bin\") pod \"multus-ng4dd\" (UID: \"9afa95f3-5eb1-47b0-b00d-a487cb566ffc\") " pod="openshift-multus/multus-ng4dd" Apr 24 16:39:26.731891 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.726636 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9afa95f3-5eb1-47b0-b00d-a487cb566ffc-multus-conf-dir\") pod \"multus-ng4dd\" (UID: \"9afa95f3-5eb1-47b0-b00d-a487cb566ffc\") " pod="openshift-multus/multus-ng4dd" Apr 24 16:39:26.731891 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.726671 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9afa95f3-5eb1-47b0-b00d-a487cb566ffc-host-run-multus-certs\") pod \"multus-ng4dd\" (UID: \"9afa95f3-5eb1-47b0-b00d-a487cb566ffc\") " pod="openshift-multus/multus-ng4dd" Apr 24 16:39:26.731891 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.726700 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/53e5c63d-2459-4e1a-be74-d81070bbbeff-tmp-dir\") pod \"node-resolver-7grsg\" (UID: \"53e5c63d-2459-4e1a-be74-d81070bbbeff\") " pod="openshift-dns/node-resolver-7grsg" Apr 24 16:39:26.731891 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.726723 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zdnkg\" (UniqueName: \"kubernetes.io/projected/53e5c63d-2459-4e1a-be74-d81070bbbeff-kube-api-access-zdnkg\") pod \"node-resolver-7grsg\" (UID: \"53e5c63d-2459-4e1a-be74-d81070bbbeff\") " pod="openshift-dns/node-resolver-7grsg" Apr 24 16:39:26.731891 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.726778 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1e8f7734-976a-4f89-b273-9e519d952582-run-ovn\") pod \"ovnkube-node-7fpht\" (UID: \"1e8f7734-976a-4f89-b273-9e519d952582\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:26.731891 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.726808 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/78ae7e56-20b9-4ea1-a705-310d2202a743-sys-fs\") pod \"aws-ebs-csi-driver-node-4vz8k\" (UID: \"78ae7e56-20b9-4ea1-a705-310d2202a743\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4vz8k" Apr 24 16:39:26.731891 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.726843 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/95e0f061-d036-4e7f-9fd6-e08dcb11ea48-host-slash\") pod \"iptables-alerter-zdpfk\" (UID: \"95e0f061-d036-4e7f-9fd6-e08dcb11ea48\") " pod="openshift-network-operator/iptables-alerter-zdpfk" Apr 24 16:39:26.731891 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.726872 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9afa95f3-5eb1-47b0-b00d-a487cb566ffc-multus-daemon-config\") pod \"multus-ng4dd\" (UID: \"9afa95f3-5eb1-47b0-b00d-a487cb566ffc\") " pod="openshift-multus/multus-ng4dd" Apr 24 16:39:26.731891 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.726901 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/7b10f610-a264-47f5-82b9-6ff395db8762-etc-modprobe-d\") pod \"tuned-tc7zr\" (UID: \"7b10f610-a264-47f5-82b9-6ff395db8762\") " pod="openshift-cluster-node-tuning-operator/tuned-tc7zr" Apr 24 16:39:26.731891 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.726933 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1e8f7734-976a-4f89-b273-9e519d952582-host-kubelet\") pod \"ovnkube-node-7fpht\" (UID: \"1e8f7734-976a-4f89-b273-9e519d952582\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:26.732683 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.726959 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1e8f7734-976a-4f89-b273-9e519d952582-log-socket\") pod \"ovnkube-node-7fpht\" (UID: \"1e8f7734-976a-4f89-b273-9e519d952582\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:26.732683 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.726989 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/78ae7e56-20b9-4ea1-a705-310d2202a743-device-dir\") pod \"aws-ebs-csi-driver-node-4vz8k\" (UID: \"78ae7e56-20b9-4ea1-a705-310d2202a743\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4vz8k" Apr 24 16:39:26.732683 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.727023 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1e8f7734-976a-4f89-b273-9e519d952582-node-log\") pod \"ovnkube-node-7fpht\" (UID: \"1e8f7734-976a-4f89-b273-9e519d952582\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:26.732683 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.727052 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-99w8w\" (UniqueName: \"kubernetes.io/projected/78ae7e56-20b9-4ea1-a705-310d2202a743-kube-api-access-99w8w\") pod \"aws-ebs-csi-driver-node-4vz8k\" (UID: \"78ae7e56-20b9-4ea1-a705-310d2202a743\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4vz8k" Apr 24 16:39:26.732683 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.727084 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9afa95f3-5eb1-47b0-b00d-a487cb566ffc-multus-cni-dir\") pod \"multus-ng4dd\" (UID: \"9afa95f3-5eb1-47b0-b00d-a487cb566ffc\") " pod="openshift-multus/multus-ng4dd" Apr 24 16:39:26.732683 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.727110 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9afa95f3-5eb1-47b0-b00d-a487cb566ffc-multus-socket-dir-parent\") pod \"multus-ng4dd\" (UID: \"9afa95f3-5eb1-47b0-b00d-a487cb566ffc\") " pod="openshift-multus/multus-ng4dd" Apr 24 16:39:26.732683 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.727141 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9afa95f3-5eb1-47b0-b00d-a487cb566ffc-host-var-lib-kubelet\") pod \"multus-ng4dd\" (UID: \"9afa95f3-5eb1-47b0-b00d-a487cb566ffc\") " pod="openshift-multus/multus-ng4dd" Apr 24 16:39:26.732683 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.727170 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/53e5c63d-2459-4e1a-be74-d81070bbbeff-hosts-file\") pod \"node-resolver-7grsg\" (UID: \"53e5c63d-2459-4e1a-be74-d81070bbbeff\") " pod="openshift-dns/node-resolver-7grsg" Apr 24 16:39:26.732683 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.727199 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tvdgh\" (UniqueName: \"kubernetes.io/projected/801dc599-67b9-400f-a14c-835900dba24e-kube-api-access-tvdgh\") pod \"node-ca-kdlh8\" (UID: \"801dc599-67b9-400f-a14c-835900dba24e\") " pod="openshift-image-registry/node-ca-kdlh8" Apr 24 16:39:26.732683 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.727467 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7b10f610-a264-47f5-82b9-6ff395db8762-tmp\") pod \"tuned-tc7zr\" (UID: \"7b10f610-a264-47f5-82b9-6ff395db8762\") " pod="openshift-cluster-node-tuning-operator/tuned-tc7zr" Apr 24 16:39:26.732683 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.727535 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9afa95f3-5eb1-47b0-b00d-a487cb566ffc-host-run-netns\") pod \"multus-ng4dd\" (UID: \"9afa95f3-5eb1-47b0-b00d-a487cb566ffc\") " pod="openshift-multus/multus-ng4dd" Apr 24 16:39:26.732683 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.727586 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1e8f7734-976a-4f89-b273-9e519d952582-host-slash\") pod \"ovnkube-node-7fpht\" (UID: \"1e8f7734-976a-4f89-b273-9e519d952582\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:26.732683 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.727648 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9afa95f3-5eb1-47b0-b00d-a487cb566ffc-host-run-multus-certs\") pod \"multus-ng4dd\" (UID: \"9afa95f3-5eb1-47b0-b00d-a487cb566ffc\") " pod="openshift-multus/multus-ng4dd" Apr 24 16:39:26.732683 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.727702 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1e8f7734-976a-4f89-b273-9e519d952582-host-run-ovn-kubernetes\") pod \"ovnkube-node-7fpht\" (UID: \"1e8f7734-976a-4f89-b273-9e519d952582\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:26.732683 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.727783 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/78ae7e56-20b9-4ea1-a705-310d2202a743-kubelet-dir\") pod \"aws-ebs-csi-driver-node-4vz8k\" (UID: \"78ae7e56-20b9-4ea1-a705-310d2202a743\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4vz8k" Apr 24 16:39:26.732683 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.727805 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1e8f7734-976a-4f89-b273-9e519d952582-log-socket\") pod \"ovnkube-node-7fpht\" (UID: \"1e8f7734-976a-4f89-b273-9e519d952582\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:26.732683 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.727823 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e8f7734-976a-4f89-b273-9e519d952582-etc-openvswitch\") pod \"ovnkube-node-7fpht\" (UID: \"1e8f7734-976a-4f89-b273-9e519d952582\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:26.732683 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.727853 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9afa95f3-5eb1-47b0-b00d-a487cb566ffc-os-release\") pod \"multus-ng4dd\" (UID: \"9afa95f3-5eb1-47b0-b00d-a487cb566ffc\") " pod="openshift-multus/multus-ng4dd" Apr 24 16:39:26.733235 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.727883 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/53e5c63d-2459-4e1a-be74-d81070bbbeff-tmp-dir\") pod \"node-resolver-7grsg\" (UID: \"53e5c63d-2459-4e1a-be74-d81070bbbeff\") " pod="openshift-dns/node-resolver-7grsg" Apr 24 16:39:26.733235 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.727885 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1e8f7734-976a-4f89-b273-9e519d952582-run-ovn\") pod \"ovnkube-node-7fpht\" (UID: \"1e8f7734-976a-4f89-b273-9e519d952582\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:26.733235 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.727918 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1e8f7734-976a-4f89-b273-9e519d952582-host-kubelet\") pod \"ovnkube-node-7fpht\" (UID: \"1e8f7734-976a-4f89-b273-9e519d952582\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:26.733235 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.727948 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1e8f7734-976a-4f89-b273-9e519d952582-node-log\") pod \"ovnkube-node-7fpht\" (UID: \"1e8f7734-976a-4f89-b273-9e519d952582\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:26.733235 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.727951 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1e8f7734-976a-4f89-b273-9e519d952582-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7fpht\" (UID: \"1e8f7734-976a-4f89-b273-9e519d952582\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:26.733235 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.727977 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7b10f610-a264-47f5-82b9-6ff395db8762-var-lib-kubelet\") pod \"tuned-tc7zr\" (UID: \"7b10f610-a264-47f5-82b9-6ff395db8762\") " pod="openshift-cluster-node-tuning-operator/tuned-tc7zr" Apr 24 16:39:26.733235 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.727994 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e8f7734-976a-4f89-b273-9e519d952582-run-openvswitch\") pod \"ovnkube-node-7fpht\" (UID: \"1e8f7734-976a-4f89-b273-9e519d952582\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:26.733235 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.728044 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9afa95f3-5eb1-47b0-b00d-a487cb566ffc-host-run-k8s-cni-cncf-io\") pod \"multus-ng4dd\" (UID: \"9afa95f3-5eb1-47b0-b00d-a487cb566ffc\") " pod="openshift-multus/multus-ng4dd" Apr 24 16:39:26.733235 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.728045 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7b10f610-a264-47f5-82b9-6ff395db8762-host\") pod \"tuned-tc7zr\" (UID: \"7b10f610-a264-47f5-82b9-6ff395db8762\") " pod="openshift-cluster-node-tuning-operator/tuned-tc7zr" Apr 24 16:39:26.733235 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.728110 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9afa95f3-5eb1-47b0-b00d-a487cb566ffc-etc-kubernetes\") pod \"multus-ng4dd\" (UID: \"9afa95f3-5eb1-47b0-b00d-a487cb566ffc\") " pod="openshift-multus/multus-ng4dd" Apr 24 16:39:26.733235 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.728128 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9afa95f3-5eb1-47b0-b00d-a487cb566ffc-multus-cni-dir\") pod \"multus-ng4dd\" (UID: \"9afa95f3-5eb1-47b0-b00d-a487cb566ffc\") " pod="openshift-multus/multus-ng4dd" Apr 24 16:39:26.733235 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.728156 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1e8f7734-976a-4f89-b273-9e519d952582-host-run-netns\") pod \"ovnkube-node-7fpht\" (UID: \"1e8f7734-976a-4f89-b273-9e519d952582\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:26.733235 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.728190 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9afa95f3-5eb1-47b0-b00d-a487cb566ffc-multus-socket-dir-parent\") pod \"multus-ng4dd\" (UID: \"9afa95f3-5eb1-47b0-b00d-a487cb566ffc\") " pod="openshift-multus/multus-ng4dd" Apr 24 16:39:26.733235 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.728248 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9afa95f3-5eb1-47b0-b00d-a487cb566ffc-host-var-lib-kubelet\") pod \"multus-ng4dd\" (UID: \"9afa95f3-5eb1-47b0-b00d-a487cb566ffc\") " pod="openshift-multus/multus-ng4dd" Apr 24 16:39:26.733235 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.728300 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/53e5c63d-2459-4e1a-be74-d81070bbbeff-hosts-file\") pod \"node-resolver-7grsg\" (UID: \"53e5c63d-2459-4e1a-be74-d81070bbbeff\") " pod="openshift-dns/node-resolver-7grsg" Apr 24 16:39:26.733235 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.728357 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/7b10f610-a264-47f5-82b9-6ff395db8762-etc-sysconfig\") pod \"tuned-tc7zr\" (UID: \"7b10f610-a264-47f5-82b9-6ff395db8762\") " pod="openshift-cluster-node-tuning-operator/tuned-tc7zr" Apr 24 16:39:26.733235 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.728408 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7b10f610-a264-47f5-82b9-6ff395db8762-lib-modules\") pod \"tuned-tc7zr\" (UID: \"7b10f610-a264-47f5-82b9-6ff395db8762\") " pod="openshift-cluster-node-tuning-operator/tuned-tc7zr" Apr 24 16:39:26.733235 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.727589 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9afa95f3-5eb1-47b0-b00d-a487cb566ffc-host-var-lib-cni-bin\") pod \"multus-ng4dd\" (UID: \"9afa95f3-5eb1-47b0-b00d-a487cb566ffc\") " pod="openshift-multus/multus-ng4dd" Apr 24 16:39:26.733866 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.728492 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9afa95f3-5eb1-47b0-b00d-a487cb566ffc-multus-conf-dir\") pod \"multus-ng4dd\" (UID: \"9afa95f3-5eb1-47b0-b00d-a487cb566ffc\") " pod="openshift-multus/multus-ng4dd" Apr 24 16:39:26.733866 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.728537 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/95e0f061-d036-4e7f-9fd6-e08dcb11ea48-host-slash\") pod \"iptables-alerter-zdpfk\" (UID: \"95e0f061-d036-4e7f-9fd6-e08dcb11ea48\") " pod="openshift-network-operator/iptables-alerter-zdpfk" Apr 24 16:39:26.733866 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.728599 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/7b10f610-a264-47f5-82b9-6ff395db8762-etc-modprobe-d\") pod \"tuned-tc7zr\" (UID: \"7b10f610-a264-47f5-82b9-6ff395db8762\") " pod="openshift-cluster-node-tuning-operator/tuned-tc7zr" Apr 24 16:39:26.733866 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.728631 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/78ae7e56-20b9-4ea1-a705-310d2202a743-sys-fs\") pod \"aws-ebs-csi-driver-node-4vz8k\" (UID: \"78ae7e56-20b9-4ea1-a705-310d2202a743\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4vz8k" Apr 24 16:39:26.733866 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.728658 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/78ae7e56-20b9-4ea1-a705-310d2202a743-device-dir\") pod \"aws-ebs-csi-driver-node-4vz8k\" (UID: \"78ae7e56-20b9-4ea1-a705-310d2202a743\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4vz8k" Apr 24 16:39:26.733866 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.729007 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/7b10f610-a264-47f5-82b9-6ff395db8762-etc-tuned\") pod \"tuned-tc7zr\" (UID: \"7b10f610-a264-47f5-82b9-6ff395db8762\") " pod="openshift-cluster-node-tuning-operator/tuned-tc7zr" Apr 24 16:39:26.733866 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.729072 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1e8f7734-976a-4f89-b273-9e519d952582-ovn-node-metrics-cert\") pod \"ovnkube-node-7fpht\" (UID: \"1e8f7734-976a-4f89-b273-9e519d952582\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:26.733866 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.729113 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/95e0f061-d036-4e7f-9fd6-e08dcb11ea48-iptables-alerter-script\") pod \"iptables-alerter-zdpfk\" (UID: \"95e0f061-d036-4e7f-9fd6-e08dcb11ea48\") " pod="openshift-network-operator/iptables-alerter-zdpfk" Apr 24 16:39:26.733866 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.730278 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1e8f7734-976a-4f89-b273-9e519d952582-env-overrides\") pod \"ovnkube-node-7fpht\" (UID: \"1e8f7734-976a-4f89-b273-9e519d952582\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:26.733866 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.730377 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9afa95f3-5eb1-47b0-b00d-a487cb566ffc-multus-daemon-config\") pod \"multus-ng4dd\" (UID: \"9afa95f3-5eb1-47b0-b00d-a487cb566ffc\") " pod="openshift-multus/multus-ng4dd" Apr 24 16:39:26.734880 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.734857 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lm9fj\" (UniqueName: \"kubernetes.io/projected/9afa95f3-5eb1-47b0-b00d-a487cb566ffc-kube-api-access-lm9fj\") pod \"multus-ng4dd\" (UID: \"9afa95f3-5eb1-47b0-b00d-a487cb566ffc\") " pod="openshift-multus/multus-ng4dd" Apr 24 16:39:26.736023 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.736005 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvdgh\" (UniqueName: \"kubernetes.io/projected/801dc599-67b9-400f-a14c-835900dba24e-kube-api-access-tvdgh\") pod \"node-ca-kdlh8\" (UID: \"801dc599-67b9-400f-a14c-835900dba24e\") " pod="openshift-image-registry/node-ca-kdlh8" Apr 24 16:39:26.738909 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.738837 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-khcth\" (UniqueName: \"kubernetes.io/projected/4af5b09f-ceea-413a-bec5-40a2b59c7ea3-kube-api-access-khcth\") pod \"network-metrics-daemon-kmh29\" (UID: \"4af5b09f-ceea-413a-bec5-40a2b59c7ea3\") " pod="openshift-multus/network-metrics-daemon-kmh29" Apr 24 16:39:26.739190 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.739168 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntgf7\" (UniqueName: \"kubernetes.io/projected/7b10f610-a264-47f5-82b9-6ff395db8762-kube-api-access-ntgf7\") pod \"tuned-tc7zr\" (UID: \"7b10f610-a264-47f5-82b9-6ff395db8762\") " pod="openshift-cluster-node-tuning-operator/tuned-tc7zr" Apr 24 16:39:26.739375 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.739358 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkpz6\" (UniqueName: \"kubernetes.io/projected/1e8f7734-976a-4f89-b273-9e519d952582-kube-api-access-tkpz6\") pod \"ovnkube-node-7fpht\" (UID: \"1e8f7734-976a-4f89-b273-9e519d952582\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:26.739700 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.739681 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xmzv\" (UniqueName: \"kubernetes.io/projected/95e0f061-d036-4e7f-9fd6-e08dcb11ea48-kube-api-access-9xmzv\") pod \"iptables-alerter-zdpfk\" (UID: \"95e0f061-d036-4e7f-9fd6-e08dcb11ea48\") " pod="openshift-network-operator/iptables-alerter-zdpfk" Apr 24 16:39:26.739700 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.739692 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdnkg\" (UniqueName: \"kubernetes.io/projected/53e5c63d-2459-4e1a-be74-d81070bbbeff-kube-api-access-zdnkg\") pod \"node-resolver-7grsg\" (UID: \"53e5c63d-2459-4e1a-be74-d81070bbbeff\") " pod="openshift-dns/node-resolver-7grsg" Apr 24 16:39:26.739909 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.739888 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-99w8w\" (UniqueName: \"kubernetes.io/projected/78ae7e56-20b9-4ea1-a705-310d2202a743-kube-api-access-99w8w\") pod \"aws-ebs-csi-driver-node-4vz8k\" (UID: \"78ae7e56-20b9-4ea1-a705-310d2202a743\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4vz8k" Apr 24 16:39:26.771923 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.771900 2579 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 16:39:26.814026 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.813986 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-b565t" Apr 24 16:39:26.823746 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.823711 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-92bp8" Apr 24 16:39:26.831365 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.831344 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-zdpfk" Apr 24 16:39:26.838881 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.838859 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-kdlh8" Apr 24 16:39:26.844497 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.844471 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:26.852109 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.852088 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4vz8k" Apr 24 16:39:26.857645 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.857630 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-tc7zr" Apr 24 16:39:26.864188 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.864170 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7grsg" Apr 24 16:39:26.867683 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.867668 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-ng4dd" Apr 24 16:39:26.920556 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:26.920527 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 16:39:27.130889 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:27.130834 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7tfc7\" (UniqueName: \"kubernetes.io/projected/28a21434-57fa-420d-a7e4-3b011eb1dc0e-kube-api-access-7tfc7\") pod \"network-check-target-hknkz\" (UID: \"28a21434-57fa-420d-a7e4-3b011eb1dc0e\") " pod="openshift-network-diagnostics/network-check-target-hknkz" Apr 24 16:39:27.131078 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:27.131013 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:39:27.131078 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:27.131044 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:39:27.131078 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:27.131058 2579 projected.go:194] Error preparing data for projected volume kube-api-access-7tfc7 for pod openshift-network-diagnostics/network-check-target-hknkz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:27.131234 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:27.131130 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/28a21434-57fa-420d-a7e4-3b011eb1dc0e-kube-api-access-7tfc7 podName:28a21434-57fa-420d-a7e4-3b011eb1dc0e nodeName:}" failed. No retries permitted until 2026-04-24 16:39:28.131109175 +0000 UTC m=+4.064178352 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-7tfc7" (UniqueName: "kubernetes.io/projected/28a21434-57fa-420d-a7e4-3b011eb1dc0e-kube-api-access-7tfc7") pod "network-check-target-hknkz" (UID: "28a21434-57fa-420d-a7e4-3b011eb1dc0e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:27.231690 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:27.231647 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4af5b09f-ceea-413a-bec5-40a2b59c7ea3-metrics-certs\") pod \"network-metrics-daemon-kmh29\" (UID: \"4af5b09f-ceea-413a-bec5-40a2b59c7ea3\") " pod="openshift-multus/network-metrics-daemon-kmh29" Apr 24 16:39:27.231888 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:27.231805 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:27.231888 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:27.231879 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4af5b09f-ceea-413a-bec5-40a2b59c7ea3-metrics-certs podName:4af5b09f-ceea-413a-bec5-40a2b59c7ea3 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:28.231860346 +0000 UTC m=+4.164929544 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4af5b09f-ceea-413a-bec5-40a2b59c7ea3-metrics-certs") pod "network-metrics-daemon-kmh29" (UID: "4af5b09f-ceea-413a-bec5-40a2b59c7ea3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:27.253403 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:27.253367 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78ae7e56_20b9_4ea1_a705_310d2202a743.slice/crio-ad0a9461114e8dd88e622fb0e64cfa80fb0bc699c1f4e27d7ff541b693aef72d WatchSource:0}: Error finding container ad0a9461114e8dd88e622fb0e64cfa80fb0bc699c1f4e27d7ff541b693aef72d: Status 404 returned error can't find the container with id ad0a9461114e8dd88e622fb0e64cfa80fb0bc699c1f4e27d7ff541b693aef72d Apr 24 16:39:27.254456 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:27.254436 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54f6ae1b_e6ac_48cd_92cc_1c9ae843b609.slice/crio-7045105c7c92edfee93fa4c4257afa9047cbb730bb217bd5175940a5081306a8 WatchSource:0}: Error finding container 7045105c7c92edfee93fa4c4257afa9047cbb730bb217bd5175940a5081306a8: Status 404 returned error can't find the container with id 7045105c7c92edfee93fa4c4257afa9047cbb730bb217bd5175940a5081306a8 Apr 24 16:39:27.256576 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:27.256552 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e8f7734_976a_4f89_b273_9e519d952582.slice/crio-933df4bdd6fb4fd6b7ea04f45d5761282154073bd8bdf28bcb34b7e449b59169 WatchSource:0}: Error finding container 933df4bdd6fb4fd6b7ea04f45d5761282154073bd8bdf28bcb34b7e449b59169: Status 404 returned error can't find the container with id 933df4bdd6fb4fd6b7ea04f45d5761282154073bd8bdf28bcb34b7e449b59169 Apr 24 16:39:27.259673 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:27.259651 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b10f610_a264_47f5_82b9_6ff395db8762.slice/crio-093e10cb9d4a1938417244b4804c7558506051dc6c82774d58ac1510d657d394 WatchSource:0}: Error finding container 093e10cb9d4a1938417244b4804c7558506051dc6c82774d58ac1510d657d394: Status 404 returned error can't find the container with id 093e10cb9d4a1938417244b4804c7558506051dc6c82774d58ac1510d657d394 Apr 24 16:39:27.260714 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:27.260692 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95e0f061_d036_4e7f_9fd6_e08dcb11ea48.slice/crio-4f35201633a010590ab7e5e7becac0dcfa67640b07445f70b764f0ea55775318 WatchSource:0}: Error finding container 4f35201633a010590ab7e5e7becac0dcfa67640b07445f70b764f0ea55775318: Status 404 returned error can't find the container with id 4f35201633a010590ab7e5e7becac0dcfa67640b07445f70b764f0ea55775318 Apr 24 16:39:27.261667 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:27.261532 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9afa95f3_5eb1_47b0_b00d_a487cb566ffc.slice/crio-506e5120bb5af34e50e4c2236461fc7f743dd9f010b69cfa60f766909e5be048 WatchSource:0}: Error finding container 506e5120bb5af34e50e4c2236461fc7f743dd9f010b69cfa60f766909e5be048: Status 404 returned error can't find the container with id 506e5120bb5af34e50e4c2236461fc7f743dd9f010b69cfa60f766909e5be048 Apr 24 16:39:27.262709 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:27.262684 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7c17281_bc8e_4ae5_bf1c_eaf465abf88b.slice/crio-c13ce6b310288885ab69c4e34025279c794342b5950cb0c9b5fc6f7def224934 WatchSource:0}: Error finding container c13ce6b310288885ab69c4e34025279c794342b5950cb0c9b5fc6f7def224934: Status 404 returned error can't find the container with id c13ce6b310288885ab69c4e34025279c794342b5950cb0c9b5fc6f7def224934 Apr 24 16:39:27.266045 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:39:27.266018 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod801dc599_67b9_400f_a14c_835900dba24e.slice/crio-cbeb917885a65335579dfde2995c6efc56342f20d7c2dd70e9b22c7821f8c8d7 WatchSource:0}: Error finding container cbeb917885a65335579dfde2995c6efc56342f20d7c2dd70e9b22c7821f8c8d7: Status 404 returned error can't find the container with id cbeb917885a65335579dfde2995c6efc56342f20d7c2dd70e9b22c7821f8c8d7 Apr 24 16:39:27.558880 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:27.558660 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 16:34:25 +0000 UTC" deadline="2028-01-17 02:16:30.122624282 +0000 UTC" Apr 24 16:39:27.558880 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:27.558876 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15177h37m2.563752761s" Apr 24 16:39:27.615315 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:27.615268 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kmh29" Apr 24 16:39:27.615486 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:27.615422 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kmh29" podUID="4af5b09f-ceea-413a-bec5-40a2b59c7ea3" Apr 24 16:39:27.627171 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:27.627132 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-227.ec2.internal" event={"ID":"abe1697a38d64560f353e38f63759680","Type":"ContainerStarted","Data":"169794bf4481ca74f5a8e7e1d5a90c59f6877db81487bd58ae0d74d7cd9fba85"} Apr 24 16:39:27.631638 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:27.630957 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-zdpfk" event={"ID":"95e0f061-d036-4e7f-9fd6-e08dcb11ea48","Type":"ContainerStarted","Data":"4f35201633a010590ab7e5e7becac0dcfa67640b07445f70b764f0ea55775318"} Apr 24 16:39:27.635854 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:27.635818 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-tc7zr" event={"ID":"7b10f610-a264-47f5-82b9-6ff395db8762","Type":"ContainerStarted","Data":"093e10cb9d4a1938417244b4804c7558506051dc6c82774d58ac1510d657d394"} Apr 24 16:39:27.638193 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:27.638140 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" event={"ID":"1e8f7734-976a-4f89-b273-9e519d952582","Type":"ContainerStarted","Data":"933df4bdd6fb4fd6b7ea04f45d5761282154073bd8bdf28bcb34b7e449b59169"} Apr 24 16:39:27.639666 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:27.639624 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-kdlh8" event={"ID":"801dc599-67b9-400f-a14c-835900dba24e","Type":"ContainerStarted","Data":"cbeb917885a65335579dfde2995c6efc56342f20d7c2dd70e9b22c7821f8c8d7"} Apr 24 16:39:27.642596 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:27.642077 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-b565t" event={"ID":"d7c17281-bc8e-4ae5-bf1c-eaf465abf88b","Type":"ContainerStarted","Data":"c13ce6b310288885ab69c4e34025279c794342b5950cb0c9b5fc6f7def224934"} Apr 24 16:39:27.642596 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:27.642123 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-227.ec2.internal" podStartSLOduration=1.642107979 podStartE2EDuration="1.642107979s" podCreationTimestamp="2026-04-24 16:39:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:39:27.641683841 +0000 UTC m=+3.574753039" watchObservedRunningTime="2026-04-24 16:39:27.642107979 +0000 UTC m=+3.575177179" Apr 24 16:39:27.643293 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:27.643247 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7grsg" event={"ID":"53e5c63d-2459-4e1a-be74-d81070bbbeff","Type":"ContainerStarted","Data":"08fc667eea09793d86df3f21942405b332798eaf0412e112c5b3991feb5e287b"} Apr 24 16:39:27.644949 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:27.644926 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ng4dd" event={"ID":"9afa95f3-5eb1-47b0-b00d-a487cb566ffc","Type":"ContainerStarted","Data":"506e5120bb5af34e50e4c2236461fc7f743dd9f010b69cfa60f766909e5be048"} Apr 24 16:39:27.647284 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:27.647263 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-92bp8" event={"ID":"54f6ae1b-e6ac-48cd-92cc-1c9ae843b609","Type":"ContainerStarted","Data":"7045105c7c92edfee93fa4c4257afa9047cbb730bb217bd5175940a5081306a8"} Apr 24 16:39:27.648573 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:27.648556 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4vz8k" event={"ID":"78ae7e56-20b9-4ea1-a705-310d2202a743","Type":"ContainerStarted","Data":"ad0a9461114e8dd88e622fb0e64cfa80fb0bc699c1f4e27d7ff541b693aef72d"} Apr 24 16:39:28.139774 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:28.139515 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7tfc7\" (UniqueName: \"kubernetes.io/projected/28a21434-57fa-420d-a7e4-3b011eb1dc0e-kube-api-access-7tfc7\") pod \"network-check-target-hknkz\" (UID: \"28a21434-57fa-420d-a7e4-3b011eb1dc0e\") " pod="openshift-network-diagnostics/network-check-target-hknkz" Apr 24 16:39:28.140530 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:28.140009 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:39:28.140530 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:28.140050 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:39:28.140530 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:28.140075 2579 projected.go:194] Error preparing data for projected volume kube-api-access-7tfc7 for pod openshift-network-diagnostics/network-check-target-hknkz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:28.140530 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:28.140150 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/28a21434-57fa-420d-a7e4-3b011eb1dc0e-kube-api-access-7tfc7 podName:28a21434-57fa-420d-a7e4-3b011eb1dc0e nodeName:}" failed. No retries permitted until 2026-04-24 16:39:30.140128541 +0000 UTC m=+6.073197723 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-7tfc7" (UniqueName: "kubernetes.io/projected/28a21434-57fa-420d-a7e4-3b011eb1dc0e-kube-api-access-7tfc7") pod "network-check-target-hknkz" (UID: "28a21434-57fa-420d-a7e4-3b011eb1dc0e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:28.240867 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:28.240274 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4af5b09f-ceea-413a-bec5-40a2b59c7ea3-metrics-certs\") pod \"network-metrics-daemon-kmh29\" (UID: \"4af5b09f-ceea-413a-bec5-40a2b59c7ea3\") " pod="openshift-multus/network-metrics-daemon-kmh29" Apr 24 16:39:28.240867 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:28.240444 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:28.240867 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:28.240505 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4af5b09f-ceea-413a-bec5-40a2b59c7ea3-metrics-certs podName:4af5b09f-ceea-413a-bec5-40a2b59c7ea3 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:30.240486419 +0000 UTC m=+6.173555604 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4af5b09f-ceea-413a-bec5-40a2b59c7ea3-metrics-certs") pod "network-metrics-daemon-kmh29" (UID: "4af5b09f-ceea-413a-bec5-40a2b59c7ea3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:28.615842 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:28.615787 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hknkz" Apr 24 16:39:28.616459 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:28.615961 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hknkz" podUID="28a21434-57fa-420d-a7e4-3b011eb1dc0e" Apr 24 16:39:28.667850 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:28.666717 2579 generic.go:358] "Generic (PLEG): container finished" podID="b4ee6da65101faf961ac9e356d355076" containerID="147bacf6b531f0326833bb69214adf042ac4268046ae02d085ee389d05a4d4a6" exitCode=0 Apr 24 16:39:28.667850 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:28.667781 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-227.ec2.internal" event={"ID":"b4ee6da65101faf961ac9e356d355076","Type":"ContainerDied","Data":"147bacf6b531f0326833bb69214adf042ac4268046ae02d085ee389d05a4d4a6"} Apr 24 16:39:29.615959 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:29.615344 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kmh29" Apr 24 16:39:29.615959 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:29.615498 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kmh29" podUID="4af5b09f-ceea-413a-bec5-40a2b59c7ea3" Apr 24 16:39:29.690426 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:29.690379 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-227.ec2.internal" event={"ID":"b4ee6da65101faf961ac9e356d355076","Type":"ContainerStarted","Data":"6174f298289861ff38811f72e7609d5a20de597e028b19d3962d1908c08ff764"} Apr 24 16:39:30.163484 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:30.163444 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7tfc7\" (UniqueName: \"kubernetes.io/projected/28a21434-57fa-420d-a7e4-3b011eb1dc0e-kube-api-access-7tfc7\") pod \"network-check-target-hknkz\" (UID: \"28a21434-57fa-420d-a7e4-3b011eb1dc0e\") " pod="openshift-network-diagnostics/network-check-target-hknkz" Apr 24 16:39:30.163685 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:30.163633 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:39:30.163685 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:30.163651 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:39:30.163685 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:30.163663 2579 projected.go:194] Error preparing data for projected volume kube-api-access-7tfc7 for pod openshift-network-diagnostics/network-check-target-hknkz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:30.163901 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:30.163720 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/28a21434-57fa-420d-a7e4-3b011eb1dc0e-kube-api-access-7tfc7 podName:28a21434-57fa-420d-a7e4-3b011eb1dc0e nodeName:}" failed. No retries permitted until 2026-04-24 16:39:34.163702426 +0000 UTC m=+10.096771618 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-7tfc7" (UniqueName: "kubernetes.io/projected/28a21434-57fa-420d-a7e4-3b011eb1dc0e-kube-api-access-7tfc7") pod "network-check-target-hknkz" (UID: "28a21434-57fa-420d-a7e4-3b011eb1dc0e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:30.264022 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:30.263984 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4af5b09f-ceea-413a-bec5-40a2b59c7ea3-metrics-certs\") pod \"network-metrics-daemon-kmh29\" (UID: \"4af5b09f-ceea-413a-bec5-40a2b59c7ea3\") " pod="openshift-multus/network-metrics-daemon-kmh29" Apr 24 16:39:30.264215 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:30.264139 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:30.264215 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:30.264202 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4af5b09f-ceea-413a-bec5-40a2b59c7ea3-metrics-certs podName:4af5b09f-ceea-413a-bec5-40a2b59c7ea3 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:34.264185883 +0000 UTC m=+10.197255074 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4af5b09f-ceea-413a-bec5-40a2b59c7ea3-metrics-certs") pod "network-metrics-daemon-kmh29" (UID: "4af5b09f-ceea-413a-bec5-40a2b59c7ea3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:30.617869 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:30.617834 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hknkz" Apr 24 16:39:30.618434 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:30.617957 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hknkz" podUID="28a21434-57fa-420d-a7e4-3b011eb1dc0e" Apr 24 16:39:31.615541 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:31.615506 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kmh29" Apr 24 16:39:31.615720 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:31.615655 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kmh29" podUID="4af5b09f-ceea-413a-bec5-40a2b59c7ea3" Apr 24 16:39:32.615918 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:32.615884 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hknkz" Apr 24 16:39:32.616317 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:32.615993 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hknkz" podUID="28a21434-57fa-420d-a7e4-3b011eb1dc0e" Apr 24 16:39:33.615032 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:33.614994 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kmh29" Apr 24 16:39:33.615336 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:33.615128 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kmh29" podUID="4af5b09f-ceea-413a-bec5-40a2b59c7ea3" Apr 24 16:39:34.201801 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:34.201725 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7tfc7\" (UniqueName: \"kubernetes.io/projected/28a21434-57fa-420d-a7e4-3b011eb1dc0e-kube-api-access-7tfc7\") pod \"network-check-target-hknkz\" (UID: \"28a21434-57fa-420d-a7e4-3b011eb1dc0e\") " pod="openshift-network-diagnostics/network-check-target-hknkz" Apr 24 16:39:34.202290 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:34.201903 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:39:34.202290 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:34.201934 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:39:34.202290 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:34.201948 2579 projected.go:194] Error preparing data for projected volume kube-api-access-7tfc7 for pod openshift-network-diagnostics/network-check-target-hknkz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:34.202290 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:34.202024 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/28a21434-57fa-420d-a7e4-3b011eb1dc0e-kube-api-access-7tfc7 podName:28a21434-57fa-420d-a7e4-3b011eb1dc0e nodeName:}" failed. No retries permitted until 2026-04-24 16:39:42.20200398 +0000 UTC m=+18.135073183 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-7tfc7" (UniqueName: "kubernetes.io/projected/28a21434-57fa-420d-a7e4-3b011eb1dc0e-kube-api-access-7tfc7") pod "network-check-target-hknkz" (UID: "28a21434-57fa-420d-a7e4-3b011eb1dc0e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:34.303109 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:34.303065 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4af5b09f-ceea-413a-bec5-40a2b59c7ea3-metrics-certs\") pod \"network-metrics-daemon-kmh29\" (UID: \"4af5b09f-ceea-413a-bec5-40a2b59c7ea3\") " pod="openshift-multus/network-metrics-daemon-kmh29" Apr 24 16:39:34.303291 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:34.303229 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:34.303366 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:34.303296 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4af5b09f-ceea-413a-bec5-40a2b59c7ea3-metrics-certs podName:4af5b09f-ceea-413a-bec5-40a2b59c7ea3 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:42.303276927 +0000 UTC m=+18.236346116 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4af5b09f-ceea-413a-bec5-40a2b59c7ea3-metrics-certs") pod "network-metrics-daemon-kmh29" (UID: "4af5b09f-ceea-413a-bec5-40a2b59c7ea3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:34.616353 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:34.616315 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hknkz" Apr 24 16:39:34.616519 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:34.616432 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hknkz" podUID="28a21434-57fa-420d-a7e4-3b011eb1dc0e" Apr 24 16:39:35.615647 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:35.615598 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kmh29" Apr 24 16:39:35.616137 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:35.615757 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kmh29" podUID="4af5b09f-ceea-413a-bec5-40a2b59c7ea3" Apr 24 16:39:36.615682 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:36.615646 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hknkz" Apr 24 16:39:36.616165 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:36.615804 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hknkz" podUID="28a21434-57fa-420d-a7e4-3b011eb1dc0e" Apr 24 16:39:37.615217 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:37.615178 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kmh29" Apr 24 16:39:37.615392 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:37.615309 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kmh29" podUID="4af5b09f-ceea-413a-bec5-40a2b59c7ea3" Apr 24 16:39:38.615622 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:38.615588 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hknkz" Apr 24 16:39:38.616136 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:38.615705 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hknkz" podUID="28a21434-57fa-420d-a7e4-3b011eb1dc0e" Apr 24 16:39:39.616412 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:39.615858 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kmh29" Apr 24 16:39:39.616412 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:39.616027 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kmh29" podUID="4af5b09f-ceea-413a-bec5-40a2b59c7ea3" Apr 24 16:39:40.615511 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:40.615470 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hknkz" Apr 24 16:39:40.615696 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:40.615580 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hknkz" podUID="28a21434-57fa-420d-a7e4-3b011eb1dc0e" Apr 24 16:39:41.615609 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:41.615574 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kmh29" Apr 24 16:39:41.616116 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:41.615717 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kmh29" podUID="4af5b09f-ceea-413a-bec5-40a2b59c7ea3" Apr 24 16:39:42.258202 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:42.258164 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7tfc7\" (UniqueName: \"kubernetes.io/projected/28a21434-57fa-420d-a7e4-3b011eb1dc0e-kube-api-access-7tfc7\") pod \"network-check-target-hknkz\" (UID: \"28a21434-57fa-420d-a7e4-3b011eb1dc0e\") " pod="openshift-network-diagnostics/network-check-target-hknkz" Apr 24 16:39:42.258397 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:42.258361 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:39:42.258397 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:42.258389 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:39:42.258510 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:42.258403 2579 projected.go:194] Error preparing data for projected volume kube-api-access-7tfc7 for pod openshift-network-diagnostics/network-check-target-hknkz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:42.258510 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:42.258474 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/28a21434-57fa-420d-a7e4-3b011eb1dc0e-kube-api-access-7tfc7 podName:28a21434-57fa-420d-a7e4-3b011eb1dc0e nodeName:}" failed. No retries permitted until 2026-04-24 16:39:58.258454707 +0000 UTC m=+34.191523902 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-7tfc7" (UniqueName: "kubernetes.io/projected/28a21434-57fa-420d-a7e4-3b011eb1dc0e-kube-api-access-7tfc7") pod "network-check-target-hknkz" (UID: "28a21434-57fa-420d-a7e4-3b011eb1dc0e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:42.358959 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:42.358909 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4af5b09f-ceea-413a-bec5-40a2b59c7ea3-metrics-certs\") pod \"network-metrics-daemon-kmh29\" (UID: \"4af5b09f-ceea-413a-bec5-40a2b59c7ea3\") " pod="openshift-multus/network-metrics-daemon-kmh29" Apr 24 16:39:42.359126 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:42.359032 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:42.359126 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:42.359098 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4af5b09f-ceea-413a-bec5-40a2b59c7ea3-metrics-certs podName:4af5b09f-ceea-413a-bec5-40a2b59c7ea3 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:58.359076225 +0000 UTC m=+34.292145404 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4af5b09f-ceea-413a-bec5-40a2b59c7ea3-metrics-certs") pod "network-metrics-daemon-kmh29" (UID: "4af5b09f-ceea-413a-bec5-40a2b59c7ea3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:42.615638 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:42.615548 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hknkz" Apr 24 16:39:42.616071 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:42.615688 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hknkz" podUID="28a21434-57fa-420d-a7e4-3b011eb1dc0e" Apr 24 16:39:43.615837 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:43.615807 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kmh29" Apr 24 16:39:43.616269 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:43.615930 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kmh29" podUID="4af5b09f-ceea-413a-bec5-40a2b59c7ea3" Apr 24 16:39:44.618799 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:44.617958 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hknkz" Apr 24 16:39:44.618799 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:44.618291 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hknkz" podUID="28a21434-57fa-420d-a7e4-3b011eb1dc0e" Apr 24 16:39:45.615657 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:45.615472 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kmh29" Apr 24 16:39:45.615815 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:45.615785 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kmh29" podUID="4af5b09f-ceea-413a-bec5-40a2b59c7ea3" Apr 24 16:39:45.713681 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:45.713658 2579 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 16:39:45.748572 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:45.748542 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7grsg" event={"ID":"53e5c63d-2459-4e1a-be74-d81070bbbeff","Type":"ContainerStarted","Data":"250dc613a14391db644097a98860700240251ef5fc338fb0415e46ee53ddc2d2"} Apr 24 16:39:45.749943 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:45.749911 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ng4dd" event={"ID":"9afa95f3-5eb1-47b0-b00d-a487cb566ffc","Type":"ContainerStarted","Data":"4bc42f0e1315d243fb2dc6af91061e9d9d193d25208fa6087834188c2f432a9c"} Apr 24 16:39:45.751266 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:45.751234 2579 generic.go:358] "Generic (PLEG): container finished" podID="54f6ae1b-e6ac-48cd-92cc-1c9ae843b609" containerID="03dcc8a6e3cff1c2dc71fce5e53741e1f3b3eebb15d95e8c0d33c53755a92995" exitCode=0 Apr 24 16:39:45.751351 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:45.751309 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-92bp8" event={"ID":"54f6ae1b-e6ac-48cd-92cc-1c9ae843b609","Type":"ContainerDied","Data":"03dcc8a6e3cff1c2dc71fce5e53741e1f3b3eebb15d95e8c0d33c53755a92995"} Apr 24 16:39:45.753002 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:45.752982 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4vz8k" event={"ID":"78ae7e56-20b9-4ea1-a705-310d2202a743","Type":"ContainerStarted","Data":"f45e96426c96ba3972035cd0fb70b448d85379bb574aff7defbca392c629fdc5"} Apr 24 16:39:45.753100 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:45.753009 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4vz8k" event={"ID":"78ae7e56-20b9-4ea1-a705-310d2202a743","Type":"ContainerStarted","Data":"5bfce0aec7228751467ecb77cb3303558e948547d9814ba891ba8688d5fb5841"} Apr 24 16:39:45.754268 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:45.754246 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-tc7zr" event={"ID":"7b10f610-a264-47f5-82b9-6ff395db8762","Type":"ContainerStarted","Data":"792dbdf59642168079ade0a84d0887b0a11e44294fbdba9afde2ced309f5583b"} Apr 24 16:39:45.756494 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:45.756472 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" event={"ID":"1e8f7734-976a-4f89-b273-9e519d952582","Type":"ContainerStarted","Data":"f3e5731b4ee5d2943a8157e473cbebd8b26a44ea4e17894f2641f9aeeb0eeaba"} Apr 24 16:39:45.756585 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:45.756502 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" event={"ID":"1e8f7734-976a-4f89-b273-9e519d952582","Type":"ContainerStarted","Data":"a9b83caba812647e4382f66bcfae49864c5e193623eb39d27445491c39f71bb7"} Apr 24 16:39:45.756585 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:45.756516 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" event={"ID":"1e8f7734-976a-4f89-b273-9e519d952582","Type":"ContainerStarted","Data":"7a1e355d50b7b1d1fc2ce32893ade715d27cd307f14f4190e3eafab65d6d83ea"} Apr 24 16:39:45.756585 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:45.756531 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" event={"ID":"1e8f7734-976a-4f89-b273-9e519d952582","Type":"ContainerStarted","Data":"27d5fc830cbf0421d6e84235d3de3433ad953eda63b7b03c1e7cd8f398d1ec0c"} Apr 24 16:39:45.756585 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:45.756542 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" event={"ID":"1e8f7734-976a-4f89-b273-9e519d952582","Type":"ContainerStarted","Data":"210044f302351abf82ce0a468589be57cad7fae09852645efa8b9449b0a08b71"} Apr 24 16:39:45.756585 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:45.756555 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" event={"ID":"1e8f7734-976a-4f89-b273-9e519d952582","Type":"ContainerStarted","Data":"8e1aef706b74bcd41bb68af01aa6cd91c8a13034c3899de583f07985da0304a4"} Apr 24 16:39:45.757709 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:45.757691 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-kdlh8" event={"ID":"801dc599-67b9-400f-a14c-835900dba24e","Type":"ContainerStarted","Data":"f938d6a12fe2f4fa033140020f0c1f7bc834485f40dd9c4ad7da810990af0cd6"} Apr 24 16:39:45.758985 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:45.758965 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-b565t" event={"ID":"d7c17281-bc8e-4ae5-bf1c-eaf465abf88b","Type":"ContainerStarted","Data":"9cc28bb8fc15674434fe6c7cdcc57cab702f1cd2db870c625ba86cdd8b288733"} Apr 24 16:39:45.774634 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:45.774599 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-7grsg" podStartSLOduration=4.506955457 podStartE2EDuration="21.774588315s" podCreationTimestamp="2026-04-24 16:39:24 +0000 UTC" firstStartedPulling="2026-04-24 16:39:27.265765488 +0000 UTC m=+3.198834666" lastFinishedPulling="2026-04-24 16:39:44.533398348 +0000 UTC m=+20.466467524" observedRunningTime="2026-04-24 16:39:45.774492331 +0000 UTC m=+21.707561532" watchObservedRunningTime="2026-04-24 16:39:45.774588315 +0000 UTC m=+21.707657513" Apr 24 16:39:45.774930 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:45.774908 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-227.ec2.internal" podStartSLOduration=19.774903327 podStartE2EDuration="19.774903327s" podCreationTimestamp="2026-04-24 16:39:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:39:29.708523088 +0000 UTC m=+5.641592287" watchObservedRunningTime="2026-04-24 16:39:45.774903327 +0000 UTC m=+21.707972526" Apr 24 16:39:45.854622 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:45.854534 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-kdlh8" podStartSLOduration=9.008633644 podStartE2EDuration="21.854519949s" podCreationTimestamp="2026-04-24 16:39:24 +0000 UTC" firstStartedPulling="2026-04-24 16:39:27.268013896 +0000 UTC m=+3.201083092" lastFinishedPulling="2026-04-24 16:39:40.113900204 +0000 UTC m=+16.046969397" observedRunningTime="2026-04-24 16:39:45.854247495 +0000 UTC m=+21.787316710" watchObservedRunningTime="2026-04-24 16:39:45.854519949 +0000 UTC m=+21.787589147" Apr 24 16:39:45.908376 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:45.908319 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-b565t" podStartSLOduration=4.639830495 podStartE2EDuration="21.908302006s" podCreationTimestamp="2026-04-24 16:39:24 +0000 UTC" firstStartedPulling="2026-04-24 16:39:27.264952728 +0000 UTC m=+3.198021905" lastFinishedPulling="2026-04-24 16:39:44.533424224 +0000 UTC m=+20.466493416" observedRunningTime="2026-04-24 16:39:45.908198777 +0000 UTC m=+21.841267987" watchObservedRunningTime="2026-04-24 16:39:45.908302006 +0000 UTC m=+21.841371203" Apr 24 16:39:45.908582 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:45.908561 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-ng4dd" podStartSLOduration=4.59614701 podStartE2EDuration="21.908556993s" podCreationTimestamp="2026-04-24 16:39:24 +0000 UTC" firstStartedPulling="2026-04-24 16:39:27.264569934 +0000 UTC m=+3.197639116" lastFinishedPulling="2026-04-24 16:39:44.576979911 +0000 UTC m=+20.510049099" observedRunningTime="2026-04-24 16:39:45.889302514 +0000 UTC m=+21.822371718" watchObservedRunningTime="2026-04-24 16:39:45.908556993 +0000 UTC m=+21.841626191" Apr 24 16:39:45.926076 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:45.926031 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-tc7zr" podStartSLOduration=4.654316835 podStartE2EDuration="21.926022303s" podCreationTimestamp="2026-04-24 16:39:24 +0000 UTC" firstStartedPulling="2026-04-24 16:39:27.261948458 +0000 UTC m=+3.195017638" lastFinishedPulling="2026-04-24 16:39:44.533653928 +0000 UTC m=+20.466723106" observedRunningTime="2026-04-24 16:39:45.925874032 +0000 UTC m=+21.858943230" watchObservedRunningTime="2026-04-24 16:39:45.926022303 +0000 UTC m=+21.859091502" Apr 24 16:39:46.597902 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:46.596866 2579 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T16:39:45.713678225Z","UUID":"5258b7a9-2ece-4f96-8e74-d66c673b54bc","Handler":null,"Name":"","Endpoint":""} Apr 24 16:39:46.598775 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:46.598713 2579 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 16:39:46.598905 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:46.598785 2579 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 16:39:46.615627 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:46.615607 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hknkz" Apr 24 16:39:46.615782 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:46.615756 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hknkz" podUID="28a21434-57fa-420d-a7e4-3b011eb1dc0e" Apr 24 16:39:46.763761 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:46.763075 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4vz8k" event={"ID":"78ae7e56-20b9-4ea1-a705-310d2202a743","Type":"ContainerStarted","Data":"2c858591e4cdd0eaaf64f049b316bab20ab28bc9369e8167b23984076e4b6931"} Apr 24 16:39:46.764646 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:46.764593 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-zdpfk" event={"ID":"95e0f061-d036-4e7f-9fd6-e08dcb11ea48","Type":"ContainerStarted","Data":"aec9c897c3bd206feacda5f1f90acf7fa357adf114a9e19d4e327e9f49ba2f8e"} Apr 24 16:39:46.784656 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:46.784571 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4vz8k" podStartSLOduration=3.522900588 podStartE2EDuration="22.78455815s" podCreationTimestamp="2026-04-24 16:39:24 +0000 UTC" firstStartedPulling="2026-04-24 16:39:27.255793419 +0000 UTC m=+3.188862602" lastFinishedPulling="2026-04-24 16:39:46.517450968 +0000 UTC m=+22.450520164" observedRunningTime="2026-04-24 16:39:46.784519188 +0000 UTC m=+22.717588387" watchObservedRunningTime="2026-04-24 16:39:46.78455815 +0000 UTC m=+22.717627349" Apr 24 16:39:46.804572 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:46.804527 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-zdpfk" podStartSLOduration=5.533785773 podStartE2EDuration="22.804512217s" podCreationTimestamp="2026-04-24 16:39:24 +0000 UTC" firstStartedPulling="2026-04-24 16:39:27.262633186 +0000 UTC m=+3.195702367" lastFinishedPulling="2026-04-24 16:39:44.533359631 +0000 UTC m=+20.466428811" observedRunningTime="2026-04-24 16:39:46.803960393 +0000 UTC m=+22.737029591" watchObservedRunningTime="2026-04-24 16:39:46.804512217 +0000 UTC m=+22.737581415" Apr 24 16:39:47.476901 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:47.476813 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-b565t" Apr 24 16:39:47.477550 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:47.477527 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-b565t" Apr 24 16:39:47.615303 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:47.615276 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kmh29" Apr 24 16:39:47.615509 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:47.615410 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kmh29" podUID="4af5b09f-ceea-413a-bec5-40a2b59c7ea3" Apr 24 16:39:47.769982 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:47.769941 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" event={"ID":"1e8f7734-976a-4f89-b273-9e519d952582","Type":"ContainerStarted","Data":"43c6c4ff4098270a053ba1579ddb66bbe9cbd04502f3e6d2973dc8e9aae80c57"} Apr 24 16:39:47.770613 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:47.770301 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-b565t" Apr 24 16:39:47.770748 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:47.770707 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-b565t" Apr 24 16:39:48.616721 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:48.616053 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hknkz" Apr 24 16:39:48.616721 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:48.616345 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hknkz" podUID="28a21434-57fa-420d-a7e4-3b011eb1dc0e" Apr 24 16:39:49.615071 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:49.614976 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kmh29" Apr 24 16:39:49.615552 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:49.615112 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kmh29" podUID="4af5b09f-ceea-413a-bec5-40a2b59c7ea3" Apr 24 16:39:50.615259 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:50.615016 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hknkz" Apr 24 16:39:50.615644 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:50.615389 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hknkz" podUID="28a21434-57fa-420d-a7e4-3b011eb1dc0e" Apr 24 16:39:50.779469 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:50.779437 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" event={"ID":"1e8f7734-976a-4f89-b273-9e519d952582","Type":"ContainerStarted","Data":"7af3ec96fc0c6f0ce7f37f71cc64f8f7850765ae486bc543b078e5e8471e59f0"} Apr 24 16:39:50.779718 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:50.779697 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:50.781149 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:50.781116 2579 generic.go:358] "Generic (PLEG): container finished" podID="54f6ae1b-e6ac-48cd-92cc-1c9ae843b609" containerID="999d3dff842bb1fe1e1c4943e6aea095c6603b29219000dc6eb50288f932f918" exitCode=0 Apr 24 16:39:50.781250 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:50.781158 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-92bp8" event={"ID":"54f6ae1b-e6ac-48cd-92cc-1c9ae843b609","Type":"ContainerDied","Data":"999d3dff842bb1fe1e1c4943e6aea095c6603b29219000dc6eb50288f932f918"} Apr 24 16:39:50.795945 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:50.795915 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:50.809993 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:50.809947 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" podStartSLOduration=9.413817284 podStartE2EDuration="26.809928642s" podCreationTimestamp="2026-04-24 16:39:24 +0000 UTC" firstStartedPulling="2026-04-24 16:39:27.259443936 +0000 UTC m=+3.192513131" lastFinishedPulling="2026-04-24 16:39:44.655555295 +0000 UTC m=+20.588624489" observedRunningTime="2026-04-24 16:39:50.809546829 +0000 UTC m=+26.742616031" watchObservedRunningTime="2026-04-24 16:39:50.809928642 +0000 UTC m=+26.742997845" Apr 24 16:39:51.615909 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:51.615867 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kmh29" Apr 24 16:39:51.616333 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:51.616024 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kmh29" podUID="4af5b09f-ceea-413a-bec5-40a2b59c7ea3" Apr 24 16:39:51.783776 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:51.783725 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:51.783776 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:51.783784 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:51.798878 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:51.798847 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:39:51.977576 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:51.977549 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-kmh29"] Apr 24 16:39:51.977703 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:51.977646 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kmh29" Apr 24 16:39:51.977794 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:51.977774 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kmh29" podUID="4af5b09f-ceea-413a-bec5-40a2b59c7ea3" Apr 24 16:39:51.989883 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:51.989852 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-hknkz"] Apr 24 16:39:51.989997 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:51.989983 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hknkz" Apr 24 16:39:51.990103 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:51.990085 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hknkz" podUID="28a21434-57fa-420d-a7e4-3b011eb1dc0e" Apr 24 16:39:52.786712 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:52.786678 2579 generic.go:358] "Generic (PLEG): container finished" podID="54f6ae1b-e6ac-48cd-92cc-1c9ae843b609" containerID="934ac72eae1a14a772876580f5afc06bdb9c28638523af578eb621cef8bf63bb" exitCode=0 Apr 24 16:39:52.787423 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:52.786769 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-92bp8" event={"ID":"54f6ae1b-e6ac-48cd-92cc-1c9ae843b609","Type":"ContainerDied","Data":"934ac72eae1a14a772876580f5afc06bdb9c28638523af578eb621cef8bf63bb"} Apr 24 16:39:53.615396 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:53.615328 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hknkz" Apr 24 16:39:53.615518 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:53.615330 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kmh29" Apr 24 16:39:53.615518 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:53.615425 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hknkz" podUID="28a21434-57fa-420d-a7e4-3b011eb1dc0e" Apr 24 16:39:53.615518 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:53.615502 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kmh29" podUID="4af5b09f-ceea-413a-bec5-40a2b59c7ea3" Apr 24 16:39:54.793120 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:54.792911 2579 generic.go:358] "Generic (PLEG): container finished" podID="54f6ae1b-e6ac-48cd-92cc-1c9ae843b609" containerID="9d6dc241a12f91f66c44e475ce44a4e3c05c367126d9df844f11e40203897eb1" exitCode=0 Apr 24 16:39:54.793548 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:54.792995 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-92bp8" event={"ID":"54f6ae1b-e6ac-48cd-92cc-1c9ae843b609","Type":"ContainerDied","Data":"9d6dc241a12f91f66c44e475ce44a4e3c05c367126d9df844f11e40203897eb1"} Apr 24 16:39:55.615927 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:55.615892 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kmh29" Apr 24 16:39:55.616115 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:55.615892 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hknkz" Apr 24 16:39:55.616115 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:55.616063 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kmh29" podUID="4af5b09f-ceea-413a-bec5-40a2b59c7ea3" Apr 24 16:39:55.616115 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:55.616091 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hknkz" podUID="28a21434-57fa-420d-a7e4-3b011eb1dc0e" Apr 24 16:39:57.429298 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:57.429253 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-227.ec2.internal" event="NodeReady" Apr 24 16:39:57.429796 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:57.429418 2579 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 16:39:57.480377 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:57.480286 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-pkmpl"] Apr 24 16:39:57.503633 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:57.502916 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-rgpmf"] Apr 24 16:39:57.522855 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:57.522818 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-pkmpl"] Apr 24 16:39:57.522855 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:57.522855 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rgpmf"] Apr 24 16:39:57.523091 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:57.522965 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rgpmf" Apr 24 16:39:57.523091 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:57.522970 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-pkmpl" Apr 24 16:39:57.525340 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:57.525310 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 16:39:57.525492 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:57.525368 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 16:39:57.525492 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:57.525442 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 16:39:57.525492 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:57.525381 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 16:39:57.525724 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:57.525702 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-ljxll\"" Apr 24 16:39:57.525860 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:57.525775 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-2lclw\"" Apr 24 16:39:57.525860 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:57.525835 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 16:39:57.571658 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:57.571623 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fc3c012b-d60b-47fa-aa82-a2d3bb5649b3-config-volume\") pod \"dns-default-pkmpl\" (UID: \"fc3c012b-d60b-47fa-aa82-a2d3bb5649b3\") " pod="openshift-dns/dns-default-pkmpl" Apr 24 16:39:57.571845 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:57.571671 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2brz6\" (UniqueName: \"kubernetes.io/projected/fc3c012b-d60b-47fa-aa82-a2d3bb5649b3-kube-api-access-2brz6\") pod \"dns-default-pkmpl\" (UID: \"fc3c012b-d60b-47fa-aa82-a2d3bb5649b3\") " pod="openshift-dns/dns-default-pkmpl" Apr 24 16:39:57.571845 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:57.571705 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fc3c012b-d60b-47fa-aa82-a2d3bb5649b3-tmp-dir\") pod \"dns-default-pkmpl\" (UID: \"fc3c012b-d60b-47fa-aa82-a2d3bb5649b3\") " pod="openshift-dns/dns-default-pkmpl" Apr 24 16:39:57.571845 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:57.571792 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fc3c012b-d60b-47fa-aa82-a2d3bb5649b3-metrics-tls\") pod \"dns-default-pkmpl\" (UID: \"fc3c012b-d60b-47fa-aa82-a2d3bb5649b3\") " pod="openshift-dns/dns-default-pkmpl" Apr 24 16:39:57.615242 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:57.615203 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hknkz" Apr 24 16:39:57.615432 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:57.615204 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kmh29" Apr 24 16:39:57.618569 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:57.618539 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 16:39:57.618691 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:57.618585 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-7ckwm\"" Apr 24 16:39:57.618691 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:57.618633 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 16:39:57.618806 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:57.618716 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 16:39:57.618847 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:57.618803 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-czswq\"" Apr 24 16:39:57.672827 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:57.672792 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fc3c012b-d60b-47fa-aa82-a2d3bb5649b3-metrics-tls\") pod \"dns-default-pkmpl\" (UID: \"fc3c012b-d60b-47fa-aa82-a2d3bb5649b3\") " pod="openshift-dns/dns-default-pkmpl" Apr 24 16:39:57.673038 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:57.672842 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svt6r\" (UniqueName: \"kubernetes.io/projected/4b53d555-6ece-4be6-a70d-30c64956654b-kube-api-access-svt6r\") pod \"ingress-canary-rgpmf\" (UID: \"4b53d555-6ece-4be6-a70d-30c64956654b\") " pod="openshift-ingress-canary/ingress-canary-rgpmf" Apr 24 16:39:57.673038 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:57.672889 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fc3c012b-d60b-47fa-aa82-a2d3bb5649b3-config-volume\") pod \"dns-default-pkmpl\" (UID: \"fc3c012b-d60b-47fa-aa82-a2d3bb5649b3\") " pod="openshift-dns/dns-default-pkmpl" Apr 24 16:39:57.673038 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:57.672929 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2brz6\" (UniqueName: \"kubernetes.io/projected/fc3c012b-d60b-47fa-aa82-a2d3bb5649b3-kube-api-access-2brz6\") pod \"dns-default-pkmpl\" (UID: \"fc3c012b-d60b-47fa-aa82-a2d3bb5649b3\") " pod="openshift-dns/dns-default-pkmpl" Apr 24 16:39:57.673038 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:57.672952 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fc3c012b-d60b-47fa-aa82-a2d3bb5649b3-tmp-dir\") pod \"dns-default-pkmpl\" (UID: \"fc3c012b-d60b-47fa-aa82-a2d3bb5649b3\") " pod="openshift-dns/dns-default-pkmpl" Apr 24 16:39:57.673038 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:57.672974 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4b53d555-6ece-4be6-a70d-30c64956654b-cert\") pod \"ingress-canary-rgpmf\" (UID: \"4b53d555-6ece-4be6-a70d-30c64956654b\") " pod="openshift-ingress-canary/ingress-canary-rgpmf" Apr 24 16:39:57.673038 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:57.672975 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:39:57.673234 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:57.673048 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc3c012b-d60b-47fa-aa82-a2d3bb5649b3-metrics-tls podName:fc3c012b-d60b-47fa-aa82-a2d3bb5649b3 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:58.173026512 +0000 UTC m=+34.106095729 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fc3c012b-d60b-47fa-aa82-a2d3bb5649b3-metrics-tls") pod "dns-default-pkmpl" (UID: "fc3c012b-d60b-47fa-aa82-a2d3bb5649b3") : secret "dns-default-metrics-tls" not found Apr 24 16:39:57.673308 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:57.673292 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fc3c012b-d60b-47fa-aa82-a2d3bb5649b3-tmp-dir\") pod \"dns-default-pkmpl\" (UID: \"fc3c012b-d60b-47fa-aa82-a2d3bb5649b3\") " pod="openshift-dns/dns-default-pkmpl" Apr 24 16:39:57.673474 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:57.673455 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fc3c012b-d60b-47fa-aa82-a2d3bb5649b3-config-volume\") pod \"dns-default-pkmpl\" (UID: \"fc3c012b-d60b-47fa-aa82-a2d3bb5649b3\") " pod="openshift-dns/dns-default-pkmpl" Apr 24 16:39:57.688318 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:57.688286 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2brz6\" (UniqueName: \"kubernetes.io/projected/fc3c012b-d60b-47fa-aa82-a2d3bb5649b3-kube-api-access-2brz6\") pod \"dns-default-pkmpl\" (UID: \"fc3c012b-d60b-47fa-aa82-a2d3bb5649b3\") " pod="openshift-dns/dns-default-pkmpl" Apr 24 16:39:57.774259 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:57.774219 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4b53d555-6ece-4be6-a70d-30c64956654b-cert\") pod \"ingress-canary-rgpmf\" (UID: \"4b53d555-6ece-4be6-a70d-30c64956654b\") " pod="openshift-ingress-canary/ingress-canary-rgpmf" Apr 24 16:39:57.774465 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:57.774316 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-svt6r\" (UniqueName: \"kubernetes.io/projected/4b53d555-6ece-4be6-a70d-30c64956654b-kube-api-access-svt6r\") pod \"ingress-canary-rgpmf\" (UID: \"4b53d555-6ece-4be6-a70d-30c64956654b\") " pod="openshift-ingress-canary/ingress-canary-rgpmf" Apr 24 16:39:57.774465 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:57.774383 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:39:57.774465 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:57.774466 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4b53d555-6ece-4be6-a70d-30c64956654b-cert podName:4b53d555-6ece-4be6-a70d-30c64956654b nodeName:}" failed. No retries permitted until 2026-04-24 16:39:58.274444074 +0000 UTC m=+34.207513264 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4b53d555-6ece-4be6-a70d-30c64956654b-cert") pod "ingress-canary-rgpmf" (UID: "4b53d555-6ece-4be6-a70d-30c64956654b") : secret "canary-serving-cert" not found Apr 24 16:39:57.784315 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:57.784283 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-svt6r\" (UniqueName: \"kubernetes.io/projected/4b53d555-6ece-4be6-a70d-30c64956654b-kube-api-access-svt6r\") pod \"ingress-canary-rgpmf\" (UID: \"4b53d555-6ece-4be6-a70d-30c64956654b\") " pod="openshift-ingress-canary/ingress-canary-rgpmf" Apr 24 16:39:58.177343 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:58.177260 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fc3c012b-d60b-47fa-aa82-a2d3bb5649b3-metrics-tls\") pod \"dns-default-pkmpl\" (UID: \"fc3c012b-d60b-47fa-aa82-a2d3bb5649b3\") " pod="openshift-dns/dns-default-pkmpl" Apr 24 16:39:58.177535 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:58.177378 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:39:58.177535 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:58.177436 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc3c012b-d60b-47fa-aa82-a2d3bb5649b3-metrics-tls podName:fc3c012b-d60b-47fa-aa82-a2d3bb5649b3 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:59.177420497 +0000 UTC m=+35.110489678 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fc3c012b-d60b-47fa-aa82-a2d3bb5649b3-metrics-tls") pod "dns-default-pkmpl" (UID: "fc3c012b-d60b-47fa-aa82-a2d3bb5649b3") : secret "dns-default-metrics-tls" not found Apr 24 16:39:58.277878 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:58.277844 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4b53d555-6ece-4be6-a70d-30c64956654b-cert\") pod \"ingress-canary-rgpmf\" (UID: \"4b53d555-6ece-4be6-a70d-30c64956654b\") " pod="openshift-ingress-canary/ingress-canary-rgpmf" Apr 24 16:39:58.278050 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:58.277903 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7tfc7\" (UniqueName: \"kubernetes.io/projected/28a21434-57fa-420d-a7e4-3b011eb1dc0e-kube-api-access-7tfc7\") pod \"network-check-target-hknkz\" (UID: \"28a21434-57fa-420d-a7e4-3b011eb1dc0e\") " pod="openshift-network-diagnostics/network-check-target-hknkz" Apr 24 16:39:58.278050 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:58.277939 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:39:58.278050 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:58.278003 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4b53d555-6ece-4be6-a70d-30c64956654b-cert podName:4b53d555-6ece-4be6-a70d-30c64956654b nodeName:}" failed. No retries permitted until 2026-04-24 16:39:59.277988717 +0000 UTC m=+35.211057894 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4b53d555-6ece-4be6-a70d-30c64956654b-cert") pod "ingress-canary-rgpmf" (UID: "4b53d555-6ece-4be6-a70d-30c64956654b") : secret "canary-serving-cert" not found Apr 24 16:39:58.281054 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:58.281017 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tfc7\" (UniqueName: \"kubernetes.io/projected/28a21434-57fa-420d-a7e4-3b011eb1dc0e-kube-api-access-7tfc7\") pod \"network-check-target-hknkz\" (UID: \"28a21434-57fa-420d-a7e4-3b011eb1dc0e\") " pod="openshift-network-diagnostics/network-check-target-hknkz" Apr 24 16:39:58.378745 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:58.378700 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4af5b09f-ceea-413a-bec5-40a2b59c7ea3-metrics-certs\") pod \"network-metrics-daemon-kmh29\" (UID: \"4af5b09f-ceea-413a-bec5-40a2b59c7ea3\") " pod="openshift-multus/network-metrics-daemon-kmh29" Apr 24 16:39:58.378929 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:58.378878 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 16:39:58.378984 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:58.378959 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4af5b09f-ceea-413a-bec5-40a2b59c7ea3-metrics-certs podName:4af5b09f-ceea-413a-bec5-40a2b59c7ea3 nodeName:}" failed. No retries permitted until 2026-04-24 16:40:30.378936453 +0000 UTC m=+66.312005630 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4af5b09f-ceea-413a-bec5-40a2b59c7ea3-metrics-certs") pod "network-metrics-daemon-kmh29" (UID: "4af5b09f-ceea-413a-bec5-40a2b59c7ea3") : secret "metrics-daemon-secret" not found Apr 24 16:39:58.526606 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:58.526557 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hknkz" Apr 24 16:39:59.184354 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:59.184319 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fc3c012b-d60b-47fa-aa82-a2d3bb5649b3-metrics-tls\") pod \"dns-default-pkmpl\" (UID: \"fc3c012b-d60b-47fa-aa82-a2d3bb5649b3\") " pod="openshift-dns/dns-default-pkmpl" Apr 24 16:39:59.184541 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:59.184479 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:39:59.184595 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:59.184543 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc3c012b-d60b-47fa-aa82-a2d3bb5649b3-metrics-tls podName:fc3c012b-d60b-47fa-aa82-a2d3bb5649b3 nodeName:}" failed. No retries permitted until 2026-04-24 16:40:01.184525466 +0000 UTC m=+37.117594644 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fc3c012b-d60b-47fa-aa82-a2d3bb5649b3-metrics-tls") pod "dns-default-pkmpl" (UID: "fc3c012b-d60b-47fa-aa82-a2d3bb5649b3") : secret "dns-default-metrics-tls" not found Apr 24 16:39:59.285227 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:39:59.285186 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4b53d555-6ece-4be6-a70d-30c64956654b-cert\") pod \"ingress-canary-rgpmf\" (UID: \"4b53d555-6ece-4be6-a70d-30c64956654b\") " pod="openshift-ingress-canary/ingress-canary-rgpmf" Apr 24 16:39:59.285399 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:59.285346 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:39:59.285452 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:39:59.285415 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4b53d555-6ece-4be6-a70d-30c64956654b-cert podName:4b53d555-6ece-4be6-a70d-30c64956654b nodeName:}" failed. No retries permitted until 2026-04-24 16:40:01.285396939 +0000 UTC m=+37.218466132 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4b53d555-6ece-4be6-a70d-30c64956654b-cert") pod "ingress-canary-rgpmf" (UID: "4b53d555-6ece-4be6-a70d-30c64956654b") : secret "canary-serving-cert" not found Apr 24 16:40:00.508559 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:40:00.508393 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-hknkz"] Apr 24 16:40:00.512040 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:40:00.512009 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28a21434_57fa_420d_a7e4_3b011eb1dc0e.slice/crio-18b0e075887ef4fc571945f09caabba552b8975a15f336d1cddf306e417c12b9 WatchSource:0}: Error finding container 18b0e075887ef4fc571945f09caabba552b8975a15f336d1cddf306e417c12b9: Status 404 returned error can't find the container with id 18b0e075887ef4fc571945f09caabba552b8975a15f336d1cddf306e417c12b9 Apr 24 16:40:00.812050 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:40:00.812009 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-92bp8" event={"ID":"54f6ae1b-e6ac-48cd-92cc-1c9ae843b609","Type":"ContainerStarted","Data":"36ce44204f75884ec212808cb8534e1e1688afac1bfd2c80d32826ac40e91f56"} Apr 24 16:40:00.813099 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:40:00.813065 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-hknkz" event={"ID":"28a21434-57fa-420d-a7e4-3b011eb1dc0e","Type":"ContainerStarted","Data":"18b0e075887ef4fc571945f09caabba552b8975a15f336d1cddf306e417c12b9"} Apr 24 16:40:01.201615 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:40:01.201536 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fc3c012b-d60b-47fa-aa82-a2d3bb5649b3-metrics-tls\") pod \"dns-default-pkmpl\" (UID: \"fc3c012b-d60b-47fa-aa82-a2d3bb5649b3\") " pod="openshift-dns/dns-default-pkmpl" Apr 24 16:40:01.201778 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:40:01.201698 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:40:01.201827 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:40:01.201792 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc3c012b-d60b-47fa-aa82-a2d3bb5649b3-metrics-tls podName:fc3c012b-d60b-47fa-aa82-a2d3bb5649b3 nodeName:}" failed. No retries permitted until 2026-04-24 16:40:05.2017741 +0000 UTC m=+41.134843277 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fc3c012b-d60b-47fa-aa82-a2d3bb5649b3-metrics-tls") pod "dns-default-pkmpl" (UID: "fc3c012b-d60b-47fa-aa82-a2d3bb5649b3") : secret "dns-default-metrics-tls" not found Apr 24 16:40:01.302884 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:40:01.302843 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4b53d555-6ece-4be6-a70d-30c64956654b-cert\") pod \"ingress-canary-rgpmf\" (UID: \"4b53d555-6ece-4be6-a70d-30c64956654b\") " pod="openshift-ingress-canary/ingress-canary-rgpmf" Apr 24 16:40:01.303069 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:40:01.302955 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:40:01.303069 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:40:01.303013 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4b53d555-6ece-4be6-a70d-30c64956654b-cert podName:4b53d555-6ece-4be6-a70d-30c64956654b nodeName:}" failed. No retries permitted until 2026-04-24 16:40:05.302998711 +0000 UTC m=+41.236067893 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4b53d555-6ece-4be6-a70d-30c64956654b-cert") pod "ingress-canary-rgpmf" (UID: "4b53d555-6ece-4be6-a70d-30c64956654b") : secret "canary-serving-cert" not found Apr 24 16:40:01.818206 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:40:01.818168 2579 generic.go:358] "Generic (PLEG): container finished" podID="54f6ae1b-e6ac-48cd-92cc-1c9ae843b609" containerID="36ce44204f75884ec212808cb8534e1e1688afac1bfd2c80d32826ac40e91f56" exitCode=0 Apr 24 16:40:01.818889 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:40:01.818236 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-92bp8" event={"ID":"54f6ae1b-e6ac-48cd-92cc-1c9ae843b609","Type":"ContainerDied","Data":"36ce44204f75884ec212808cb8534e1e1688afac1bfd2c80d32826ac40e91f56"} Apr 24 16:40:02.823746 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:40:02.823691 2579 generic.go:358] "Generic (PLEG): container finished" podID="54f6ae1b-e6ac-48cd-92cc-1c9ae843b609" containerID="6809eafa6891d15c0f74ec90322992d8c313145ff81282bba9cec581efce67ee" exitCode=0 Apr 24 16:40:02.824595 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:40:02.823766 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-92bp8" event={"ID":"54f6ae1b-e6ac-48cd-92cc-1c9ae843b609","Type":"ContainerDied","Data":"6809eafa6891d15c0f74ec90322992d8c313145ff81282bba9cec581efce67ee"} Apr 24 16:40:03.828246 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:40:03.828210 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-92bp8" event={"ID":"54f6ae1b-e6ac-48cd-92cc-1c9ae843b609","Type":"ContainerStarted","Data":"6e6c75af7c04e6a300890acb223ea2666582dfbe647932e4f5e01f4e0b2b27c7"} Apr 24 16:40:03.829486 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:40:03.829458 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-hknkz" event={"ID":"28a21434-57fa-420d-a7e4-3b011eb1dc0e","Type":"ContainerStarted","Data":"183f55330c02b1de00893b677b191ca3d1ebe17d5843102f9866de34c5ef6c57"} Apr 24 16:40:03.829612 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:40:03.829579 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-hknkz" Apr 24 16:40:03.850006 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:40:03.849918 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-92bp8" podStartSLOduration=6.437597943 podStartE2EDuration="39.84990375s" podCreationTimestamp="2026-04-24 16:39:24 +0000 UTC" firstStartedPulling="2026-04-24 16:39:27.257134046 +0000 UTC m=+3.190203238" lastFinishedPulling="2026-04-24 16:40:00.669439868 +0000 UTC m=+36.602509045" observedRunningTime="2026-04-24 16:40:03.849004038 +0000 UTC m=+39.782073238" watchObservedRunningTime="2026-04-24 16:40:03.84990375 +0000 UTC m=+39.782972927" Apr 24 16:40:03.863487 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:40:03.863440 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-hknkz" podStartSLOduration=36.794703015 podStartE2EDuration="39.863427585s" podCreationTimestamp="2026-04-24 16:39:24 +0000 UTC" firstStartedPulling="2026-04-24 16:40:00.514069005 +0000 UTC m=+36.447138182" lastFinishedPulling="2026-04-24 16:40:03.582793565 +0000 UTC m=+39.515862752" observedRunningTime="2026-04-24 16:40:03.863232123 +0000 UTC m=+39.796301323" watchObservedRunningTime="2026-04-24 16:40:03.863427585 +0000 UTC m=+39.796496784" Apr 24 16:40:05.232323 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:40:05.232269 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fc3c012b-d60b-47fa-aa82-a2d3bb5649b3-metrics-tls\") pod \"dns-default-pkmpl\" (UID: \"fc3c012b-d60b-47fa-aa82-a2d3bb5649b3\") " pod="openshift-dns/dns-default-pkmpl" Apr 24 16:40:05.232725 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:40:05.232418 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:40:05.232725 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:40:05.232489 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc3c012b-d60b-47fa-aa82-a2d3bb5649b3-metrics-tls podName:fc3c012b-d60b-47fa-aa82-a2d3bb5649b3 nodeName:}" failed. No retries permitted until 2026-04-24 16:40:13.232472535 +0000 UTC m=+49.165541712 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fc3c012b-d60b-47fa-aa82-a2d3bb5649b3-metrics-tls") pod "dns-default-pkmpl" (UID: "fc3c012b-d60b-47fa-aa82-a2d3bb5649b3") : secret "dns-default-metrics-tls" not found Apr 24 16:40:05.333579 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:40:05.333544 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4b53d555-6ece-4be6-a70d-30c64956654b-cert\") pod \"ingress-canary-rgpmf\" (UID: \"4b53d555-6ece-4be6-a70d-30c64956654b\") " pod="openshift-ingress-canary/ingress-canary-rgpmf" Apr 24 16:40:05.333712 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:40:05.333690 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:40:05.333779 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:40:05.333772 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4b53d555-6ece-4be6-a70d-30c64956654b-cert podName:4b53d555-6ece-4be6-a70d-30c64956654b nodeName:}" failed. No retries permitted until 2026-04-24 16:40:13.333757357 +0000 UTC m=+49.266826534 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4b53d555-6ece-4be6-a70d-30c64956654b-cert") pod "ingress-canary-rgpmf" (UID: "4b53d555-6ece-4be6-a70d-30c64956654b") : secret "canary-serving-cert" not found Apr 24 16:40:13.289782 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:40:13.289706 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fc3c012b-d60b-47fa-aa82-a2d3bb5649b3-metrics-tls\") pod \"dns-default-pkmpl\" (UID: \"fc3c012b-d60b-47fa-aa82-a2d3bb5649b3\") " pod="openshift-dns/dns-default-pkmpl" Apr 24 16:40:13.290187 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:40:13.289867 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:40:13.290187 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:40:13.289933 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc3c012b-d60b-47fa-aa82-a2d3bb5649b3-metrics-tls podName:fc3c012b-d60b-47fa-aa82-a2d3bb5649b3 nodeName:}" failed. No retries permitted until 2026-04-24 16:40:29.289917961 +0000 UTC m=+65.222987138 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fc3c012b-d60b-47fa-aa82-a2d3bb5649b3-metrics-tls") pod "dns-default-pkmpl" (UID: "fc3c012b-d60b-47fa-aa82-a2d3bb5649b3") : secret "dns-default-metrics-tls" not found Apr 24 16:40:13.390817 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:40:13.390758 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4b53d555-6ece-4be6-a70d-30c64956654b-cert\") pod \"ingress-canary-rgpmf\" (UID: \"4b53d555-6ece-4be6-a70d-30c64956654b\") " pod="openshift-ingress-canary/ingress-canary-rgpmf" Apr 24 16:40:13.391002 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:40:13.390930 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:40:13.391039 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:40:13.391008 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4b53d555-6ece-4be6-a70d-30c64956654b-cert podName:4b53d555-6ece-4be6-a70d-30c64956654b nodeName:}" failed. No retries permitted until 2026-04-24 16:40:29.390992018 +0000 UTC m=+65.324061195 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4b53d555-6ece-4be6-a70d-30c64956654b-cert") pod "ingress-canary-rgpmf" (UID: "4b53d555-6ece-4be6-a70d-30c64956654b") : secret "canary-serving-cert" not found Apr 24 16:40:23.807947 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:40:23.807917 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7fpht" Apr 24 16:40:29.304891 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:40:29.304849 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fc3c012b-d60b-47fa-aa82-a2d3bb5649b3-metrics-tls\") pod \"dns-default-pkmpl\" (UID: \"fc3c012b-d60b-47fa-aa82-a2d3bb5649b3\") " pod="openshift-dns/dns-default-pkmpl" Apr 24 16:40:29.305276 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:40:29.304965 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:40:29.305276 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:40:29.305019 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc3c012b-d60b-47fa-aa82-a2d3bb5649b3-metrics-tls podName:fc3c012b-d60b-47fa-aa82-a2d3bb5649b3 nodeName:}" failed. No retries permitted until 2026-04-24 16:41:01.305004968 +0000 UTC m=+97.238074144 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fc3c012b-d60b-47fa-aa82-a2d3bb5649b3-metrics-tls") pod "dns-default-pkmpl" (UID: "fc3c012b-d60b-47fa-aa82-a2d3bb5649b3") : secret "dns-default-metrics-tls" not found Apr 24 16:40:29.406072 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:40:29.406038 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4b53d555-6ece-4be6-a70d-30c64956654b-cert\") pod \"ingress-canary-rgpmf\" (UID: \"4b53d555-6ece-4be6-a70d-30c64956654b\") " pod="openshift-ingress-canary/ingress-canary-rgpmf" Apr 24 16:40:29.406210 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:40:29.406184 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:40:29.406267 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:40:29.406255 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4b53d555-6ece-4be6-a70d-30c64956654b-cert podName:4b53d555-6ece-4be6-a70d-30c64956654b nodeName:}" failed. No retries permitted until 2026-04-24 16:41:01.406239496 +0000 UTC m=+97.339308673 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4b53d555-6ece-4be6-a70d-30c64956654b-cert") pod "ingress-canary-rgpmf" (UID: "4b53d555-6ece-4be6-a70d-30c64956654b") : secret "canary-serving-cert" not found Apr 24 16:40:30.412988 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:40:30.412927 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4af5b09f-ceea-413a-bec5-40a2b59c7ea3-metrics-certs\") pod \"network-metrics-daemon-kmh29\" (UID: \"4af5b09f-ceea-413a-bec5-40a2b59c7ea3\") " pod="openshift-multus/network-metrics-daemon-kmh29" Apr 24 16:40:30.413380 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:40:30.413092 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 16:40:30.413380 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:40:30.413180 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4af5b09f-ceea-413a-bec5-40a2b59c7ea3-metrics-certs podName:4af5b09f-ceea-413a-bec5-40a2b59c7ea3 nodeName:}" failed. No retries permitted until 2026-04-24 16:41:34.413160679 +0000 UTC m=+130.346229876 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4af5b09f-ceea-413a-bec5-40a2b59c7ea3-metrics-certs") pod "network-metrics-daemon-kmh29" (UID: "4af5b09f-ceea-413a-bec5-40a2b59c7ea3") : secret "metrics-daemon-secret" not found Apr 24 16:40:34.834093 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:40:34.833989 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-hknkz" Apr 24 16:41:01.323566 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:01.323525 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fc3c012b-d60b-47fa-aa82-a2d3bb5649b3-metrics-tls\") pod \"dns-default-pkmpl\" (UID: \"fc3c012b-d60b-47fa-aa82-a2d3bb5649b3\") " pod="openshift-dns/dns-default-pkmpl" Apr 24 16:41:01.323986 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:41:01.323644 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:41:01.323986 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:41:01.323708 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc3c012b-d60b-47fa-aa82-a2d3bb5649b3-metrics-tls podName:fc3c012b-d60b-47fa-aa82-a2d3bb5649b3 nodeName:}" failed. No retries permitted until 2026-04-24 16:42:05.323694144 +0000 UTC m=+161.256763321 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fc3c012b-d60b-47fa-aa82-a2d3bb5649b3-metrics-tls") pod "dns-default-pkmpl" (UID: "fc3c012b-d60b-47fa-aa82-a2d3bb5649b3") : secret "dns-default-metrics-tls" not found Apr 24 16:41:01.423961 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:01.423921 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4b53d555-6ece-4be6-a70d-30c64956654b-cert\") pod \"ingress-canary-rgpmf\" (UID: \"4b53d555-6ece-4be6-a70d-30c64956654b\") " pod="openshift-ingress-canary/ingress-canary-rgpmf" Apr 24 16:41:01.424119 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:41:01.424084 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:41:01.424169 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:41:01.424158 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4b53d555-6ece-4be6-a70d-30c64956654b-cert podName:4b53d555-6ece-4be6-a70d-30c64956654b nodeName:}" failed. No retries permitted until 2026-04-24 16:42:05.424141471 +0000 UTC m=+161.357210649 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4b53d555-6ece-4be6-a70d-30c64956654b-cert") pod "ingress-canary-rgpmf" (UID: "4b53d555-6ece-4be6-a70d-30c64956654b") : secret "canary-serving-cert" not found Apr 24 16:41:30.451701 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.451668 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-84cbm"] Apr 24 16:41:30.454408 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.454390 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-84cbm" Apr 24 16:41:30.456171 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.456147 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-52mlp"] Apr 24 16:41:30.456587 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.456565 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-5jn2g\"" Apr 24 16:41:30.457169 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.457146 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 24 16:41:30.457255 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.457173 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 24 16:41:30.459157 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.459138 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-5nnkx"] Apr 24 16:41:30.459275 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.459260 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-52mlp" Apr 24 16:41:30.461307 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.461289 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 24 16:41:30.461649 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.461628 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 24 16:41:30.462106 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.461969 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5nnkx" Apr 24 16:41:30.462702 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.462675 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 24 16:41:30.462834 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.462769 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 24 16:41:30.462988 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.462976 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-wv7qs\"" Apr 24 16:41:30.464888 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.464870 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-nkkqc\"" Apr 24 16:41:30.465384 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.465369 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 16:41:30.465482 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.465397 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 24 16:41:30.465482 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.465375 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 16:41:30.465604 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.465549 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 24 16:41:30.469494 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.469264 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-84cbm"] Apr 24 16:41:30.474150 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.473163 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-52mlp"] Apr 24 16:41:30.474593 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.474560 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 24 16:41:30.484545 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.484523 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-5nnkx"] Apr 24 16:41:30.518053 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.518022 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80-trusted-ca\") pod \"console-operator-9d4b6777b-52mlp\" (UID: \"9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80\") " pod="openshift-console-operator/console-operator-9d4b6777b-52mlp" Apr 24 16:41:30.518237 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.518075 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/a32d7d16-b384-47fc-a565-2b51f6f8c945-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-5nnkx\" (UID: \"a32d7d16-b384-47fc-a565-2b51f6f8c945\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5nnkx" Apr 24 16:41:30.518237 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.518158 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hjsx\" (UniqueName: \"kubernetes.io/projected/a32d7d16-b384-47fc-a565-2b51f6f8c945-kube-api-access-7hjsx\") pod \"cluster-monitoring-operator-75587bd455-5nnkx\" (UID: \"a32d7d16-b384-47fc-a565-2b51f6f8c945\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5nnkx" Apr 24 16:41:30.518237 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.518198 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80-config\") pod \"console-operator-9d4b6777b-52mlp\" (UID: \"9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80\") " pod="openshift-console-operator/console-operator-9d4b6777b-52mlp" Apr 24 16:41:30.518237 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.518218 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80-serving-cert\") pod \"console-operator-9d4b6777b-52mlp\" (UID: \"9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80\") " pod="openshift-console-operator/console-operator-9d4b6777b-52mlp" Apr 24 16:41:30.518237 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.518234 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s96pp\" (UniqueName: \"kubernetes.io/projected/9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80-kube-api-access-s96pp\") pod \"console-operator-9d4b6777b-52mlp\" (UID: \"9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80\") " pod="openshift-console-operator/console-operator-9d4b6777b-52mlp" Apr 24 16:41:30.518435 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.518300 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a32d7d16-b384-47fc-a565-2b51f6f8c945-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-5nnkx\" (UID: \"a32d7d16-b384-47fc-a565-2b51f6f8c945\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5nnkx" Apr 24 16:41:30.518435 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.518350 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng5wp\" (UniqueName: \"kubernetes.io/projected/90bd04ca-0a65-47d9-a182-69a15ab185c9-kube-api-access-ng5wp\") pod \"volume-data-source-validator-7c6cbb6c87-84cbm\" (UID: \"90bd04ca-0a65-47d9-a182-69a15ab185c9\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-84cbm" Apr 24 16:41:30.549276 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.549241 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-84fzj"] Apr 24 16:41:30.552305 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.552284 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-84fzj" Apr 24 16:41:30.553427 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.553405 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-nfd4n"] Apr 24 16:41:30.555984 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.555970 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dghtm"] Apr 24 16:41:30.556130 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.556114 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-nfd4n" Apr 24 16:41:30.556690 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.556667 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 24 16:41:30.556889 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.556864 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 24 16:41:30.556955 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.556942 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 24 16:41:30.557073 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.557054 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-mkjsb\"" Apr 24 16:41:30.557505 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.557487 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 24 16:41:30.558775 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.558757 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-4wsnt"] Apr 24 16:41:30.558837 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.558800 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dghtm" Apr 24 16:41:30.561054 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.561023 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 24 16:41:30.561054 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.561040 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 24 16:41:30.561054 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.561040 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 24 16:41:30.561268 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.561092 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-d7pg8\"" Apr 24 16:41:30.561268 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.561118 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-j4rhw\"" Apr 24 16:41:30.561984 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.561967 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-4wsnt" Apr 24 16:41:30.562632 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.562617 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 24 16:41:30.563845 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.563829 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 16:41:30.564050 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.564036 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 24 16:41:30.564306 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.564290 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 24 16:41:30.564525 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.564510 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 16:41:30.565109 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.565092 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-989b8\"" Apr 24 16:41:30.570288 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.570266 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 24 16:41:30.576457 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.576419 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-84fzj"] Apr 24 16:41:30.577251 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.577229 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-nfd4n"] Apr 24 16:41:30.577966 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.577918 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dghtm"] Apr 24 16:41:30.578946 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.578921 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-4wsnt"] Apr 24 16:41:30.618745 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.618701 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6ed4479-aced-4d35-9249-229096300dc7-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-dghtm\" (UID: \"d6ed4479-aced-4d35-9249-229096300dc7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dghtm" Apr 24 16:41:30.618745 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.618747 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7hjsx\" (UniqueName: \"kubernetes.io/projected/a32d7d16-b384-47fc-a565-2b51f6f8c945-kube-api-access-7hjsx\") pod \"cluster-monitoring-operator-75587bd455-5nnkx\" (UID: \"a32d7d16-b384-47fc-a565-2b51f6f8c945\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5nnkx" Apr 24 16:41:30.618941 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.618767 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/07d4b91a-a227-4b22-8f87-40c4e9c8139c-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-4wsnt\" (UID: \"07d4b91a-a227-4b22-8f87-40c4e9c8139c\") " pod="openshift-insights/insights-operator-585dfdc468-4wsnt" Apr 24 16:41:30.618941 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.618782 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/07d4b91a-a227-4b22-8f87-40c4e9c8139c-service-ca-bundle\") pod \"insights-operator-585dfdc468-4wsnt\" (UID: \"07d4b91a-a227-4b22-8f87-40c4e9c8139c\") " pod="openshift-insights/insights-operator-585dfdc468-4wsnt" Apr 24 16:41:30.618941 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.618805 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5w8d\" (UniqueName: \"kubernetes.io/projected/0e87d719-99e3-4983-8268-513fafb03033-kube-api-access-t5w8d\") pod \"network-check-source-8894fc9bd-nfd4n\" (UID: \"0e87d719-99e3-4983-8268-513fafb03033\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-nfd4n" Apr 24 16:41:30.618941 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.618834 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80-config\") pod \"console-operator-9d4b6777b-52mlp\" (UID: \"9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80\") " pod="openshift-console-operator/console-operator-9d4b6777b-52mlp" Apr 24 16:41:30.618941 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.618851 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80-serving-cert\") pod \"console-operator-9d4b6777b-52mlp\" (UID: \"9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80\") " pod="openshift-console-operator/console-operator-9d4b6777b-52mlp" Apr 24 16:41:30.618941 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.618890 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s96pp\" (UniqueName: \"kubernetes.io/projected/9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80-kube-api-access-s96pp\") pod \"console-operator-9d4b6777b-52mlp\" (UID: \"9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80\") " pod="openshift-console-operator/console-operator-9d4b6777b-52mlp" Apr 24 16:41:30.618941 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.618933 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a32d7d16-b384-47fc-a565-2b51f6f8c945-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-5nnkx\" (UID: \"a32d7d16-b384-47fc-a565-2b51f6f8c945\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5nnkx" Apr 24 16:41:30.619266 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.618970 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/07d4b91a-a227-4b22-8f87-40c4e9c8139c-snapshots\") pod \"insights-operator-585dfdc468-4wsnt\" (UID: \"07d4b91a-a227-4b22-8f87-40c4e9c8139c\") " pod="openshift-insights/insights-operator-585dfdc468-4wsnt" Apr 24 16:41:30.619266 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.619004 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ng5wp\" (UniqueName: \"kubernetes.io/projected/90bd04ca-0a65-47d9-a182-69a15ab185c9-kube-api-access-ng5wp\") pod \"volume-data-source-validator-7c6cbb6c87-84cbm\" (UID: \"90bd04ca-0a65-47d9-a182-69a15ab185c9\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-84cbm" Apr 24 16:41:30.619266 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.619046 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6ed4479-aced-4d35-9249-229096300dc7-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-dghtm\" (UID: \"d6ed4479-aced-4d35-9249-229096300dc7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dghtm" Apr 24 16:41:30.619266 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.619073 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/07d4b91a-a227-4b22-8f87-40c4e9c8139c-tmp\") pod \"insights-operator-585dfdc468-4wsnt\" (UID: \"07d4b91a-a227-4b22-8f87-40c4e9c8139c\") " pod="openshift-insights/insights-operator-585dfdc468-4wsnt" Apr 24 16:41:30.619266 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:41:30.619095 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 16:41:30.619266 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.619098 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfq4j\" (UniqueName: \"kubernetes.io/projected/07d4b91a-a227-4b22-8f87-40c4e9c8139c-kube-api-access-qfq4j\") pod \"insights-operator-585dfdc468-4wsnt\" (UID: \"07d4b91a-a227-4b22-8f87-40c4e9c8139c\") " pod="openshift-insights/insights-operator-585dfdc468-4wsnt" Apr 24 16:41:30.619266 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:41:30.619170 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a32d7d16-b384-47fc-a565-2b51f6f8c945-cluster-monitoring-operator-tls podName:a32d7d16-b384-47fc-a565-2b51f6f8c945 nodeName:}" failed. No retries permitted until 2026-04-24 16:41:31.119154478 +0000 UTC m=+127.052223655 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a32d7d16-b384-47fc-a565-2b51f6f8c945-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-5nnkx" (UID: "a32d7d16-b384-47fc-a565-2b51f6f8c945") : secret "cluster-monitoring-operator-tls" not found Apr 24 16:41:30.619266 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.619223 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80-trusted-ca\") pod \"console-operator-9d4b6777b-52mlp\" (UID: \"9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80\") " pod="openshift-console-operator/console-operator-9d4b6777b-52mlp" Apr 24 16:41:30.619266 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.619244 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxqnx\" (UniqueName: \"kubernetes.io/projected/2a504e1c-5915-4d94-9934-e45731af60c7-kube-api-access-hxqnx\") pod \"service-ca-operator-d6fc45fc5-84fzj\" (UID: \"2a504e1c-5915-4d94-9934-e45731af60c7\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-84fzj" Apr 24 16:41:30.619266 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.619267 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jgwk\" (UniqueName: \"kubernetes.io/projected/d6ed4479-aced-4d35-9249-229096300dc7-kube-api-access-6jgwk\") pod \"kube-storage-version-migrator-operator-6769c5d45-dghtm\" (UID: \"d6ed4479-aced-4d35-9249-229096300dc7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dghtm" Apr 24 16:41:30.619859 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.619349 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/a32d7d16-b384-47fc-a565-2b51f6f8c945-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-5nnkx\" (UID: \"a32d7d16-b384-47fc-a565-2b51f6f8c945\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5nnkx" Apr 24 16:41:30.619859 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.619375 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2a504e1c-5915-4d94-9934-e45731af60c7-serving-cert\") pod \"service-ca-operator-d6fc45fc5-84fzj\" (UID: \"2a504e1c-5915-4d94-9934-e45731af60c7\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-84fzj" Apr 24 16:41:30.619859 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.619391 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a504e1c-5915-4d94-9934-e45731af60c7-config\") pod \"service-ca-operator-d6fc45fc5-84fzj\" (UID: \"2a504e1c-5915-4d94-9934-e45731af60c7\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-84fzj" Apr 24 16:41:30.619859 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.619413 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07d4b91a-a227-4b22-8f87-40c4e9c8139c-serving-cert\") pod \"insights-operator-585dfdc468-4wsnt\" (UID: \"07d4b91a-a227-4b22-8f87-40c4e9c8139c\") " pod="openshift-insights/insights-operator-585dfdc468-4wsnt" Apr 24 16:41:30.619859 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.619557 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80-config\") pod \"console-operator-9d4b6777b-52mlp\" (UID: \"9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80\") " pod="openshift-console-operator/console-operator-9d4b6777b-52mlp" Apr 24 16:41:30.620084 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.620026 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/a32d7d16-b384-47fc-a565-2b51f6f8c945-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-5nnkx\" (UID: \"a32d7d16-b384-47fc-a565-2b51f6f8c945\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5nnkx" Apr 24 16:41:30.620222 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.620202 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80-trusted-ca\") pod \"console-operator-9d4b6777b-52mlp\" (UID: \"9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80\") " pod="openshift-console-operator/console-operator-9d4b6777b-52mlp" Apr 24 16:41:30.622239 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.622221 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80-serving-cert\") pod \"console-operator-9d4b6777b-52mlp\" (UID: \"9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80\") " pod="openshift-console-operator/console-operator-9d4b6777b-52mlp" Apr 24 16:41:30.635089 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.635057 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng5wp\" (UniqueName: \"kubernetes.io/projected/90bd04ca-0a65-47d9-a182-69a15ab185c9-kube-api-access-ng5wp\") pod \"volume-data-source-validator-7c6cbb6c87-84cbm\" (UID: \"90bd04ca-0a65-47d9-a182-69a15ab185c9\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-84cbm" Apr 24 16:41:30.635232 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.635140 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hjsx\" (UniqueName: \"kubernetes.io/projected/a32d7d16-b384-47fc-a565-2b51f6f8c945-kube-api-access-7hjsx\") pod \"cluster-monitoring-operator-75587bd455-5nnkx\" (UID: \"a32d7d16-b384-47fc-a565-2b51f6f8c945\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5nnkx" Apr 24 16:41:30.635232 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.635183 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s96pp\" (UniqueName: \"kubernetes.io/projected/9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80-kube-api-access-s96pp\") pod \"console-operator-9d4b6777b-52mlp\" (UID: \"9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80\") " pod="openshift-console-operator/console-operator-9d4b6777b-52mlp" Apr 24 16:41:30.720535 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.720451 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hxqnx\" (UniqueName: \"kubernetes.io/projected/2a504e1c-5915-4d94-9934-e45731af60c7-kube-api-access-hxqnx\") pod \"service-ca-operator-d6fc45fc5-84fzj\" (UID: \"2a504e1c-5915-4d94-9934-e45731af60c7\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-84fzj" Apr 24 16:41:30.720535 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.720488 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6jgwk\" (UniqueName: \"kubernetes.io/projected/d6ed4479-aced-4d35-9249-229096300dc7-kube-api-access-6jgwk\") pod \"kube-storage-version-migrator-operator-6769c5d45-dghtm\" (UID: \"d6ed4479-aced-4d35-9249-229096300dc7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dghtm" Apr 24 16:41:30.720535 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.720524 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2a504e1c-5915-4d94-9934-e45731af60c7-serving-cert\") pod \"service-ca-operator-d6fc45fc5-84fzj\" (UID: \"2a504e1c-5915-4d94-9934-e45731af60c7\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-84fzj" Apr 24 16:41:30.720843 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.720540 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a504e1c-5915-4d94-9934-e45731af60c7-config\") pod \"service-ca-operator-d6fc45fc5-84fzj\" (UID: \"2a504e1c-5915-4d94-9934-e45731af60c7\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-84fzj" Apr 24 16:41:30.720843 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.720557 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07d4b91a-a227-4b22-8f87-40c4e9c8139c-serving-cert\") pod \"insights-operator-585dfdc468-4wsnt\" (UID: \"07d4b91a-a227-4b22-8f87-40c4e9c8139c\") " pod="openshift-insights/insights-operator-585dfdc468-4wsnt" Apr 24 16:41:30.720843 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.720604 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6ed4479-aced-4d35-9249-229096300dc7-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-dghtm\" (UID: \"d6ed4479-aced-4d35-9249-229096300dc7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dghtm" Apr 24 16:41:30.720843 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.720623 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/07d4b91a-a227-4b22-8f87-40c4e9c8139c-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-4wsnt\" (UID: \"07d4b91a-a227-4b22-8f87-40c4e9c8139c\") " pod="openshift-insights/insights-operator-585dfdc468-4wsnt" Apr 24 16:41:30.720843 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.720638 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/07d4b91a-a227-4b22-8f87-40c4e9c8139c-service-ca-bundle\") pod \"insights-operator-585dfdc468-4wsnt\" (UID: \"07d4b91a-a227-4b22-8f87-40c4e9c8139c\") " pod="openshift-insights/insights-operator-585dfdc468-4wsnt" Apr 24 16:41:30.720843 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.720660 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t5w8d\" (UniqueName: \"kubernetes.io/projected/0e87d719-99e3-4983-8268-513fafb03033-kube-api-access-t5w8d\") pod \"network-check-source-8894fc9bd-nfd4n\" (UID: \"0e87d719-99e3-4983-8268-513fafb03033\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-nfd4n" Apr 24 16:41:30.720843 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.720715 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/07d4b91a-a227-4b22-8f87-40c4e9c8139c-snapshots\") pod \"insights-operator-585dfdc468-4wsnt\" (UID: \"07d4b91a-a227-4b22-8f87-40c4e9c8139c\") " pod="openshift-insights/insights-operator-585dfdc468-4wsnt" Apr 24 16:41:30.720843 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.720776 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6ed4479-aced-4d35-9249-229096300dc7-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-dghtm\" (UID: \"d6ed4479-aced-4d35-9249-229096300dc7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dghtm" Apr 24 16:41:30.720843 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.720807 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/07d4b91a-a227-4b22-8f87-40c4e9c8139c-tmp\") pod \"insights-operator-585dfdc468-4wsnt\" (UID: \"07d4b91a-a227-4b22-8f87-40c4e9c8139c\") " pod="openshift-insights/insights-operator-585dfdc468-4wsnt" Apr 24 16:41:30.720843 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.720832 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qfq4j\" (UniqueName: \"kubernetes.io/projected/07d4b91a-a227-4b22-8f87-40c4e9c8139c-kube-api-access-qfq4j\") pod \"insights-operator-585dfdc468-4wsnt\" (UID: \"07d4b91a-a227-4b22-8f87-40c4e9c8139c\") " pod="openshift-insights/insights-operator-585dfdc468-4wsnt" Apr 24 16:41:30.721305 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.721285 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a504e1c-5915-4d94-9934-e45731af60c7-config\") pod \"service-ca-operator-d6fc45fc5-84fzj\" (UID: \"2a504e1c-5915-4d94-9934-e45731af60c7\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-84fzj" Apr 24 16:41:30.721529 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.721501 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/07d4b91a-a227-4b22-8f87-40c4e9c8139c-snapshots\") pod \"insights-operator-585dfdc468-4wsnt\" (UID: \"07d4b91a-a227-4b22-8f87-40c4e9c8139c\") " pod="openshift-insights/insights-operator-585dfdc468-4wsnt" Apr 24 16:41:30.721623 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.721608 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/07d4b91a-a227-4b22-8f87-40c4e9c8139c-tmp\") pod \"insights-operator-585dfdc468-4wsnt\" (UID: \"07d4b91a-a227-4b22-8f87-40c4e9c8139c\") " pod="openshift-insights/insights-operator-585dfdc468-4wsnt" Apr 24 16:41:30.721685 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.721664 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/07d4b91a-a227-4b22-8f87-40c4e9c8139c-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-4wsnt\" (UID: \"07d4b91a-a227-4b22-8f87-40c4e9c8139c\") " pod="openshift-insights/insights-operator-585dfdc468-4wsnt" Apr 24 16:41:30.721781 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.721751 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6ed4479-aced-4d35-9249-229096300dc7-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-dghtm\" (UID: \"d6ed4479-aced-4d35-9249-229096300dc7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dghtm" Apr 24 16:41:30.721962 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.721942 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/07d4b91a-a227-4b22-8f87-40c4e9c8139c-service-ca-bundle\") pod \"insights-operator-585dfdc468-4wsnt\" (UID: \"07d4b91a-a227-4b22-8f87-40c4e9c8139c\") " pod="openshift-insights/insights-operator-585dfdc468-4wsnt" Apr 24 16:41:30.723062 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.723043 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2a504e1c-5915-4d94-9934-e45731af60c7-serving-cert\") pod \"service-ca-operator-d6fc45fc5-84fzj\" (UID: \"2a504e1c-5915-4d94-9934-e45731af60c7\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-84fzj" Apr 24 16:41:30.723147 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.723133 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6ed4479-aced-4d35-9249-229096300dc7-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-dghtm\" (UID: \"d6ed4479-aced-4d35-9249-229096300dc7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dghtm" Apr 24 16:41:30.723539 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.723519 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07d4b91a-a227-4b22-8f87-40c4e9c8139c-serving-cert\") pod \"insights-operator-585dfdc468-4wsnt\" (UID: \"07d4b91a-a227-4b22-8f87-40c4e9c8139c\") " pod="openshift-insights/insights-operator-585dfdc468-4wsnt" Apr 24 16:41:30.728593 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.728570 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jgwk\" (UniqueName: \"kubernetes.io/projected/d6ed4479-aced-4d35-9249-229096300dc7-kube-api-access-6jgwk\") pod \"kube-storage-version-migrator-operator-6769c5d45-dghtm\" (UID: \"d6ed4479-aced-4d35-9249-229096300dc7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dghtm" Apr 24 16:41:30.728593 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.728583 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfq4j\" (UniqueName: \"kubernetes.io/projected/07d4b91a-a227-4b22-8f87-40c4e9c8139c-kube-api-access-qfq4j\") pod \"insights-operator-585dfdc468-4wsnt\" (UID: \"07d4b91a-a227-4b22-8f87-40c4e9c8139c\") " pod="openshift-insights/insights-operator-585dfdc468-4wsnt" Apr 24 16:41:30.729019 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.728997 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxqnx\" (UniqueName: \"kubernetes.io/projected/2a504e1c-5915-4d94-9934-e45731af60c7-kube-api-access-hxqnx\") pod \"service-ca-operator-d6fc45fc5-84fzj\" (UID: \"2a504e1c-5915-4d94-9934-e45731af60c7\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-84fzj" Apr 24 16:41:30.729247 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.729233 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5w8d\" (UniqueName: \"kubernetes.io/projected/0e87d719-99e3-4983-8268-513fafb03033-kube-api-access-t5w8d\") pod \"network-check-source-8894fc9bd-nfd4n\" (UID: \"0e87d719-99e3-4983-8268-513fafb03033\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-nfd4n" Apr 24 16:41:30.765836 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.765789 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-84cbm" Apr 24 16:41:30.779161 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.775927 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-52mlp" Apr 24 16:41:30.863040 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.862755 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-84fzj" Apr 24 16:41:30.869761 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.869655 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-nfd4n" Apr 24 16:41:30.876527 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.876499 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dghtm" Apr 24 16:41:30.882053 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.881635 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-4wsnt" Apr 24 16:41:30.893488 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.893454 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-84cbm"] Apr 24 16:41:30.921582 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.921519 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-52mlp"] Apr 24 16:41:30.928675 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:41:30.928621 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a9d0510_e8e5_4e24_bd5a_c4e88d66aa80.slice/crio-0dce496a79cbecd4b60e5ca5a4a2859240799f9b54b27188d959b697bc28569b WatchSource:0}: Error finding container 0dce496a79cbecd4b60e5ca5a4a2859240799f9b54b27188d959b697bc28569b: Status 404 returned error can't find the container with id 0dce496a79cbecd4b60e5ca5a4a2859240799f9b54b27188d959b697bc28569b Apr 24 16:41:30.999517 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:30.999416 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-52mlp" event={"ID":"9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80","Type":"ContainerStarted","Data":"0dce496a79cbecd4b60e5ca5a4a2859240799f9b54b27188d959b697bc28569b"} Apr 24 16:41:31.000638 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:31.000569 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-84cbm" event={"ID":"90bd04ca-0a65-47d9-a182-69a15ab185c9","Type":"ContainerStarted","Data":"28bc525045c6477414815e6a9aca4bd5267dabb0f0b9c111472e0284b8c37fa0"} Apr 24 16:41:31.024092 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:31.024039 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-84fzj"] Apr 24 16:41:31.028807 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:41:31.028776 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a504e1c_5915_4d94_9934_e45731af60c7.slice/crio-54fb2cec46994254c96c8837de1922f346bca01864f185307c5e633916f11dd4 WatchSource:0}: Error finding container 54fb2cec46994254c96c8837de1922f346bca01864f185307c5e633916f11dd4: Status 404 returned error can't find the container with id 54fb2cec46994254c96c8837de1922f346bca01864f185307c5e633916f11dd4 Apr 24 16:41:31.125021 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:31.124983 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a32d7d16-b384-47fc-a565-2b51f6f8c945-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-5nnkx\" (UID: \"a32d7d16-b384-47fc-a565-2b51f6f8c945\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5nnkx" Apr 24 16:41:31.125203 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:41:31.125126 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 16:41:31.125203 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:41:31.125195 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a32d7d16-b384-47fc-a565-2b51f6f8c945-cluster-monitoring-operator-tls podName:a32d7d16-b384-47fc-a565-2b51f6f8c945 nodeName:}" failed. No retries permitted until 2026-04-24 16:41:32.125178078 +0000 UTC m=+128.058247256 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a32d7d16-b384-47fc-a565-2b51f6f8c945-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-5nnkx" (UID: "a32d7d16-b384-47fc-a565-2b51f6f8c945") : secret "cluster-monitoring-operator-tls" not found Apr 24 16:41:31.247286 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:31.247261 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-nfd4n"] Apr 24 16:41:31.247944 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:31.247916 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dghtm"] Apr 24 16:41:31.248332 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:41:31.248309 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6ed4479_aced_4d35_9249_229096300dc7.slice/crio-37a467a5776d01a274ff3d2020fdd8038e8b1969274a1677557df8bb2e6fefca WatchSource:0}: Error finding container 37a467a5776d01a274ff3d2020fdd8038e8b1969274a1677557df8bb2e6fefca: Status 404 returned error can't find the container with id 37a467a5776d01a274ff3d2020fdd8038e8b1969274a1677557df8bb2e6fefca Apr 24 16:41:31.249005 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:41:31.248931 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e87d719_99e3_4983_8268_513fafb03033.slice/crio-580354da45878eb9d7d19845e6e6a59a5364c6a75a9b31a22a038714d34e5c30 WatchSource:0}: Error finding container 580354da45878eb9d7d19845e6e6a59a5364c6a75a9b31a22a038714d34e5c30: Status 404 returned error can't find the container with id 580354da45878eb9d7d19845e6e6a59a5364c6a75a9b31a22a038714d34e5c30 Apr 24 16:41:31.251910 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:31.251887 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-4wsnt"] Apr 24 16:41:31.254993 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:41:31.254967 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07d4b91a_a227_4b22_8f87_40c4e9c8139c.slice/crio-934d50a7b07160c1a2dc20a27d0589f97e75988dcb5d034e1c441f7f6499a5fa WatchSource:0}: Error finding container 934d50a7b07160c1a2dc20a27d0589f97e75988dcb5d034e1c441f7f6499a5fa: Status 404 returned error can't find the container with id 934d50a7b07160c1a2dc20a27d0589f97e75988dcb5d034e1c441f7f6499a5fa Apr 24 16:41:32.008605 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:32.008540 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-4wsnt" event={"ID":"07d4b91a-a227-4b22-8f87-40c4e9c8139c","Type":"ContainerStarted","Data":"934d50a7b07160c1a2dc20a27d0589f97e75988dcb5d034e1c441f7f6499a5fa"} Apr 24 16:41:32.011680 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:32.011645 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-nfd4n" event={"ID":"0e87d719-99e3-4983-8268-513fafb03033","Type":"ContainerStarted","Data":"0be85ef6021ac1e6c8cb82f1aff4623a5f650e8b9b2fb88966614b807cb1ed10"} Apr 24 16:41:32.011827 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:32.011689 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-nfd4n" event={"ID":"0e87d719-99e3-4983-8268-513fafb03033","Type":"ContainerStarted","Data":"580354da45878eb9d7d19845e6e6a59a5364c6a75a9b31a22a038714d34e5c30"} Apr 24 16:41:32.014120 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:32.014091 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dghtm" event={"ID":"d6ed4479-aced-4d35-9249-229096300dc7","Type":"ContainerStarted","Data":"37a467a5776d01a274ff3d2020fdd8038e8b1969274a1677557df8bb2e6fefca"} Apr 24 16:41:32.016635 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:32.016610 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-84fzj" event={"ID":"2a504e1c-5915-4d94-9934-e45731af60c7","Type":"ContainerStarted","Data":"54fb2cec46994254c96c8837de1922f346bca01864f185307c5e633916f11dd4"} Apr 24 16:41:32.029153 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:32.029102 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-nfd4n" podStartSLOduration=2.029084183 podStartE2EDuration="2.029084183s" podCreationTimestamp="2026-04-24 16:41:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:41:32.027095529 +0000 UTC m=+127.960164749" watchObservedRunningTime="2026-04-24 16:41:32.029084183 +0000 UTC m=+127.962153379" Apr 24 16:41:32.147539 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:32.135086 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a32d7d16-b384-47fc-a565-2b51f6f8c945-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-5nnkx\" (UID: \"a32d7d16-b384-47fc-a565-2b51f6f8c945\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5nnkx" Apr 24 16:41:32.147539 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:41:32.135602 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 16:41:32.147539 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:41:32.135679 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a32d7d16-b384-47fc-a565-2b51f6f8c945-cluster-monitoring-operator-tls podName:a32d7d16-b384-47fc-a565-2b51f6f8c945 nodeName:}" failed. No retries permitted until 2026-04-24 16:41:34.135658385 +0000 UTC m=+130.068727585 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a32d7d16-b384-47fc-a565-2b51f6f8c945-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-5nnkx" (UID: "a32d7d16-b384-47fc-a565-2b51f6f8c945") : secret "cluster-monitoring-operator-tls" not found Apr 24 16:41:33.020343 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:33.020299 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-84cbm" event={"ID":"90bd04ca-0a65-47d9-a182-69a15ab185c9","Type":"ContainerStarted","Data":"e0ccf6dbb10addadfe5e2b72419465f9bbf5f3bc872273ad93d7cf0dc040db5c"} Apr 24 16:41:33.044990 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:33.044930 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-84cbm" podStartSLOduration=1.106633194 podStartE2EDuration="3.044909756s" podCreationTimestamp="2026-04-24 16:41:30 +0000 UTC" firstStartedPulling="2026-04-24 16:41:30.901120067 +0000 UTC m=+126.834189248" lastFinishedPulling="2026-04-24 16:41:32.839396625 +0000 UTC m=+128.772465810" observedRunningTime="2026-04-24 16:41:33.04411267 +0000 UTC m=+128.977181874" watchObservedRunningTime="2026-04-24 16:41:33.044909756 +0000 UTC m=+128.977978961" Apr 24 16:41:34.154553 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:34.154514 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a32d7d16-b384-47fc-a565-2b51f6f8c945-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-5nnkx\" (UID: \"a32d7d16-b384-47fc-a565-2b51f6f8c945\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5nnkx" Apr 24 16:41:34.155004 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:41:34.154695 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 16:41:34.155004 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:41:34.154794 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a32d7d16-b384-47fc-a565-2b51f6f8c945-cluster-monitoring-operator-tls podName:a32d7d16-b384-47fc-a565-2b51f6f8c945 nodeName:}" failed. No retries permitted until 2026-04-24 16:41:38.154773486 +0000 UTC m=+134.087842675 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a32d7d16-b384-47fc-a565-2b51f6f8c945-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-5nnkx" (UID: "a32d7d16-b384-47fc-a565-2b51f6f8c945") : secret "cluster-monitoring-operator-tls" not found Apr 24 16:41:34.457947 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:34.457860 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4af5b09f-ceea-413a-bec5-40a2b59c7ea3-metrics-certs\") pod \"network-metrics-daemon-kmh29\" (UID: \"4af5b09f-ceea-413a-bec5-40a2b59c7ea3\") " pod="openshift-multus/network-metrics-daemon-kmh29" Apr 24 16:41:34.458076 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:41:34.458016 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 16:41:34.458114 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:41:34.458083 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4af5b09f-ceea-413a-bec5-40a2b59c7ea3-metrics-certs podName:4af5b09f-ceea-413a-bec5-40a2b59c7ea3 nodeName:}" failed. No retries permitted until 2026-04-24 16:43:36.458067883 +0000 UTC m=+252.391137064 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4af5b09f-ceea-413a-bec5-40a2b59c7ea3-metrics-certs") pod "network-metrics-daemon-kmh29" (UID: "4af5b09f-ceea-413a-bec5-40a2b59c7ea3") : secret "metrics-daemon-secret" not found Apr 24 16:41:36.030608 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:36.030580 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-52mlp_9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80/console-operator/0.log" Apr 24 16:41:36.031234 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:36.030625 2579 generic.go:358] "Generic (PLEG): container finished" podID="9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80" containerID="7cdfd4c55298aedb6fe81c52cfac3a6667e13336759eae603f9a6fa420fe1816" exitCode=255 Apr 24 16:41:36.031234 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:36.030715 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-52mlp" event={"ID":"9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80","Type":"ContainerDied","Data":"7cdfd4c55298aedb6fe81c52cfac3a6667e13336759eae603f9a6fa420fe1816"} Apr 24 16:41:36.031234 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:36.030983 2579 scope.go:117] "RemoveContainer" containerID="7cdfd4c55298aedb6fe81c52cfac3a6667e13336759eae603f9a6fa420fe1816" Apr 24 16:41:36.032235 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:36.032202 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dghtm" event={"ID":"d6ed4479-aced-4d35-9249-229096300dc7","Type":"ContainerStarted","Data":"7f834f42f241f76b01e2bcb754e6cb1a4d222ce3bf90684c79cd389ab066482e"} Apr 24 16:41:36.033842 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:36.033809 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-84fzj" event={"ID":"2a504e1c-5915-4d94-9934-e45731af60c7","Type":"ContainerStarted","Data":"ef8ddaf60e168dc86e5cc0f73a4212cffef4a13de94f1b2d0557837d77fc1c17"} Apr 24 16:41:36.035371 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:36.035347 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-4wsnt" event={"ID":"07d4b91a-a227-4b22-8f87-40c4e9c8139c","Type":"ContainerStarted","Data":"55895bf4fbc7baa1b4a5e9a36fe1debbfd2112185a4bb0b75bce30c0a4257966"} Apr 24 16:41:36.064064 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:36.064016 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-84fzj" podStartSLOduration=2.100691487 podStartE2EDuration="6.064000229s" podCreationTimestamp="2026-04-24 16:41:30 +0000 UTC" firstStartedPulling="2026-04-24 16:41:31.031005249 +0000 UTC m=+126.964074425" lastFinishedPulling="2026-04-24 16:41:34.99431399 +0000 UTC m=+130.927383167" observedRunningTime="2026-04-24 16:41:36.062911734 +0000 UTC m=+131.995980929" watchObservedRunningTime="2026-04-24 16:41:36.064000229 +0000 UTC m=+131.997069466" Apr 24 16:41:36.081195 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:36.081142 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-4wsnt" podStartSLOduration=2.3376454669999998 podStartE2EDuration="6.081125239s" podCreationTimestamp="2026-04-24 16:41:30 +0000 UTC" firstStartedPulling="2026-04-24 16:41:31.256798623 +0000 UTC m=+127.189867799" lastFinishedPulling="2026-04-24 16:41:35.000278373 +0000 UTC m=+130.933347571" observedRunningTime="2026-04-24 16:41:36.079329894 +0000 UTC m=+132.012399095" watchObservedRunningTime="2026-04-24 16:41:36.081125239 +0000 UTC m=+132.014194439" Apr 24 16:41:36.097406 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:36.097038 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dghtm" podStartSLOduration=2.349001455 podStartE2EDuration="6.097020782s" podCreationTimestamp="2026-04-24 16:41:30 +0000 UTC" firstStartedPulling="2026-04-24 16:41:31.250327219 +0000 UTC m=+127.183396400" lastFinishedPulling="2026-04-24 16:41:34.998346534 +0000 UTC m=+130.931415727" observedRunningTime="2026-04-24 16:41:36.096592294 +0000 UTC m=+132.029661505" watchObservedRunningTime="2026-04-24 16:41:36.097020782 +0000 UTC m=+132.030089984" Apr 24 16:41:36.883052 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:36.883016 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-c59xq"] Apr 24 16:41:36.886358 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:36.886337 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-c59xq" Apr 24 16:41:36.888450 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:36.888425 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 24 16:41:36.889014 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:36.888995 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 24 16:41:36.889119 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:36.888998 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-498p6\"" Apr 24 16:41:36.895038 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:36.895010 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-c59xq"] Apr 24 16:41:36.977276 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:36.977242 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfp52\" (UniqueName: \"kubernetes.io/projected/16107c19-f352-4fb7-b6d8-6604b6f5ceb1-kube-api-access-xfp52\") pod \"migrator-74bb7799d9-c59xq\" (UID: \"16107c19-f352-4fb7-b6d8-6604b6f5ceb1\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-c59xq" Apr 24 16:41:37.039317 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:37.039287 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-52mlp_9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80/console-operator/1.log" Apr 24 16:41:37.039705 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:37.039679 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-52mlp_9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80/console-operator/0.log" Apr 24 16:41:37.039785 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:37.039712 2579 generic.go:358] "Generic (PLEG): container finished" podID="9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80" containerID="44fe86a33b0dc2910c2eb18b1375deb9cc95e2773f127b8e7d8ccbebf1464a3c" exitCode=255 Apr 24 16:41:37.039844 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:37.039818 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-52mlp" event={"ID":"9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80","Type":"ContainerDied","Data":"44fe86a33b0dc2910c2eb18b1375deb9cc95e2773f127b8e7d8ccbebf1464a3c"} Apr 24 16:41:37.039884 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:37.039870 2579 scope.go:117] "RemoveContainer" containerID="7cdfd4c55298aedb6fe81c52cfac3a6667e13336759eae603f9a6fa420fe1816" Apr 24 16:41:37.040109 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:37.040085 2579 scope.go:117] "RemoveContainer" containerID="44fe86a33b0dc2910c2eb18b1375deb9cc95e2773f127b8e7d8ccbebf1464a3c" Apr 24 16:41:37.040324 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:41:37.040297 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-52mlp_openshift-console-operator(9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80)\"" pod="openshift-console-operator/console-operator-9d4b6777b-52mlp" podUID="9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80" Apr 24 16:41:37.078287 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:37.078257 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xfp52\" (UniqueName: \"kubernetes.io/projected/16107c19-f352-4fb7-b6d8-6604b6f5ceb1-kube-api-access-xfp52\") pod \"migrator-74bb7799d9-c59xq\" (UID: \"16107c19-f352-4fb7-b6d8-6604b6f5ceb1\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-c59xq" Apr 24 16:41:37.086390 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:37.086364 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfp52\" (UniqueName: \"kubernetes.io/projected/16107c19-f352-4fb7-b6d8-6604b6f5ceb1-kube-api-access-xfp52\") pod \"migrator-74bb7799d9-c59xq\" (UID: \"16107c19-f352-4fb7-b6d8-6604b6f5ceb1\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-c59xq" Apr 24 16:41:37.196453 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:37.196354 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-c59xq" Apr 24 16:41:37.318454 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:37.318426 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-c59xq"] Apr 24 16:41:37.321408 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:41:37.321370 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16107c19_f352_4fb7_b6d8_6604b6f5ceb1.slice/crio-665b01a1b98330214de702cae31c86224a7839eb19d6fa072738c6d9d442a574 WatchSource:0}: Error finding container 665b01a1b98330214de702cae31c86224a7839eb19d6fa072738c6d9d442a574: Status 404 returned error can't find the container with id 665b01a1b98330214de702cae31c86224a7839eb19d6fa072738c6d9d442a574 Apr 24 16:41:38.044082 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:38.044050 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-52mlp_9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80/console-operator/1.log" Apr 24 16:41:38.044538 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:38.044470 2579 scope.go:117] "RemoveContainer" containerID="44fe86a33b0dc2910c2eb18b1375deb9cc95e2773f127b8e7d8ccbebf1464a3c" Apr 24 16:41:38.044711 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:41:38.044677 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-52mlp_openshift-console-operator(9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80)\"" pod="openshift-console-operator/console-operator-9d4b6777b-52mlp" podUID="9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80" Apr 24 16:41:38.045468 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:38.045441 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-c59xq" event={"ID":"16107c19-f352-4fb7-b6d8-6604b6f5ceb1","Type":"ContainerStarted","Data":"665b01a1b98330214de702cae31c86224a7839eb19d6fa072738c6d9d442a574"} Apr 24 16:41:38.187213 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:38.187171 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a32d7d16-b384-47fc-a565-2b51f6f8c945-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-5nnkx\" (UID: \"a32d7d16-b384-47fc-a565-2b51f6f8c945\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5nnkx" Apr 24 16:41:38.187405 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:41:38.187308 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 16:41:38.187484 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:41:38.187385 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a32d7d16-b384-47fc-a565-2b51f6f8c945-cluster-monitoring-operator-tls podName:a32d7d16-b384-47fc-a565-2b51f6f8c945 nodeName:}" failed. No retries permitted until 2026-04-24 16:41:46.187364409 +0000 UTC m=+142.120433599 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a32d7d16-b384-47fc-a565-2b51f6f8c945-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-5nnkx" (UID: "a32d7d16-b384-47fc-a565-2b51f6f8c945") : secret "cluster-monitoring-operator-tls" not found Apr 24 16:41:38.444257 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:38.444226 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-7grsg_53e5c63d-2459-4e1a-be74-d81070bbbeff/dns-node-resolver/0.log" Apr 24 16:41:39.049892 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:39.049855 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-c59xq" event={"ID":"16107c19-f352-4fb7-b6d8-6604b6f5ceb1","Type":"ContainerStarted","Data":"0ed762a807c229cff17997816cbb8540fee6bb781a178e27d5adab347294fff8"} Apr 24 16:41:39.049892 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:39.049893 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-c59xq" event={"ID":"16107c19-f352-4fb7-b6d8-6604b6f5ceb1","Type":"ContainerStarted","Data":"008690589dfa327e395f9669b7ddf256b7c46bb1ab02d2e539e963f545db31ef"} Apr 24 16:41:39.073572 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:39.073512 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-c59xq" podStartSLOduration=2.130506748 podStartE2EDuration="3.073495725s" podCreationTimestamp="2026-04-24 16:41:36 +0000 UTC" firstStartedPulling="2026-04-24 16:41:37.323364112 +0000 UTC m=+133.256433289" lastFinishedPulling="2026-04-24 16:41:38.266353088 +0000 UTC m=+134.199422266" observedRunningTime="2026-04-24 16:41:39.072494916 +0000 UTC m=+135.005564117" watchObservedRunningTime="2026-04-24 16:41:39.073495725 +0000 UTC m=+135.006564924" Apr 24 16:41:39.438391 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:39.438308 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-xknth"] Apr 24 16:41:39.441361 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:39.441342 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-xknth" Apr 24 16:41:39.445304 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:39.445281 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 24 16:41:39.445304 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:39.445285 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 24 16:41:39.445577 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:39.445565 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 24 16:41:39.445621 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:39.445575 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 24 16:41:39.446056 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:39.446042 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-4zgwk\"" Apr 24 16:41:39.453317 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:39.453291 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-xknth"] Apr 24 16:41:39.597199 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:39.597159 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/dbdd7d9e-7e7c-4d21-af8f-eb048dba3b5c-signing-cabundle\") pod \"service-ca-865cb79987-xknth\" (UID: \"dbdd7d9e-7e7c-4d21-af8f-eb048dba3b5c\") " pod="openshift-service-ca/service-ca-865cb79987-xknth" Apr 24 16:41:39.597392 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:39.597258 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/dbdd7d9e-7e7c-4d21-af8f-eb048dba3b5c-signing-key\") pod \"service-ca-865cb79987-xknth\" (UID: \"dbdd7d9e-7e7c-4d21-af8f-eb048dba3b5c\") " pod="openshift-service-ca/service-ca-865cb79987-xknth" Apr 24 16:41:39.597392 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:39.597288 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x42rr\" (UniqueName: \"kubernetes.io/projected/dbdd7d9e-7e7c-4d21-af8f-eb048dba3b5c-kube-api-access-x42rr\") pod \"service-ca-865cb79987-xknth\" (UID: \"dbdd7d9e-7e7c-4d21-af8f-eb048dba3b5c\") " pod="openshift-service-ca/service-ca-865cb79987-xknth" Apr 24 16:41:39.645946 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:39.645918 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-kdlh8_801dc599-67b9-400f-a14c-835900dba24e/node-ca/0.log" Apr 24 16:41:39.698508 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:39.698402 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/dbdd7d9e-7e7c-4d21-af8f-eb048dba3b5c-signing-key\") pod \"service-ca-865cb79987-xknth\" (UID: \"dbdd7d9e-7e7c-4d21-af8f-eb048dba3b5c\") " pod="openshift-service-ca/service-ca-865cb79987-xknth" Apr 24 16:41:39.698508 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:39.698454 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x42rr\" (UniqueName: \"kubernetes.io/projected/dbdd7d9e-7e7c-4d21-af8f-eb048dba3b5c-kube-api-access-x42rr\") pod \"service-ca-865cb79987-xknth\" (UID: \"dbdd7d9e-7e7c-4d21-af8f-eb048dba3b5c\") " pod="openshift-service-ca/service-ca-865cb79987-xknth" Apr 24 16:41:39.698770 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:39.698517 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/dbdd7d9e-7e7c-4d21-af8f-eb048dba3b5c-signing-cabundle\") pod \"service-ca-865cb79987-xknth\" (UID: \"dbdd7d9e-7e7c-4d21-af8f-eb048dba3b5c\") " pod="openshift-service-ca/service-ca-865cb79987-xknth" Apr 24 16:41:39.699248 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:39.699222 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/dbdd7d9e-7e7c-4d21-af8f-eb048dba3b5c-signing-cabundle\") pod \"service-ca-865cb79987-xknth\" (UID: \"dbdd7d9e-7e7c-4d21-af8f-eb048dba3b5c\") " pod="openshift-service-ca/service-ca-865cb79987-xknth" Apr 24 16:41:39.701095 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:39.701078 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/dbdd7d9e-7e7c-4d21-af8f-eb048dba3b5c-signing-key\") pod \"service-ca-865cb79987-xknth\" (UID: \"dbdd7d9e-7e7c-4d21-af8f-eb048dba3b5c\") " pod="openshift-service-ca/service-ca-865cb79987-xknth" Apr 24 16:41:39.706703 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:39.706679 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x42rr\" (UniqueName: \"kubernetes.io/projected/dbdd7d9e-7e7c-4d21-af8f-eb048dba3b5c-kube-api-access-x42rr\") pod \"service-ca-865cb79987-xknth\" (UID: \"dbdd7d9e-7e7c-4d21-af8f-eb048dba3b5c\") " pod="openshift-service-ca/service-ca-865cb79987-xknth" Apr 24 16:41:39.750697 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:39.750662 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-xknth" Apr 24 16:41:39.872013 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:39.871976 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-xknth"] Apr 24 16:41:39.875131 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:41:39.875090 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbdd7d9e_7e7c_4d21_af8f_eb048dba3b5c.slice/crio-0ed9d8fd28f75e343f3c266cae1146a3f59022d889fa5793317f7fa025ffc737 WatchSource:0}: Error finding container 0ed9d8fd28f75e343f3c266cae1146a3f59022d889fa5793317f7fa025ffc737: Status 404 returned error can't find the container with id 0ed9d8fd28f75e343f3c266cae1146a3f59022d889fa5793317f7fa025ffc737 Apr 24 16:41:40.053298 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:40.053263 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-xknth" event={"ID":"dbdd7d9e-7e7c-4d21-af8f-eb048dba3b5c","Type":"ContainerStarted","Data":"9e77c952021b487323acd58aa885c30a55cf9c96e6c212bdb345674e159fcb2a"} Apr 24 16:41:40.053298 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:40.053302 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-xknth" event={"ID":"dbdd7d9e-7e7c-4d21-af8f-eb048dba3b5c","Type":"ContainerStarted","Data":"0ed9d8fd28f75e343f3c266cae1146a3f59022d889fa5793317f7fa025ffc737"} Apr 24 16:41:40.073542 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:40.073489 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-xknth" podStartSLOduration=1.073474341 podStartE2EDuration="1.073474341s" podCreationTimestamp="2026-04-24 16:41:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:41:40.07220505 +0000 UTC m=+136.005274248" watchObservedRunningTime="2026-04-24 16:41:40.073474341 +0000 UTC m=+136.006543539" Apr 24 16:41:40.777034 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:40.776991 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-52mlp" Apr 24 16:41:40.777034 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:40.777040 2579 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-52mlp" Apr 24 16:41:40.777513 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:40.777499 2579 scope.go:117] "RemoveContainer" containerID="44fe86a33b0dc2910c2eb18b1375deb9cc95e2773f127b8e7d8ccbebf1464a3c" Apr 24 16:41:40.777752 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:41:40.777708 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-52mlp_openshift-console-operator(9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80)\"" pod="openshift-console-operator/console-operator-9d4b6777b-52mlp" podUID="9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80" Apr 24 16:41:46.254000 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:46.253958 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a32d7d16-b384-47fc-a565-2b51f6f8c945-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-5nnkx\" (UID: \"a32d7d16-b384-47fc-a565-2b51f6f8c945\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5nnkx" Apr 24 16:41:46.254376 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:41:46.254102 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 16:41:46.254376 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:41:46.254168 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a32d7d16-b384-47fc-a565-2b51f6f8c945-cluster-monitoring-operator-tls podName:a32d7d16-b384-47fc-a565-2b51f6f8c945 nodeName:}" failed. No retries permitted until 2026-04-24 16:42:02.254150626 +0000 UTC m=+158.187219823 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a32d7d16-b384-47fc-a565-2b51f6f8c945-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-5nnkx" (UID: "a32d7d16-b384-47fc-a565-2b51f6f8c945") : secret "cluster-monitoring-operator-tls" not found Apr 24 16:41:55.616277 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:55.616245 2579 scope.go:117] "RemoveContainer" containerID="44fe86a33b0dc2910c2eb18b1375deb9cc95e2773f127b8e7d8ccbebf1464a3c" Apr 24 16:41:56.096438 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:56.096408 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-52mlp_9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80/console-operator/2.log" Apr 24 16:41:56.096821 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:56.096805 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-52mlp_9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80/console-operator/1.log" Apr 24 16:41:56.096884 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:56.096838 2579 generic.go:358] "Generic (PLEG): container finished" podID="9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80" containerID="93d6c3ce2ad1197e27f605ad1f8cb8c37d942a83c58c0e48e6e6a432044c92e0" exitCode=255 Apr 24 16:41:56.096919 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:56.096895 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-52mlp" event={"ID":"9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80","Type":"ContainerDied","Data":"93d6c3ce2ad1197e27f605ad1f8cb8c37d942a83c58c0e48e6e6a432044c92e0"} Apr 24 16:41:56.096952 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:56.096927 2579 scope.go:117] "RemoveContainer" containerID="44fe86a33b0dc2910c2eb18b1375deb9cc95e2773f127b8e7d8ccbebf1464a3c" Apr 24 16:41:56.100944 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:56.100915 2579 scope.go:117] "RemoveContainer" containerID="93d6c3ce2ad1197e27f605ad1f8cb8c37d942a83c58c0e48e6e6a432044c92e0" Apr 24 16:41:56.101187 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:41:56.101169 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-52mlp_openshift-console-operator(9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80)\"" pod="openshift-console-operator/console-operator-9d4b6777b-52mlp" podUID="9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80" Apr 24 16:41:57.100131 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:41:57.100100 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-52mlp_9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80/console-operator/2.log" Apr 24 16:42:00.536443 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:42:00.536398 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-rgpmf" podUID="4b53d555-6ece-4be6-a70d-30c64956654b" Apr 24 16:42:00.542574 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:42:00.542535 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-pkmpl" podUID="fc3c012b-d60b-47fa-aa82-a2d3bb5649b3" Apr 24 16:42:00.632932 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:42:00.632889 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-kmh29" podUID="4af5b09f-ceea-413a-bec5-40a2b59c7ea3" Apr 24 16:42:00.776214 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:00.776180 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-52mlp" Apr 24 16:42:00.776214 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:00.776221 2579 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-52mlp" Apr 24 16:42:00.776559 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:00.776546 2579 scope.go:117] "RemoveContainer" containerID="93d6c3ce2ad1197e27f605ad1f8cb8c37d942a83c58c0e48e6e6a432044c92e0" Apr 24 16:42:00.776763 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:42:00.776719 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-52mlp_openshift-console-operator(9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80)\"" pod="openshift-console-operator/console-operator-9d4b6777b-52mlp" podUID="9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80" Apr 24 16:42:01.110290 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:01.110254 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-pkmpl" Apr 24 16:42:01.416685 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:01.416652 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-d7n6q"] Apr 24 16:42:01.419876 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:01.419853 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-d7n6q" Apr 24 16:42:01.422802 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:01.422773 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 16:42:01.422802 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:01.422788 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 16:42:01.422802 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:01.422777 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-jgncl\"" Apr 24 16:42:01.429111 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:01.429086 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-d7n6q"] Apr 24 16:42:01.474543 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:01.474499 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/15b2786f-0566-4570-a7f7-f09fd69c6f54-data-volume\") pod \"insights-runtime-extractor-d7n6q\" (UID: \"15b2786f-0566-4570-a7f7-f09fd69c6f54\") " pod="openshift-insights/insights-runtime-extractor-d7n6q" Apr 24 16:42:01.474769 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:01.474615 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwbx4\" (UniqueName: \"kubernetes.io/projected/15b2786f-0566-4570-a7f7-f09fd69c6f54-kube-api-access-zwbx4\") pod \"insights-runtime-extractor-d7n6q\" (UID: \"15b2786f-0566-4570-a7f7-f09fd69c6f54\") " pod="openshift-insights/insights-runtime-extractor-d7n6q" Apr 24 16:42:01.474769 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:01.474648 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/15b2786f-0566-4570-a7f7-f09fd69c6f54-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-d7n6q\" (UID: \"15b2786f-0566-4570-a7f7-f09fd69c6f54\") " pod="openshift-insights/insights-runtime-extractor-d7n6q" Apr 24 16:42:01.474769 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:01.474709 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/15b2786f-0566-4570-a7f7-f09fd69c6f54-crio-socket\") pod \"insights-runtime-extractor-d7n6q\" (UID: \"15b2786f-0566-4570-a7f7-f09fd69c6f54\") " pod="openshift-insights/insights-runtime-extractor-d7n6q" Apr 24 16:42:01.474927 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:01.474792 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/15b2786f-0566-4570-a7f7-f09fd69c6f54-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-d7n6q\" (UID: \"15b2786f-0566-4570-a7f7-f09fd69c6f54\") " pod="openshift-insights/insights-runtime-extractor-d7n6q" Apr 24 16:42:01.575935 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:01.575894 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/15b2786f-0566-4570-a7f7-f09fd69c6f54-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-d7n6q\" (UID: \"15b2786f-0566-4570-a7f7-f09fd69c6f54\") " pod="openshift-insights/insights-runtime-extractor-d7n6q" Apr 24 16:42:01.576399 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:01.575970 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/15b2786f-0566-4570-a7f7-f09fd69c6f54-data-volume\") pod \"insights-runtime-extractor-d7n6q\" (UID: \"15b2786f-0566-4570-a7f7-f09fd69c6f54\") " pod="openshift-insights/insights-runtime-extractor-d7n6q" Apr 24 16:42:01.576399 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:01.576000 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zwbx4\" (UniqueName: \"kubernetes.io/projected/15b2786f-0566-4570-a7f7-f09fd69c6f54-kube-api-access-zwbx4\") pod \"insights-runtime-extractor-d7n6q\" (UID: \"15b2786f-0566-4570-a7f7-f09fd69c6f54\") " pod="openshift-insights/insights-runtime-extractor-d7n6q" Apr 24 16:42:01.576399 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:01.576020 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/15b2786f-0566-4570-a7f7-f09fd69c6f54-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-d7n6q\" (UID: \"15b2786f-0566-4570-a7f7-f09fd69c6f54\") " pod="openshift-insights/insights-runtime-extractor-d7n6q" Apr 24 16:42:01.576399 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:01.576051 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/15b2786f-0566-4570-a7f7-f09fd69c6f54-crio-socket\") pod \"insights-runtime-extractor-d7n6q\" (UID: \"15b2786f-0566-4570-a7f7-f09fd69c6f54\") " pod="openshift-insights/insights-runtime-extractor-d7n6q" Apr 24 16:42:01.576399 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:01.576120 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/15b2786f-0566-4570-a7f7-f09fd69c6f54-crio-socket\") pod \"insights-runtime-extractor-d7n6q\" (UID: \"15b2786f-0566-4570-a7f7-f09fd69c6f54\") " pod="openshift-insights/insights-runtime-extractor-d7n6q" Apr 24 16:42:01.576591 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:01.576417 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/15b2786f-0566-4570-a7f7-f09fd69c6f54-data-volume\") pod \"insights-runtime-extractor-d7n6q\" (UID: \"15b2786f-0566-4570-a7f7-f09fd69c6f54\") " pod="openshift-insights/insights-runtime-extractor-d7n6q" Apr 24 16:42:01.576591 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:01.576436 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/15b2786f-0566-4570-a7f7-f09fd69c6f54-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-d7n6q\" (UID: \"15b2786f-0566-4570-a7f7-f09fd69c6f54\") " pod="openshift-insights/insights-runtime-extractor-d7n6q" Apr 24 16:42:01.578514 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:01.578495 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/15b2786f-0566-4570-a7f7-f09fd69c6f54-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-d7n6q\" (UID: \"15b2786f-0566-4570-a7f7-f09fd69c6f54\") " pod="openshift-insights/insights-runtime-extractor-d7n6q" Apr 24 16:42:01.590145 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:01.590122 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwbx4\" (UniqueName: \"kubernetes.io/projected/15b2786f-0566-4570-a7f7-f09fd69c6f54-kube-api-access-zwbx4\") pod \"insights-runtime-extractor-d7n6q\" (UID: \"15b2786f-0566-4570-a7f7-f09fd69c6f54\") " pod="openshift-insights/insights-runtime-extractor-d7n6q" Apr 24 16:42:01.729397 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:01.729307 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-d7n6q" Apr 24 16:42:01.865130 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:01.865091 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-d7n6q"] Apr 24 16:42:01.871097 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:42:01.871057 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15b2786f_0566_4570_a7f7_f09fd69c6f54.slice/crio-e0d80a266420c00a0ae878979ea21793a2297f689ade136069af403c63cd9a02 WatchSource:0}: Error finding container e0d80a266420c00a0ae878979ea21793a2297f689ade136069af403c63cd9a02: Status 404 returned error can't find the container with id e0d80a266420c00a0ae878979ea21793a2297f689ade136069af403c63cd9a02 Apr 24 16:42:02.113986 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:02.113952 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-d7n6q" event={"ID":"15b2786f-0566-4570-a7f7-f09fd69c6f54","Type":"ContainerStarted","Data":"2ad2199873003e67e3ddbe93c24cd8c061c4031f998ee8bb130681760373dd83"} Apr 24 16:42:02.113986 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:02.113988 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-d7n6q" event={"ID":"15b2786f-0566-4570-a7f7-f09fd69c6f54","Type":"ContainerStarted","Data":"e0d80a266420c00a0ae878979ea21793a2297f689ade136069af403c63cd9a02"} Apr 24 16:42:02.280849 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:02.280805 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a32d7d16-b384-47fc-a565-2b51f6f8c945-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-5nnkx\" (UID: \"a32d7d16-b384-47fc-a565-2b51f6f8c945\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5nnkx" Apr 24 16:42:02.283327 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:02.283299 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a32d7d16-b384-47fc-a565-2b51f6f8c945-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-5nnkx\" (UID: \"a32d7d16-b384-47fc-a565-2b51f6f8c945\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5nnkx" Apr 24 16:42:02.579686 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:02.579664 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5nnkx" Apr 24 16:42:02.699176 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:02.699146 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-5nnkx"] Apr 24 16:42:02.702499 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:42:02.702469 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda32d7d16_b384_47fc_a565_2b51f6f8c945.slice/crio-4c07c8bd6232e2ec4b1dde7fefecd916236d08109c0d02412783799b368bdab6 WatchSource:0}: Error finding container 4c07c8bd6232e2ec4b1dde7fefecd916236d08109c0d02412783799b368bdab6: Status 404 returned error can't find the container with id 4c07c8bd6232e2ec4b1dde7fefecd916236d08109c0d02412783799b368bdab6 Apr 24 16:42:03.118024 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:03.117977 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-d7n6q" event={"ID":"15b2786f-0566-4570-a7f7-f09fd69c6f54","Type":"ContainerStarted","Data":"c997edb2ed45342b4f1d62b0ba72a475a480cd4e2d97dd642d39d2eab55249b7"} Apr 24 16:42:03.119265 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:03.119222 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5nnkx" event={"ID":"a32d7d16-b384-47fc-a565-2b51f6f8c945","Type":"ContainerStarted","Data":"4c07c8bd6232e2ec4b1dde7fefecd916236d08109c0d02412783799b368bdab6"} Apr 24 16:42:05.126677 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:05.126640 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-d7n6q" event={"ID":"15b2786f-0566-4570-a7f7-f09fd69c6f54","Type":"ContainerStarted","Data":"19e0585e0bbe373defc49b67c4b5b3193fd5aeabd2fe549295b070d7e9c318e2"} Apr 24 16:42:05.128151 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:05.128117 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5nnkx" event={"ID":"a32d7d16-b384-47fc-a565-2b51f6f8c945","Type":"ContainerStarted","Data":"5292d141695847bb161afaf7ec22dde8c0d13eefc988493c89119c8e6bfbbbcd"} Apr 24 16:42:05.149796 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:05.149717 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-d7n6q" podStartSLOduration=1.663951445 podStartE2EDuration="4.149704107s" podCreationTimestamp="2026-04-24 16:42:01 +0000 UTC" firstStartedPulling="2026-04-24 16:42:01.931215623 +0000 UTC m=+157.864284800" lastFinishedPulling="2026-04-24 16:42:04.416968283 +0000 UTC m=+160.350037462" observedRunningTime="2026-04-24 16:42:05.149161205 +0000 UTC m=+161.082230405" watchObservedRunningTime="2026-04-24 16:42:05.149704107 +0000 UTC m=+161.082773305" Apr 24 16:42:05.164852 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:05.164802 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5nnkx" podStartSLOduration=33.448402207 podStartE2EDuration="35.164787531s" podCreationTimestamp="2026-04-24 16:41:30 +0000 UTC" firstStartedPulling="2026-04-24 16:42:02.704320306 +0000 UTC m=+158.637389483" lastFinishedPulling="2026-04-24 16:42:04.42070563 +0000 UTC m=+160.353774807" observedRunningTime="2026-04-24 16:42:05.164247893 +0000 UTC m=+161.097317136" watchObservedRunningTime="2026-04-24 16:42:05.164787531 +0000 UTC m=+161.097856729" Apr 24 16:42:05.406938 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:05.406840 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fc3c012b-d60b-47fa-aa82-a2d3bb5649b3-metrics-tls\") pod \"dns-default-pkmpl\" (UID: \"fc3c012b-d60b-47fa-aa82-a2d3bb5649b3\") " pod="openshift-dns/dns-default-pkmpl" Apr 24 16:42:05.409230 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:05.409211 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fc3c012b-d60b-47fa-aa82-a2d3bb5649b3-metrics-tls\") pod \"dns-default-pkmpl\" (UID: \"fc3c012b-d60b-47fa-aa82-a2d3bb5649b3\") " pod="openshift-dns/dns-default-pkmpl" Apr 24 16:42:05.507913 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:05.507872 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4b53d555-6ece-4be6-a70d-30c64956654b-cert\") pod \"ingress-canary-rgpmf\" (UID: \"4b53d555-6ece-4be6-a70d-30c64956654b\") " pod="openshift-ingress-canary/ingress-canary-rgpmf" Apr 24 16:42:05.510389 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:05.510369 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4b53d555-6ece-4be6-a70d-30c64956654b-cert\") pod \"ingress-canary-rgpmf\" (UID: \"4b53d555-6ece-4be6-a70d-30c64956654b\") " pod="openshift-ingress-canary/ingress-canary-rgpmf" Apr 24 16:42:05.613874 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:05.613843 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-ljxll\"" Apr 24 16:42:05.622395 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:05.622358 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-pkmpl" Apr 24 16:42:05.762598 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:05.762564 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-pkmpl"] Apr 24 16:42:05.765683 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:42:05.765653 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc3c012b_d60b_47fa_aa82_a2d3bb5649b3.slice/crio-85be51c53b760bdbf3edf7a5fe78fe51c33aa10f291e7027feaaf84ab24c4cee WatchSource:0}: Error finding container 85be51c53b760bdbf3edf7a5fe78fe51c33aa10f291e7027feaaf84ab24c4cee: Status 404 returned error can't find the container with id 85be51c53b760bdbf3edf7a5fe78fe51c33aa10f291e7027feaaf84ab24c4cee Apr 24 16:42:06.131595 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:06.131559 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pkmpl" event={"ID":"fc3c012b-d60b-47fa-aa82-a2d3bb5649b3","Type":"ContainerStarted","Data":"85be51c53b760bdbf3edf7a5fe78fe51c33aa10f291e7027feaaf84ab24c4cee"} Apr 24 16:42:08.018780 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:08.018726 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-chvg7"] Apr 24 16:42:08.020675 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:08.020660 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-chvg7" Apr 24 16:42:08.024879 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:08.024851 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-8vq94\"" Apr 24 16:42:08.024879 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:08.024851 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 24 16:42:08.025042 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:08.024905 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 16:42:08.025042 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:08.024853 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 24 16:42:08.032247 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:08.032212 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-chvg7"] Apr 24 16:42:08.127468 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:08.127430 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fe961a5e-d33f-4c72-a253-e83a37663457-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-chvg7\" (UID: \"fe961a5e-d33f-4c72-a253-e83a37663457\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-chvg7" Apr 24 16:42:08.127631 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:08.127476 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/fe961a5e-d33f-4c72-a253-e83a37663457-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-chvg7\" (UID: \"fe961a5e-d33f-4c72-a253-e83a37663457\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-chvg7" Apr 24 16:42:08.127631 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:08.127546 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fe961a5e-d33f-4c72-a253-e83a37663457-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-chvg7\" (UID: \"fe961a5e-d33f-4c72-a253-e83a37663457\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-chvg7" Apr 24 16:42:08.127631 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:08.127599 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7ghh\" (UniqueName: \"kubernetes.io/projected/fe961a5e-d33f-4c72-a253-e83a37663457-kube-api-access-s7ghh\") pod \"prometheus-operator-5676c8c784-chvg7\" (UID: \"fe961a5e-d33f-4c72-a253-e83a37663457\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-chvg7" Apr 24 16:42:08.139091 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:08.139062 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pkmpl" event={"ID":"fc3c012b-d60b-47fa-aa82-a2d3bb5649b3","Type":"ContainerStarted","Data":"dc6f2b7e23c1031039c3c0f5eea7f01e0e8b50a60ec4fa8d16c5dd104bfdcca4"} Apr 24 16:42:08.139240 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:08.139097 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pkmpl" event={"ID":"fc3c012b-d60b-47fa-aa82-a2d3bb5649b3","Type":"ContainerStarted","Data":"d7f119af108b02f7586e69c486f3211aa9c04ba1562b33cc7ebcf3c4c1ee7c31"} Apr 24 16:42:08.139240 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:08.139174 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-pkmpl" Apr 24 16:42:08.157460 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:08.157414 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-pkmpl" podStartSLOduration=129.640012562 podStartE2EDuration="2m11.157401464s" podCreationTimestamp="2026-04-24 16:39:57 +0000 UTC" firstStartedPulling="2026-04-24 16:42:05.767619065 +0000 UTC m=+161.700688243" lastFinishedPulling="2026-04-24 16:42:07.285007967 +0000 UTC m=+163.218077145" observedRunningTime="2026-04-24 16:42:08.156462292 +0000 UTC m=+164.089531491" watchObservedRunningTime="2026-04-24 16:42:08.157401464 +0000 UTC m=+164.090470662" Apr 24 16:42:08.228903 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:08.228867 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fe961a5e-d33f-4c72-a253-e83a37663457-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-chvg7\" (UID: \"fe961a5e-d33f-4c72-a253-e83a37663457\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-chvg7" Apr 24 16:42:08.228903 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:08.228903 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/fe961a5e-d33f-4c72-a253-e83a37663457-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-chvg7\" (UID: \"fe961a5e-d33f-4c72-a253-e83a37663457\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-chvg7" Apr 24 16:42:08.229107 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:42:08.229018 2579 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 24 16:42:08.229107 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:42:08.229081 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe961a5e-d33f-4c72-a253-e83a37663457-prometheus-operator-tls podName:fe961a5e-d33f-4c72-a253-e83a37663457 nodeName:}" failed. No retries permitted until 2026-04-24 16:42:08.729062273 +0000 UTC m=+164.662131451 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/fe961a5e-d33f-4c72-a253-e83a37663457-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-chvg7" (UID: "fe961a5e-d33f-4c72-a253-e83a37663457") : secret "prometheus-operator-tls" not found Apr 24 16:42:08.229107 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:08.229098 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fe961a5e-d33f-4c72-a253-e83a37663457-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-chvg7\" (UID: \"fe961a5e-d33f-4c72-a253-e83a37663457\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-chvg7" Apr 24 16:42:08.229279 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:08.229168 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s7ghh\" (UniqueName: \"kubernetes.io/projected/fe961a5e-d33f-4c72-a253-e83a37663457-kube-api-access-s7ghh\") pod \"prometheus-operator-5676c8c784-chvg7\" (UID: \"fe961a5e-d33f-4c72-a253-e83a37663457\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-chvg7" Apr 24 16:42:08.229855 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:08.229832 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fe961a5e-d33f-4c72-a253-e83a37663457-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-chvg7\" (UID: \"fe961a5e-d33f-4c72-a253-e83a37663457\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-chvg7" Apr 24 16:42:08.231491 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:08.231462 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fe961a5e-d33f-4c72-a253-e83a37663457-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-chvg7\" (UID: \"fe961a5e-d33f-4c72-a253-e83a37663457\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-chvg7" Apr 24 16:42:08.238106 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:08.238083 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7ghh\" (UniqueName: \"kubernetes.io/projected/fe961a5e-d33f-4c72-a253-e83a37663457-kube-api-access-s7ghh\") pod \"prometheus-operator-5676c8c784-chvg7\" (UID: \"fe961a5e-d33f-4c72-a253-e83a37663457\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-chvg7" Apr 24 16:42:08.732550 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:08.732493 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/fe961a5e-d33f-4c72-a253-e83a37663457-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-chvg7\" (UID: \"fe961a5e-d33f-4c72-a253-e83a37663457\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-chvg7" Apr 24 16:42:08.735060 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:08.735028 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/fe961a5e-d33f-4c72-a253-e83a37663457-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-chvg7\" (UID: \"fe961a5e-d33f-4c72-a253-e83a37663457\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-chvg7" Apr 24 16:42:08.930454 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:08.930414 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-chvg7" Apr 24 16:42:09.049708 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:09.049679 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-chvg7"] Apr 24 16:42:09.052433 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:42:09.052404 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe961a5e_d33f_4c72_a253_e83a37663457.slice/crio-53265254f3c18feb577d2e343f4a747e2fc7b8d6c3aa05f5832eda857b1a9681 WatchSource:0}: Error finding container 53265254f3c18feb577d2e343f4a747e2fc7b8d6c3aa05f5832eda857b1a9681: Status 404 returned error can't find the container with id 53265254f3c18feb577d2e343f4a747e2fc7b8d6c3aa05f5832eda857b1a9681 Apr 24 16:42:09.142143 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:09.142099 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-chvg7" event={"ID":"fe961a5e-d33f-4c72-a253-e83a37663457","Type":"ContainerStarted","Data":"53265254f3c18feb577d2e343f4a747e2fc7b8d6c3aa05f5832eda857b1a9681"} Apr 24 16:42:11.149880 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:11.149847 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-chvg7" event={"ID":"fe961a5e-d33f-4c72-a253-e83a37663457","Type":"ContainerStarted","Data":"e45e980b5cad60dc76e98ae9e329e26f0ce4ff7a82041a1957658580923ae59c"} Apr 24 16:42:11.149880 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:11.149882 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-chvg7" event={"ID":"fe961a5e-d33f-4c72-a253-e83a37663457","Type":"ContainerStarted","Data":"300aa6b743875a51ec6890e4c024051c9aa7d1a76526c7d553309c2775b9241d"} Apr 24 16:42:11.169884 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:11.169839 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-chvg7" podStartSLOduration=2.100099771 podStartE2EDuration="3.16982386s" podCreationTimestamp="2026-04-24 16:42:08 +0000 UTC" firstStartedPulling="2026-04-24 16:42:09.054193698 +0000 UTC m=+164.987262875" lastFinishedPulling="2026-04-24 16:42:10.123917786 +0000 UTC m=+166.056986964" observedRunningTime="2026-04-24 16:42:11.168752509 +0000 UTC m=+167.101821712" watchObservedRunningTime="2026-04-24 16:42:11.16982386 +0000 UTC m=+167.102893057" Apr 24 16:42:13.362253 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:13.362219 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-cfv6k"] Apr 24 16:42:13.365996 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:13.365977 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cfv6k" Apr 24 16:42:13.369990 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:13.369966 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 24 16:42:13.370124 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:13.369971 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 24 16:42:13.370397 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:13.370381 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-4l6wx\"" Apr 24 16:42:13.376594 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:13.376573 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-cfv6k"] Apr 24 16:42:13.394156 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:13.394131 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-2tmrq"] Apr 24 16:42:13.397858 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:13.397838 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-2tmrq" Apr 24 16:42:13.400511 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:13.400489 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 16:42:13.400618 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:13.400522 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 16:42:13.400618 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:13.400574 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-4464q\"" Apr 24 16:42:13.400755 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:13.400640 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 16:42:13.469055 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:13.469020 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e68c601e-ace9-461f-8270-139af643bb24-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2tmrq\" (UID: \"e68c601e-ace9-461f-8270-139af643bb24\") " pod="openshift-monitoring/node-exporter-2tmrq" Apr 24 16:42:13.469055 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:13.469054 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e68c601e-ace9-461f-8270-139af643bb24-node-exporter-accelerators-collector-config\") pod \"node-exporter-2tmrq\" (UID: \"e68c601e-ace9-461f-8270-139af643bb24\") " pod="openshift-monitoring/node-exporter-2tmrq" Apr 24 16:42:13.469289 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:13.469075 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/08d16e0f-be67-4d98-93a9-25a13038cbb7-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-cfv6k\" (UID: \"08d16e0f-be67-4d98-93a9-25a13038cbb7\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cfv6k" Apr 24 16:42:13.469289 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:13.469098 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e68c601e-ace9-461f-8270-139af643bb24-node-exporter-wtmp\") pod \"node-exporter-2tmrq\" (UID: \"e68c601e-ace9-461f-8270-139af643bb24\") " pod="openshift-monitoring/node-exporter-2tmrq" Apr 24 16:42:13.469289 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:13.469163 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvcj9\" (UniqueName: \"kubernetes.io/projected/e68c601e-ace9-461f-8270-139af643bb24-kube-api-access-vvcj9\") pod \"node-exporter-2tmrq\" (UID: \"e68c601e-ace9-461f-8270-139af643bb24\") " pod="openshift-monitoring/node-exporter-2tmrq" Apr 24 16:42:13.469289 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:13.469204 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9b797\" (UniqueName: \"kubernetes.io/projected/08d16e0f-be67-4d98-93a9-25a13038cbb7-kube-api-access-9b797\") pod \"openshift-state-metrics-9d44df66c-cfv6k\" (UID: \"08d16e0f-be67-4d98-93a9-25a13038cbb7\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cfv6k" Apr 24 16:42:13.469289 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:13.469267 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e68c601e-ace9-461f-8270-139af643bb24-sys\") pod \"node-exporter-2tmrq\" (UID: \"e68c601e-ace9-461f-8270-139af643bb24\") " pod="openshift-monitoring/node-exporter-2tmrq" Apr 24 16:42:13.469692 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:13.469327 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e68c601e-ace9-461f-8270-139af643bb24-node-exporter-tls\") pod \"node-exporter-2tmrq\" (UID: \"e68c601e-ace9-461f-8270-139af643bb24\") " pod="openshift-monitoring/node-exporter-2tmrq" Apr 24 16:42:13.469692 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:13.469389 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/08d16e0f-be67-4d98-93a9-25a13038cbb7-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-cfv6k\" (UID: \"08d16e0f-be67-4d98-93a9-25a13038cbb7\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cfv6k" Apr 24 16:42:13.469692 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:13.469430 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e68c601e-ace9-461f-8270-139af643bb24-root\") pod \"node-exporter-2tmrq\" (UID: \"e68c601e-ace9-461f-8270-139af643bb24\") " pod="openshift-monitoring/node-exporter-2tmrq" Apr 24 16:42:13.469692 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:13.469460 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/08d16e0f-be67-4d98-93a9-25a13038cbb7-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-cfv6k\" (UID: \"08d16e0f-be67-4d98-93a9-25a13038cbb7\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cfv6k" Apr 24 16:42:13.469692 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:13.469490 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e68c601e-ace9-461f-8270-139af643bb24-metrics-client-ca\") pod \"node-exporter-2tmrq\" (UID: \"e68c601e-ace9-461f-8270-139af643bb24\") " pod="openshift-monitoring/node-exporter-2tmrq" Apr 24 16:42:13.469692 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:13.469528 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e68c601e-ace9-461f-8270-139af643bb24-node-exporter-textfile\") pod \"node-exporter-2tmrq\" (UID: \"e68c601e-ace9-461f-8270-139af643bb24\") " pod="openshift-monitoring/node-exporter-2tmrq" Apr 24 16:42:13.570372 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:13.570335 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e68c601e-ace9-461f-8270-139af643bb24-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2tmrq\" (UID: \"e68c601e-ace9-461f-8270-139af643bb24\") " pod="openshift-monitoring/node-exporter-2tmrq" Apr 24 16:42:13.570372 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:13.570376 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e68c601e-ace9-461f-8270-139af643bb24-node-exporter-accelerators-collector-config\") pod \"node-exporter-2tmrq\" (UID: \"e68c601e-ace9-461f-8270-139af643bb24\") " pod="openshift-monitoring/node-exporter-2tmrq" Apr 24 16:42:13.570605 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:13.570393 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/08d16e0f-be67-4d98-93a9-25a13038cbb7-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-cfv6k\" (UID: \"08d16e0f-be67-4d98-93a9-25a13038cbb7\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cfv6k" Apr 24 16:42:13.570605 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:13.570484 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e68c601e-ace9-461f-8270-139af643bb24-node-exporter-wtmp\") pod \"node-exporter-2tmrq\" (UID: \"e68c601e-ace9-461f-8270-139af643bb24\") " pod="openshift-monitoring/node-exporter-2tmrq" Apr 24 16:42:13.570605 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:13.570531 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vvcj9\" (UniqueName: \"kubernetes.io/projected/e68c601e-ace9-461f-8270-139af643bb24-kube-api-access-vvcj9\") pod \"node-exporter-2tmrq\" (UID: \"e68c601e-ace9-461f-8270-139af643bb24\") " pod="openshift-monitoring/node-exporter-2tmrq" Apr 24 16:42:13.570605 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:13.570558 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9b797\" (UniqueName: \"kubernetes.io/projected/08d16e0f-be67-4d98-93a9-25a13038cbb7-kube-api-access-9b797\") pod \"openshift-state-metrics-9d44df66c-cfv6k\" (UID: \"08d16e0f-be67-4d98-93a9-25a13038cbb7\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cfv6k" Apr 24 16:42:13.570605 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:13.570588 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e68c601e-ace9-461f-8270-139af643bb24-sys\") pod \"node-exporter-2tmrq\" (UID: \"e68c601e-ace9-461f-8270-139af643bb24\") " pod="openshift-monitoring/node-exporter-2tmrq" Apr 24 16:42:13.570933 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:13.570621 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e68c601e-ace9-461f-8270-139af643bb24-node-exporter-tls\") pod \"node-exporter-2tmrq\" (UID: \"e68c601e-ace9-461f-8270-139af643bb24\") " pod="openshift-monitoring/node-exporter-2tmrq" Apr 24 16:42:13.570933 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:13.570679 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/08d16e0f-be67-4d98-93a9-25a13038cbb7-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-cfv6k\" (UID: \"08d16e0f-be67-4d98-93a9-25a13038cbb7\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cfv6k" Apr 24 16:42:13.570933 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:13.570689 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e68c601e-ace9-461f-8270-139af643bb24-node-exporter-wtmp\") pod \"node-exporter-2tmrq\" (UID: \"e68c601e-ace9-461f-8270-139af643bb24\") " pod="openshift-monitoring/node-exporter-2tmrq" Apr 24 16:42:13.570933 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:13.570719 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e68c601e-ace9-461f-8270-139af643bb24-root\") pod \"node-exporter-2tmrq\" (UID: \"e68c601e-ace9-461f-8270-139af643bb24\") " pod="openshift-monitoring/node-exporter-2tmrq" Apr 24 16:42:13.570933 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:13.570774 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/08d16e0f-be67-4d98-93a9-25a13038cbb7-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-cfv6k\" (UID: \"08d16e0f-be67-4d98-93a9-25a13038cbb7\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cfv6k" Apr 24 16:42:13.570933 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:13.570779 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e68c601e-ace9-461f-8270-139af643bb24-sys\") pod \"node-exporter-2tmrq\" (UID: \"e68c601e-ace9-461f-8270-139af643bb24\") " pod="openshift-monitoring/node-exporter-2tmrq" Apr 24 16:42:13.570933 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:13.570805 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e68c601e-ace9-461f-8270-139af643bb24-metrics-client-ca\") pod \"node-exporter-2tmrq\" (UID: \"e68c601e-ace9-461f-8270-139af643bb24\") " pod="openshift-monitoring/node-exporter-2tmrq" Apr 24 16:42:13.570933 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:13.570841 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e68c601e-ace9-461f-8270-139af643bb24-node-exporter-textfile\") pod \"node-exporter-2tmrq\" (UID: \"e68c601e-ace9-461f-8270-139af643bb24\") " pod="openshift-monitoring/node-exporter-2tmrq" Apr 24 16:42:13.571391 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:13.570960 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e68c601e-ace9-461f-8270-139af643bb24-root\") pod \"node-exporter-2tmrq\" (UID: \"e68c601e-ace9-461f-8270-139af643bb24\") " pod="openshift-monitoring/node-exporter-2tmrq" Apr 24 16:42:13.571391 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:13.571076 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e68c601e-ace9-461f-8270-139af643bb24-node-exporter-accelerators-collector-config\") pod \"node-exporter-2tmrq\" (UID: \"e68c601e-ace9-461f-8270-139af643bb24\") " pod="openshift-monitoring/node-exporter-2tmrq" Apr 24 16:42:13.571391 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:13.571143 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e68c601e-ace9-461f-8270-139af643bb24-node-exporter-textfile\") pod \"node-exporter-2tmrq\" (UID: \"e68c601e-ace9-461f-8270-139af643bb24\") " pod="openshift-monitoring/node-exporter-2tmrq" Apr 24 16:42:13.571391 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:13.571162 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/08d16e0f-be67-4d98-93a9-25a13038cbb7-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-cfv6k\" (UID: \"08d16e0f-be67-4d98-93a9-25a13038cbb7\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cfv6k" Apr 24 16:42:13.573476 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:13.573453 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e68c601e-ace9-461f-8270-139af643bb24-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2tmrq\" (UID: \"e68c601e-ace9-461f-8270-139af643bb24\") " pod="openshift-monitoring/node-exporter-2tmrq" Apr 24 16:42:13.573804 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:13.573788 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e68c601e-ace9-461f-8270-139af643bb24-metrics-client-ca\") pod \"node-exporter-2tmrq\" (UID: \"e68c601e-ace9-461f-8270-139af643bb24\") " pod="openshift-monitoring/node-exporter-2tmrq" Apr 24 16:42:13.573990 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:13.573968 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/08d16e0f-be67-4d98-93a9-25a13038cbb7-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-cfv6k\" (UID: \"08d16e0f-be67-4d98-93a9-25a13038cbb7\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cfv6k" Apr 24 16:42:13.574315 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:13.574299 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e68c601e-ace9-461f-8270-139af643bb24-node-exporter-tls\") pod \"node-exporter-2tmrq\" (UID: \"e68c601e-ace9-461f-8270-139af643bb24\") " pod="openshift-monitoring/node-exporter-2tmrq" Apr 24 16:42:13.574465 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:13.574445 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/08d16e0f-be67-4d98-93a9-25a13038cbb7-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-cfv6k\" (UID: \"08d16e0f-be67-4d98-93a9-25a13038cbb7\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cfv6k" Apr 24 16:42:13.581576 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:13.581551 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvcj9\" (UniqueName: \"kubernetes.io/projected/e68c601e-ace9-461f-8270-139af643bb24-kube-api-access-vvcj9\") pod \"node-exporter-2tmrq\" (UID: \"e68c601e-ace9-461f-8270-139af643bb24\") " pod="openshift-monitoring/node-exporter-2tmrq" Apr 24 16:42:13.582193 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:13.582162 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9b797\" (UniqueName: \"kubernetes.io/projected/08d16e0f-be67-4d98-93a9-25a13038cbb7-kube-api-access-9b797\") pod \"openshift-state-metrics-9d44df66c-cfv6k\" (UID: \"08d16e0f-be67-4d98-93a9-25a13038cbb7\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cfv6k" Apr 24 16:42:13.616204 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:13.616124 2579 scope.go:117] "RemoveContainer" containerID="93d6c3ce2ad1197e27f605ad1f8cb8c37d942a83c58c0e48e6e6a432044c92e0" Apr 24 16:42:13.616433 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:42:13.616402 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-52mlp_openshift-console-operator(9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80)\"" pod="openshift-console-operator/console-operator-9d4b6777b-52mlp" podUID="9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80" Apr 24 16:42:13.677395 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:13.677367 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cfv6k" Apr 24 16:42:13.708206 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:13.708173 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-2tmrq" Apr 24 16:42:13.718574 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:42:13.718539 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode68c601e_ace9_461f_8270_139af643bb24.slice/crio-2a250d5987c63cab5e09825907b2a690266eb0026a097827bec5e13391446084 WatchSource:0}: Error finding container 2a250d5987c63cab5e09825907b2a690266eb0026a097827bec5e13391446084: Status 404 returned error can't find the container with id 2a250d5987c63cab5e09825907b2a690266eb0026a097827bec5e13391446084 Apr 24 16:42:13.805688 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:13.805660 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-cfv6k"] Apr 24 16:42:13.808583 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:42:13.808554 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08d16e0f_be67_4d98_93a9_25a13038cbb7.slice/crio-699342bed3106bd83ee919c5990f5fa8c23de1cf6a151bf2dfb1f443405213ea WatchSource:0}: Error finding container 699342bed3106bd83ee919c5990f5fa8c23de1cf6a151bf2dfb1f443405213ea: Status 404 returned error can't find the container with id 699342bed3106bd83ee919c5990f5fa8c23de1cf6a151bf2dfb1f443405213ea Apr 24 16:42:14.161177 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:14.161089 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2tmrq" event={"ID":"e68c601e-ace9-461f-8270-139af643bb24","Type":"ContainerStarted","Data":"2a250d5987c63cab5e09825907b2a690266eb0026a097827bec5e13391446084"} Apr 24 16:42:14.163074 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:14.163008 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cfv6k" event={"ID":"08d16e0f-be67-4d98-93a9-25a13038cbb7","Type":"ContainerStarted","Data":"76fe6755b9614ddd2bca4895d7283560fdbec1cfe0b01a6c96fbd21c59378624"} Apr 24 16:42:14.163074 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:14.163078 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cfv6k" event={"ID":"08d16e0f-be67-4d98-93a9-25a13038cbb7","Type":"ContainerStarted","Data":"865e783b5972f0213949837d07275e6bacc4926aa49e21b09d44661b8806f8d5"} Apr 24 16:42:14.163263 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:14.163093 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cfv6k" event={"ID":"08d16e0f-be67-4d98-93a9-25a13038cbb7","Type":"ContainerStarted","Data":"699342bed3106bd83ee919c5990f5fa8c23de1cf6a151bf2dfb1f443405213ea"} Apr 24 16:42:14.619489 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:14.619456 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rgpmf" Apr 24 16:42:14.621983 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:14.621800 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-2lclw\"" Apr 24 16:42:14.630225 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:14.630200 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rgpmf" Apr 24 16:42:14.794861 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:14.794817 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rgpmf"] Apr 24 16:42:14.799276 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:42:14.799241 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b53d555_6ece_4be6_a70d_30c64956654b.slice/crio-04609ac4d879b1d93855fc5c687c1c03d90ded2d12b024b3279ebf1fb6a88ba0 WatchSource:0}: Error finding container 04609ac4d879b1d93855fc5c687c1c03d90ded2d12b024b3279ebf1fb6a88ba0: Status 404 returned error can't find the container with id 04609ac4d879b1d93855fc5c687c1c03d90ded2d12b024b3279ebf1fb6a88ba0 Apr 24 16:42:15.166654 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:15.166614 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rgpmf" event={"ID":"4b53d555-6ece-4be6-a70d-30c64956654b","Type":"ContainerStarted","Data":"04609ac4d879b1d93855fc5c687c1c03d90ded2d12b024b3279ebf1fb6a88ba0"} Apr 24 16:42:15.168155 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:15.168124 2579 generic.go:358] "Generic (PLEG): container finished" podID="e68c601e-ace9-461f-8270-139af643bb24" containerID="4f0d0dce4eb4fc8f4933a2cc3e3c6a794dea5ac5278210394538427c8951e4e2" exitCode=0 Apr 24 16:42:15.168289 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:15.168207 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2tmrq" event={"ID":"e68c601e-ace9-461f-8270-139af643bb24","Type":"ContainerDied","Data":"4f0d0dce4eb4fc8f4933a2cc3e3c6a794dea5ac5278210394538427c8951e4e2"} Apr 24 16:42:15.615944 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:15.615907 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kmh29" Apr 24 16:42:16.174792 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:16.174726 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2tmrq" event={"ID":"e68c601e-ace9-461f-8270-139af643bb24","Type":"ContainerStarted","Data":"910b193c80be725d2a6264cd36f5fad278ba7283706dce03250b65d3dc779356"} Apr 24 16:42:16.174792 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:16.174798 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2tmrq" event={"ID":"e68c601e-ace9-461f-8270-139af643bb24","Type":"ContainerStarted","Data":"f0631e50ada1ab2271c2744d66af24243432547f6de1ec88d96e21c7de1a69d1"} Apr 24 16:42:16.176918 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:16.176880 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cfv6k" event={"ID":"08d16e0f-be67-4d98-93a9-25a13038cbb7","Type":"ContainerStarted","Data":"19434889fede35ef6ba4c8a52b48f35fcd3c665eefae2919404bdf2cf1d512a9"} Apr 24 16:42:16.195402 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:16.195341 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-2tmrq" podStartSLOduration=2.452353826 podStartE2EDuration="3.195323818s" podCreationTimestamp="2026-04-24 16:42:13 +0000 UTC" firstStartedPulling="2026-04-24 16:42:13.720318897 +0000 UTC m=+169.653388084" lastFinishedPulling="2026-04-24 16:42:14.463288896 +0000 UTC m=+170.396358076" observedRunningTime="2026-04-24 16:42:16.193177224 +0000 UTC m=+172.126246438" watchObservedRunningTime="2026-04-24 16:42:16.195323818 +0000 UTC m=+172.128393022" Apr 24 16:42:16.209930 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:16.209886 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-cfv6k" podStartSLOduration=2.017818605 podStartE2EDuration="3.209871116s" podCreationTimestamp="2026-04-24 16:42:13 +0000 UTC" firstStartedPulling="2026-04-24 16:42:13.928120343 +0000 UTC m=+169.861189520" lastFinishedPulling="2026-04-24 16:42:15.12017284 +0000 UTC m=+171.053242031" observedRunningTime="2026-04-24 16:42:16.209463474 +0000 UTC m=+172.142532674" watchObservedRunningTime="2026-04-24 16:42:16.209871116 +0000 UTC m=+172.142940316" Apr 24 16:42:17.180810 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.180757 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rgpmf" event={"ID":"4b53d555-6ece-4be6-a70d-30c64956654b","Type":"ContainerStarted","Data":"bdd4a714c12966e9e8fc3a1812ad6d212adb741531fe9e9093a3a810498ed068"} Apr 24 16:42:17.198585 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.198529 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-rgpmf" podStartSLOduration=138.561761829 podStartE2EDuration="2m20.198510671s" podCreationTimestamp="2026-04-24 16:39:57 +0000 UTC" firstStartedPulling="2026-04-24 16:42:14.802217398 +0000 UTC m=+170.735286583" lastFinishedPulling="2026-04-24 16:42:16.438966233 +0000 UTC m=+172.372035425" observedRunningTime="2026-04-24 16:42:17.197350109 +0000 UTC m=+173.130419307" watchObservedRunningTime="2026-04-24 16:42:17.198510671 +0000 UTC m=+173.131579872" Apr 24 16:42:17.471245 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.471163 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-65f49765b8-dzndp"] Apr 24 16:42:17.474689 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.474673 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-65f49765b8-dzndp" Apr 24 16:42:17.477748 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.477714 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 24 16:42:17.477884 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.477761 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-5574x\"" Apr 24 16:42:17.477974 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.477898 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 24 16:42:17.477974 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.477909 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 24 16:42:17.477974 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.477899 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 24 16:42:17.478463 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.478443 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 24 16:42:17.478519 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.478450 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-6ca35ou1tkalq\"" Apr 24 16:42:17.486283 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.486261 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-65f49765b8-dzndp"] Apr 24 16:42:17.505464 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.505433 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6c27e1fe-fd53-4300-9942-8c630cdcafe5-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-65f49765b8-dzndp\" (UID: \"6c27e1fe-fd53-4300-9942-8c630cdcafe5\") " pod="openshift-monitoring/thanos-querier-65f49765b8-dzndp" Apr 24 16:42:17.505583 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.505473 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwnlj\" (UniqueName: \"kubernetes.io/projected/6c27e1fe-fd53-4300-9942-8c630cdcafe5-kube-api-access-vwnlj\") pod \"thanos-querier-65f49765b8-dzndp\" (UID: \"6c27e1fe-fd53-4300-9942-8c630cdcafe5\") " pod="openshift-monitoring/thanos-querier-65f49765b8-dzndp" Apr 24 16:42:17.505583 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.505511 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/6c27e1fe-fd53-4300-9942-8c630cdcafe5-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-65f49765b8-dzndp\" (UID: \"6c27e1fe-fd53-4300-9942-8c630cdcafe5\") " pod="openshift-monitoring/thanos-querier-65f49765b8-dzndp" Apr 24 16:42:17.505583 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.505536 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/6c27e1fe-fd53-4300-9942-8c630cdcafe5-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-65f49765b8-dzndp\" (UID: \"6c27e1fe-fd53-4300-9942-8c630cdcafe5\") " pod="openshift-monitoring/thanos-querier-65f49765b8-dzndp" Apr 24 16:42:17.505583 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.505559 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/6c27e1fe-fd53-4300-9942-8c630cdcafe5-secret-thanos-querier-tls\") pod \"thanos-querier-65f49765b8-dzndp\" (UID: \"6c27e1fe-fd53-4300-9942-8c630cdcafe5\") " pod="openshift-monitoring/thanos-querier-65f49765b8-dzndp" Apr 24 16:42:17.505705 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.505584 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6c27e1fe-fd53-4300-9942-8c630cdcafe5-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-65f49765b8-dzndp\" (UID: \"6c27e1fe-fd53-4300-9942-8c630cdcafe5\") " pod="openshift-monitoring/thanos-querier-65f49765b8-dzndp" Apr 24 16:42:17.505705 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.505612 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6c27e1fe-fd53-4300-9942-8c630cdcafe5-metrics-client-ca\") pod \"thanos-querier-65f49765b8-dzndp\" (UID: \"6c27e1fe-fd53-4300-9942-8c630cdcafe5\") " pod="openshift-monitoring/thanos-querier-65f49765b8-dzndp" Apr 24 16:42:17.505705 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.505666 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6c27e1fe-fd53-4300-9942-8c630cdcafe5-secret-grpc-tls\") pod \"thanos-querier-65f49765b8-dzndp\" (UID: \"6c27e1fe-fd53-4300-9942-8c630cdcafe5\") " pod="openshift-monitoring/thanos-querier-65f49765b8-dzndp" Apr 24 16:42:17.606954 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.606897 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6c27e1fe-fd53-4300-9942-8c630cdcafe5-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-65f49765b8-dzndp\" (UID: \"6c27e1fe-fd53-4300-9942-8c630cdcafe5\") " pod="openshift-monitoring/thanos-querier-65f49765b8-dzndp" Apr 24 16:42:17.606954 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.606962 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vwnlj\" (UniqueName: \"kubernetes.io/projected/6c27e1fe-fd53-4300-9942-8c630cdcafe5-kube-api-access-vwnlj\") pod \"thanos-querier-65f49765b8-dzndp\" (UID: \"6c27e1fe-fd53-4300-9942-8c630cdcafe5\") " pod="openshift-monitoring/thanos-querier-65f49765b8-dzndp" Apr 24 16:42:17.607210 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.607012 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/6c27e1fe-fd53-4300-9942-8c630cdcafe5-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-65f49765b8-dzndp\" (UID: \"6c27e1fe-fd53-4300-9942-8c630cdcafe5\") " pod="openshift-monitoring/thanos-querier-65f49765b8-dzndp" Apr 24 16:42:17.607210 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.607047 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/6c27e1fe-fd53-4300-9942-8c630cdcafe5-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-65f49765b8-dzndp\" (UID: \"6c27e1fe-fd53-4300-9942-8c630cdcafe5\") " pod="openshift-monitoring/thanos-querier-65f49765b8-dzndp" Apr 24 16:42:17.607210 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.607079 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/6c27e1fe-fd53-4300-9942-8c630cdcafe5-secret-thanos-querier-tls\") pod \"thanos-querier-65f49765b8-dzndp\" (UID: \"6c27e1fe-fd53-4300-9942-8c630cdcafe5\") " pod="openshift-monitoring/thanos-querier-65f49765b8-dzndp" Apr 24 16:42:17.607210 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.607105 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6c27e1fe-fd53-4300-9942-8c630cdcafe5-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-65f49765b8-dzndp\" (UID: \"6c27e1fe-fd53-4300-9942-8c630cdcafe5\") " pod="openshift-monitoring/thanos-querier-65f49765b8-dzndp" Apr 24 16:42:17.607398 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.607201 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6c27e1fe-fd53-4300-9942-8c630cdcafe5-metrics-client-ca\") pod \"thanos-querier-65f49765b8-dzndp\" (UID: \"6c27e1fe-fd53-4300-9942-8c630cdcafe5\") " pod="openshift-monitoring/thanos-querier-65f49765b8-dzndp" Apr 24 16:42:17.607398 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.607242 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6c27e1fe-fd53-4300-9942-8c630cdcafe5-secret-grpc-tls\") pod \"thanos-querier-65f49765b8-dzndp\" (UID: \"6c27e1fe-fd53-4300-9942-8c630cdcafe5\") " pod="openshift-monitoring/thanos-querier-65f49765b8-dzndp" Apr 24 16:42:17.608058 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.608031 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6c27e1fe-fd53-4300-9942-8c630cdcafe5-metrics-client-ca\") pod \"thanos-querier-65f49765b8-dzndp\" (UID: \"6c27e1fe-fd53-4300-9942-8c630cdcafe5\") " pod="openshift-monitoring/thanos-querier-65f49765b8-dzndp" Apr 24 16:42:17.609855 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.609831 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/6c27e1fe-fd53-4300-9942-8c630cdcafe5-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-65f49765b8-dzndp\" (UID: \"6c27e1fe-fd53-4300-9942-8c630cdcafe5\") " pod="openshift-monitoring/thanos-querier-65f49765b8-dzndp" Apr 24 16:42:17.609954 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.609923 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6c27e1fe-fd53-4300-9942-8c630cdcafe5-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-65f49765b8-dzndp\" (UID: \"6c27e1fe-fd53-4300-9942-8c630cdcafe5\") " pod="openshift-monitoring/thanos-querier-65f49765b8-dzndp" Apr 24 16:42:17.610246 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.610221 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/6c27e1fe-fd53-4300-9942-8c630cdcafe5-secret-thanos-querier-tls\") pod \"thanos-querier-65f49765b8-dzndp\" (UID: \"6c27e1fe-fd53-4300-9942-8c630cdcafe5\") " pod="openshift-monitoring/thanos-querier-65f49765b8-dzndp" Apr 24 16:42:17.610338 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.610248 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/6c27e1fe-fd53-4300-9942-8c630cdcafe5-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-65f49765b8-dzndp\" (UID: \"6c27e1fe-fd53-4300-9942-8c630cdcafe5\") " pod="openshift-monitoring/thanos-querier-65f49765b8-dzndp" Apr 24 16:42:17.610395 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.610382 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6c27e1fe-fd53-4300-9942-8c630cdcafe5-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-65f49765b8-dzndp\" (UID: \"6c27e1fe-fd53-4300-9942-8c630cdcafe5\") " pod="openshift-monitoring/thanos-querier-65f49765b8-dzndp" Apr 24 16:42:17.610450 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.610403 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6c27e1fe-fd53-4300-9942-8c630cdcafe5-secret-grpc-tls\") pod \"thanos-querier-65f49765b8-dzndp\" (UID: \"6c27e1fe-fd53-4300-9942-8c630cdcafe5\") " pod="openshift-monitoring/thanos-querier-65f49765b8-dzndp" Apr 24 16:42:17.615975 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.615953 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwnlj\" (UniqueName: \"kubernetes.io/projected/6c27e1fe-fd53-4300-9942-8c630cdcafe5-kube-api-access-vwnlj\") pod \"thanos-querier-65f49765b8-dzndp\" (UID: \"6c27e1fe-fd53-4300-9942-8c630cdcafe5\") " pod="openshift-monitoring/thanos-querier-65f49765b8-dzndp" Apr 24 16:42:17.691377 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.691328 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-7468bc468d-h9m7f"] Apr 24 16:42:17.697081 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.697054 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7468bc468d-h9m7f" Apr 24 16:42:17.699536 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.699510 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 24 16:42:17.699665 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.699561 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 24 16:42:17.699665 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.699599 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 24 16:42:17.699665 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.699561 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 24 16:42:17.699665 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.699658 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-88gdt\"" Apr 24 16:42:17.699936 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.699919 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-5bi8v6soeneij\"" Apr 24 16:42:17.707611 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.707576 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7468bc468d-h9m7f"] Apr 24 16:42:17.784360 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.784314 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-65f49765b8-dzndp" Apr 24 16:42:17.809257 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.809223 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/1892d90d-bef5-49d7-87af-a352bf60c2a3-secret-metrics-server-client-certs\") pod \"metrics-server-7468bc468d-h9m7f\" (UID: \"1892d90d-bef5-49d7-87af-a352bf60c2a3\") " pod="openshift-monitoring/metrics-server-7468bc468d-h9m7f" Apr 24 16:42:17.809448 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.809274 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/1892d90d-bef5-49d7-87af-a352bf60c2a3-audit-log\") pod \"metrics-server-7468bc468d-h9m7f\" (UID: \"1892d90d-bef5-49d7-87af-a352bf60c2a3\") " pod="openshift-monitoring/metrics-server-7468bc468d-h9m7f" Apr 24 16:42:17.809448 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.809369 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1892d90d-bef5-49d7-87af-a352bf60c2a3-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7468bc468d-h9m7f\" (UID: \"1892d90d-bef5-49d7-87af-a352bf60c2a3\") " pod="openshift-monitoring/metrics-server-7468bc468d-h9m7f" Apr 24 16:42:17.809448 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.809403 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrjwj\" (UniqueName: \"kubernetes.io/projected/1892d90d-bef5-49d7-87af-a352bf60c2a3-kube-api-access-xrjwj\") pod \"metrics-server-7468bc468d-h9m7f\" (UID: \"1892d90d-bef5-49d7-87af-a352bf60c2a3\") " pod="openshift-monitoring/metrics-server-7468bc468d-h9m7f" Apr 24 16:42:17.809448 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.809433 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/1892d90d-bef5-49d7-87af-a352bf60c2a3-secret-metrics-server-tls\") pod \"metrics-server-7468bc468d-h9m7f\" (UID: \"1892d90d-bef5-49d7-87af-a352bf60c2a3\") " pod="openshift-monitoring/metrics-server-7468bc468d-h9m7f" Apr 24 16:42:17.809597 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.809506 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1892d90d-bef5-49d7-87af-a352bf60c2a3-client-ca-bundle\") pod \"metrics-server-7468bc468d-h9m7f\" (UID: \"1892d90d-bef5-49d7-87af-a352bf60c2a3\") " pod="openshift-monitoring/metrics-server-7468bc468d-h9m7f" Apr 24 16:42:17.809597 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.809558 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/1892d90d-bef5-49d7-87af-a352bf60c2a3-metrics-server-audit-profiles\") pod \"metrics-server-7468bc468d-h9m7f\" (UID: \"1892d90d-bef5-49d7-87af-a352bf60c2a3\") " pod="openshift-monitoring/metrics-server-7468bc468d-h9m7f" Apr 24 16:42:17.907762 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.907701 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-65f49765b8-dzndp"] Apr 24 16:42:17.910037 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.910015 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1892d90d-bef5-49d7-87af-a352bf60c2a3-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7468bc468d-h9m7f\" (UID: \"1892d90d-bef5-49d7-87af-a352bf60c2a3\") " pod="openshift-monitoring/metrics-server-7468bc468d-h9m7f" Apr 24 16:42:17.910109 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.910054 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xrjwj\" (UniqueName: \"kubernetes.io/projected/1892d90d-bef5-49d7-87af-a352bf60c2a3-kube-api-access-xrjwj\") pod \"metrics-server-7468bc468d-h9m7f\" (UID: \"1892d90d-bef5-49d7-87af-a352bf60c2a3\") " pod="openshift-monitoring/metrics-server-7468bc468d-h9m7f" Apr 24 16:42:17.910109 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.910076 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/1892d90d-bef5-49d7-87af-a352bf60c2a3-secret-metrics-server-tls\") pod \"metrics-server-7468bc468d-h9m7f\" (UID: \"1892d90d-bef5-49d7-87af-a352bf60c2a3\") " pod="openshift-monitoring/metrics-server-7468bc468d-h9m7f" Apr 24 16:42:17.910258 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.910209 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1892d90d-bef5-49d7-87af-a352bf60c2a3-client-ca-bundle\") pod \"metrics-server-7468bc468d-h9m7f\" (UID: \"1892d90d-bef5-49d7-87af-a352bf60c2a3\") " pod="openshift-monitoring/metrics-server-7468bc468d-h9m7f" Apr 24 16:42:17.910394 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.910268 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/1892d90d-bef5-49d7-87af-a352bf60c2a3-metrics-server-audit-profiles\") pod \"metrics-server-7468bc468d-h9m7f\" (UID: \"1892d90d-bef5-49d7-87af-a352bf60c2a3\") " pod="openshift-monitoring/metrics-server-7468bc468d-h9m7f" Apr 24 16:42:17.910394 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.910324 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/1892d90d-bef5-49d7-87af-a352bf60c2a3-secret-metrics-server-client-certs\") pod \"metrics-server-7468bc468d-h9m7f\" (UID: \"1892d90d-bef5-49d7-87af-a352bf60c2a3\") " pod="openshift-monitoring/metrics-server-7468bc468d-h9m7f" Apr 24 16:42:17.910394 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.910362 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/1892d90d-bef5-49d7-87af-a352bf60c2a3-audit-log\") pod \"metrics-server-7468bc468d-h9m7f\" (UID: \"1892d90d-bef5-49d7-87af-a352bf60c2a3\") " pod="openshift-monitoring/metrics-server-7468bc468d-h9m7f" Apr 24 16:42:17.910865 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.910752 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/1892d90d-bef5-49d7-87af-a352bf60c2a3-audit-log\") pod \"metrics-server-7468bc468d-h9m7f\" (UID: \"1892d90d-bef5-49d7-87af-a352bf60c2a3\") " pod="openshift-monitoring/metrics-server-7468bc468d-h9m7f" Apr 24 16:42:17.910865 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.910846 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1892d90d-bef5-49d7-87af-a352bf60c2a3-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7468bc468d-h9m7f\" (UID: \"1892d90d-bef5-49d7-87af-a352bf60c2a3\") " pod="openshift-monitoring/metrics-server-7468bc468d-h9m7f" Apr 24 16:42:17.911358 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.911312 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/1892d90d-bef5-49d7-87af-a352bf60c2a3-metrics-server-audit-profiles\") pod \"metrics-server-7468bc468d-h9m7f\" (UID: \"1892d90d-bef5-49d7-87af-a352bf60c2a3\") " pod="openshift-monitoring/metrics-server-7468bc468d-h9m7f" Apr 24 16:42:17.911676 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:42:17.911637 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c27e1fe_fd53_4300_9942_8c630cdcafe5.slice/crio-68bd6ca35dc104325d9304f40a559f0b8a88f72475b03dccce862a3a33ccb180 WatchSource:0}: Error finding container 68bd6ca35dc104325d9304f40a559f0b8a88f72475b03dccce862a3a33ccb180: Status 404 returned error can't find the container with id 68bd6ca35dc104325d9304f40a559f0b8a88f72475b03dccce862a3a33ccb180 Apr 24 16:42:17.913012 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.912992 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/1892d90d-bef5-49d7-87af-a352bf60c2a3-secret-metrics-server-tls\") pod \"metrics-server-7468bc468d-h9m7f\" (UID: \"1892d90d-bef5-49d7-87af-a352bf60c2a3\") " pod="openshift-monitoring/metrics-server-7468bc468d-h9m7f" Apr 24 16:42:17.913112 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.913096 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1892d90d-bef5-49d7-87af-a352bf60c2a3-client-ca-bundle\") pod \"metrics-server-7468bc468d-h9m7f\" (UID: \"1892d90d-bef5-49d7-87af-a352bf60c2a3\") " pod="openshift-monitoring/metrics-server-7468bc468d-h9m7f" Apr 24 16:42:17.913230 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.913211 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/1892d90d-bef5-49d7-87af-a352bf60c2a3-secret-metrics-server-client-certs\") pod \"metrics-server-7468bc468d-h9m7f\" (UID: \"1892d90d-bef5-49d7-87af-a352bf60c2a3\") " pod="openshift-monitoring/metrics-server-7468bc468d-h9m7f" Apr 24 16:42:17.917551 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:17.917531 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrjwj\" (UniqueName: \"kubernetes.io/projected/1892d90d-bef5-49d7-87af-a352bf60c2a3-kube-api-access-xrjwj\") pod \"metrics-server-7468bc468d-h9m7f\" (UID: \"1892d90d-bef5-49d7-87af-a352bf60c2a3\") " pod="openshift-monitoring/metrics-server-7468bc468d-h9m7f" Apr 24 16:42:18.007405 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:18.007361 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7468bc468d-h9m7f" Apr 24 16:42:18.137401 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:18.137374 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7468bc468d-h9m7f"] Apr 24 16:42:18.144370 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:18.144342 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-pkmpl" Apr 24 16:42:18.179579 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:18.179547 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-rpv8t"] Apr 24 16:42:18.184548 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:18.184522 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rpv8t" Apr 24 16:42:18.185456 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:18.185429 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7468bc468d-h9m7f" event={"ID":"1892d90d-bef5-49d7-87af-a352bf60c2a3","Type":"ContainerStarted","Data":"9d02c5da5016c8499d265a9ab22bac4791092874835053bce6b530b535b5631c"} Apr 24 16:42:18.186832 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:18.186771 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-65f49765b8-dzndp" event={"ID":"6c27e1fe-fd53-4300-9942-8c630cdcafe5","Type":"ContainerStarted","Data":"68bd6ca35dc104325d9304f40a559f0b8a88f72475b03dccce862a3a33ccb180"} Apr 24 16:42:18.186945 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:18.186898 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-gb4t2\"" Apr 24 16:42:18.187002 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:18.186941 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 24 16:42:18.193108 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:18.193083 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-rpv8t"] Apr 24 16:42:18.213284 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:18.213251 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d732ca6b-ee1c-4dac-8c6b-17d88a89520a-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-rpv8t\" (UID: \"d732ca6b-ee1c-4dac-8c6b-17d88a89520a\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rpv8t" Apr 24 16:42:18.314172 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:18.314077 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d732ca6b-ee1c-4dac-8c6b-17d88a89520a-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-rpv8t\" (UID: \"d732ca6b-ee1c-4dac-8c6b-17d88a89520a\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rpv8t" Apr 24 16:42:18.314312 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:42:18.314226 2579 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 24 16:42:18.314312 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:42:18.314297 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d732ca6b-ee1c-4dac-8c6b-17d88a89520a-monitoring-plugin-cert podName:d732ca6b-ee1c-4dac-8c6b-17d88a89520a nodeName:}" failed. No retries permitted until 2026-04-24 16:42:18.814281947 +0000 UTC m=+174.747351125 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/d732ca6b-ee1c-4dac-8c6b-17d88a89520a-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-rpv8t" (UID: "d732ca6b-ee1c-4dac-8c6b-17d88a89520a") : secret "monitoring-plugin-cert" not found Apr 24 16:42:18.819284 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:18.819243 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d732ca6b-ee1c-4dac-8c6b-17d88a89520a-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-rpv8t\" (UID: \"d732ca6b-ee1c-4dac-8c6b-17d88a89520a\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rpv8t" Apr 24 16:42:18.822978 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:18.822943 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d732ca6b-ee1c-4dac-8c6b-17d88a89520a-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-rpv8t\" (UID: \"d732ca6b-ee1c-4dac-8c6b-17d88a89520a\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rpv8t" Apr 24 16:42:19.095003 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:19.094924 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rpv8t" Apr 24 16:42:20.304243 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:20.304223 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-rpv8t"] Apr 24 16:42:20.307126 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:42:20.307099 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd732ca6b_ee1c_4dac_8c6b_17d88a89520a.slice/crio-175eb55307f79042d5b6861936a45c305c7291c63c3f9e353414155e1f92523b WatchSource:0}: Error finding container 175eb55307f79042d5b6861936a45c305c7291c63c3f9e353414155e1f92523b: Status 404 returned error can't find the container with id 175eb55307f79042d5b6861936a45c305c7291c63c3f9e353414155e1f92523b Apr 24 16:42:21.200188 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:21.200142 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-65f49765b8-dzndp" event={"ID":"6c27e1fe-fd53-4300-9942-8c630cdcafe5","Type":"ContainerStarted","Data":"83663a8ed7e909187d6d769f2b0d1063756bdfcf296a6f5b467962afbb0a0382"} Apr 24 16:42:21.200188 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:21.200188 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-65f49765b8-dzndp" event={"ID":"6c27e1fe-fd53-4300-9942-8c630cdcafe5","Type":"ContainerStarted","Data":"54bd0389d6bb94a63f6598b7e30be0ba9cca7aa2ba3cf367064e53d024ff496f"} Apr 24 16:42:21.200418 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:21.200202 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-65f49765b8-dzndp" event={"ID":"6c27e1fe-fd53-4300-9942-8c630cdcafe5","Type":"ContainerStarted","Data":"7db28ee66c6196ae63a5a5f1bdf281553539e3cbc552f8c60c44023701aee458"} Apr 24 16:42:21.201478 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:21.201448 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rpv8t" event={"ID":"d732ca6b-ee1c-4dac-8c6b-17d88a89520a","Type":"ContainerStarted","Data":"175eb55307f79042d5b6861936a45c305c7291c63c3f9e353414155e1f92523b"} Apr 24 16:42:21.202993 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:21.202967 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7468bc468d-h9m7f" event={"ID":"1892d90d-bef5-49d7-87af-a352bf60c2a3","Type":"ContainerStarted","Data":"26dd94da99df722e4dbf6d00b719145c92ed489dd5e7045c930759ad3090388e"} Apr 24 16:42:21.221234 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:21.221182 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-7468bc468d-h9m7f" podStartSLOduration=2.186036394 podStartE2EDuration="4.221158763s" podCreationTimestamp="2026-04-24 16:42:17 +0000 UTC" firstStartedPulling="2026-04-24 16:42:18.145147511 +0000 UTC m=+174.078216688" lastFinishedPulling="2026-04-24 16:42:20.180269873 +0000 UTC m=+176.113339057" observedRunningTime="2026-04-24 16:42:21.219851955 +0000 UTC m=+177.152921155" watchObservedRunningTime="2026-04-24 16:42:21.221158763 +0000 UTC m=+177.154227965" Apr 24 16:42:22.208209 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:22.208174 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-65f49765b8-dzndp" event={"ID":"6c27e1fe-fd53-4300-9942-8c630cdcafe5","Type":"ContainerStarted","Data":"ec3be136e977c2b4590c6cdaac5e79b4f5c4e3f1f449dc705a103e88f92883a8"} Apr 24 16:42:22.208209 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:22.208212 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-65f49765b8-dzndp" event={"ID":"6c27e1fe-fd53-4300-9942-8c630cdcafe5","Type":"ContainerStarted","Data":"ffc69850d2ff3330b71f284fd7f910cbb283e86bb2832782418f042113049611"} Apr 24 16:42:22.208669 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:22.208227 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-65f49765b8-dzndp" event={"ID":"6c27e1fe-fd53-4300-9942-8c630cdcafe5","Type":"ContainerStarted","Data":"29d36cd3779491b715d18869ea0dd8bdec8a5a75603fcf199dd9f7315178abff"} Apr 24 16:42:22.208669 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:22.208367 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-65f49765b8-dzndp" Apr 24 16:42:22.209476 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:22.209452 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rpv8t" event={"ID":"d732ca6b-ee1c-4dac-8c6b-17d88a89520a","Type":"ContainerStarted","Data":"85e19dc791edbbc287a7b003336bd9bb0dadf427451333a9a031bb0fb4727001"} Apr 24 16:42:22.231593 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:22.231544 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-65f49765b8-dzndp" podStartSLOduration=1.686275485 podStartE2EDuration="5.231529432s" podCreationTimestamp="2026-04-24 16:42:17 +0000 UTC" firstStartedPulling="2026-04-24 16:42:17.913788713 +0000 UTC m=+173.846857891" lastFinishedPulling="2026-04-24 16:42:21.459042652 +0000 UTC m=+177.392111838" observedRunningTime="2026-04-24 16:42:22.229670746 +0000 UTC m=+178.162739947" watchObservedRunningTime="2026-04-24 16:42:22.231529432 +0000 UTC m=+178.164598631" Apr 24 16:42:22.244193 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:22.244133 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rpv8t" podStartSLOduration=3.045054284 podStartE2EDuration="4.244118096s" podCreationTimestamp="2026-04-24 16:42:18 +0000 UTC" firstStartedPulling="2026-04-24 16:42:20.310139683 +0000 UTC m=+176.243208860" lastFinishedPulling="2026-04-24 16:42:21.50920349 +0000 UTC m=+177.442272672" observedRunningTime="2026-04-24 16:42:22.243020566 +0000 UTC m=+178.176089766" watchObservedRunningTime="2026-04-24 16:42:22.244118096 +0000 UTC m=+178.177187295" Apr 24 16:42:23.212537 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:23.212501 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rpv8t" Apr 24 16:42:23.217352 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:23.217330 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rpv8t" Apr 24 16:42:27.616226 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:27.616190 2579 scope.go:117] "RemoveContainer" containerID="93d6c3ce2ad1197e27f605ad1f8cb8c37d942a83c58c0e48e6e6a432044c92e0" Apr 24 16:42:28.180826 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:28.180790 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-n8jmb"] Apr 24 16:42:28.184140 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:28.184119 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-n8jmb" Apr 24 16:42:28.186423 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:28.186400 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 24 16:42:28.186668 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:28.186648 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-6b9s8\"" Apr 24 16:42:28.186755 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:28.186648 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 24 16:42:28.196296 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:28.196276 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-n8jmb"] Apr 24 16:42:28.219553 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:28.219531 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-65f49765b8-dzndp" Apr 24 16:42:28.229609 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:28.229589 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-52mlp_9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80/console-operator/2.log" Apr 24 16:42:28.229757 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:28.229661 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-52mlp" event={"ID":"9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80","Type":"ContainerStarted","Data":"578cde8fa5b96144167c9c5dc115ec6be3691eadf892192d660ad8db74367731"} Apr 24 16:42:28.229957 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:28.229940 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-52mlp" Apr 24 16:42:28.234399 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:28.234375 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-52mlp" Apr 24 16:42:28.302823 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:28.302790 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvc7m\" (UniqueName: \"kubernetes.io/projected/9d205aa0-a444-4516-a2f2-f9d5e15b0a24-kube-api-access-jvc7m\") pod \"downloads-6bcc868b7-n8jmb\" (UID: \"9d205aa0-a444-4516-a2f2-f9d5e15b0a24\") " pod="openshift-console/downloads-6bcc868b7-n8jmb" Apr 24 16:42:28.307080 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:28.307036 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-52mlp" podStartSLOduration=54.245805562 podStartE2EDuration="58.307021808s" podCreationTimestamp="2026-04-24 16:41:30 +0000 UTC" firstStartedPulling="2026-04-24 16:41:30.932717485 +0000 UTC m=+126.865786680" lastFinishedPulling="2026-04-24 16:41:34.993933735 +0000 UTC m=+130.927002926" observedRunningTime="2026-04-24 16:42:28.305719563 +0000 UTC m=+184.238788764" watchObservedRunningTime="2026-04-24 16:42:28.307021808 +0000 UTC m=+184.240091047" Apr 24 16:42:28.404384 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:28.404346 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jvc7m\" (UniqueName: \"kubernetes.io/projected/9d205aa0-a444-4516-a2f2-f9d5e15b0a24-kube-api-access-jvc7m\") pod \"downloads-6bcc868b7-n8jmb\" (UID: \"9d205aa0-a444-4516-a2f2-f9d5e15b0a24\") " pod="openshift-console/downloads-6bcc868b7-n8jmb" Apr 24 16:42:28.415216 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:28.415187 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvc7m\" (UniqueName: \"kubernetes.io/projected/9d205aa0-a444-4516-a2f2-f9d5e15b0a24-kube-api-access-jvc7m\") pod \"downloads-6bcc868b7-n8jmb\" (UID: \"9d205aa0-a444-4516-a2f2-f9d5e15b0a24\") " pod="openshift-console/downloads-6bcc868b7-n8jmb" Apr 24 16:42:28.493170 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:28.493075 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-n8jmb" Apr 24 16:42:28.634101 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:28.634073 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-n8jmb"] Apr 24 16:42:28.635701 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:42:28.635672 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d205aa0_a444_4516_a2f2_f9d5e15b0a24.slice/crio-5de0961fb9e116d8db9244431832b46ec63b5c36b0b5cd0e8e1f6f5cb18f62b5 WatchSource:0}: Error finding container 5de0961fb9e116d8db9244431832b46ec63b5c36b0b5cd0e8e1f6f5cb18f62b5: Status 404 returned error can't find the container with id 5de0961fb9e116d8db9244431832b46ec63b5c36b0b5cd0e8e1f6f5cb18f62b5 Apr 24 16:42:29.234475 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:29.234428 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-n8jmb" event={"ID":"9d205aa0-a444-4516-a2f2-f9d5e15b0a24","Type":"ContainerStarted","Data":"5de0961fb9e116d8db9244431832b46ec63b5c36b0b5cd0e8e1f6f5cb18f62b5"} Apr 24 16:42:38.007625 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:38.007590 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-7468bc468d-h9m7f" Apr 24 16:42:38.008107 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:38.007680 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-7468bc468d-h9m7f" Apr 24 16:42:38.332349 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:38.332266 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6df7cf8bd5-ljcpn"] Apr 24 16:42:38.336148 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:38.336122 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6df7cf8bd5-ljcpn" Apr 24 16:42:38.342074 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:38.342043 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 24 16:42:38.342266 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:38.342048 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 24 16:42:38.342583 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:38.342562 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 24 16:42:38.342823 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:38.342583 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 24 16:42:38.342913 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:38.342894 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-ctfdb\"" Apr 24 16:42:38.342961 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:38.342905 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 24 16:42:38.350204 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:38.350179 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6df7cf8bd5-ljcpn"] Apr 24 16:42:38.503101 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:38.503065 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ca64442b-d234-464e-8df4-4fcd6ff1aca7-console-oauth-config\") pod \"console-6df7cf8bd5-ljcpn\" (UID: \"ca64442b-d234-464e-8df4-4fcd6ff1aca7\") " pod="openshift-console/console-6df7cf8bd5-ljcpn" Apr 24 16:42:38.503285 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:38.503119 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ca64442b-d234-464e-8df4-4fcd6ff1aca7-service-ca\") pod \"console-6df7cf8bd5-ljcpn\" (UID: \"ca64442b-d234-464e-8df4-4fcd6ff1aca7\") " pod="openshift-console/console-6df7cf8bd5-ljcpn" Apr 24 16:42:38.503285 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:38.503219 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ca64442b-d234-464e-8df4-4fcd6ff1aca7-oauth-serving-cert\") pod \"console-6df7cf8bd5-ljcpn\" (UID: \"ca64442b-d234-464e-8df4-4fcd6ff1aca7\") " pod="openshift-console/console-6df7cf8bd5-ljcpn" Apr 24 16:42:38.503285 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:38.503257 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ca64442b-d234-464e-8df4-4fcd6ff1aca7-console-serving-cert\") pod \"console-6df7cf8bd5-ljcpn\" (UID: \"ca64442b-d234-464e-8df4-4fcd6ff1aca7\") " pod="openshift-console/console-6df7cf8bd5-ljcpn" Apr 24 16:42:38.503417 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:38.503287 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ca64442b-d234-464e-8df4-4fcd6ff1aca7-console-config\") pod \"console-6df7cf8bd5-ljcpn\" (UID: \"ca64442b-d234-464e-8df4-4fcd6ff1aca7\") " pod="openshift-console/console-6df7cf8bd5-ljcpn" Apr 24 16:42:38.503417 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:38.503340 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgqfk\" (UniqueName: \"kubernetes.io/projected/ca64442b-d234-464e-8df4-4fcd6ff1aca7-kube-api-access-cgqfk\") pod \"console-6df7cf8bd5-ljcpn\" (UID: \"ca64442b-d234-464e-8df4-4fcd6ff1aca7\") " pod="openshift-console/console-6df7cf8bd5-ljcpn" Apr 24 16:42:38.604762 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:38.604649 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ca64442b-d234-464e-8df4-4fcd6ff1aca7-console-config\") pod \"console-6df7cf8bd5-ljcpn\" (UID: \"ca64442b-d234-464e-8df4-4fcd6ff1aca7\") " pod="openshift-console/console-6df7cf8bd5-ljcpn" Apr 24 16:42:38.604949 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:38.604756 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cgqfk\" (UniqueName: \"kubernetes.io/projected/ca64442b-d234-464e-8df4-4fcd6ff1aca7-kube-api-access-cgqfk\") pod \"console-6df7cf8bd5-ljcpn\" (UID: \"ca64442b-d234-464e-8df4-4fcd6ff1aca7\") " pod="openshift-console/console-6df7cf8bd5-ljcpn" Apr 24 16:42:38.604949 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:38.604799 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ca64442b-d234-464e-8df4-4fcd6ff1aca7-console-oauth-config\") pod \"console-6df7cf8bd5-ljcpn\" (UID: \"ca64442b-d234-464e-8df4-4fcd6ff1aca7\") " pod="openshift-console/console-6df7cf8bd5-ljcpn" Apr 24 16:42:38.604949 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:38.604847 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ca64442b-d234-464e-8df4-4fcd6ff1aca7-service-ca\") pod \"console-6df7cf8bd5-ljcpn\" (UID: \"ca64442b-d234-464e-8df4-4fcd6ff1aca7\") " pod="openshift-console/console-6df7cf8bd5-ljcpn" Apr 24 16:42:38.604949 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:38.604897 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ca64442b-d234-464e-8df4-4fcd6ff1aca7-oauth-serving-cert\") pod \"console-6df7cf8bd5-ljcpn\" (UID: \"ca64442b-d234-464e-8df4-4fcd6ff1aca7\") " pod="openshift-console/console-6df7cf8bd5-ljcpn" Apr 24 16:42:38.604949 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:38.604944 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ca64442b-d234-464e-8df4-4fcd6ff1aca7-console-serving-cert\") pod \"console-6df7cf8bd5-ljcpn\" (UID: \"ca64442b-d234-464e-8df4-4fcd6ff1aca7\") " pod="openshift-console/console-6df7cf8bd5-ljcpn" Apr 24 16:42:38.605556 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:38.605527 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ca64442b-d234-464e-8df4-4fcd6ff1aca7-console-config\") pod \"console-6df7cf8bd5-ljcpn\" (UID: \"ca64442b-d234-464e-8df4-4fcd6ff1aca7\") " pod="openshift-console/console-6df7cf8bd5-ljcpn" Apr 24 16:42:38.605898 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:38.605769 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ca64442b-d234-464e-8df4-4fcd6ff1aca7-service-ca\") pod \"console-6df7cf8bd5-ljcpn\" (UID: \"ca64442b-d234-464e-8df4-4fcd6ff1aca7\") " pod="openshift-console/console-6df7cf8bd5-ljcpn" Apr 24 16:42:38.606154 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:38.606125 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ca64442b-d234-464e-8df4-4fcd6ff1aca7-oauth-serving-cert\") pod \"console-6df7cf8bd5-ljcpn\" (UID: \"ca64442b-d234-464e-8df4-4fcd6ff1aca7\") " pod="openshift-console/console-6df7cf8bd5-ljcpn" Apr 24 16:42:38.607866 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:38.607842 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ca64442b-d234-464e-8df4-4fcd6ff1aca7-console-serving-cert\") pod \"console-6df7cf8bd5-ljcpn\" (UID: \"ca64442b-d234-464e-8df4-4fcd6ff1aca7\") " pod="openshift-console/console-6df7cf8bd5-ljcpn" Apr 24 16:42:38.608044 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:38.608019 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ca64442b-d234-464e-8df4-4fcd6ff1aca7-console-oauth-config\") pod \"console-6df7cf8bd5-ljcpn\" (UID: \"ca64442b-d234-464e-8df4-4fcd6ff1aca7\") " pod="openshift-console/console-6df7cf8bd5-ljcpn" Apr 24 16:42:38.624492 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:38.624466 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgqfk\" (UniqueName: \"kubernetes.io/projected/ca64442b-d234-464e-8df4-4fcd6ff1aca7-kube-api-access-cgqfk\") pod \"console-6df7cf8bd5-ljcpn\" (UID: \"ca64442b-d234-464e-8df4-4fcd6ff1aca7\") " pod="openshift-console/console-6df7cf8bd5-ljcpn" Apr 24 16:42:38.647398 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:38.647359 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6df7cf8bd5-ljcpn" Apr 24 16:42:44.543343 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:44.543317 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6df7cf8bd5-ljcpn"] Apr 24 16:42:44.557978 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:42:44.557946 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca64442b_d234_464e_8df4_4fcd6ff1aca7.slice/crio-74956b91b165250b7f4c6d616701093a465897764ba7afe5baded131c75911fb WatchSource:0}: Error finding container 74956b91b165250b7f4c6d616701093a465897764ba7afe5baded131c75911fb: Status 404 returned error can't find the container with id 74956b91b165250b7f4c6d616701093a465897764ba7afe5baded131c75911fb Apr 24 16:42:45.289460 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:45.289419 2579 generic.go:358] "Generic (PLEG): container finished" podID="d6ed4479-aced-4d35-9249-229096300dc7" containerID="7f834f42f241f76b01e2bcb754e6cb1a4d222ce3bf90684c79cd389ab066482e" exitCode=0 Apr 24 16:42:45.289990 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:45.289500 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dghtm" event={"ID":"d6ed4479-aced-4d35-9249-229096300dc7","Type":"ContainerDied","Data":"7f834f42f241f76b01e2bcb754e6cb1a4d222ce3bf90684c79cd389ab066482e"} Apr 24 16:42:45.289990 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:45.289961 2579 scope.go:117] "RemoveContainer" containerID="7f834f42f241f76b01e2bcb754e6cb1a4d222ce3bf90684c79cd389ab066482e" Apr 24 16:42:45.292480 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:45.292411 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-n8jmb" event={"ID":"9d205aa0-a444-4516-a2f2-f9d5e15b0a24","Type":"ContainerStarted","Data":"edc14488ddceafa0c4283a98b084d86bf9df64dfa5f8b44b77bbd21260adf157"} Apr 24 16:42:45.293577 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:45.293551 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-n8jmb" Apr 24 16:42:45.295826 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:45.295800 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6df7cf8bd5-ljcpn" event={"ID":"ca64442b-d234-464e-8df4-4fcd6ff1aca7","Type":"ContainerStarted","Data":"74956b91b165250b7f4c6d616701093a465897764ba7afe5baded131c75911fb"} Apr 24 16:42:45.305239 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:45.305210 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-n8jmb" Apr 24 16:42:45.337660 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:45.337518 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-n8jmb" podStartSLOduration=1.459004728 podStartE2EDuration="17.337497182s" podCreationTimestamp="2026-04-24 16:42:28 +0000 UTC" firstStartedPulling="2026-04-24 16:42:28.63803987 +0000 UTC m=+184.571109061" lastFinishedPulling="2026-04-24 16:42:44.516532326 +0000 UTC m=+200.449601515" observedRunningTime="2026-04-24 16:42:45.33130534 +0000 UTC m=+201.264374541" watchObservedRunningTime="2026-04-24 16:42:45.337497182 +0000 UTC m=+201.270566382" Apr 24 16:42:46.305395 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:46.305301 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dghtm" event={"ID":"d6ed4479-aced-4d35-9249-229096300dc7","Type":"ContainerStarted","Data":"d815efc11232af24193a7658897dbb0a8f7a9040c46f33c7cab985bda7b76a87"} Apr 24 16:42:46.308318 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:46.307816 2579 generic.go:358] "Generic (PLEG): container finished" podID="2a504e1c-5915-4d94-9934-e45731af60c7" containerID="ef8ddaf60e168dc86e5cc0f73a4212cffef4a13de94f1b2d0557837d77fc1c17" exitCode=0 Apr 24 16:42:46.308318 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:46.308084 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-84fzj" event={"ID":"2a504e1c-5915-4d94-9934-e45731af60c7","Type":"ContainerDied","Data":"ef8ddaf60e168dc86e5cc0f73a4212cffef4a13de94f1b2d0557837d77fc1c17"} Apr 24 16:42:46.308489 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:46.308422 2579 scope.go:117] "RemoveContainer" containerID="ef8ddaf60e168dc86e5cc0f73a4212cffef4a13de94f1b2d0557837d77fc1c17" Apr 24 16:42:47.316108 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:47.316056 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-84fzj" event={"ID":"2a504e1c-5915-4d94-9934-e45731af60c7","Type":"ContainerStarted","Data":"df1fa34e1d2d8ab264283a763509664f5332b65efc4007ffef4a65a58b53a3cd"} Apr 24 16:42:49.323768 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:49.323706 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6df7cf8bd5-ljcpn" event={"ID":"ca64442b-d234-464e-8df4-4fcd6ff1aca7","Type":"ContainerStarted","Data":"a2c57554ef56050aed955a309bd2c6d65253dea190bbb91e1bff9d0586f8553b"} Apr 24 16:42:49.343706 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:49.343645 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6df7cf8bd5-ljcpn" podStartSLOduration=7.644339174 podStartE2EDuration="11.343624384s" podCreationTimestamp="2026-04-24 16:42:38 +0000 UTC" firstStartedPulling="2026-04-24 16:42:44.560149377 +0000 UTC m=+200.493218557" lastFinishedPulling="2026-04-24 16:42:48.259434588 +0000 UTC m=+204.192503767" observedRunningTime="2026-04-24 16:42:49.342949404 +0000 UTC m=+205.276018603" watchObservedRunningTime="2026-04-24 16:42:49.343624384 +0000 UTC m=+205.276693584" Apr 24 16:42:52.606758 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:52.606697 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6df7cf8bd5-ljcpn"] Apr 24 16:42:58.013446 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:58.013416 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-7468bc468d-h9m7f" Apr 24 16:42:58.017482 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:58.017452 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-7468bc468d-h9m7f" Apr 24 16:42:58.647803 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:42:58.647770 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6df7cf8bd5-ljcpn" Apr 24 16:43:06.376289 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:06.376258 2579 generic.go:358] "Generic (PLEG): container finished" podID="07d4b91a-a227-4b22-8f87-40c4e9c8139c" containerID="55895bf4fbc7baa1b4a5e9a36fe1debbfd2112185a4bb0b75bce30c0a4257966" exitCode=0 Apr 24 16:43:06.376709 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:06.376332 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-4wsnt" event={"ID":"07d4b91a-a227-4b22-8f87-40c4e9c8139c","Type":"ContainerDied","Data":"55895bf4fbc7baa1b4a5e9a36fe1debbfd2112185a4bb0b75bce30c0a4257966"} Apr 24 16:43:06.376709 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:06.376672 2579 scope.go:117] "RemoveContainer" containerID="55895bf4fbc7baa1b4a5e9a36fe1debbfd2112185a4bb0b75bce30c0a4257966" Apr 24 16:43:07.381147 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:07.381120 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-4wsnt" event={"ID":"07d4b91a-a227-4b22-8f87-40c4e9c8139c","Type":"ContainerStarted","Data":"3154e66c8e9112f7173f515211abc9c4f27f73bc21c17675a3f734fcfa003524"} Apr 24 16:43:08.300217 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:08.300181 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-5nnkx_a32d7d16-b384-47fc-a565-2b51f6f8c945/cluster-monitoring-operator/0.log" Apr 24 16:43:09.095786 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:09.095757 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-7468bc468d-h9m7f_1892d90d-bef5-49d7-87af-a352bf60c2a3/metrics-server/0.log" Apr 24 16:43:09.294999 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:09.294949 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-rpv8t_d732ca6b-ee1c-4dac-8c6b-17d88a89520a/monitoring-plugin/0.log" Apr 24 16:43:09.495920 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:09.495838 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2tmrq_e68c601e-ace9-461f-8270-139af643bb24/init-textfile/0.log" Apr 24 16:43:09.696166 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:09.696124 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2tmrq_e68c601e-ace9-461f-8270-139af643bb24/node-exporter/0.log" Apr 24 16:43:09.896632 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:09.896607 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2tmrq_e68c601e-ace9-461f-8270-139af643bb24/kube-rbac-proxy/0.log" Apr 24 16:43:11.295831 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:11.295802 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-cfv6k_08d16e0f-be67-4d98-93a9-25a13038cbb7/kube-rbac-proxy-main/0.log" Apr 24 16:43:11.496029 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:11.495980 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-cfv6k_08d16e0f-be67-4d98-93a9-25a13038cbb7/kube-rbac-proxy-self/0.log" Apr 24 16:43:11.695358 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:11.695255 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-cfv6k_08d16e0f-be67-4d98-93a9-25a13038cbb7/openshift-state-metrics/0.log" Apr 24 16:43:13.296753 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:13.296710 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-chvg7_fe961a5e-d33f-4c72-a253-e83a37663457/prometheus-operator/0.log" Apr 24 16:43:13.495800 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:13.495773 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-chvg7_fe961a5e-d33f-4c72-a253-e83a37663457/kube-rbac-proxy/0.log" Apr 24 16:43:13.896655 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:13.896627 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-65f49765b8-dzndp_6c27e1fe-fd53-4300-9942-8c630cdcafe5/thanos-query/0.log" Apr 24 16:43:14.095349 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:14.095327 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-65f49765b8-dzndp_6c27e1fe-fd53-4300-9942-8c630cdcafe5/kube-rbac-proxy-web/0.log" Apr 24 16:43:14.296546 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:14.296493 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-65f49765b8-dzndp_6c27e1fe-fd53-4300-9942-8c630cdcafe5/kube-rbac-proxy/0.log" Apr 24 16:43:14.495750 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:14.495706 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-65f49765b8-dzndp_6c27e1fe-fd53-4300-9942-8c630cdcafe5/prom-label-proxy/0.log" Apr 24 16:43:14.695319 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:14.695224 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-65f49765b8-dzndp_6c27e1fe-fd53-4300-9942-8c630cdcafe5/kube-rbac-proxy-rules/0.log" Apr 24 16:43:14.896225 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:14.896184 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-65f49765b8-dzndp_6c27e1fe-fd53-4300-9942-8c630cdcafe5/kube-rbac-proxy-metrics/0.log" Apr 24 16:43:15.295394 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:15.295364 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-52mlp_9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80/console-operator/2.log" Apr 24 16:43:15.498526 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:15.498496 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-52mlp_9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80/console-operator/3.log" Apr 24 16:43:15.895553 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:15.895509 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6df7cf8bd5-ljcpn_ca64442b-d234-464e-8df4-4fcd6ff1aca7/console/0.log" Apr 24 16:43:16.097525 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:16.097496 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-n8jmb_9d205aa0-a444-4516-a2f2-f9d5e15b0a24/download-server/0.log" Apr 24 16:43:17.631243 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:17.631183 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6df7cf8bd5-ljcpn" podUID="ca64442b-d234-464e-8df4-4fcd6ff1aca7" containerName="console" containerID="cri-o://a2c57554ef56050aed955a309bd2c6d65253dea190bbb91e1bff9d0586f8553b" gracePeriod=15 Apr 24 16:43:17.906744 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:17.906703 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6df7cf8bd5-ljcpn_ca64442b-d234-464e-8df4-4fcd6ff1aca7/console/0.log" Apr 24 16:43:17.906918 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:17.906794 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6df7cf8bd5-ljcpn" Apr 24 16:43:17.963624 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:17.963589 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ca64442b-d234-464e-8df4-4fcd6ff1aca7-oauth-serving-cert\") pod \"ca64442b-d234-464e-8df4-4fcd6ff1aca7\" (UID: \"ca64442b-d234-464e-8df4-4fcd6ff1aca7\") " Apr 24 16:43:17.963822 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:17.963677 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ca64442b-d234-464e-8df4-4fcd6ff1aca7-console-oauth-config\") pod \"ca64442b-d234-464e-8df4-4fcd6ff1aca7\" (UID: \"ca64442b-d234-464e-8df4-4fcd6ff1aca7\") " Apr 24 16:43:17.963822 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:17.963710 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ca64442b-d234-464e-8df4-4fcd6ff1aca7-console-config\") pod \"ca64442b-d234-464e-8df4-4fcd6ff1aca7\" (UID: \"ca64442b-d234-464e-8df4-4fcd6ff1aca7\") " Apr 24 16:43:17.963822 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:17.963758 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgqfk\" (UniqueName: \"kubernetes.io/projected/ca64442b-d234-464e-8df4-4fcd6ff1aca7-kube-api-access-cgqfk\") pod \"ca64442b-d234-464e-8df4-4fcd6ff1aca7\" (UID: \"ca64442b-d234-464e-8df4-4fcd6ff1aca7\") " Apr 24 16:43:17.963822 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:17.963792 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ca64442b-d234-464e-8df4-4fcd6ff1aca7-service-ca\") pod \"ca64442b-d234-464e-8df4-4fcd6ff1aca7\" (UID: \"ca64442b-d234-464e-8df4-4fcd6ff1aca7\") " Apr 24 16:43:17.964024 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:17.963832 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ca64442b-d234-464e-8df4-4fcd6ff1aca7-console-serving-cert\") pod \"ca64442b-d234-464e-8df4-4fcd6ff1aca7\" (UID: \"ca64442b-d234-464e-8df4-4fcd6ff1aca7\") " Apr 24 16:43:17.964103 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:17.964072 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca64442b-d234-464e-8df4-4fcd6ff1aca7-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "ca64442b-d234-464e-8df4-4fcd6ff1aca7" (UID: "ca64442b-d234-464e-8df4-4fcd6ff1aca7"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:43:17.964184 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:17.964152 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca64442b-d234-464e-8df4-4fcd6ff1aca7-console-config" (OuterVolumeSpecName: "console-config") pod "ca64442b-d234-464e-8df4-4fcd6ff1aca7" (UID: "ca64442b-d234-464e-8df4-4fcd6ff1aca7"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:43:17.964564 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:17.964501 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca64442b-d234-464e-8df4-4fcd6ff1aca7-service-ca" (OuterVolumeSpecName: "service-ca") pod "ca64442b-d234-464e-8df4-4fcd6ff1aca7" (UID: "ca64442b-d234-464e-8df4-4fcd6ff1aca7"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:43:17.966075 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:17.966045 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca64442b-d234-464e-8df4-4fcd6ff1aca7-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "ca64442b-d234-464e-8df4-4fcd6ff1aca7" (UID: "ca64442b-d234-464e-8df4-4fcd6ff1aca7"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:43:17.966194 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:17.966160 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca64442b-d234-464e-8df4-4fcd6ff1aca7-kube-api-access-cgqfk" (OuterVolumeSpecName: "kube-api-access-cgqfk") pod "ca64442b-d234-464e-8df4-4fcd6ff1aca7" (UID: "ca64442b-d234-464e-8df4-4fcd6ff1aca7"). InnerVolumeSpecName "kube-api-access-cgqfk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:43:17.966250 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:17.966216 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca64442b-d234-464e-8df4-4fcd6ff1aca7-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "ca64442b-d234-464e-8df4-4fcd6ff1aca7" (UID: "ca64442b-d234-464e-8df4-4fcd6ff1aca7"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:43:18.064779 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:18.064750 2579 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ca64442b-d234-464e-8df4-4fcd6ff1aca7-console-serving-cert\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:43:18.064779 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:18.064777 2579 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ca64442b-d234-464e-8df4-4fcd6ff1aca7-oauth-serving-cert\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:43:18.064779 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:18.064786 2579 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ca64442b-d234-464e-8df4-4fcd6ff1aca7-console-oauth-config\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:43:18.065034 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:18.064796 2579 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ca64442b-d234-464e-8df4-4fcd6ff1aca7-console-config\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:43:18.065034 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:18.064804 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cgqfk\" (UniqueName: \"kubernetes.io/projected/ca64442b-d234-464e-8df4-4fcd6ff1aca7-kube-api-access-cgqfk\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:43:18.065034 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:18.064814 2579 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ca64442b-d234-464e-8df4-4fcd6ff1aca7-service-ca\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:43:18.413261 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:18.413225 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6df7cf8bd5-ljcpn_ca64442b-d234-464e-8df4-4fcd6ff1aca7/console/0.log" Apr 24 16:43:18.413457 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:18.413271 2579 generic.go:358] "Generic (PLEG): container finished" podID="ca64442b-d234-464e-8df4-4fcd6ff1aca7" containerID="a2c57554ef56050aed955a309bd2c6d65253dea190bbb91e1bff9d0586f8553b" exitCode=2 Apr 24 16:43:18.413457 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:18.413360 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6df7cf8bd5-ljcpn" Apr 24 16:43:18.413457 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:18.413388 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6df7cf8bd5-ljcpn" event={"ID":"ca64442b-d234-464e-8df4-4fcd6ff1aca7","Type":"ContainerDied","Data":"a2c57554ef56050aed955a309bd2c6d65253dea190bbb91e1bff9d0586f8553b"} Apr 24 16:43:18.413457 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:18.413435 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6df7cf8bd5-ljcpn" event={"ID":"ca64442b-d234-464e-8df4-4fcd6ff1aca7","Type":"ContainerDied","Data":"74956b91b165250b7f4c6d616701093a465897764ba7afe5baded131c75911fb"} Apr 24 16:43:18.413457 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:18.413455 2579 scope.go:117] "RemoveContainer" containerID="a2c57554ef56050aed955a309bd2c6d65253dea190bbb91e1bff9d0586f8553b" Apr 24 16:43:18.422178 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:18.422159 2579 scope.go:117] "RemoveContainer" containerID="a2c57554ef56050aed955a309bd2c6d65253dea190bbb91e1bff9d0586f8553b" Apr 24 16:43:18.422447 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:43:18.422423 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2c57554ef56050aed955a309bd2c6d65253dea190bbb91e1bff9d0586f8553b\": container with ID starting with a2c57554ef56050aed955a309bd2c6d65253dea190bbb91e1bff9d0586f8553b not found: ID does not exist" containerID="a2c57554ef56050aed955a309bd2c6d65253dea190bbb91e1bff9d0586f8553b" Apr 24 16:43:18.422520 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:18.422460 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2c57554ef56050aed955a309bd2c6d65253dea190bbb91e1bff9d0586f8553b"} err="failed to get container status \"a2c57554ef56050aed955a309bd2c6d65253dea190bbb91e1bff9d0586f8553b\": rpc error: code = NotFound desc = could not find container \"a2c57554ef56050aed955a309bd2c6d65253dea190bbb91e1bff9d0586f8553b\": container with ID starting with a2c57554ef56050aed955a309bd2c6d65253dea190bbb91e1bff9d0586f8553b not found: ID does not exist" Apr 24 16:43:18.435588 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:18.435548 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6df7cf8bd5-ljcpn"] Apr 24 16:43:18.440968 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:18.440944 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6df7cf8bd5-ljcpn"] Apr 24 16:43:18.619821 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:18.619791 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca64442b-d234-464e-8df4-4fcd6ff1aca7" path="/var/lib/kubelet/pods/ca64442b-d234-464e-8df4-4fcd6ff1aca7/volumes" Apr 24 16:43:36.534322 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:36.534221 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4af5b09f-ceea-413a-bec5-40a2b59c7ea3-metrics-certs\") pod \"network-metrics-daemon-kmh29\" (UID: \"4af5b09f-ceea-413a-bec5-40a2b59c7ea3\") " pod="openshift-multus/network-metrics-daemon-kmh29" Apr 24 16:43:36.536762 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:36.536724 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4af5b09f-ceea-413a-bec5-40a2b59c7ea3-metrics-certs\") pod \"network-metrics-daemon-kmh29\" (UID: \"4af5b09f-ceea-413a-bec5-40a2b59c7ea3\") " pod="openshift-multus/network-metrics-daemon-kmh29" Apr 24 16:43:36.618554 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:36.618526 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-czswq\"" Apr 24 16:43:36.627173 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:36.627148 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kmh29" Apr 24 16:43:36.750938 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:36.750908 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-kmh29"] Apr 24 16:43:36.753901 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:43:36.753867 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4af5b09f_ceea_413a_bec5_40a2b59c7ea3.slice/crio-4db6269e79c5224a3e6c341613ddf53b2892d79c626e6381579e7c35609c9a22 WatchSource:0}: Error finding container 4db6269e79c5224a3e6c341613ddf53b2892d79c626e6381579e7c35609c9a22: Status 404 returned error can't find the container with id 4db6269e79c5224a3e6c341613ddf53b2892d79c626e6381579e7c35609c9a22 Apr 24 16:43:37.472387 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:37.472352 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kmh29" event={"ID":"4af5b09f-ceea-413a-bec5-40a2b59c7ea3","Type":"ContainerStarted","Data":"4db6269e79c5224a3e6c341613ddf53b2892d79c626e6381579e7c35609c9a22"} Apr 24 16:43:38.480324 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:38.480227 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kmh29" event={"ID":"4af5b09f-ceea-413a-bec5-40a2b59c7ea3","Type":"ContainerStarted","Data":"078552f37f4c27e1107a9fa59d2979d66fd9bef05d476394fe7d130c3223d5ac"} Apr 24 16:43:38.480324 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:38.480267 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kmh29" event={"ID":"4af5b09f-ceea-413a-bec5-40a2b59c7ea3","Type":"ContainerStarted","Data":"a751ff6eace3df194961801a853c135a322a412b84d4afeaef4f35275e89f9f7"} Apr 24 16:43:38.503628 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:38.503579 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-kmh29" podStartSLOduration=253.466204681 podStartE2EDuration="4m14.503565635s" podCreationTimestamp="2026-04-24 16:39:24 +0000 UTC" firstStartedPulling="2026-04-24 16:43:36.755923126 +0000 UTC m=+252.688992309" lastFinishedPulling="2026-04-24 16:43:37.793284073 +0000 UTC m=+253.726353263" observedRunningTime="2026-04-24 16:43:38.502263867 +0000 UTC m=+254.435333093" watchObservedRunningTime="2026-04-24 16:43:38.503565635 +0000 UTC m=+254.436634834" Apr 24 16:43:41.522395 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:41.522364 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-87cf6d8c4-65svq"] Apr 24 16:43:41.522769 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:41.522671 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ca64442b-d234-464e-8df4-4fcd6ff1aca7" containerName="console" Apr 24 16:43:41.522769 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:41.522682 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca64442b-d234-464e-8df4-4fcd6ff1aca7" containerName="console" Apr 24 16:43:41.522769 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:41.522754 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="ca64442b-d234-464e-8df4-4fcd6ff1aca7" containerName="console" Apr 24 16:43:41.525764 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:41.525721 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-87cf6d8c4-65svq" Apr 24 16:43:41.528140 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:41.528112 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 24 16:43:41.528140 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:41.528136 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 24 16:43:41.528471 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:41.528456 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 24 16:43:41.529250 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:41.529232 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-ctfdb\"" Apr 24 16:43:41.529973 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:41.529956 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 24 16:43:41.530425 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:41.530409 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 24 16:43:41.539607 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:41.539579 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 24 16:43:41.545122 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:41.545100 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-87cf6d8c4-65svq"] Apr 24 16:43:41.572774 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:41.572742 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/882de6e6-1a62-4896-b9f2-8589b8081bd3-console-oauth-config\") pod \"console-87cf6d8c4-65svq\" (UID: \"882de6e6-1a62-4896-b9f2-8589b8081bd3\") " pod="openshift-console/console-87cf6d8c4-65svq" Apr 24 16:43:41.572774 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:41.572774 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/882de6e6-1a62-4896-b9f2-8589b8081bd3-service-ca\") pod \"console-87cf6d8c4-65svq\" (UID: \"882de6e6-1a62-4896-b9f2-8589b8081bd3\") " pod="openshift-console/console-87cf6d8c4-65svq" Apr 24 16:43:41.572986 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:41.572831 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/882de6e6-1a62-4896-b9f2-8589b8081bd3-console-serving-cert\") pod \"console-87cf6d8c4-65svq\" (UID: \"882de6e6-1a62-4896-b9f2-8589b8081bd3\") " pod="openshift-console/console-87cf6d8c4-65svq" Apr 24 16:43:41.572986 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:41.572877 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/882de6e6-1a62-4896-b9f2-8589b8081bd3-console-config\") pod \"console-87cf6d8c4-65svq\" (UID: \"882de6e6-1a62-4896-b9f2-8589b8081bd3\") " pod="openshift-console/console-87cf6d8c4-65svq" Apr 24 16:43:41.572986 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:41.572925 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wd2h6\" (UniqueName: \"kubernetes.io/projected/882de6e6-1a62-4896-b9f2-8589b8081bd3-kube-api-access-wd2h6\") pod \"console-87cf6d8c4-65svq\" (UID: \"882de6e6-1a62-4896-b9f2-8589b8081bd3\") " pod="openshift-console/console-87cf6d8c4-65svq" Apr 24 16:43:41.573091 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:41.572997 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/882de6e6-1a62-4896-b9f2-8589b8081bd3-trusted-ca-bundle\") pod \"console-87cf6d8c4-65svq\" (UID: \"882de6e6-1a62-4896-b9f2-8589b8081bd3\") " pod="openshift-console/console-87cf6d8c4-65svq" Apr 24 16:43:41.573091 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:41.573026 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/882de6e6-1a62-4896-b9f2-8589b8081bd3-oauth-serving-cert\") pod \"console-87cf6d8c4-65svq\" (UID: \"882de6e6-1a62-4896-b9f2-8589b8081bd3\") " pod="openshift-console/console-87cf6d8c4-65svq" Apr 24 16:43:41.673995 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:41.673951 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/882de6e6-1a62-4896-b9f2-8589b8081bd3-trusted-ca-bundle\") pod \"console-87cf6d8c4-65svq\" (UID: \"882de6e6-1a62-4896-b9f2-8589b8081bd3\") " pod="openshift-console/console-87cf6d8c4-65svq" Apr 24 16:43:41.673995 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:41.673995 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/882de6e6-1a62-4896-b9f2-8589b8081bd3-oauth-serving-cert\") pod \"console-87cf6d8c4-65svq\" (UID: \"882de6e6-1a62-4896-b9f2-8589b8081bd3\") " pod="openshift-console/console-87cf6d8c4-65svq" Apr 24 16:43:41.674338 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:41.674052 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/882de6e6-1a62-4896-b9f2-8589b8081bd3-console-oauth-config\") pod \"console-87cf6d8c4-65svq\" (UID: \"882de6e6-1a62-4896-b9f2-8589b8081bd3\") " pod="openshift-console/console-87cf6d8c4-65svq" Apr 24 16:43:41.674338 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:41.674079 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/882de6e6-1a62-4896-b9f2-8589b8081bd3-service-ca\") pod \"console-87cf6d8c4-65svq\" (UID: \"882de6e6-1a62-4896-b9f2-8589b8081bd3\") " pod="openshift-console/console-87cf6d8c4-65svq" Apr 24 16:43:41.674338 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:41.674116 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/882de6e6-1a62-4896-b9f2-8589b8081bd3-console-serving-cert\") pod \"console-87cf6d8c4-65svq\" (UID: \"882de6e6-1a62-4896-b9f2-8589b8081bd3\") " pod="openshift-console/console-87cf6d8c4-65svq" Apr 24 16:43:41.674338 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:41.674145 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/882de6e6-1a62-4896-b9f2-8589b8081bd3-console-config\") pod \"console-87cf6d8c4-65svq\" (UID: \"882de6e6-1a62-4896-b9f2-8589b8081bd3\") " pod="openshift-console/console-87cf6d8c4-65svq" Apr 24 16:43:41.674338 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:41.674195 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wd2h6\" (UniqueName: \"kubernetes.io/projected/882de6e6-1a62-4896-b9f2-8589b8081bd3-kube-api-access-wd2h6\") pod \"console-87cf6d8c4-65svq\" (UID: \"882de6e6-1a62-4896-b9f2-8589b8081bd3\") " pod="openshift-console/console-87cf6d8c4-65svq" Apr 24 16:43:41.675010 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:41.674901 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/882de6e6-1a62-4896-b9f2-8589b8081bd3-service-ca\") pod \"console-87cf6d8c4-65svq\" (UID: \"882de6e6-1a62-4896-b9f2-8589b8081bd3\") " pod="openshift-console/console-87cf6d8c4-65svq" Apr 24 16:43:41.675010 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:41.674901 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/882de6e6-1a62-4896-b9f2-8589b8081bd3-oauth-serving-cert\") pod \"console-87cf6d8c4-65svq\" (UID: \"882de6e6-1a62-4896-b9f2-8589b8081bd3\") " pod="openshift-console/console-87cf6d8c4-65svq" Apr 24 16:43:41.675010 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:41.674988 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/882de6e6-1a62-4896-b9f2-8589b8081bd3-console-config\") pod \"console-87cf6d8c4-65svq\" (UID: \"882de6e6-1a62-4896-b9f2-8589b8081bd3\") " pod="openshift-console/console-87cf6d8c4-65svq" Apr 24 16:43:41.675010 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:41.674993 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/882de6e6-1a62-4896-b9f2-8589b8081bd3-trusted-ca-bundle\") pod \"console-87cf6d8c4-65svq\" (UID: \"882de6e6-1a62-4896-b9f2-8589b8081bd3\") " pod="openshift-console/console-87cf6d8c4-65svq" Apr 24 16:43:41.676726 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:41.676698 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/882de6e6-1a62-4896-b9f2-8589b8081bd3-console-oauth-config\") pod \"console-87cf6d8c4-65svq\" (UID: \"882de6e6-1a62-4896-b9f2-8589b8081bd3\") " pod="openshift-console/console-87cf6d8c4-65svq" Apr 24 16:43:41.677093 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:41.677071 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/882de6e6-1a62-4896-b9f2-8589b8081bd3-console-serving-cert\") pod \"console-87cf6d8c4-65svq\" (UID: \"882de6e6-1a62-4896-b9f2-8589b8081bd3\") " pod="openshift-console/console-87cf6d8c4-65svq" Apr 24 16:43:41.692995 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:41.692968 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wd2h6\" (UniqueName: \"kubernetes.io/projected/882de6e6-1a62-4896-b9f2-8589b8081bd3-kube-api-access-wd2h6\") pod \"console-87cf6d8c4-65svq\" (UID: \"882de6e6-1a62-4896-b9f2-8589b8081bd3\") " pod="openshift-console/console-87cf6d8c4-65svq" Apr 24 16:43:41.840471 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:41.840370 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-87cf6d8c4-65svq" Apr 24 16:43:42.006067 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:42.006040 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-87cf6d8c4-65svq"] Apr 24 16:43:42.008098 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:43:42.008066 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod882de6e6_1a62_4896_b9f2_8589b8081bd3.slice/crio-5e36a3602211116e90bd58dab52cdd383f5b7c95ae1d6fa4b48a8fe8ce589bab WatchSource:0}: Error finding container 5e36a3602211116e90bd58dab52cdd383f5b7c95ae1d6fa4b48a8fe8ce589bab: Status 404 returned error can't find the container with id 5e36a3602211116e90bd58dab52cdd383f5b7c95ae1d6fa4b48a8fe8ce589bab Apr 24 16:43:42.493177 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:42.493135 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-87cf6d8c4-65svq" event={"ID":"882de6e6-1a62-4896-b9f2-8589b8081bd3","Type":"ContainerStarted","Data":"0990a1fbfc07b1f5a372372d5ebd9f1524348b5f1f10f693d47b6b747c3c9df7"} Apr 24 16:43:42.493177 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:42.493178 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-87cf6d8c4-65svq" event={"ID":"882de6e6-1a62-4896-b9f2-8589b8081bd3","Type":"ContainerStarted","Data":"5e36a3602211116e90bd58dab52cdd383f5b7c95ae1d6fa4b48a8fe8ce589bab"} Apr 24 16:43:42.524232 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:42.524184 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-87cf6d8c4-65svq" podStartSLOduration=1.524169232 podStartE2EDuration="1.524169232s" podCreationTimestamp="2026-04-24 16:43:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:43:42.523796178 +0000 UTC m=+258.456865377" watchObservedRunningTime="2026-04-24 16:43:42.524169232 +0000 UTC m=+258.457238431" Apr 24 16:43:51.841384 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:51.841341 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-87cf6d8c4-65svq" Apr 24 16:43:51.841384 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:51.841395 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-87cf6d8c4-65svq" Apr 24 16:43:51.847125 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:51.847104 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-87cf6d8c4-65svq" Apr 24 16:43:52.531644 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:43:52.531620 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-87cf6d8c4-65svq" Apr 24 16:44:24.529992 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:44:24.529957 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-52mlp_9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80/console-operator/2.log" Apr 24 16:44:24.532264 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:44:24.532236 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-52mlp_9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80/console-operator/2.log" Apr 24 16:44:24.541539 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:44:24.541517 2579 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 16:45:23.053006 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:23.052904 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-dpptg"] Apr 24 16:45:23.056340 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:23.056321 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dpptg" Apr 24 16:45:23.061233 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:23.061204 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 16:45:23.066725 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:23.066697 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-dpptg"] Apr 24 16:45:23.201770 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:23.201698 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/756e5676-cd3e-4b45-abfe-777889ee4865-dbus\") pod \"global-pull-secret-syncer-dpptg\" (UID: \"756e5676-cd3e-4b45-abfe-777889ee4865\") " pod="kube-system/global-pull-secret-syncer-dpptg" Apr 24 16:45:23.201955 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:23.201832 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/756e5676-cd3e-4b45-abfe-777889ee4865-kubelet-config\") pod \"global-pull-secret-syncer-dpptg\" (UID: \"756e5676-cd3e-4b45-abfe-777889ee4865\") " pod="kube-system/global-pull-secret-syncer-dpptg" Apr 24 16:45:23.201955 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:23.201859 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/756e5676-cd3e-4b45-abfe-777889ee4865-original-pull-secret\") pod \"global-pull-secret-syncer-dpptg\" (UID: \"756e5676-cd3e-4b45-abfe-777889ee4865\") " pod="kube-system/global-pull-secret-syncer-dpptg" Apr 24 16:45:23.302988 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:23.302950 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/756e5676-cd3e-4b45-abfe-777889ee4865-kubelet-config\") pod \"global-pull-secret-syncer-dpptg\" (UID: \"756e5676-cd3e-4b45-abfe-777889ee4865\") " pod="kube-system/global-pull-secret-syncer-dpptg" Apr 24 16:45:23.302988 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:23.302991 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/756e5676-cd3e-4b45-abfe-777889ee4865-original-pull-secret\") pod \"global-pull-secret-syncer-dpptg\" (UID: \"756e5676-cd3e-4b45-abfe-777889ee4865\") " pod="kube-system/global-pull-secret-syncer-dpptg" Apr 24 16:45:23.303282 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:23.303070 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/756e5676-cd3e-4b45-abfe-777889ee4865-dbus\") pod \"global-pull-secret-syncer-dpptg\" (UID: \"756e5676-cd3e-4b45-abfe-777889ee4865\") " pod="kube-system/global-pull-secret-syncer-dpptg" Apr 24 16:45:23.303282 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:23.303124 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/756e5676-cd3e-4b45-abfe-777889ee4865-kubelet-config\") pod \"global-pull-secret-syncer-dpptg\" (UID: \"756e5676-cd3e-4b45-abfe-777889ee4865\") " pod="kube-system/global-pull-secret-syncer-dpptg" Apr 24 16:45:23.303379 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:23.303286 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/756e5676-cd3e-4b45-abfe-777889ee4865-dbus\") pod \"global-pull-secret-syncer-dpptg\" (UID: \"756e5676-cd3e-4b45-abfe-777889ee4865\") " pod="kube-system/global-pull-secret-syncer-dpptg" Apr 24 16:45:23.305562 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:23.305527 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/756e5676-cd3e-4b45-abfe-777889ee4865-original-pull-secret\") pod \"global-pull-secret-syncer-dpptg\" (UID: \"756e5676-cd3e-4b45-abfe-777889ee4865\") " pod="kube-system/global-pull-secret-syncer-dpptg" Apr 24 16:45:23.366653 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:23.366611 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dpptg" Apr 24 16:45:23.500496 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:23.500459 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-dpptg"] Apr 24 16:45:23.503651 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:45:23.503618 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod756e5676_cd3e_4b45_abfe_777889ee4865.slice/crio-e662d742663d0d9dbac633510fcf81e4845895a372a315802b83ccf863eb0c29 WatchSource:0}: Error finding container e662d742663d0d9dbac633510fcf81e4845895a372a315802b83ccf863eb0c29: Status 404 returned error can't find the container with id e662d742663d0d9dbac633510fcf81e4845895a372a315802b83ccf863eb0c29 Apr 24 16:45:23.505381 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:23.505364 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 16:45:23.795492 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:23.795457 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-dpptg" event={"ID":"756e5676-cd3e-4b45-abfe-777889ee4865","Type":"ContainerStarted","Data":"e662d742663d0d9dbac633510fcf81e4845895a372a315802b83ccf863eb0c29"} Apr 24 16:45:27.816385 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:27.816342 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-dpptg" event={"ID":"756e5676-cd3e-4b45-abfe-777889ee4865","Type":"ContainerStarted","Data":"36c94aa734978b22a1fe86fd99efff727638dbf35d33fb1caf27a144f1d587a0"} Apr 24 16:45:27.832296 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:27.832236 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-dpptg" podStartSLOduration=1.068001499 podStartE2EDuration="4.832217601s" podCreationTimestamp="2026-04-24 16:45:23 +0000 UTC" firstStartedPulling="2026-04-24 16:45:23.505917025 +0000 UTC m=+359.438986202" lastFinishedPulling="2026-04-24 16:45:27.270133125 +0000 UTC m=+363.203202304" observedRunningTime="2026-04-24 16:45:27.830527675 +0000 UTC m=+363.763596879" watchObservedRunningTime="2026-04-24 16:45:27.832217601 +0000 UTC m=+363.765286799" Apr 24 16:45:31.050962 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:31.050925 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-87cf6d8c4-65svq"] Apr 24 16:45:42.777136 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:42.777093 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfmjhh"] Apr 24 16:45:42.780662 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:42.780641 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfmjhh" Apr 24 16:45:42.783638 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:42.783605 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-4gnn8\"" Apr 24 16:45:42.783833 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:42.783813 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 24 16:45:42.783932 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:42.783916 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 24 16:45:42.793105 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:42.793072 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfmjhh"] Apr 24 16:45:42.860163 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:42.860102 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4nn7\" (UniqueName: \"kubernetes.io/projected/2f5eac17-d7fb-4d0a-a0e8-58fee1a9788e-kube-api-access-k4nn7\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfmjhh\" (UID: \"2f5eac17-d7fb-4d0a-a0e8-58fee1a9788e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfmjhh" Apr 24 16:45:42.860390 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:42.860195 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2f5eac17-d7fb-4d0a-a0e8-58fee1a9788e-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfmjhh\" (UID: \"2f5eac17-d7fb-4d0a-a0e8-58fee1a9788e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfmjhh" Apr 24 16:45:42.860390 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:42.860224 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2f5eac17-d7fb-4d0a-a0e8-58fee1a9788e-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfmjhh\" (UID: \"2f5eac17-d7fb-4d0a-a0e8-58fee1a9788e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfmjhh" Apr 24 16:45:42.960592 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:42.960550 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k4nn7\" (UniqueName: \"kubernetes.io/projected/2f5eac17-d7fb-4d0a-a0e8-58fee1a9788e-kube-api-access-k4nn7\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfmjhh\" (UID: \"2f5eac17-d7fb-4d0a-a0e8-58fee1a9788e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfmjhh" Apr 24 16:45:42.960801 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:42.960607 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2f5eac17-d7fb-4d0a-a0e8-58fee1a9788e-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfmjhh\" (UID: \"2f5eac17-d7fb-4d0a-a0e8-58fee1a9788e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfmjhh" Apr 24 16:45:42.960801 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:42.960631 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2f5eac17-d7fb-4d0a-a0e8-58fee1a9788e-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfmjhh\" (UID: \"2f5eac17-d7fb-4d0a-a0e8-58fee1a9788e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfmjhh" Apr 24 16:45:42.961021 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:42.961004 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2f5eac17-d7fb-4d0a-a0e8-58fee1a9788e-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfmjhh\" (UID: \"2f5eac17-d7fb-4d0a-a0e8-58fee1a9788e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfmjhh" Apr 24 16:45:42.961058 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:42.961043 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2f5eac17-d7fb-4d0a-a0e8-58fee1a9788e-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfmjhh\" (UID: \"2f5eac17-d7fb-4d0a-a0e8-58fee1a9788e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfmjhh" Apr 24 16:45:42.970889 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:42.970858 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4nn7\" (UniqueName: \"kubernetes.io/projected/2f5eac17-d7fb-4d0a-a0e8-58fee1a9788e-kube-api-access-k4nn7\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfmjhh\" (UID: \"2f5eac17-d7fb-4d0a-a0e8-58fee1a9788e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfmjhh" Apr 24 16:45:43.090151 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:43.090040 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfmjhh" Apr 24 16:45:43.223533 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:43.223481 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfmjhh"] Apr 24 16:45:43.227693 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:45:43.227661 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f5eac17_d7fb_4d0a_a0e8_58fee1a9788e.slice/crio-a55296b931719e3bd71b3665a63e805ee507f29033f70f52d893a9f3aa67939f WatchSource:0}: Error finding container a55296b931719e3bd71b3665a63e805ee507f29033f70f52d893a9f3aa67939f: Status 404 returned error can't find the container with id a55296b931719e3bd71b3665a63e805ee507f29033f70f52d893a9f3aa67939f Apr 24 16:45:43.866800 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:43.866758 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfmjhh" event={"ID":"2f5eac17-d7fb-4d0a-a0e8-58fee1a9788e","Type":"ContainerStarted","Data":"a55296b931719e3bd71b3665a63e805ee507f29033f70f52d893a9f3aa67939f"} Apr 24 16:45:49.886255 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:49.886217 2579 generic.go:358] "Generic (PLEG): container finished" podID="2f5eac17-d7fb-4d0a-a0e8-58fee1a9788e" containerID="f4c8f013ff3435492fdafa932e279275dcfa032a987b3fcaadec9bff7b0e393f" exitCode=0 Apr 24 16:45:49.886702 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:49.886259 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfmjhh" event={"ID":"2f5eac17-d7fb-4d0a-a0e8-58fee1a9788e","Type":"ContainerDied","Data":"f4c8f013ff3435492fdafa932e279275dcfa032a987b3fcaadec9bff7b0e393f"} Apr 24 16:45:52.897103 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:52.897062 2579 generic.go:358] "Generic (PLEG): container finished" podID="2f5eac17-d7fb-4d0a-a0e8-58fee1a9788e" containerID="9b5252c9a91171a0fdcfffd9392d50873c3c0ca1a86297e4fb57b71e5ca7e44f" exitCode=0 Apr 24 16:45:52.897485 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:52.897142 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfmjhh" event={"ID":"2f5eac17-d7fb-4d0a-a0e8-58fee1a9788e","Type":"ContainerDied","Data":"9b5252c9a91171a0fdcfffd9392d50873c3c0ca1a86297e4fb57b71e5ca7e44f"} Apr 24 16:45:56.070404 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:56.070343 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-87cf6d8c4-65svq" podUID="882de6e6-1a62-4896-b9f2-8589b8081bd3" containerName="console" containerID="cri-o://0990a1fbfc07b1f5a372372d5ebd9f1524348b5f1f10f693d47b6b747c3c9df7" gracePeriod=15 Apr 24 16:45:56.314217 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:56.314192 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-87cf6d8c4-65svq_882de6e6-1a62-4896-b9f2-8589b8081bd3/console/0.log" Apr 24 16:45:56.314353 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:56.314268 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-87cf6d8c4-65svq" Apr 24 16:45:56.364219 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:56.364129 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/882de6e6-1a62-4896-b9f2-8589b8081bd3-console-config\") pod \"882de6e6-1a62-4896-b9f2-8589b8081bd3\" (UID: \"882de6e6-1a62-4896-b9f2-8589b8081bd3\") " Apr 24 16:45:56.364219 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:56.364200 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/882de6e6-1a62-4896-b9f2-8589b8081bd3-oauth-serving-cert\") pod \"882de6e6-1a62-4896-b9f2-8589b8081bd3\" (UID: \"882de6e6-1a62-4896-b9f2-8589b8081bd3\") " Apr 24 16:45:56.364426 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:56.364234 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/882de6e6-1a62-4896-b9f2-8589b8081bd3-trusted-ca-bundle\") pod \"882de6e6-1a62-4896-b9f2-8589b8081bd3\" (UID: \"882de6e6-1a62-4896-b9f2-8589b8081bd3\") " Apr 24 16:45:56.364426 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:56.364282 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wd2h6\" (UniqueName: \"kubernetes.io/projected/882de6e6-1a62-4896-b9f2-8589b8081bd3-kube-api-access-wd2h6\") pod \"882de6e6-1a62-4896-b9f2-8589b8081bd3\" (UID: \"882de6e6-1a62-4896-b9f2-8589b8081bd3\") " Apr 24 16:45:56.364426 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:56.364327 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/882de6e6-1a62-4896-b9f2-8589b8081bd3-console-oauth-config\") pod \"882de6e6-1a62-4896-b9f2-8589b8081bd3\" (UID: \"882de6e6-1a62-4896-b9f2-8589b8081bd3\") " Apr 24 16:45:56.364426 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:56.364346 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/882de6e6-1a62-4896-b9f2-8589b8081bd3-service-ca\") pod \"882de6e6-1a62-4896-b9f2-8589b8081bd3\" (UID: \"882de6e6-1a62-4896-b9f2-8589b8081bd3\") " Apr 24 16:45:56.364426 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:56.364367 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/882de6e6-1a62-4896-b9f2-8589b8081bd3-console-serving-cert\") pod \"882de6e6-1a62-4896-b9f2-8589b8081bd3\" (UID: \"882de6e6-1a62-4896-b9f2-8589b8081bd3\") " Apr 24 16:45:56.364637 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:56.364553 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/882de6e6-1a62-4896-b9f2-8589b8081bd3-console-config" (OuterVolumeSpecName: "console-config") pod "882de6e6-1a62-4896-b9f2-8589b8081bd3" (UID: "882de6e6-1a62-4896-b9f2-8589b8081bd3"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:45:56.365005 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:56.364860 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/882de6e6-1a62-4896-b9f2-8589b8081bd3-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "882de6e6-1a62-4896-b9f2-8589b8081bd3" (UID: "882de6e6-1a62-4896-b9f2-8589b8081bd3"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:45:56.365005 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:56.364988 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/882de6e6-1a62-4896-b9f2-8589b8081bd3-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "882de6e6-1a62-4896-b9f2-8589b8081bd3" (UID: "882de6e6-1a62-4896-b9f2-8589b8081bd3"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:45:56.365174 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:56.365027 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/882de6e6-1a62-4896-b9f2-8589b8081bd3-service-ca" (OuterVolumeSpecName: "service-ca") pod "882de6e6-1a62-4896-b9f2-8589b8081bd3" (UID: "882de6e6-1a62-4896-b9f2-8589b8081bd3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:45:56.367098 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:56.367072 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/882de6e6-1a62-4896-b9f2-8589b8081bd3-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "882de6e6-1a62-4896-b9f2-8589b8081bd3" (UID: "882de6e6-1a62-4896-b9f2-8589b8081bd3"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:45:56.367218 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:56.367196 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/882de6e6-1a62-4896-b9f2-8589b8081bd3-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "882de6e6-1a62-4896-b9f2-8589b8081bd3" (UID: "882de6e6-1a62-4896-b9f2-8589b8081bd3"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:45:56.367368 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:56.367338 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/882de6e6-1a62-4896-b9f2-8589b8081bd3-kube-api-access-wd2h6" (OuterVolumeSpecName: "kube-api-access-wd2h6") pod "882de6e6-1a62-4896-b9f2-8589b8081bd3" (UID: "882de6e6-1a62-4896-b9f2-8589b8081bd3"). InnerVolumeSpecName "kube-api-access-wd2h6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:45:56.465109 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:56.465067 2579 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/882de6e6-1a62-4896-b9f2-8589b8081bd3-trusted-ca-bundle\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:45:56.465109 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:56.465108 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wd2h6\" (UniqueName: \"kubernetes.io/projected/882de6e6-1a62-4896-b9f2-8589b8081bd3-kube-api-access-wd2h6\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:45:56.465346 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:56.465124 2579 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/882de6e6-1a62-4896-b9f2-8589b8081bd3-console-oauth-config\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:45:56.465346 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:56.465139 2579 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/882de6e6-1a62-4896-b9f2-8589b8081bd3-service-ca\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:45:56.465346 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:56.465153 2579 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/882de6e6-1a62-4896-b9f2-8589b8081bd3-console-serving-cert\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:45:56.465346 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:56.465168 2579 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/882de6e6-1a62-4896-b9f2-8589b8081bd3-console-config\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:45:56.465346 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:56.465181 2579 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/882de6e6-1a62-4896-b9f2-8589b8081bd3-oauth-serving-cert\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:45:56.912554 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:56.912485 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-87cf6d8c4-65svq_882de6e6-1a62-4896-b9f2-8589b8081bd3/console/0.log" Apr 24 16:45:56.912554 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:56.912538 2579 generic.go:358] "Generic (PLEG): container finished" podID="882de6e6-1a62-4896-b9f2-8589b8081bd3" containerID="0990a1fbfc07b1f5a372372d5ebd9f1524348b5f1f10f693d47b6b747c3c9df7" exitCode=2 Apr 24 16:45:56.912809 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:56.912624 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-87cf6d8c4-65svq" Apr 24 16:45:56.912809 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:56.912633 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-87cf6d8c4-65svq" event={"ID":"882de6e6-1a62-4896-b9f2-8589b8081bd3","Type":"ContainerDied","Data":"0990a1fbfc07b1f5a372372d5ebd9f1524348b5f1f10f693d47b6b747c3c9df7"} Apr 24 16:45:56.912809 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:56.912762 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-87cf6d8c4-65svq" event={"ID":"882de6e6-1a62-4896-b9f2-8589b8081bd3","Type":"ContainerDied","Data":"5e36a3602211116e90bd58dab52cdd383f5b7c95ae1d6fa4b48a8fe8ce589bab"} Apr 24 16:45:56.912809 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:56.912780 2579 scope.go:117] "RemoveContainer" containerID="0990a1fbfc07b1f5a372372d5ebd9f1524348b5f1f10f693d47b6b747c3c9df7" Apr 24 16:45:56.923000 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:56.922980 2579 scope.go:117] "RemoveContainer" containerID="0990a1fbfc07b1f5a372372d5ebd9f1524348b5f1f10f693d47b6b747c3c9df7" Apr 24 16:45:56.923287 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:45:56.923258 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0990a1fbfc07b1f5a372372d5ebd9f1524348b5f1f10f693d47b6b747c3c9df7\": container with ID starting with 0990a1fbfc07b1f5a372372d5ebd9f1524348b5f1f10f693d47b6b747c3c9df7 not found: ID does not exist" containerID="0990a1fbfc07b1f5a372372d5ebd9f1524348b5f1f10f693d47b6b747c3c9df7" Apr 24 16:45:56.923365 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:56.923299 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0990a1fbfc07b1f5a372372d5ebd9f1524348b5f1f10f693d47b6b747c3c9df7"} err="failed to get container status \"0990a1fbfc07b1f5a372372d5ebd9f1524348b5f1f10f693d47b6b747c3c9df7\": rpc error: code = NotFound desc = could not find container \"0990a1fbfc07b1f5a372372d5ebd9f1524348b5f1f10f693d47b6b747c3c9df7\": container with ID starting with 0990a1fbfc07b1f5a372372d5ebd9f1524348b5f1f10f693d47b6b747c3c9df7 not found: ID does not exist" Apr 24 16:45:56.938175 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:56.938146 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-87cf6d8c4-65svq"] Apr 24 16:45:56.942006 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:56.941979 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-87cf6d8c4-65svq"] Apr 24 16:45:58.620234 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:58.620193 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="882de6e6-1a62-4896-b9f2-8589b8081bd3" path="/var/lib/kubelet/pods/882de6e6-1a62-4896-b9f2-8589b8081bd3/volumes" Apr 24 16:45:59.926712 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:59.926671 2579 generic.go:358] "Generic (PLEG): container finished" podID="2f5eac17-d7fb-4d0a-a0e8-58fee1a9788e" containerID="6584c3dcf38e4ebfa5e733433f2dd5aba78fa0084f4092bd97fecfcfd71c2981" exitCode=0 Apr 24 16:45:59.927134 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:45:59.926759 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfmjhh" event={"ID":"2f5eac17-d7fb-4d0a-a0e8-58fee1a9788e","Type":"ContainerDied","Data":"6584c3dcf38e4ebfa5e733433f2dd5aba78fa0084f4092bd97fecfcfd71c2981"} Apr 24 16:46:01.050440 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:01.050416 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfmjhh" Apr 24 16:46:01.109124 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:01.109086 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2f5eac17-d7fb-4d0a-a0e8-58fee1a9788e-bundle\") pod \"2f5eac17-d7fb-4d0a-a0e8-58fee1a9788e\" (UID: \"2f5eac17-d7fb-4d0a-a0e8-58fee1a9788e\") " Apr 24 16:46:01.109124 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:01.109132 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2f5eac17-d7fb-4d0a-a0e8-58fee1a9788e-util\") pod \"2f5eac17-d7fb-4d0a-a0e8-58fee1a9788e\" (UID: \"2f5eac17-d7fb-4d0a-a0e8-58fee1a9788e\") " Apr 24 16:46:01.109328 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:01.109191 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4nn7\" (UniqueName: \"kubernetes.io/projected/2f5eac17-d7fb-4d0a-a0e8-58fee1a9788e-kube-api-access-k4nn7\") pod \"2f5eac17-d7fb-4d0a-a0e8-58fee1a9788e\" (UID: \"2f5eac17-d7fb-4d0a-a0e8-58fee1a9788e\") " Apr 24 16:46:01.109665 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:01.109642 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f5eac17-d7fb-4d0a-a0e8-58fee1a9788e-bundle" (OuterVolumeSpecName: "bundle") pod "2f5eac17-d7fb-4d0a-a0e8-58fee1a9788e" (UID: "2f5eac17-d7fb-4d0a-a0e8-58fee1a9788e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:46:01.111542 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:01.111513 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f5eac17-d7fb-4d0a-a0e8-58fee1a9788e-kube-api-access-k4nn7" (OuterVolumeSpecName: "kube-api-access-k4nn7") pod "2f5eac17-d7fb-4d0a-a0e8-58fee1a9788e" (UID: "2f5eac17-d7fb-4d0a-a0e8-58fee1a9788e"). InnerVolumeSpecName "kube-api-access-k4nn7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:46:01.113517 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:01.113498 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f5eac17-d7fb-4d0a-a0e8-58fee1a9788e-util" (OuterVolumeSpecName: "util") pod "2f5eac17-d7fb-4d0a-a0e8-58fee1a9788e" (UID: "2f5eac17-d7fb-4d0a-a0e8-58fee1a9788e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:46:01.210617 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:01.210521 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k4nn7\" (UniqueName: \"kubernetes.io/projected/2f5eac17-d7fb-4d0a-a0e8-58fee1a9788e-kube-api-access-k4nn7\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:46:01.210617 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:01.210566 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2f5eac17-d7fb-4d0a-a0e8-58fee1a9788e-bundle\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:46:01.210617 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:01.210580 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2f5eac17-d7fb-4d0a-a0e8-58fee1a9788e-util\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:46:01.933928 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:01.933896 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfmjhh" Apr 24 16:46:01.933928 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:01.933909 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfmjhh" event={"ID":"2f5eac17-d7fb-4d0a-a0e8-58fee1a9788e","Type":"ContainerDied","Data":"a55296b931719e3bd71b3665a63e805ee507f29033f70f52d893a9f3aa67939f"} Apr 24 16:46:01.934136 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:01.933947 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a55296b931719e3bd71b3665a63e805ee507f29033f70f52d893a9f3aa67939f" Apr 24 16:46:06.141193 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:06.141160 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-lrlql"] Apr 24 16:46:06.141606 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:06.141520 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2f5eac17-d7fb-4d0a-a0e8-58fee1a9788e" containerName="extract" Apr 24 16:46:06.141606 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:06.141533 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f5eac17-d7fb-4d0a-a0e8-58fee1a9788e" containerName="extract" Apr 24 16:46:06.141606 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:06.141548 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2f5eac17-d7fb-4d0a-a0e8-58fee1a9788e" containerName="pull" Apr 24 16:46:06.141606 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:06.141553 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f5eac17-d7fb-4d0a-a0e8-58fee1a9788e" containerName="pull" Apr 24 16:46:06.141606 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:06.141564 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="882de6e6-1a62-4896-b9f2-8589b8081bd3" containerName="console" Apr 24 16:46:06.141606 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:06.141570 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="882de6e6-1a62-4896-b9f2-8589b8081bd3" containerName="console" Apr 24 16:46:06.141606 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:06.141576 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2f5eac17-d7fb-4d0a-a0e8-58fee1a9788e" containerName="util" Apr 24 16:46:06.141606 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:06.141581 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f5eac17-d7fb-4d0a-a0e8-58fee1a9788e" containerName="util" Apr 24 16:46:06.141881 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:06.141637 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="882de6e6-1a62-4896-b9f2-8589b8081bd3" containerName="console" Apr 24 16:46:06.141881 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:06.141650 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="2f5eac17-d7fb-4d0a-a0e8-58fee1a9788e" containerName="extract" Apr 24 16:46:06.144505 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:06.144486 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-lrlql" Apr 24 16:46:06.149655 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:06.149621 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 24 16:46:06.149804 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:06.149714 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 24 16:46:06.149804 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:06.149772 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 24 16:46:06.149804 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:06.149794 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-l877m\"" Apr 24 16:46:06.171302 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:06.171262 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-lrlql"] Apr 24 16:46:06.256170 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:06.256126 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/bf5705a2-d260-4c2d-be6d-18bf1d6251ca-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-lrlql\" (UID: \"bf5705a2-d260-4c2d-be6d-18bf1d6251ca\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-lrlql" Apr 24 16:46:06.256354 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:06.256209 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7scsm\" (UniqueName: \"kubernetes.io/projected/bf5705a2-d260-4c2d-be6d-18bf1d6251ca-kube-api-access-7scsm\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-lrlql\" (UID: \"bf5705a2-d260-4c2d-be6d-18bf1d6251ca\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-lrlql" Apr 24 16:46:06.357264 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:06.357222 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/bf5705a2-d260-4c2d-be6d-18bf1d6251ca-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-lrlql\" (UID: \"bf5705a2-d260-4c2d-be6d-18bf1d6251ca\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-lrlql" Apr 24 16:46:06.357414 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:06.357293 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7scsm\" (UniqueName: \"kubernetes.io/projected/bf5705a2-d260-4c2d-be6d-18bf1d6251ca-kube-api-access-7scsm\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-lrlql\" (UID: \"bf5705a2-d260-4c2d-be6d-18bf1d6251ca\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-lrlql" Apr 24 16:46:06.359935 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:06.359897 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/bf5705a2-d260-4c2d-be6d-18bf1d6251ca-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-lrlql\" (UID: \"bf5705a2-d260-4c2d-be6d-18bf1d6251ca\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-lrlql" Apr 24 16:46:06.369944 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:06.369908 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7scsm\" (UniqueName: \"kubernetes.io/projected/bf5705a2-d260-4c2d-be6d-18bf1d6251ca-kube-api-access-7scsm\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-lrlql\" (UID: \"bf5705a2-d260-4c2d-be6d-18bf1d6251ca\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-lrlql" Apr 24 16:46:06.454968 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:06.454877 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-lrlql" Apr 24 16:46:06.611004 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:06.610970 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-lrlql"] Apr 24 16:46:06.614699 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:46:06.614666 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf5705a2_d260_4c2d_be6d_18bf1d6251ca.slice/crio-c586082e29e59ebdbb9c1befb7311be1287840f402df9cdc8affce28de9c087a WatchSource:0}: Error finding container c586082e29e59ebdbb9c1befb7311be1287840f402df9cdc8affce28de9c087a: Status 404 returned error can't find the container with id c586082e29e59ebdbb9c1befb7311be1287840f402df9cdc8affce28de9c087a Apr 24 16:46:06.952642 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:06.952604 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-lrlql" event={"ID":"bf5705a2-d260-4c2d-be6d-18bf1d6251ca","Type":"ContainerStarted","Data":"c586082e29e59ebdbb9c1befb7311be1287840f402df9cdc8affce28de9c087a"} Apr 24 16:46:10.968239 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:10.968196 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-lrlql" event={"ID":"bf5705a2-d260-4c2d-be6d-18bf1d6251ca","Type":"ContainerStarted","Data":"a6aa36cba7a2661e199c01a8079f4ef6094c71d7fe5289af04b0789bbc32d6e7"} Apr 24 16:46:10.968712 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:10.968309 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-lrlql" Apr 24 16:46:11.001467 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:11.001390 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-lrlql" podStartSLOduration=0.765343952 podStartE2EDuration="5.001368364s" podCreationTimestamp="2026-04-24 16:46:06 +0000 UTC" firstStartedPulling="2026-04-24 16:46:06.616522744 +0000 UTC m=+402.549591926" lastFinishedPulling="2026-04-24 16:46:10.852547151 +0000 UTC m=+406.785616338" observedRunningTime="2026-04-24 16:46:10.997590452 +0000 UTC m=+406.930659656" watchObservedRunningTime="2026-04-24 16:46:11.001368364 +0000 UTC m=+406.934437564" Apr 24 16:46:11.462660 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:11.462620 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-jp5xr"] Apr 24 16:46:11.466350 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:11.466326 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-jp5xr" Apr 24 16:46:11.469572 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:11.469541 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-6jw28\"" Apr 24 16:46:11.469772 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:11.469602 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 24 16:46:11.469878 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:11.469864 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 24 16:46:11.483675 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:11.483641 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-jp5xr"] Apr 24 16:46:11.605045 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:11.605004 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/e372ad1a-e6db-4e83-be58-e8b10f0f167f-cabundle0\") pod \"keda-operator-ffbb595cb-jp5xr\" (UID: \"e372ad1a-e6db-4e83-be58-e8b10f0f167f\") " pod="openshift-keda/keda-operator-ffbb595cb-jp5xr" Apr 24 16:46:11.605045 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:11.605050 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shgtj\" (UniqueName: \"kubernetes.io/projected/e372ad1a-e6db-4e83-be58-e8b10f0f167f-kube-api-access-shgtj\") pod \"keda-operator-ffbb595cb-jp5xr\" (UID: \"e372ad1a-e6db-4e83-be58-e8b10f0f167f\") " pod="openshift-keda/keda-operator-ffbb595cb-jp5xr" Apr 24 16:46:11.605297 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:11.605072 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e372ad1a-e6db-4e83-be58-e8b10f0f167f-certificates\") pod \"keda-operator-ffbb595cb-jp5xr\" (UID: \"e372ad1a-e6db-4e83-be58-e8b10f0f167f\") " pod="openshift-keda/keda-operator-ffbb595cb-jp5xr" Apr 24 16:46:11.706170 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:11.706124 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/e372ad1a-e6db-4e83-be58-e8b10f0f167f-cabundle0\") pod \"keda-operator-ffbb595cb-jp5xr\" (UID: \"e372ad1a-e6db-4e83-be58-e8b10f0f167f\") " pod="openshift-keda/keda-operator-ffbb595cb-jp5xr" Apr 24 16:46:11.706170 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:11.706173 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-shgtj\" (UniqueName: \"kubernetes.io/projected/e372ad1a-e6db-4e83-be58-e8b10f0f167f-kube-api-access-shgtj\") pod \"keda-operator-ffbb595cb-jp5xr\" (UID: \"e372ad1a-e6db-4e83-be58-e8b10f0f167f\") " pod="openshift-keda/keda-operator-ffbb595cb-jp5xr" Apr 24 16:46:11.706450 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:11.706205 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e372ad1a-e6db-4e83-be58-e8b10f0f167f-certificates\") pod \"keda-operator-ffbb595cb-jp5xr\" (UID: \"e372ad1a-e6db-4e83-be58-e8b10f0f167f\") " pod="openshift-keda/keda-operator-ffbb595cb-jp5xr" Apr 24 16:46:11.706450 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:46:11.706350 2579 secret.go:281] references non-existent secret key: ca.crt Apr 24 16:46:11.706450 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:46:11.706367 2579 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 16:46:11.706450 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:46:11.706378 2579 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-jp5xr: references non-existent secret key: ca.crt Apr 24 16:46:11.706450 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:46:11.706449 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e372ad1a-e6db-4e83-be58-e8b10f0f167f-certificates podName:e372ad1a-e6db-4e83-be58-e8b10f0f167f nodeName:}" failed. No retries permitted until 2026-04-24 16:46:12.206425635 +0000 UTC m=+408.139494833 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/e372ad1a-e6db-4e83-be58-e8b10f0f167f-certificates") pod "keda-operator-ffbb595cb-jp5xr" (UID: "e372ad1a-e6db-4e83-be58-e8b10f0f167f") : references non-existent secret key: ca.crt Apr 24 16:46:11.706856 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:11.706832 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/e372ad1a-e6db-4e83-be58-e8b10f0f167f-cabundle0\") pod \"keda-operator-ffbb595cb-jp5xr\" (UID: \"e372ad1a-e6db-4e83-be58-e8b10f0f167f\") " pod="openshift-keda/keda-operator-ffbb595cb-jp5xr" Apr 24 16:46:11.722630 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:11.722554 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-shgtj\" (UniqueName: \"kubernetes.io/projected/e372ad1a-e6db-4e83-be58-e8b10f0f167f-kube-api-access-shgtj\") pod \"keda-operator-ffbb595cb-jp5xr\" (UID: \"e372ad1a-e6db-4e83-be58-e8b10f0f167f\") " pod="openshift-keda/keda-operator-ffbb595cb-jp5xr" Apr 24 16:46:11.930411 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:11.930365 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-qzcw8"] Apr 24 16:46:11.934150 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:11.934120 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qzcw8" Apr 24 16:46:11.936874 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:11.936846 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 24 16:46:11.944745 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:11.944694 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-qzcw8"] Apr 24 16:46:12.008687 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:12.008644 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/8ac911e4-1e35-4da3-8341-89d4777b80bf-certificates\") pod \"keda-metrics-apiserver-7c9f485588-qzcw8\" (UID: \"8ac911e4-1e35-4da3-8341-89d4777b80bf\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qzcw8" Apr 24 16:46:12.009115 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:12.008699 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/8ac911e4-1e35-4da3-8341-89d4777b80bf-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-qzcw8\" (UID: \"8ac911e4-1e35-4da3-8341-89d4777b80bf\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qzcw8" Apr 24 16:46:12.009115 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:12.008757 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9wlf\" (UniqueName: \"kubernetes.io/projected/8ac911e4-1e35-4da3-8341-89d4777b80bf-kube-api-access-n9wlf\") pod \"keda-metrics-apiserver-7c9f485588-qzcw8\" (UID: \"8ac911e4-1e35-4da3-8341-89d4777b80bf\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qzcw8" Apr 24 16:46:12.110024 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:12.109987 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/8ac911e4-1e35-4da3-8341-89d4777b80bf-certificates\") pod \"keda-metrics-apiserver-7c9f485588-qzcw8\" (UID: \"8ac911e4-1e35-4da3-8341-89d4777b80bf\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qzcw8" Apr 24 16:46:12.110241 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:12.110045 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/8ac911e4-1e35-4da3-8341-89d4777b80bf-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-qzcw8\" (UID: \"8ac911e4-1e35-4da3-8341-89d4777b80bf\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qzcw8" Apr 24 16:46:12.110241 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:12.110085 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n9wlf\" (UniqueName: \"kubernetes.io/projected/8ac911e4-1e35-4da3-8341-89d4777b80bf-kube-api-access-n9wlf\") pod \"keda-metrics-apiserver-7c9f485588-qzcw8\" (UID: \"8ac911e4-1e35-4da3-8341-89d4777b80bf\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qzcw8" Apr 24 16:46:12.110241 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:46:12.110165 2579 secret.go:281] references non-existent secret key: tls.crt Apr 24 16:46:12.110241 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:46:12.110186 2579 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 16:46:12.110241 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:46:12.110205 2579 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-qzcw8: references non-existent secret key: tls.crt Apr 24 16:46:12.110495 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:46:12.110263 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8ac911e4-1e35-4da3-8341-89d4777b80bf-certificates podName:8ac911e4-1e35-4da3-8341-89d4777b80bf nodeName:}" failed. No retries permitted until 2026-04-24 16:46:12.610242548 +0000 UTC m=+408.543311746 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/8ac911e4-1e35-4da3-8341-89d4777b80bf-certificates") pod "keda-metrics-apiserver-7c9f485588-qzcw8" (UID: "8ac911e4-1e35-4da3-8341-89d4777b80bf") : references non-existent secret key: tls.crt Apr 24 16:46:12.110495 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:12.110476 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/8ac911e4-1e35-4da3-8341-89d4777b80bf-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-qzcw8\" (UID: \"8ac911e4-1e35-4da3-8341-89d4777b80bf\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qzcw8" Apr 24 16:46:12.120696 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:12.120664 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9wlf\" (UniqueName: \"kubernetes.io/projected/8ac911e4-1e35-4da3-8341-89d4777b80bf-kube-api-access-n9wlf\") pod \"keda-metrics-apiserver-7c9f485588-qzcw8\" (UID: \"8ac911e4-1e35-4da3-8341-89d4777b80bf\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qzcw8" Apr 24 16:46:12.211682 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:12.211630 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e372ad1a-e6db-4e83-be58-e8b10f0f167f-certificates\") pod \"keda-operator-ffbb595cb-jp5xr\" (UID: \"e372ad1a-e6db-4e83-be58-e8b10f0f167f\") " pod="openshift-keda/keda-operator-ffbb595cb-jp5xr" Apr 24 16:46:12.211930 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:46:12.211815 2579 secret.go:281] references non-existent secret key: ca.crt Apr 24 16:46:12.211930 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:46:12.211837 2579 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 16:46:12.211930 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:46:12.211847 2579 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-jp5xr: references non-existent secret key: ca.crt Apr 24 16:46:12.211930 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:46:12.211912 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e372ad1a-e6db-4e83-be58-e8b10f0f167f-certificates podName:e372ad1a-e6db-4e83-be58-e8b10f0f167f nodeName:}" failed. No retries permitted until 2026-04-24 16:46:13.211896661 +0000 UTC m=+409.144965851 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/e372ad1a-e6db-4e83-be58-e8b10f0f167f-certificates") pod "keda-operator-ffbb595cb-jp5xr" (UID: "e372ad1a-e6db-4e83-be58-e8b10f0f167f") : references non-existent secret key: ca.crt Apr 24 16:46:12.616390 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:46:12.615855 2579 secret.go:281] references non-existent secret key: tls.crt Apr 24 16:46:12.616390 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:46:12.615886 2579 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 16:46:12.616390 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:46:12.615909 2579 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-qzcw8: references non-existent secret key: tls.crt Apr 24 16:46:12.616390 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:46:12.615971 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8ac911e4-1e35-4da3-8341-89d4777b80bf-certificates podName:8ac911e4-1e35-4da3-8341-89d4777b80bf nodeName:}" failed. No retries permitted until 2026-04-24 16:46:13.615951732 +0000 UTC m=+409.549020931 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/8ac911e4-1e35-4da3-8341-89d4777b80bf-certificates") pod "keda-metrics-apiserver-7c9f485588-qzcw8" (UID: "8ac911e4-1e35-4da3-8341-89d4777b80bf") : references non-existent secret key: tls.crt Apr 24 16:46:12.616851 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:12.615718 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/8ac911e4-1e35-4da3-8341-89d4777b80bf-certificates\") pod \"keda-metrics-apiserver-7c9f485588-qzcw8\" (UID: \"8ac911e4-1e35-4da3-8341-89d4777b80bf\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qzcw8" Apr 24 16:46:13.223056 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:13.223016 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e372ad1a-e6db-4e83-be58-e8b10f0f167f-certificates\") pod \"keda-operator-ffbb595cb-jp5xr\" (UID: \"e372ad1a-e6db-4e83-be58-e8b10f0f167f\") " pod="openshift-keda/keda-operator-ffbb595cb-jp5xr" Apr 24 16:46:13.223503 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:46:13.223137 2579 secret.go:281] references non-existent secret key: ca.crt Apr 24 16:46:13.223503 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:46:13.223150 2579 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 16:46:13.223503 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:46:13.223159 2579 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-jp5xr: references non-existent secret key: ca.crt Apr 24 16:46:13.223503 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:46:13.223205 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e372ad1a-e6db-4e83-be58-e8b10f0f167f-certificates podName:e372ad1a-e6db-4e83-be58-e8b10f0f167f nodeName:}" failed. No retries permitted until 2026-04-24 16:46:15.223193251 +0000 UTC m=+411.156262427 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/e372ad1a-e6db-4e83-be58-e8b10f0f167f-certificates") pod "keda-operator-ffbb595cb-jp5xr" (UID: "e372ad1a-e6db-4e83-be58-e8b10f0f167f") : references non-existent secret key: ca.crt Apr 24 16:46:13.627871 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:13.627836 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/8ac911e4-1e35-4da3-8341-89d4777b80bf-certificates\") pod \"keda-metrics-apiserver-7c9f485588-qzcw8\" (UID: \"8ac911e4-1e35-4da3-8341-89d4777b80bf\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qzcw8" Apr 24 16:46:13.628043 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:46:13.627990 2579 secret.go:281] references non-existent secret key: tls.crt Apr 24 16:46:13.628043 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:46:13.628008 2579 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 16:46:13.628043 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:46:13.628028 2579 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-qzcw8: references non-existent secret key: tls.crt Apr 24 16:46:13.628146 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:46:13.628084 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8ac911e4-1e35-4da3-8341-89d4777b80bf-certificates podName:8ac911e4-1e35-4da3-8341-89d4777b80bf nodeName:}" failed. No retries permitted until 2026-04-24 16:46:15.628067316 +0000 UTC m=+411.561136493 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/8ac911e4-1e35-4da3-8341-89d4777b80bf-certificates") pod "keda-metrics-apiserver-7c9f485588-qzcw8" (UID: "8ac911e4-1e35-4da3-8341-89d4777b80bf") : references non-existent secret key: tls.crt Apr 24 16:46:15.244495 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:15.244446 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e372ad1a-e6db-4e83-be58-e8b10f0f167f-certificates\") pod \"keda-operator-ffbb595cb-jp5xr\" (UID: \"e372ad1a-e6db-4e83-be58-e8b10f0f167f\") " pod="openshift-keda/keda-operator-ffbb595cb-jp5xr" Apr 24 16:46:15.244912 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:46:15.244620 2579 secret.go:281] references non-existent secret key: ca.crt Apr 24 16:46:15.244912 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:46:15.244637 2579 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 16:46:15.244912 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:46:15.244646 2579 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-jp5xr: references non-existent secret key: ca.crt Apr 24 16:46:15.244912 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:46:15.244711 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e372ad1a-e6db-4e83-be58-e8b10f0f167f-certificates podName:e372ad1a-e6db-4e83-be58-e8b10f0f167f nodeName:}" failed. No retries permitted until 2026-04-24 16:46:19.244695204 +0000 UTC m=+415.177764384 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/e372ad1a-e6db-4e83-be58-e8b10f0f167f-certificates") pod "keda-operator-ffbb595cb-jp5xr" (UID: "e372ad1a-e6db-4e83-be58-e8b10f0f167f") : references non-existent secret key: ca.crt Apr 24 16:46:15.648024 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:15.647979 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/8ac911e4-1e35-4da3-8341-89d4777b80bf-certificates\") pod \"keda-metrics-apiserver-7c9f485588-qzcw8\" (UID: \"8ac911e4-1e35-4da3-8341-89d4777b80bf\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qzcw8" Apr 24 16:46:15.648220 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:46:15.648113 2579 secret.go:281] references non-existent secret key: tls.crt Apr 24 16:46:15.648220 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:46:15.648132 2579 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 16:46:15.648220 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:46:15.648152 2579 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-qzcw8: references non-existent secret key: tls.crt Apr 24 16:46:15.648220 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:46:15.648211 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8ac911e4-1e35-4da3-8341-89d4777b80bf-certificates podName:8ac911e4-1e35-4da3-8341-89d4777b80bf nodeName:}" failed. No retries permitted until 2026-04-24 16:46:19.64819767 +0000 UTC m=+415.581266848 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/8ac911e4-1e35-4da3-8341-89d4777b80bf-certificates") pod "keda-metrics-apiserver-7c9f485588-qzcw8" (UID: "8ac911e4-1e35-4da3-8341-89d4777b80bf") : references non-existent secret key: tls.crt Apr 24 16:46:19.283873 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:19.283824 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e372ad1a-e6db-4e83-be58-e8b10f0f167f-certificates\") pod \"keda-operator-ffbb595cb-jp5xr\" (UID: \"e372ad1a-e6db-4e83-be58-e8b10f0f167f\") " pod="openshift-keda/keda-operator-ffbb595cb-jp5xr" Apr 24 16:46:19.286654 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:19.286623 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e372ad1a-e6db-4e83-be58-e8b10f0f167f-certificates\") pod \"keda-operator-ffbb595cb-jp5xr\" (UID: \"e372ad1a-e6db-4e83-be58-e8b10f0f167f\") " pod="openshift-keda/keda-operator-ffbb595cb-jp5xr" Apr 24 16:46:19.578316 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:19.578224 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-jp5xr" Apr 24 16:46:19.687194 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:19.687157 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/8ac911e4-1e35-4da3-8341-89d4777b80bf-certificates\") pod \"keda-metrics-apiserver-7c9f485588-qzcw8\" (UID: \"8ac911e4-1e35-4da3-8341-89d4777b80bf\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qzcw8" Apr 24 16:46:19.690370 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:19.690342 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/8ac911e4-1e35-4da3-8341-89d4777b80bf-certificates\") pod \"keda-metrics-apiserver-7c9f485588-qzcw8\" (UID: \"8ac911e4-1e35-4da3-8341-89d4777b80bf\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qzcw8" Apr 24 16:46:19.711785 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:19.711575 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-jp5xr"] Apr 24 16:46:19.714672 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:46:19.714632 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode372ad1a_e6db_4e83_be58_e8b10f0f167f.slice/crio-a9af766b5e61e66b113aa5c05cf4ba8d2115f88e419e8f80e25a505a407af4d0 WatchSource:0}: Error finding container a9af766b5e61e66b113aa5c05cf4ba8d2115f88e419e8f80e25a505a407af4d0: Status 404 returned error can't find the container with id a9af766b5e61e66b113aa5c05cf4ba8d2115f88e419e8f80e25a505a407af4d0 Apr 24 16:46:19.747406 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:19.747363 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qzcw8" Apr 24 16:46:19.879859 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:19.879833 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-qzcw8"] Apr 24 16:46:19.882198 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:46:19.882168 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ac911e4_1e35_4da3_8341_89d4777b80bf.slice/crio-6947a2045d1593a7abbdaffdedf7a8cd8329fe744063f4e70bf62595a715acdf WatchSource:0}: Error finding container 6947a2045d1593a7abbdaffdedf7a8cd8329fe744063f4e70bf62595a715acdf: Status 404 returned error can't find the container with id 6947a2045d1593a7abbdaffdedf7a8cd8329fe744063f4e70bf62595a715acdf Apr 24 16:46:20.001655 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:20.001614 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-jp5xr" event={"ID":"e372ad1a-e6db-4e83-be58-e8b10f0f167f","Type":"ContainerStarted","Data":"a9af766b5e61e66b113aa5c05cf4ba8d2115f88e419e8f80e25a505a407af4d0"} Apr 24 16:46:20.002875 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:20.002843 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qzcw8" event={"ID":"8ac911e4-1e35-4da3-8341-89d4777b80bf","Type":"ContainerStarted","Data":"6947a2045d1593a7abbdaffdedf7a8cd8329fe744063f4e70bf62595a715acdf"} Apr 24 16:46:25.029943 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:25.029901 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qzcw8" event={"ID":"8ac911e4-1e35-4da3-8341-89d4777b80bf","Type":"ContainerStarted","Data":"6bf86bda8750c173e6b68e39239f6b7915c9e672ca593b5230595568cfaf5909"} Apr 24 16:46:25.030440 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:25.030061 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qzcw8" Apr 24 16:46:25.031348 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:25.031318 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-jp5xr" event={"ID":"e372ad1a-e6db-4e83-be58-e8b10f0f167f","Type":"ContainerStarted","Data":"5e0f6e80f13452cf96d756e8cd2f799d0aa251b429e189f47ee14d5554f529f5"} Apr 24 16:46:25.031504 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:25.031487 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-jp5xr" Apr 24 16:46:25.049943 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:25.049875 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qzcw8" podStartSLOduration=9.69519418 podStartE2EDuration="14.049853898s" podCreationTimestamp="2026-04-24 16:46:11 +0000 UTC" firstStartedPulling="2026-04-24 16:46:19.883479717 +0000 UTC m=+415.816548897" lastFinishedPulling="2026-04-24 16:46:24.238139433 +0000 UTC m=+420.171208615" observedRunningTime="2026-04-24 16:46:25.048517669 +0000 UTC m=+420.981586867" watchObservedRunningTime="2026-04-24 16:46:25.049853898 +0000 UTC m=+420.982923098" Apr 24 16:46:25.080576 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:25.080514 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-jp5xr" podStartSLOduration=9.54967391 podStartE2EDuration="14.080497027s" podCreationTimestamp="2026-04-24 16:46:11 +0000 UTC" firstStartedPulling="2026-04-24 16:46:19.716047195 +0000 UTC m=+415.649116372" lastFinishedPulling="2026-04-24 16:46:24.246870308 +0000 UTC m=+420.179939489" observedRunningTime="2026-04-24 16:46:25.077867293 +0000 UTC m=+421.010936489" watchObservedRunningTime="2026-04-24 16:46:25.080497027 +0000 UTC m=+421.013566227" Apr 24 16:46:31.975965 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:31.975884 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-lrlql" Apr 24 16:46:36.039901 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:36.039870 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qzcw8" Apr 24 16:46:46.037277 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:46:46.037238 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-jp5xr" Apr 24 16:47:04.601070 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:04.601027 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9kwkb"] Apr 24 16:47:04.604538 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:04.604509 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9kwkb" Apr 24 16:47:04.607654 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:04.607628 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 24 16:47:04.607838 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:04.607825 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-4gnn8\"" Apr 24 16:47:04.608372 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:04.608351 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 24 16:47:04.629167 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:04.629132 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9kwkb"] Apr 24 16:47:04.679127 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:04.679090 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/39544bbc-e97a-46a8-a84f-ddd9493abf0f-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9kwkb\" (UID: \"39544bbc-e97a-46a8-a84f-ddd9493abf0f\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9kwkb" Apr 24 16:47:04.679127 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:04.679127 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/39544bbc-e97a-46a8-a84f-ddd9493abf0f-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9kwkb\" (UID: \"39544bbc-e97a-46a8-a84f-ddd9493abf0f\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9kwkb" Apr 24 16:47:04.679397 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:04.679212 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrfdp\" (UniqueName: \"kubernetes.io/projected/39544bbc-e97a-46a8-a84f-ddd9493abf0f-kube-api-access-zrfdp\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9kwkb\" (UID: \"39544bbc-e97a-46a8-a84f-ddd9493abf0f\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9kwkb" Apr 24 16:47:04.780130 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:04.780089 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zrfdp\" (UniqueName: \"kubernetes.io/projected/39544bbc-e97a-46a8-a84f-ddd9493abf0f-kube-api-access-zrfdp\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9kwkb\" (UID: \"39544bbc-e97a-46a8-a84f-ddd9493abf0f\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9kwkb" Apr 24 16:47:04.780337 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:04.780181 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/39544bbc-e97a-46a8-a84f-ddd9493abf0f-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9kwkb\" (UID: \"39544bbc-e97a-46a8-a84f-ddd9493abf0f\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9kwkb" Apr 24 16:47:04.780337 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:04.780214 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/39544bbc-e97a-46a8-a84f-ddd9493abf0f-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9kwkb\" (UID: \"39544bbc-e97a-46a8-a84f-ddd9493abf0f\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9kwkb" Apr 24 16:47:04.780627 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:04.780607 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/39544bbc-e97a-46a8-a84f-ddd9493abf0f-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9kwkb\" (UID: \"39544bbc-e97a-46a8-a84f-ddd9493abf0f\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9kwkb" Apr 24 16:47:04.780702 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:04.780644 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/39544bbc-e97a-46a8-a84f-ddd9493abf0f-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9kwkb\" (UID: \"39544bbc-e97a-46a8-a84f-ddd9493abf0f\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9kwkb" Apr 24 16:47:04.793370 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:04.793332 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrfdp\" (UniqueName: \"kubernetes.io/projected/39544bbc-e97a-46a8-a84f-ddd9493abf0f-kube-api-access-zrfdp\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9kwkb\" (UID: \"39544bbc-e97a-46a8-a84f-ddd9493abf0f\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9kwkb" Apr 24 16:47:04.914921 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:04.914832 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9kwkb" Apr 24 16:47:05.067412 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:05.067382 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9kwkb"] Apr 24 16:47:05.069996 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:47:05.069958 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39544bbc_e97a_46a8_a84f_ddd9493abf0f.slice/crio-0d2147e33dd01869bc33418f1ad7f4d363ee1aedb80fc6fecb6ecd795f6543c0 WatchSource:0}: Error finding container 0d2147e33dd01869bc33418f1ad7f4d363ee1aedb80fc6fecb6ecd795f6543c0: Status 404 returned error can't find the container with id 0d2147e33dd01869bc33418f1ad7f4d363ee1aedb80fc6fecb6ecd795f6543c0 Apr 24 16:47:05.174864 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:05.174833 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9kwkb" event={"ID":"39544bbc-e97a-46a8-a84f-ddd9493abf0f","Type":"ContainerStarted","Data":"7cba2f4a8f557be5d25a515aea5c7845e0d9dbee7b124993349e481f869165ca"} Apr 24 16:47:05.175002 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:05.174869 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9kwkb" event={"ID":"39544bbc-e97a-46a8-a84f-ddd9493abf0f","Type":"ContainerStarted","Data":"0d2147e33dd01869bc33418f1ad7f4d363ee1aedb80fc6fecb6ecd795f6543c0"} Apr 24 16:47:06.180122 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:06.180020 2579 generic.go:358] "Generic (PLEG): container finished" podID="39544bbc-e97a-46a8-a84f-ddd9493abf0f" containerID="7cba2f4a8f557be5d25a515aea5c7845e0d9dbee7b124993349e481f869165ca" exitCode=0 Apr 24 16:47:06.180488 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:06.180116 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9kwkb" event={"ID":"39544bbc-e97a-46a8-a84f-ddd9493abf0f","Type":"ContainerDied","Data":"7cba2f4a8f557be5d25a515aea5c7845e0d9dbee7b124993349e481f869165ca"} Apr 24 16:47:08.190049 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:08.190009 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9kwkb" event={"ID":"39544bbc-e97a-46a8-a84f-ddd9493abf0f","Type":"ContainerStarted","Data":"24e7bd9ed0e6cb1cc6d28fc2931ee42cf5a779cef44bb239d1c3519a101400b3"} Apr 24 16:47:09.195967 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:09.195930 2579 generic.go:358] "Generic (PLEG): container finished" podID="39544bbc-e97a-46a8-a84f-ddd9493abf0f" containerID="24e7bd9ed0e6cb1cc6d28fc2931ee42cf5a779cef44bb239d1c3519a101400b3" exitCode=0 Apr 24 16:47:09.196364 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:09.195977 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9kwkb" event={"ID":"39544bbc-e97a-46a8-a84f-ddd9493abf0f","Type":"ContainerDied","Data":"24e7bd9ed0e6cb1cc6d28fc2931ee42cf5a779cef44bb239d1c3519a101400b3"} Apr 24 16:47:10.202143 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:10.202101 2579 generic.go:358] "Generic (PLEG): container finished" podID="39544bbc-e97a-46a8-a84f-ddd9493abf0f" containerID="1b8f13a39f00940014506465f8f3ede1a364d6dfbf5b20a84ea990da6cf9a5b0" exitCode=0 Apr 24 16:47:10.202589 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:10.202189 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9kwkb" event={"ID":"39544bbc-e97a-46a8-a84f-ddd9493abf0f","Type":"ContainerDied","Data":"1b8f13a39f00940014506465f8f3ede1a364d6dfbf5b20a84ea990da6cf9a5b0"} Apr 24 16:47:11.337798 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:11.337770 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9kwkb" Apr 24 16:47:11.438567 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:11.438468 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrfdp\" (UniqueName: \"kubernetes.io/projected/39544bbc-e97a-46a8-a84f-ddd9493abf0f-kube-api-access-zrfdp\") pod \"39544bbc-e97a-46a8-a84f-ddd9493abf0f\" (UID: \"39544bbc-e97a-46a8-a84f-ddd9493abf0f\") " Apr 24 16:47:11.438771 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:11.438639 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/39544bbc-e97a-46a8-a84f-ddd9493abf0f-bundle\") pod \"39544bbc-e97a-46a8-a84f-ddd9493abf0f\" (UID: \"39544bbc-e97a-46a8-a84f-ddd9493abf0f\") " Apr 24 16:47:11.438771 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:11.438663 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/39544bbc-e97a-46a8-a84f-ddd9493abf0f-util\") pod \"39544bbc-e97a-46a8-a84f-ddd9493abf0f\" (UID: \"39544bbc-e97a-46a8-a84f-ddd9493abf0f\") " Apr 24 16:47:11.439315 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:11.439287 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39544bbc-e97a-46a8-a84f-ddd9493abf0f-bundle" (OuterVolumeSpecName: "bundle") pod "39544bbc-e97a-46a8-a84f-ddd9493abf0f" (UID: "39544bbc-e97a-46a8-a84f-ddd9493abf0f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:47:11.440760 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:11.440718 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39544bbc-e97a-46a8-a84f-ddd9493abf0f-kube-api-access-zrfdp" (OuterVolumeSpecName: "kube-api-access-zrfdp") pod "39544bbc-e97a-46a8-a84f-ddd9493abf0f" (UID: "39544bbc-e97a-46a8-a84f-ddd9493abf0f"). InnerVolumeSpecName "kube-api-access-zrfdp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:47:11.445375 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:11.445334 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39544bbc-e97a-46a8-a84f-ddd9493abf0f-util" (OuterVolumeSpecName: "util") pod "39544bbc-e97a-46a8-a84f-ddd9493abf0f" (UID: "39544bbc-e97a-46a8-a84f-ddd9493abf0f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:47:11.539549 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:11.539511 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/39544bbc-e97a-46a8-a84f-ddd9493abf0f-bundle\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:47:11.539549 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:11.539543 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/39544bbc-e97a-46a8-a84f-ddd9493abf0f-util\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:47:11.539549 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:11.539554 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zrfdp\" (UniqueName: \"kubernetes.io/projected/39544bbc-e97a-46a8-a84f-ddd9493abf0f-kube-api-access-zrfdp\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:47:12.210701 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:12.210663 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9kwkb" event={"ID":"39544bbc-e97a-46a8-a84f-ddd9493abf0f","Type":"ContainerDied","Data":"0d2147e33dd01869bc33418f1ad7f4d363ee1aedb80fc6fecb6ecd795f6543c0"} Apr 24 16:47:12.210701 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:12.210688 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9kwkb" Apr 24 16:47:12.210701 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:12.210701 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d2147e33dd01869bc33418f1ad7f4d363ee1aedb80fc6fecb6ecd795f6543c0" Apr 24 16:47:17.827874 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:17.827838 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-7wg6m"] Apr 24 16:47:17.828305 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:17.828195 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="39544bbc-e97a-46a8-a84f-ddd9493abf0f" containerName="util" Apr 24 16:47:17.828305 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:17.828205 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="39544bbc-e97a-46a8-a84f-ddd9493abf0f" containerName="util" Apr 24 16:47:17.828305 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:17.828216 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="39544bbc-e97a-46a8-a84f-ddd9493abf0f" containerName="pull" Apr 24 16:47:17.828305 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:17.828221 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="39544bbc-e97a-46a8-a84f-ddd9493abf0f" containerName="pull" Apr 24 16:47:17.828305 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:17.828227 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="39544bbc-e97a-46a8-a84f-ddd9493abf0f" containerName="extract" Apr 24 16:47:17.828305 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:17.828233 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="39544bbc-e97a-46a8-a84f-ddd9493abf0f" containerName="extract" Apr 24 16:47:17.828305 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:17.828298 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="39544bbc-e97a-46a8-a84f-ddd9493abf0f" containerName="extract" Apr 24 16:47:17.833448 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:17.833421 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-7wg6m" Apr 24 16:47:17.835688 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:17.835663 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 24 16:47:17.835872 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:17.835839 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 24 16:47:17.835952 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:17.835931 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-zgjwq\"" Apr 24 16:47:17.844844 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:17.844811 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-7wg6m"] Apr 24 16:47:17.997699 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:17.997648 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82kv9\" (UniqueName: \"kubernetes.io/projected/b95b5459-44b1-4d3f-a8e9-49f69d580ae5-kube-api-access-82kv9\") pod \"cert-manager-operator-controller-manager-54b9655956-7wg6m\" (UID: \"b95b5459-44b1-4d3f-a8e9-49f69d580ae5\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-7wg6m" Apr 24 16:47:17.997931 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:17.997795 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b95b5459-44b1-4d3f-a8e9-49f69d580ae5-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-7wg6m\" (UID: \"b95b5459-44b1-4d3f-a8e9-49f69d580ae5\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-7wg6m" Apr 24 16:47:18.098603 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:18.098508 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b95b5459-44b1-4d3f-a8e9-49f69d580ae5-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-7wg6m\" (UID: \"b95b5459-44b1-4d3f-a8e9-49f69d580ae5\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-7wg6m" Apr 24 16:47:18.098603 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:18.098564 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-82kv9\" (UniqueName: \"kubernetes.io/projected/b95b5459-44b1-4d3f-a8e9-49f69d580ae5-kube-api-access-82kv9\") pod \"cert-manager-operator-controller-manager-54b9655956-7wg6m\" (UID: \"b95b5459-44b1-4d3f-a8e9-49f69d580ae5\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-7wg6m" Apr 24 16:47:18.098925 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:18.098905 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b95b5459-44b1-4d3f-a8e9-49f69d580ae5-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-7wg6m\" (UID: \"b95b5459-44b1-4d3f-a8e9-49f69d580ae5\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-7wg6m" Apr 24 16:47:18.110247 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:18.110217 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-82kv9\" (UniqueName: \"kubernetes.io/projected/b95b5459-44b1-4d3f-a8e9-49f69d580ae5-kube-api-access-82kv9\") pod \"cert-manager-operator-controller-manager-54b9655956-7wg6m\" (UID: \"b95b5459-44b1-4d3f-a8e9-49f69d580ae5\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-7wg6m" Apr 24 16:47:18.143243 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:18.143199 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-7wg6m" Apr 24 16:47:18.290868 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:18.290837 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-7wg6m"] Apr 24 16:47:18.294006 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:47:18.293974 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb95b5459_44b1_4d3f_a8e9_49f69d580ae5.slice/crio-77d6caabb8463fc995cbf44354ecc0519a9d24f62e6f0b8c93488ea110759759 WatchSource:0}: Error finding container 77d6caabb8463fc995cbf44354ecc0519a9d24f62e6f0b8c93488ea110759759: Status 404 returned error can't find the container with id 77d6caabb8463fc995cbf44354ecc0519a9d24f62e6f0b8c93488ea110759759 Apr 24 16:47:19.240156 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:19.240113 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-7wg6m" event={"ID":"b95b5459-44b1-4d3f-a8e9-49f69d580ae5","Type":"ContainerStarted","Data":"77d6caabb8463fc995cbf44354ecc0519a9d24f62e6f0b8c93488ea110759759"} Apr 24 16:47:20.246952 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:20.246836 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-7wg6m" event={"ID":"b95b5459-44b1-4d3f-a8e9-49f69d580ae5","Type":"ContainerStarted","Data":"f751df91c801b717c93077d532dcffd0580a171f9f320e727e4b65c68519bf5d"} Apr 24 16:47:20.308236 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:20.308157 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-7wg6m" podStartSLOduration=1.840198679 podStartE2EDuration="3.308141214s" podCreationTimestamp="2026-04-24 16:47:17 +0000 UTC" firstStartedPulling="2026-04-24 16:47:18.296688048 +0000 UTC m=+474.229757226" lastFinishedPulling="2026-04-24 16:47:19.76463057 +0000 UTC m=+475.697699761" observedRunningTime="2026-04-24 16:47:20.300698842 +0000 UTC m=+476.233768040" watchObservedRunningTime="2026-04-24 16:47:20.308141214 +0000 UTC m=+476.241210413" Apr 24 16:47:26.873802 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:26.873756 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fr94xs"] Apr 24 16:47:26.879516 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:26.879483 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fr94xs" Apr 24 16:47:26.882885 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:26.882850 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 24 16:47:26.883698 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:26.883677 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-4gnn8\"" Apr 24 16:47:26.885998 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:26.885972 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 24 16:47:26.898295 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:26.898246 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fr94xs"] Apr 24 16:47:26.979518 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:26.979469 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5cql\" (UniqueName: \"kubernetes.io/projected/2cd5160e-b1e5-4b73-a966-53ba9e8dc34c-kube-api-access-v5cql\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fr94xs\" (UID: \"2cd5160e-b1e5-4b73-a966-53ba9e8dc34c\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fr94xs" Apr 24 16:47:26.979719 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:26.979586 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2cd5160e-b1e5-4b73-a966-53ba9e8dc34c-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fr94xs\" (UID: \"2cd5160e-b1e5-4b73-a966-53ba9e8dc34c\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fr94xs" Apr 24 16:47:26.979719 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:26.979610 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2cd5160e-b1e5-4b73-a966-53ba9e8dc34c-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fr94xs\" (UID: \"2cd5160e-b1e5-4b73-a966-53ba9e8dc34c\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fr94xs" Apr 24 16:47:27.080980 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:27.080935 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2cd5160e-b1e5-4b73-a966-53ba9e8dc34c-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fr94xs\" (UID: \"2cd5160e-b1e5-4b73-a966-53ba9e8dc34c\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fr94xs" Apr 24 16:47:27.080980 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:27.080979 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2cd5160e-b1e5-4b73-a966-53ba9e8dc34c-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fr94xs\" (UID: \"2cd5160e-b1e5-4b73-a966-53ba9e8dc34c\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fr94xs" Apr 24 16:47:27.081215 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:27.081021 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v5cql\" (UniqueName: \"kubernetes.io/projected/2cd5160e-b1e5-4b73-a966-53ba9e8dc34c-kube-api-access-v5cql\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fr94xs\" (UID: \"2cd5160e-b1e5-4b73-a966-53ba9e8dc34c\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fr94xs" Apr 24 16:47:27.081347 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:27.081324 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2cd5160e-b1e5-4b73-a966-53ba9e8dc34c-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fr94xs\" (UID: \"2cd5160e-b1e5-4b73-a966-53ba9e8dc34c\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fr94xs" Apr 24 16:47:27.081406 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:27.081385 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2cd5160e-b1e5-4b73-a966-53ba9e8dc34c-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fr94xs\" (UID: \"2cd5160e-b1e5-4b73-a966-53ba9e8dc34c\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fr94xs" Apr 24 16:47:27.092690 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:27.092654 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5cql\" (UniqueName: \"kubernetes.io/projected/2cd5160e-b1e5-4b73-a966-53ba9e8dc34c-kube-api-access-v5cql\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fr94xs\" (UID: \"2cd5160e-b1e5-4b73-a966-53ba9e8dc34c\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fr94xs" Apr 24 16:47:27.189642 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:27.189531 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fr94xs" Apr 24 16:47:27.355048 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:27.355023 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fr94xs"] Apr 24 16:47:27.357567 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:47:27.357534 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2cd5160e_b1e5_4b73_a966_53ba9e8dc34c.slice/crio-67a516282ed9970e1e5912de6a3ff29ed7ec5fb60d685219f96914ddafbe3743 WatchSource:0}: Error finding container 67a516282ed9970e1e5912de6a3ff29ed7ec5fb60d685219f96914ddafbe3743: Status 404 returned error can't find the container with id 67a516282ed9970e1e5912de6a3ff29ed7ec5fb60d685219f96914ddafbe3743 Apr 24 16:47:28.277340 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:28.277299 2579 generic.go:358] "Generic (PLEG): container finished" podID="2cd5160e-b1e5-4b73-a966-53ba9e8dc34c" containerID="367f83c1e51579464b3b25a1a1918dc2f0e8c356fd6d59c178c914051d681e03" exitCode=0 Apr 24 16:47:28.277776 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:28.277401 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fr94xs" event={"ID":"2cd5160e-b1e5-4b73-a966-53ba9e8dc34c","Type":"ContainerDied","Data":"367f83c1e51579464b3b25a1a1918dc2f0e8c356fd6d59c178c914051d681e03"} Apr 24 16:47:28.277776 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:28.277436 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fr94xs" event={"ID":"2cd5160e-b1e5-4b73-a966-53ba9e8dc34c","Type":"ContainerStarted","Data":"67a516282ed9970e1e5912de6a3ff29ed7ec5fb60d685219f96914ddafbe3743"} Apr 24 16:47:30.286935 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:30.286892 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fr94xs" event={"ID":"2cd5160e-b1e5-4b73-a966-53ba9e8dc34c","Type":"ContainerStarted","Data":"bcbde61cf3f2aae8323fed2bdf37e3f701a39bddd78009f42e93785a4511f161"} Apr 24 16:47:31.292879 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:31.292839 2579 generic.go:358] "Generic (PLEG): container finished" podID="2cd5160e-b1e5-4b73-a966-53ba9e8dc34c" containerID="bcbde61cf3f2aae8323fed2bdf37e3f701a39bddd78009f42e93785a4511f161" exitCode=0 Apr 24 16:47:31.293272 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:31.292904 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fr94xs" event={"ID":"2cd5160e-b1e5-4b73-a966-53ba9e8dc34c","Type":"ContainerDied","Data":"bcbde61cf3f2aae8323fed2bdf37e3f701a39bddd78009f42e93785a4511f161"} Apr 24 16:47:32.298804 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:32.298767 2579 generic.go:358] "Generic (PLEG): container finished" podID="2cd5160e-b1e5-4b73-a966-53ba9e8dc34c" containerID="b4dadc4d8a72eabebba4bab38aa1837fc348b60bc0495280cd9fd2cb288ab492" exitCode=0 Apr 24 16:47:32.299262 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:32.298821 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fr94xs" event={"ID":"2cd5160e-b1e5-4b73-a966-53ba9e8dc34c","Type":"ContainerDied","Data":"b4dadc4d8a72eabebba4bab38aa1837fc348b60bc0495280cd9fd2cb288ab492"} Apr 24 16:47:33.435371 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:33.435342 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fr94xs" Apr 24 16:47:33.534787 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:33.534719 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2cd5160e-b1e5-4b73-a966-53ba9e8dc34c-bundle\") pod \"2cd5160e-b1e5-4b73-a966-53ba9e8dc34c\" (UID: \"2cd5160e-b1e5-4b73-a966-53ba9e8dc34c\") " Apr 24 16:47:33.534787 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:33.534793 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5cql\" (UniqueName: \"kubernetes.io/projected/2cd5160e-b1e5-4b73-a966-53ba9e8dc34c-kube-api-access-v5cql\") pod \"2cd5160e-b1e5-4b73-a966-53ba9e8dc34c\" (UID: \"2cd5160e-b1e5-4b73-a966-53ba9e8dc34c\") " Apr 24 16:47:33.535068 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:33.534812 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2cd5160e-b1e5-4b73-a966-53ba9e8dc34c-util\") pod \"2cd5160e-b1e5-4b73-a966-53ba9e8dc34c\" (UID: \"2cd5160e-b1e5-4b73-a966-53ba9e8dc34c\") " Apr 24 16:47:33.535232 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:33.535207 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2cd5160e-b1e5-4b73-a966-53ba9e8dc34c-bundle" (OuterVolumeSpecName: "bundle") pod "2cd5160e-b1e5-4b73-a966-53ba9e8dc34c" (UID: "2cd5160e-b1e5-4b73-a966-53ba9e8dc34c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:47:33.537212 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:33.537185 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cd5160e-b1e5-4b73-a966-53ba9e8dc34c-kube-api-access-v5cql" (OuterVolumeSpecName: "kube-api-access-v5cql") pod "2cd5160e-b1e5-4b73-a966-53ba9e8dc34c" (UID: "2cd5160e-b1e5-4b73-a966-53ba9e8dc34c"). InnerVolumeSpecName "kube-api-access-v5cql". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:47:33.540928 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:33.540885 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2cd5160e-b1e5-4b73-a966-53ba9e8dc34c-util" (OuterVolumeSpecName: "util") pod "2cd5160e-b1e5-4b73-a966-53ba9e8dc34c" (UID: "2cd5160e-b1e5-4b73-a966-53ba9e8dc34c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:47:33.636518 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:33.636402 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2cd5160e-b1e5-4b73-a966-53ba9e8dc34c-bundle\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:47:33.636518 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:33.636454 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v5cql\" (UniqueName: \"kubernetes.io/projected/2cd5160e-b1e5-4b73-a966-53ba9e8dc34c-kube-api-access-v5cql\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:47:33.636518 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:33.636473 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2cd5160e-b1e5-4b73-a966-53ba9e8dc34c-util\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:47:34.308521 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:34.308479 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fr94xs" event={"ID":"2cd5160e-b1e5-4b73-a966-53ba9e8dc34c","Type":"ContainerDied","Data":"67a516282ed9970e1e5912de6a3ff29ed7ec5fb60d685219f96914ddafbe3743"} Apr 24 16:47:34.308521 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:34.308506 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fr94xs" Apr 24 16:47:34.308521 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:34.308517 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67a516282ed9970e1e5912de6a3ff29ed7ec5fb60d685219f96914ddafbe3743" Apr 24 16:47:39.626716 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:39.626676 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-ht659"] Apr 24 16:47:39.627193 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:39.627097 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2cd5160e-b1e5-4b73-a966-53ba9e8dc34c" containerName="pull" Apr 24 16:47:39.627193 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:39.627111 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cd5160e-b1e5-4b73-a966-53ba9e8dc34c" containerName="pull" Apr 24 16:47:39.627193 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:39.627133 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2cd5160e-b1e5-4b73-a966-53ba9e8dc34c" containerName="util" Apr 24 16:47:39.627193 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:39.627138 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cd5160e-b1e5-4b73-a966-53ba9e8dc34c" containerName="util" Apr 24 16:47:39.627193 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:39.627146 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2cd5160e-b1e5-4b73-a966-53ba9e8dc34c" containerName="extract" Apr 24 16:47:39.627193 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:39.627151 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cd5160e-b1e5-4b73-a966-53ba9e8dc34c" containerName="extract" Apr 24 16:47:39.627377 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:39.627212 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="2cd5160e-b1e5-4b73-a966-53ba9e8dc34c" containerName="extract" Apr 24 16:47:39.632092 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:39.632071 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-ht659" Apr 24 16:47:39.634610 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:39.634584 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-gnqnq\"" Apr 24 16:47:39.635212 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:39.635197 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 24 16:47:39.635212 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:39.635209 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 24 16:47:39.640937 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:39.640910 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-ht659"] Apr 24 16:47:39.689442 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:39.689410 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mdlx\" (UniqueName: \"kubernetes.io/projected/d95d03e0-6d52-4fe0-bc24-a0bc74394c12-kube-api-access-2mdlx\") pod \"openshift-lws-operator-bfc7f696d-ht659\" (UID: \"d95d03e0-6d52-4fe0-bc24-a0bc74394c12\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-ht659" Apr 24 16:47:39.689646 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:39.689466 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d95d03e0-6d52-4fe0-bc24-a0bc74394c12-tmp\") pod \"openshift-lws-operator-bfc7f696d-ht659\" (UID: \"d95d03e0-6d52-4fe0-bc24-a0bc74394c12\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-ht659" Apr 24 16:47:39.790275 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:39.790232 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2mdlx\" (UniqueName: \"kubernetes.io/projected/d95d03e0-6d52-4fe0-bc24-a0bc74394c12-kube-api-access-2mdlx\") pod \"openshift-lws-operator-bfc7f696d-ht659\" (UID: \"d95d03e0-6d52-4fe0-bc24-a0bc74394c12\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-ht659" Apr 24 16:47:39.790455 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:39.790289 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d95d03e0-6d52-4fe0-bc24-a0bc74394c12-tmp\") pod \"openshift-lws-operator-bfc7f696d-ht659\" (UID: \"d95d03e0-6d52-4fe0-bc24-a0bc74394c12\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-ht659" Apr 24 16:47:39.790665 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:39.790647 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d95d03e0-6d52-4fe0-bc24-a0bc74394c12-tmp\") pod \"openshift-lws-operator-bfc7f696d-ht659\" (UID: \"d95d03e0-6d52-4fe0-bc24-a0bc74394c12\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-ht659" Apr 24 16:47:39.802869 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:39.802836 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mdlx\" (UniqueName: \"kubernetes.io/projected/d95d03e0-6d52-4fe0-bc24-a0bc74394c12-kube-api-access-2mdlx\") pod \"openshift-lws-operator-bfc7f696d-ht659\" (UID: \"d95d03e0-6d52-4fe0-bc24-a0bc74394c12\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-ht659" Apr 24 16:47:39.949665 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:39.949576 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-ht659" Apr 24 16:47:40.122786 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:40.122758 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-ht659"] Apr 24 16:47:40.125787 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:47:40.125723 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd95d03e0_6d52_4fe0_bc24_a0bc74394c12.slice/crio-c67e2d03cc23c12f9e4a59b6fc36c016c32a6e5b51d5e920ec6e938dc1705c01 WatchSource:0}: Error finding container c67e2d03cc23c12f9e4a59b6fc36c016c32a6e5b51d5e920ec6e938dc1705c01: Status 404 returned error can't find the container with id c67e2d03cc23c12f9e4a59b6fc36c016c32a6e5b51d5e920ec6e938dc1705c01 Apr 24 16:47:40.333074 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:40.333037 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-ht659" event={"ID":"d95d03e0-6d52-4fe0-bc24-a0bc74394c12","Type":"ContainerStarted","Data":"c67e2d03cc23c12f9e4a59b6fc36c016c32a6e5b51d5e920ec6e938dc1705c01"} Apr 24 16:47:42.344505 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:42.343969 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-ht659" event={"ID":"d95d03e0-6d52-4fe0-bc24-a0bc74394c12","Type":"ContainerStarted","Data":"cdf65857b006f82089d80249eb19e7f7cc8150f12b82410119aab29d39fdd88f"} Apr 24 16:47:52.958971 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:52.958895 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-ht659" podStartSLOduration=11.96996768 podStartE2EDuration="13.958877103s" podCreationTimestamp="2026-04-24 16:47:39 +0000 UTC" firstStartedPulling="2026-04-24 16:47:40.127399058 +0000 UTC m=+496.060468235" lastFinishedPulling="2026-04-24 16:47:42.116308481 +0000 UTC m=+498.049377658" observedRunningTime="2026-04-24 16:47:42.374423081 +0000 UTC m=+498.307492279" watchObservedRunningTime="2026-04-24 16:47:52.958877103 +0000 UTC m=+508.891946363" Apr 24 16:47:52.959775 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:52.959723 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dzxps"] Apr 24 16:47:52.963597 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:52.963572 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dzxps" Apr 24 16:47:52.965947 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:52.965922 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-4gnn8\"" Apr 24 16:47:52.966049 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:52.965922 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 24 16:47:52.966455 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:52.966439 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 24 16:47:52.970573 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:52.970534 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dzxps"] Apr 24 16:47:53.105371 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:53.105328 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/51cf2c6b-8020-4121-bb67-4b85efd28a9b-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dzxps\" (UID: \"51cf2c6b-8020-4121-bb67-4b85efd28a9b\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dzxps" Apr 24 16:47:53.105573 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:53.105386 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bcjz\" (UniqueName: \"kubernetes.io/projected/51cf2c6b-8020-4121-bb67-4b85efd28a9b-kube-api-access-7bcjz\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dzxps\" (UID: \"51cf2c6b-8020-4121-bb67-4b85efd28a9b\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dzxps" Apr 24 16:47:53.105573 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:53.105435 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/51cf2c6b-8020-4121-bb67-4b85efd28a9b-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dzxps\" (UID: \"51cf2c6b-8020-4121-bb67-4b85efd28a9b\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dzxps" Apr 24 16:47:53.206222 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:53.206173 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/51cf2c6b-8020-4121-bb67-4b85efd28a9b-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dzxps\" (UID: \"51cf2c6b-8020-4121-bb67-4b85efd28a9b\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dzxps" Apr 24 16:47:53.206365 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:53.206231 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7bcjz\" (UniqueName: \"kubernetes.io/projected/51cf2c6b-8020-4121-bb67-4b85efd28a9b-kube-api-access-7bcjz\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dzxps\" (UID: \"51cf2c6b-8020-4121-bb67-4b85efd28a9b\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dzxps" Apr 24 16:47:53.206365 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:53.206258 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/51cf2c6b-8020-4121-bb67-4b85efd28a9b-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dzxps\" (UID: \"51cf2c6b-8020-4121-bb67-4b85efd28a9b\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dzxps" Apr 24 16:47:53.206602 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:53.206582 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/51cf2c6b-8020-4121-bb67-4b85efd28a9b-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dzxps\" (UID: \"51cf2c6b-8020-4121-bb67-4b85efd28a9b\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dzxps" Apr 24 16:47:53.206657 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:53.206638 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/51cf2c6b-8020-4121-bb67-4b85efd28a9b-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dzxps\" (UID: \"51cf2c6b-8020-4121-bb67-4b85efd28a9b\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dzxps" Apr 24 16:47:53.215651 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:53.215563 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bcjz\" (UniqueName: \"kubernetes.io/projected/51cf2c6b-8020-4121-bb67-4b85efd28a9b-kube-api-access-7bcjz\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dzxps\" (UID: \"51cf2c6b-8020-4121-bb67-4b85efd28a9b\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dzxps" Apr 24 16:47:53.274866 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:53.274826 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dzxps" Apr 24 16:47:53.410060 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:53.410031 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dzxps"] Apr 24 16:47:53.412357 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:47:53.412328 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51cf2c6b_8020_4121_bb67_4b85efd28a9b.slice/crio-f0f81c927457a110bb691ab6faf57f3392041bc22419650ffbdd0c4c86fc88d5 WatchSource:0}: Error finding container f0f81c927457a110bb691ab6faf57f3392041bc22419650ffbdd0c4c86fc88d5: Status 404 returned error can't find the container with id f0f81c927457a110bb691ab6faf57f3392041bc22419650ffbdd0c4c86fc88d5 Apr 24 16:47:54.388831 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:54.388794 2579 generic.go:358] "Generic (PLEG): container finished" podID="51cf2c6b-8020-4121-bb67-4b85efd28a9b" containerID="c9723b5870b137f9a83269a52d074f7adaf0eaf879e3768a0908da59f892bbd2" exitCode=0 Apr 24 16:47:54.389252 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:54.388836 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dzxps" event={"ID":"51cf2c6b-8020-4121-bb67-4b85efd28a9b","Type":"ContainerDied","Data":"c9723b5870b137f9a83269a52d074f7adaf0eaf879e3768a0908da59f892bbd2"} Apr 24 16:47:54.389252 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:54.388875 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dzxps" event={"ID":"51cf2c6b-8020-4121-bb67-4b85efd28a9b","Type":"ContainerStarted","Data":"f0f81c927457a110bb691ab6faf57f3392041bc22419650ffbdd0c4c86fc88d5"} Apr 24 16:47:55.394818 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:55.394780 2579 generic.go:358] "Generic (PLEG): container finished" podID="51cf2c6b-8020-4121-bb67-4b85efd28a9b" containerID="b18ce53d644eba2d3169d32a014131d287b2afe6962b77adad9d2614ba19d2f0" exitCode=0 Apr 24 16:47:55.395295 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:55.394826 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dzxps" event={"ID":"51cf2c6b-8020-4121-bb67-4b85efd28a9b","Type":"ContainerDied","Data":"b18ce53d644eba2d3169d32a014131d287b2afe6962b77adad9d2614ba19d2f0"} Apr 24 16:47:56.400321 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:56.400286 2579 generic.go:358] "Generic (PLEG): container finished" podID="51cf2c6b-8020-4121-bb67-4b85efd28a9b" containerID="63b7d32c4fc6693d604102f66514f8062ac185278743a27c7ecdd89e18b83291" exitCode=0 Apr 24 16:47:56.400720 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:56.400382 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dzxps" event={"ID":"51cf2c6b-8020-4121-bb67-4b85efd28a9b","Type":"ContainerDied","Data":"63b7d32c4fc6693d604102f66514f8062ac185278743a27c7ecdd89e18b83291"} Apr 24 16:47:57.537716 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:57.537688 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dzxps" Apr 24 16:47:57.644968 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:57.644927 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/51cf2c6b-8020-4121-bb67-4b85efd28a9b-bundle\") pod \"51cf2c6b-8020-4121-bb67-4b85efd28a9b\" (UID: \"51cf2c6b-8020-4121-bb67-4b85efd28a9b\") " Apr 24 16:47:57.645215 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:57.644997 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bcjz\" (UniqueName: \"kubernetes.io/projected/51cf2c6b-8020-4121-bb67-4b85efd28a9b-kube-api-access-7bcjz\") pod \"51cf2c6b-8020-4121-bb67-4b85efd28a9b\" (UID: \"51cf2c6b-8020-4121-bb67-4b85efd28a9b\") " Apr 24 16:47:57.645215 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:57.645013 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/51cf2c6b-8020-4121-bb67-4b85efd28a9b-util\") pod \"51cf2c6b-8020-4121-bb67-4b85efd28a9b\" (UID: \"51cf2c6b-8020-4121-bb67-4b85efd28a9b\") " Apr 24 16:47:57.645949 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:57.645918 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51cf2c6b-8020-4121-bb67-4b85efd28a9b-bundle" (OuterVolumeSpecName: "bundle") pod "51cf2c6b-8020-4121-bb67-4b85efd28a9b" (UID: "51cf2c6b-8020-4121-bb67-4b85efd28a9b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:47:57.647515 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:57.647490 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51cf2c6b-8020-4121-bb67-4b85efd28a9b-kube-api-access-7bcjz" (OuterVolumeSpecName: "kube-api-access-7bcjz") pod "51cf2c6b-8020-4121-bb67-4b85efd28a9b" (UID: "51cf2c6b-8020-4121-bb67-4b85efd28a9b"). InnerVolumeSpecName "kube-api-access-7bcjz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:47:57.650706 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:57.650672 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51cf2c6b-8020-4121-bb67-4b85efd28a9b-util" (OuterVolumeSpecName: "util") pod "51cf2c6b-8020-4121-bb67-4b85efd28a9b" (UID: "51cf2c6b-8020-4121-bb67-4b85efd28a9b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:47:57.746725 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:57.746621 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/51cf2c6b-8020-4121-bb67-4b85efd28a9b-bundle\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:47:57.746725 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:57.746670 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7bcjz\" (UniqueName: \"kubernetes.io/projected/51cf2c6b-8020-4121-bb67-4b85efd28a9b-kube-api-access-7bcjz\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:47:57.746725 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:57.746685 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/51cf2c6b-8020-4121-bb67-4b85efd28a9b-util\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:47:58.410101 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:58.410066 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dzxps" Apr 24 16:47:58.410295 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:58.410066 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835dzxps" event={"ID":"51cf2c6b-8020-4121-bb67-4b85efd28a9b","Type":"ContainerDied","Data":"f0f81c927457a110bb691ab6faf57f3392041bc22419650ffbdd0c4c86fc88d5"} Apr 24 16:47:58.410295 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:47:58.410181 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0f81c927457a110bb691ab6faf57f3392041bc22419650ffbdd0c4c86fc88d5" Apr 24 16:48:07.812436 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:07.812345 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb62xcl"] Apr 24 16:48:07.812829 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:07.812761 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="51cf2c6b-8020-4121-bb67-4b85efd28a9b" containerName="pull" Apr 24 16:48:07.812829 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:07.812773 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="51cf2c6b-8020-4121-bb67-4b85efd28a9b" containerName="pull" Apr 24 16:48:07.812829 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:07.812792 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="51cf2c6b-8020-4121-bb67-4b85efd28a9b" containerName="util" Apr 24 16:48:07.812829 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:07.812797 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="51cf2c6b-8020-4121-bb67-4b85efd28a9b" containerName="util" Apr 24 16:48:07.812829 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:07.812810 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="51cf2c6b-8020-4121-bb67-4b85efd28a9b" containerName="extract" Apr 24 16:48:07.812829 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:07.812816 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="51cf2c6b-8020-4121-bb67-4b85efd28a9b" containerName="extract" Apr 24 16:48:07.813010 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:07.812874 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="51cf2c6b-8020-4121-bb67-4b85efd28a9b" containerName="extract" Apr 24 16:48:07.816133 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:07.816109 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb62xcl" Apr 24 16:48:07.819709 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:07.819682 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 24 16:48:07.819852 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:07.819807 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 24 16:48:07.821026 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:07.821005 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-4gnn8\"" Apr 24 16:48:07.843216 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:07.843179 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb62xcl"] Apr 24 16:48:07.946269 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:07.946232 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5m8p\" (UniqueName: \"kubernetes.io/projected/8bbaf67f-58a7-45b2-b041-ce3fd256eaed-kube-api-access-v5m8p\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb62xcl\" (UID: \"8bbaf67f-58a7-45b2-b041-ce3fd256eaed\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb62xcl" Apr 24 16:48:07.946441 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:07.946288 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8bbaf67f-58a7-45b2-b041-ce3fd256eaed-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb62xcl\" (UID: \"8bbaf67f-58a7-45b2-b041-ce3fd256eaed\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb62xcl" Apr 24 16:48:07.946441 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:07.946319 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8bbaf67f-58a7-45b2-b041-ce3fd256eaed-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb62xcl\" (UID: \"8bbaf67f-58a7-45b2-b041-ce3fd256eaed\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb62xcl" Apr 24 16:48:08.048529 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:08.048475 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v5m8p\" (UniqueName: \"kubernetes.io/projected/8bbaf67f-58a7-45b2-b041-ce3fd256eaed-kube-api-access-v5m8p\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb62xcl\" (UID: \"8bbaf67f-58a7-45b2-b041-ce3fd256eaed\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb62xcl" Apr 24 16:48:08.048529 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:08.048538 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8bbaf67f-58a7-45b2-b041-ce3fd256eaed-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb62xcl\" (UID: \"8bbaf67f-58a7-45b2-b041-ce3fd256eaed\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb62xcl" Apr 24 16:48:08.048851 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:08.048568 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8bbaf67f-58a7-45b2-b041-ce3fd256eaed-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb62xcl\" (UID: \"8bbaf67f-58a7-45b2-b041-ce3fd256eaed\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb62xcl" Apr 24 16:48:08.049040 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:08.049009 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8bbaf67f-58a7-45b2-b041-ce3fd256eaed-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb62xcl\" (UID: \"8bbaf67f-58a7-45b2-b041-ce3fd256eaed\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb62xcl" Apr 24 16:48:08.049040 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:08.049026 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8bbaf67f-58a7-45b2-b041-ce3fd256eaed-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb62xcl\" (UID: \"8bbaf67f-58a7-45b2-b041-ce3fd256eaed\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb62xcl" Apr 24 16:48:08.064909 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:08.064822 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5m8p\" (UniqueName: \"kubernetes.io/projected/8bbaf67f-58a7-45b2-b041-ce3fd256eaed-kube-api-access-v5m8p\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb62xcl\" (UID: \"8bbaf67f-58a7-45b2-b041-ce3fd256eaed\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb62xcl" Apr 24 16:48:08.126667 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:08.126622 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb62xcl" Apr 24 16:48:08.295091 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:08.294916 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb62xcl"] Apr 24 16:48:08.297861 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:48:08.297803 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8bbaf67f_58a7_45b2_b041_ce3fd256eaed.slice/crio-14e926918d051236d7ba6c52cdf47ef5d0f065ddd8fb418d8e0a4139d632e7a6 WatchSource:0}: Error finding container 14e926918d051236d7ba6c52cdf47ef5d0f065ddd8fb418d8e0a4139d632e7a6: Status 404 returned error can't find the container with id 14e926918d051236d7ba6c52cdf47ef5d0f065ddd8fb418d8e0a4139d632e7a6 Apr 24 16:48:08.449985 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:08.449951 2579 generic.go:358] "Generic (PLEG): container finished" podID="8bbaf67f-58a7-45b2-b041-ce3fd256eaed" containerID="a4081d785b7566c95247e0dc6d489cd70289efe4e0b17d118075c68fbb55fb0a" exitCode=0 Apr 24 16:48:08.450164 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:08.450041 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb62xcl" event={"ID":"8bbaf67f-58a7-45b2-b041-ce3fd256eaed","Type":"ContainerDied","Data":"a4081d785b7566c95247e0dc6d489cd70289efe4e0b17d118075c68fbb55fb0a"} Apr 24 16:48:08.450164 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:08.450080 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb62xcl" event={"ID":"8bbaf67f-58a7-45b2-b041-ce3fd256eaed","Type":"ContainerStarted","Data":"14e926918d051236d7ba6c52cdf47ef5d0f065ddd8fb418d8e0a4139d632e7a6"} Apr 24 16:48:09.382945 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:09.382850 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-pqx9m"] Apr 24 16:48:09.386491 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:09.386468 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-pqx9m" Apr 24 16:48:09.390819 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:09.390784 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 24 16:48:09.390977 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:09.390834 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 24 16:48:09.391491 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:09.391465 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-gbcjw\"" Apr 24 16:48:09.409945 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:09.409917 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-pqx9m"] Apr 24 16:48:09.456387 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:09.456346 2579 generic.go:358] "Generic (PLEG): container finished" podID="8bbaf67f-58a7-45b2-b041-ce3fd256eaed" containerID="01e67aee124f6445122b722d1e6b6150074804f4e4766b4a2818052e98112ae2" exitCode=0 Apr 24 16:48:09.456566 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:09.456427 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb62xcl" event={"ID":"8bbaf67f-58a7-45b2-b041-ce3fd256eaed","Type":"ContainerDied","Data":"01e67aee124f6445122b722d1e6b6150074804f4e4766b4a2818052e98112ae2"} Apr 24 16:48:09.563312 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:09.563272 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/7f2fd6c8-706d-4065-843d-438847c7c817-operator-config\") pod \"servicemesh-operator3-55f49c5f94-pqx9m\" (UID: \"7f2fd6c8-706d-4065-843d-438847c7c817\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-pqx9m" Apr 24 16:48:09.563489 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:09.563326 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wk5fj\" (UniqueName: \"kubernetes.io/projected/7f2fd6c8-706d-4065-843d-438847c7c817-kube-api-access-wk5fj\") pod \"servicemesh-operator3-55f49c5f94-pqx9m\" (UID: \"7f2fd6c8-706d-4065-843d-438847c7c817\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-pqx9m" Apr 24 16:48:09.664684 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:09.664588 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/7f2fd6c8-706d-4065-843d-438847c7c817-operator-config\") pod \"servicemesh-operator3-55f49c5f94-pqx9m\" (UID: \"7f2fd6c8-706d-4065-843d-438847c7c817\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-pqx9m" Apr 24 16:48:09.664684 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:09.664641 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wk5fj\" (UniqueName: \"kubernetes.io/projected/7f2fd6c8-706d-4065-843d-438847c7c817-kube-api-access-wk5fj\") pod \"servicemesh-operator3-55f49c5f94-pqx9m\" (UID: \"7f2fd6c8-706d-4065-843d-438847c7c817\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-pqx9m" Apr 24 16:48:09.667556 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:09.667522 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/7f2fd6c8-706d-4065-843d-438847c7c817-operator-config\") pod \"servicemesh-operator3-55f49c5f94-pqx9m\" (UID: \"7f2fd6c8-706d-4065-843d-438847c7c817\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-pqx9m" Apr 24 16:48:09.767396 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:09.767360 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wk5fj\" (UniqueName: \"kubernetes.io/projected/7f2fd6c8-706d-4065-843d-438847c7c817-kube-api-access-wk5fj\") pod \"servicemesh-operator3-55f49c5f94-pqx9m\" (UID: \"7f2fd6c8-706d-4065-843d-438847c7c817\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-pqx9m" Apr 24 16:48:09.996878 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:09.996777 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-pqx9m" Apr 24 16:48:10.045446 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:10.045408 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-75b5bf9f6d-splcs"] Apr 24 16:48:10.050920 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:10.050654 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-75b5bf9f6d-splcs" Apr 24 16:48:10.053548 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:10.053506 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 24 16:48:10.054101 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:10.054081 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 24 16:48:10.054230 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:10.054103 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 24 16:48:10.054935 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:10.054835 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-6xgxr\"" Apr 24 16:48:10.066900 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:10.066853 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-75b5bf9f6d-splcs"] Apr 24 16:48:10.162772 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:10.162746 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-pqx9m"] Apr 24 16:48:10.164486 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:48:10.164463 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f2fd6c8_706d_4065_843d_438847c7c817.slice/crio-d8abd1ada61e957542e2c00cf2113fe8c581aa3ab86600c5ce058c2e9093dcc1 WatchSource:0}: Error finding container d8abd1ada61e957542e2c00cf2113fe8c581aa3ab86600c5ce058c2e9093dcc1: Status 404 returned error can't find the container with id d8abd1ada61e957542e2c00cf2113fe8c581aa3ab86600c5ce058c2e9093dcc1 Apr 24 16:48:10.167798 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:10.167690 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spxq2\" (UniqueName: \"kubernetes.io/projected/6065d775-116b-4301-9a90-beb7dc443688-kube-api-access-spxq2\") pod \"lws-controller-manager-75b5bf9f6d-splcs\" (UID: \"6065d775-116b-4301-9a90-beb7dc443688\") " pod="openshift-lws-operator/lws-controller-manager-75b5bf9f6d-splcs" Apr 24 16:48:10.167878 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:10.167818 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/6065d775-116b-4301-9a90-beb7dc443688-metrics-cert\") pod \"lws-controller-manager-75b5bf9f6d-splcs\" (UID: \"6065d775-116b-4301-9a90-beb7dc443688\") " pod="openshift-lws-operator/lws-controller-manager-75b5bf9f6d-splcs" Apr 24 16:48:10.167929 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:10.167898 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/6065d775-116b-4301-9a90-beb7dc443688-manager-config\") pod \"lws-controller-manager-75b5bf9f6d-splcs\" (UID: \"6065d775-116b-4301-9a90-beb7dc443688\") " pod="openshift-lws-operator/lws-controller-manager-75b5bf9f6d-splcs" Apr 24 16:48:10.167981 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:10.167961 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6065d775-116b-4301-9a90-beb7dc443688-cert\") pod \"lws-controller-manager-75b5bf9f6d-splcs\" (UID: \"6065d775-116b-4301-9a90-beb7dc443688\") " pod="openshift-lws-operator/lws-controller-manager-75b5bf9f6d-splcs" Apr 24 16:48:10.269210 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:10.269169 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6065d775-116b-4301-9a90-beb7dc443688-cert\") pod \"lws-controller-manager-75b5bf9f6d-splcs\" (UID: \"6065d775-116b-4301-9a90-beb7dc443688\") " pod="openshift-lws-operator/lws-controller-manager-75b5bf9f6d-splcs" Apr 24 16:48:10.269210 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:10.269213 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-spxq2\" (UniqueName: \"kubernetes.io/projected/6065d775-116b-4301-9a90-beb7dc443688-kube-api-access-spxq2\") pod \"lws-controller-manager-75b5bf9f6d-splcs\" (UID: \"6065d775-116b-4301-9a90-beb7dc443688\") " pod="openshift-lws-operator/lws-controller-manager-75b5bf9f6d-splcs" Apr 24 16:48:10.269494 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:10.269274 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/6065d775-116b-4301-9a90-beb7dc443688-metrics-cert\") pod \"lws-controller-manager-75b5bf9f6d-splcs\" (UID: \"6065d775-116b-4301-9a90-beb7dc443688\") " pod="openshift-lws-operator/lws-controller-manager-75b5bf9f6d-splcs" Apr 24 16:48:10.269494 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:10.269338 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/6065d775-116b-4301-9a90-beb7dc443688-manager-config\") pod \"lws-controller-manager-75b5bf9f6d-splcs\" (UID: \"6065d775-116b-4301-9a90-beb7dc443688\") " pod="openshift-lws-operator/lws-controller-manager-75b5bf9f6d-splcs" Apr 24 16:48:10.269993 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:10.269968 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/6065d775-116b-4301-9a90-beb7dc443688-manager-config\") pod \"lws-controller-manager-75b5bf9f6d-splcs\" (UID: \"6065d775-116b-4301-9a90-beb7dc443688\") " pod="openshift-lws-operator/lws-controller-manager-75b5bf9f6d-splcs" Apr 24 16:48:10.271956 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:10.271936 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6065d775-116b-4301-9a90-beb7dc443688-cert\") pod \"lws-controller-manager-75b5bf9f6d-splcs\" (UID: \"6065d775-116b-4301-9a90-beb7dc443688\") " pod="openshift-lws-operator/lws-controller-manager-75b5bf9f6d-splcs" Apr 24 16:48:10.272072 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:10.271961 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/6065d775-116b-4301-9a90-beb7dc443688-metrics-cert\") pod \"lws-controller-manager-75b5bf9f6d-splcs\" (UID: \"6065d775-116b-4301-9a90-beb7dc443688\") " pod="openshift-lws-operator/lws-controller-manager-75b5bf9f6d-splcs" Apr 24 16:48:10.286966 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:10.286934 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-spxq2\" (UniqueName: \"kubernetes.io/projected/6065d775-116b-4301-9a90-beb7dc443688-kube-api-access-spxq2\") pod \"lws-controller-manager-75b5bf9f6d-splcs\" (UID: \"6065d775-116b-4301-9a90-beb7dc443688\") " pod="openshift-lws-operator/lws-controller-manager-75b5bf9f6d-splcs" Apr 24 16:48:10.364556 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:10.364511 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-75b5bf9f6d-splcs" Apr 24 16:48:10.464700 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:10.464661 2579 generic.go:358] "Generic (PLEG): container finished" podID="8bbaf67f-58a7-45b2-b041-ce3fd256eaed" containerID="5a2eed2cea310844e3f979a3f6f76d879011b01fd5c39177e246998fb4cad5e7" exitCode=0 Apr 24 16:48:10.465154 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:10.464799 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb62xcl" event={"ID":"8bbaf67f-58a7-45b2-b041-ce3fd256eaed","Type":"ContainerDied","Data":"5a2eed2cea310844e3f979a3f6f76d879011b01fd5c39177e246998fb4cad5e7"} Apr 24 16:48:10.466539 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:10.466504 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-pqx9m" event={"ID":"7f2fd6c8-706d-4065-843d-438847c7c817","Type":"ContainerStarted","Data":"d8abd1ada61e957542e2c00cf2113fe8c581aa3ab86600c5ce058c2e9093dcc1"} Apr 24 16:48:10.529029 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:48:10.528985 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6065d775_116b_4301_9a90_beb7dc443688.slice/crio-efa877e754cbd650153447afbb8f0a3e68b162b59dca70e1ec934d03b66ad1c6 WatchSource:0}: Error finding container efa877e754cbd650153447afbb8f0a3e68b162b59dca70e1ec934d03b66ad1c6: Status 404 returned error can't find the container with id efa877e754cbd650153447afbb8f0a3e68b162b59dca70e1ec934d03b66ad1c6 Apr 24 16:48:10.530901 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:10.530871 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-75b5bf9f6d-splcs"] Apr 24 16:48:11.473900 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:11.473860 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-75b5bf9f6d-splcs" event={"ID":"6065d775-116b-4301-9a90-beb7dc443688","Type":"ContainerStarted","Data":"efa877e754cbd650153447afbb8f0a3e68b162b59dca70e1ec934d03b66ad1c6"} Apr 24 16:48:11.654251 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:11.653946 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb62xcl" Apr 24 16:48:11.680753 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:11.679971 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5m8p\" (UniqueName: \"kubernetes.io/projected/8bbaf67f-58a7-45b2-b041-ce3fd256eaed-kube-api-access-v5m8p\") pod \"8bbaf67f-58a7-45b2-b041-ce3fd256eaed\" (UID: \"8bbaf67f-58a7-45b2-b041-ce3fd256eaed\") " Apr 24 16:48:11.680753 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:11.680051 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8bbaf67f-58a7-45b2-b041-ce3fd256eaed-bundle\") pod \"8bbaf67f-58a7-45b2-b041-ce3fd256eaed\" (UID: \"8bbaf67f-58a7-45b2-b041-ce3fd256eaed\") " Apr 24 16:48:11.680753 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:11.680089 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8bbaf67f-58a7-45b2-b041-ce3fd256eaed-util\") pod \"8bbaf67f-58a7-45b2-b041-ce3fd256eaed\" (UID: \"8bbaf67f-58a7-45b2-b041-ce3fd256eaed\") " Apr 24 16:48:11.682803 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:11.681467 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bbaf67f-58a7-45b2-b041-ce3fd256eaed-bundle" (OuterVolumeSpecName: "bundle") pod "8bbaf67f-58a7-45b2-b041-ce3fd256eaed" (UID: "8bbaf67f-58a7-45b2-b041-ce3fd256eaed"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:48:11.684971 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:11.684903 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bbaf67f-58a7-45b2-b041-ce3fd256eaed-kube-api-access-v5m8p" (OuterVolumeSpecName: "kube-api-access-v5m8p") pod "8bbaf67f-58a7-45b2-b041-ce3fd256eaed" (UID: "8bbaf67f-58a7-45b2-b041-ce3fd256eaed"). InnerVolumeSpecName "kube-api-access-v5m8p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:48:11.689259 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:11.689200 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bbaf67f-58a7-45b2-b041-ce3fd256eaed-util" (OuterVolumeSpecName: "util") pod "8bbaf67f-58a7-45b2-b041-ce3fd256eaed" (UID: "8bbaf67f-58a7-45b2-b041-ce3fd256eaed"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:48:11.781556 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:11.781508 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v5m8p\" (UniqueName: \"kubernetes.io/projected/8bbaf67f-58a7-45b2-b041-ce3fd256eaed-kube-api-access-v5m8p\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:48:11.781556 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:11.781554 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8bbaf67f-58a7-45b2-b041-ce3fd256eaed-bundle\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:48:11.781870 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:11.781570 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8bbaf67f-58a7-45b2-b041-ce3fd256eaed-util\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:48:12.481336 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:12.481291 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb62xcl" Apr 24 16:48:12.481874 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:12.481290 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb62xcl" event={"ID":"8bbaf67f-58a7-45b2-b041-ce3fd256eaed","Type":"ContainerDied","Data":"14e926918d051236d7ba6c52cdf47ef5d0f065ddd8fb418d8e0a4139d632e7a6"} Apr 24 16:48:12.481874 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:12.481430 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14e926918d051236d7ba6c52cdf47ef5d0f065ddd8fb418d8e0a4139d632e7a6" Apr 24 16:48:13.488472 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:13.488428 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-pqx9m" event={"ID":"7f2fd6c8-706d-4065-843d-438847c7c817","Type":"ContainerStarted","Data":"415f2f4804cce6c62bb11910431fc16c0c2c314a17d46e75db51562739296cb1"} Apr 24 16:48:13.489100 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:13.488822 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-pqx9m" Apr 24 16:48:13.491340 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:13.490861 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-75b5bf9f6d-splcs" event={"ID":"6065d775-116b-4301-9a90-beb7dc443688","Type":"ContainerStarted","Data":"fdf0a60c2a7d0e7f71a558459409fee843ac6c1645ae7f2eb1938318f8b8e9a7"} Apr 24 16:48:13.491340 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:13.491042 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-75b5bf9f6d-splcs" Apr 24 16:48:13.526933 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:13.526864 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-pqx9m" podStartSLOduration=1.405160512 podStartE2EDuration="4.526841918s" podCreationTimestamp="2026-04-24 16:48:09 +0000 UTC" firstStartedPulling="2026-04-24 16:48:10.167203952 +0000 UTC m=+526.100273130" lastFinishedPulling="2026-04-24 16:48:13.288885349 +0000 UTC m=+529.221954536" observedRunningTime="2026-04-24 16:48:13.523104134 +0000 UTC m=+529.456173328" watchObservedRunningTime="2026-04-24 16:48:13.526841918 +0000 UTC m=+529.459911118" Apr 24 16:48:13.550661 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:13.550542 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-75b5bf9f6d-splcs" podStartSLOduration=0.791888763 podStartE2EDuration="3.550518834s" podCreationTimestamp="2026-04-24 16:48:10 +0000 UTC" firstStartedPulling="2026-04-24 16:48:10.531623184 +0000 UTC m=+526.464692364" lastFinishedPulling="2026-04-24 16:48:13.290253255 +0000 UTC m=+529.223322435" observedRunningTime="2026-04-24 16:48:13.546082937 +0000 UTC m=+529.479152138" watchObservedRunningTime="2026-04-24 16:48:13.550518834 +0000 UTC m=+529.483588037" Apr 24 16:48:24.497785 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:24.497724 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-75b5bf9f6d-splcs" Apr 24 16:48:24.498198 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:24.497940 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-pqx9m" Apr 24 16:48:29.990962 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:29.990928 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503r276n"] Apr 24 16:48:29.991366 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:29.991351 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8bbaf67f-58a7-45b2-b041-ce3fd256eaed" containerName="pull" Apr 24 16:48:29.991419 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:29.991368 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bbaf67f-58a7-45b2-b041-ce3fd256eaed" containerName="pull" Apr 24 16:48:29.991419 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:29.991385 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8bbaf67f-58a7-45b2-b041-ce3fd256eaed" containerName="extract" Apr 24 16:48:29.991419 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:29.991391 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bbaf67f-58a7-45b2-b041-ce3fd256eaed" containerName="extract" Apr 24 16:48:29.991419 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:29.991406 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8bbaf67f-58a7-45b2-b041-ce3fd256eaed" containerName="util" Apr 24 16:48:29.991419 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:29.991413 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bbaf67f-58a7-45b2-b041-ce3fd256eaed" containerName="util" Apr 24 16:48:29.991618 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:29.991481 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="8bbaf67f-58a7-45b2-b041-ce3fd256eaed" containerName="extract" Apr 24 16:48:29.996592 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:29.996565 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503r276n" Apr 24 16:48:29.999037 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:29.999010 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-4gnn8\"" Apr 24 16:48:29.999180 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:29.999012 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 24 16:48:29.999510 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:29.999495 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 24 16:48:30.003948 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:30.003919 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503r276n"] Apr 24 16:48:30.039069 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:30.039025 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vm8g\" (UniqueName: \"kubernetes.io/projected/62f656e5-ebb1-49bb-8103-f8b8cf7b4d48-kube-api-access-6vm8g\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503r276n\" (UID: \"62f656e5-ebb1-49bb-8103-f8b8cf7b4d48\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503r276n" Apr 24 16:48:30.039265 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:30.039118 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/62f656e5-ebb1-49bb-8103-f8b8cf7b4d48-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503r276n\" (UID: \"62f656e5-ebb1-49bb-8103-f8b8cf7b4d48\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503r276n" Apr 24 16:48:30.039265 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:30.039169 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/62f656e5-ebb1-49bb-8103-f8b8cf7b4d48-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503r276n\" (UID: \"62f656e5-ebb1-49bb-8103-f8b8cf7b4d48\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503r276n" Apr 24 16:48:30.091327 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:30.091287 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30ntztj"] Apr 24 16:48:30.095553 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:30.095524 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30ntztj" Apr 24 16:48:30.102536 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:30.102487 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30ntztj"] Apr 24 16:48:30.139641 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:30.139590 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/62f656e5-ebb1-49bb-8103-f8b8cf7b4d48-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503r276n\" (UID: \"62f656e5-ebb1-49bb-8103-f8b8cf7b4d48\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503r276n" Apr 24 16:48:30.139641 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:30.139638 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/71edc11f-fd5d-4f18-9911-eb73de0e383b-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30ntztj\" (UID: \"71edc11f-fd5d-4f18-9911-eb73de0e383b\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30ntztj" Apr 24 16:48:30.139925 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:30.139660 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/71edc11f-fd5d-4f18-9911-eb73de0e383b-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30ntztj\" (UID: \"71edc11f-fd5d-4f18-9911-eb73de0e383b\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30ntztj" Apr 24 16:48:30.139925 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:30.139833 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvx4z\" (UniqueName: \"kubernetes.io/projected/71edc11f-fd5d-4f18-9911-eb73de0e383b-kube-api-access-wvx4z\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30ntztj\" (UID: \"71edc11f-fd5d-4f18-9911-eb73de0e383b\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30ntztj" Apr 24 16:48:30.139925 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:30.139904 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6vm8g\" (UniqueName: \"kubernetes.io/projected/62f656e5-ebb1-49bb-8103-f8b8cf7b4d48-kube-api-access-6vm8g\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503r276n\" (UID: \"62f656e5-ebb1-49bb-8103-f8b8cf7b4d48\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503r276n" Apr 24 16:48:30.140077 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:30.139954 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/62f656e5-ebb1-49bb-8103-f8b8cf7b4d48-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503r276n\" (UID: \"62f656e5-ebb1-49bb-8103-f8b8cf7b4d48\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503r276n" Apr 24 16:48:30.140077 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:30.140072 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/62f656e5-ebb1-49bb-8103-f8b8cf7b4d48-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503r276n\" (UID: \"62f656e5-ebb1-49bb-8103-f8b8cf7b4d48\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503r276n" Apr 24 16:48:30.140266 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:30.140243 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/62f656e5-ebb1-49bb-8103-f8b8cf7b4d48-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503r276n\" (UID: \"62f656e5-ebb1-49bb-8103-f8b8cf7b4d48\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503r276n" Apr 24 16:48:30.149036 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:30.148988 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vm8g\" (UniqueName: \"kubernetes.io/projected/62f656e5-ebb1-49bb-8103-f8b8cf7b4d48-kube-api-access-6vm8g\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503r276n\" (UID: \"62f656e5-ebb1-49bb-8103-f8b8cf7b4d48\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503r276n" Apr 24 16:48:30.191220 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:30.191178 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bgll4t"] Apr 24 16:48:30.195485 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:30.195463 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bgll4t" Apr 24 16:48:30.203903 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:30.203868 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bgll4t"] Apr 24 16:48:30.241175 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:30.241076 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmmjb\" (UniqueName: \"kubernetes.io/projected/cc3ad02b-5097-4178-b231-0501dfd0a762-kube-api-access-rmmjb\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bgll4t\" (UID: \"cc3ad02b-5097-4178-b231-0501dfd0a762\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bgll4t" Apr 24 16:48:30.241175 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:30.241129 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wvx4z\" (UniqueName: \"kubernetes.io/projected/71edc11f-fd5d-4f18-9911-eb73de0e383b-kube-api-access-wvx4z\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30ntztj\" (UID: \"71edc11f-fd5d-4f18-9911-eb73de0e383b\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30ntztj" Apr 24 16:48:30.241175 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:30.241160 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cc3ad02b-5097-4178-b231-0501dfd0a762-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bgll4t\" (UID: \"cc3ad02b-5097-4178-b231-0501dfd0a762\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bgll4t" Apr 24 16:48:30.241487 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:30.241187 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cc3ad02b-5097-4178-b231-0501dfd0a762-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bgll4t\" (UID: \"cc3ad02b-5097-4178-b231-0501dfd0a762\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bgll4t" Apr 24 16:48:30.241487 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:30.241219 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/71edc11f-fd5d-4f18-9911-eb73de0e383b-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30ntztj\" (UID: \"71edc11f-fd5d-4f18-9911-eb73de0e383b\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30ntztj" Apr 24 16:48:30.241487 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:30.241241 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/71edc11f-fd5d-4f18-9911-eb73de0e383b-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30ntztj\" (UID: \"71edc11f-fd5d-4f18-9911-eb73de0e383b\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30ntztj" Apr 24 16:48:30.241658 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:30.241609 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/71edc11f-fd5d-4f18-9911-eb73de0e383b-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30ntztj\" (UID: \"71edc11f-fd5d-4f18-9911-eb73de0e383b\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30ntztj" Apr 24 16:48:30.241713 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:30.241669 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/71edc11f-fd5d-4f18-9911-eb73de0e383b-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30ntztj\" (UID: \"71edc11f-fd5d-4f18-9911-eb73de0e383b\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30ntztj" Apr 24 16:48:30.249567 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:30.249530 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvx4z\" (UniqueName: \"kubernetes.io/projected/71edc11f-fd5d-4f18-9911-eb73de0e383b-kube-api-access-wvx4z\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30ntztj\" (UID: \"71edc11f-fd5d-4f18-9911-eb73de0e383b\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30ntztj" Apr 24 16:48:30.290157 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:30.290112 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88ph4sz"] Apr 24 16:48:30.301912 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:30.301875 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88ph4sz"] Apr 24 16:48:30.302114 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:30.302005 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88ph4sz" Apr 24 16:48:30.308348 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:30.308310 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503r276n" Apr 24 16:48:30.342546 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:30.342509 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cc3ad02b-5097-4178-b231-0501dfd0a762-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bgll4t\" (UID: \"cc3ad02b-5097-4178-b231-0501dfd0a762\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bgll4t" Apr 24 16:48:30.342770 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:30.342571 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cc3ad02b-5097-4178-b231-0501dfd0a762-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bgll4t\" (UID: \"cc3ad02b-5097-4178-b231-0501dfd0a762\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bgll4t" Apr 24 16:48:30.342770 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:30.342626 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3922e82e-d63a-448a-8155-05005f956cf5-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88ph4sz\" (UID: \"3922e82e-d63a-448a-8155-05005f956cf5\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88ph4sz" Apr 24 16:48:30.342770 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:30.342672 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3922e82e-d63a-448a-8155-05005f956cf5-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88ph4sz\" (UID: \"3922e82e-d63a-448a-8155-05005f956cf5\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88ph4sz" Apr 24 16:48:30.342770 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:30.342745 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rmmjb\" (UniqueName: \"kubernetes.io/projected/cc3ad02b-5097-4178-b231-0501dfd0a762-kube-api-access-rmmjb\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bgll4t\" (UID: \"cc3ad02b-5097-4178-b231-0501dfd0a762\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bgll4t" Apr 24 16:48:30.342997 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:30.342797 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9k6pc\" (UniqueName: \"kubernetes.io/projected/3922e82e-d63a-448a-8155-05005f956cf5-kube-api-access-9k6pc\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88ph4sz\" (UID: \"3922e82e-d63a-448a-8155-05005f956cf5\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88ph4sz" Apr 24 16:48:30.342997 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:30.342976 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cc3ad02b-5097-4178-b231-0501dfd0a762-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bgll4t\" (UID: \"cc3ad02b-5097-4178-b231-0501dfd0a762\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bgll4t" Apr 24 16:48:30.343093 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:30.343039 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cc3ad02b-5097-4178-b231-0501dfd0a762-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bgll4t\" (UID: \"cc3ad02b-5097-4178-b231-0501dfd0a762\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bgll4t" Apr 24 16:48:30.352055 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:30.352016 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmmjb\" (UniqueName: \"kubernetes.io/projected/cc3ad02b-5097-4178-b231-0501dfd0a762-kube-api-access-rmmjb\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bgll4t\" (UID: \"cc3ad02b-5097-4178-b231-0501dfd0a762\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bgll4t" Apr 24 16:48:30.408126 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:30.408090 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30ntztj" Apr 24 16:48:30.443647 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:30.443610 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3922e82e-d63a-448a-8155-05005f956cf5-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88ph4sz\" (UID: \"3922e82e-d63a-448a-8155-05005f956cf5\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88ph4sz" Apr 24 16:48:30.443877 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:30.443660 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3922e82e-d63a-448a-8155-05005f956cf5-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88ph4sz\" (UID: \"3922e82e-d63a-448a-8155-05005f956cf5\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88ph4sz" Apr 24 16:48:30.443877 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:30.443847 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9k6pc\" (UniqueName: \"kubernetes.io/projected/3922e82e-d63a-448a-8155-05005f956cf5-kube-api-access-9k6pc\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88ph4sz\" (UID: \"3922e82e-d63a-448a-8155-05005f956cf5\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88ph4sz" Apr 24 16:48:30.444147 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:30.444128 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3922e82e-d63a-448a-8155-05005f956cf5-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88ph4sz\" (UID: \"3922e82e-d63a-448a-8155-05005f956cf5\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88ph4sz" Apr 24 16:48:30.444228 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:30.444208 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3922e82e-d63a-448a-8155-05005f956cf5-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88ph4sz\" (UID: \"3922e82e-d63a-448a-8155-05005f956cf5\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88ph4sz" Apr 24 16:48:30.453711 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:30.453629 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9k6pc\" (UniqueName: \"kubernetes.io/projected/3922e82e-d63a-448a-8155-05005f956cf5-kube-api-access-9k6pc\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88ph4sz\" (UID: \"3922e82e-d63a-448a-8155-05005f956cf5\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88ph4sz" Apr 24 16:48:30.458857 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:30.458820 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503r276n"] Apr 24 16:48:30.460909 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:48:30.460874 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62f656e5_ebb1_49bb_8103_f8b8cf7b4d48.slice/crio-af844455ac23953fbe7398b48777d5ef484f099e6ef24d825cf7494a19d14ba7 WatchSource:0}: Error finding container af844455ac23953fbe7398b48777d5ef484f099e6ef24d825cf7494a19d14ba7: Status 404 returned error can't find the container with id af844455ac23953fbe7398b48777d5ef484f099e6ef24d825cf7494a19d14ba7 Apr 24 16:48:30.507933 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:30.507855 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bgll4t" Apr 24 16:48:30.560337 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:30.560301 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30ntztj"] Apr 24 16:48:30.562428 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:48:30.562390 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71edc11f_fd5d_4f18_9911_eb73de0e383b.slice/crio-2fb149032b673e5de44fc74528f72c4f82996c7ce818331cfd3e70e26b587ea5 WatchSource:0}: Error finding container 2fb149032b673e5de44fc74528f72c4f82996c7ce818331cfd3e70e26b587ea5: Status 404 returned error can't find the container with id 2fb149032b673e5de44fc74528f72c4f82996c7ce818331cfd3e70e26b587ea5 Apr 24 16:48:30.566247 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:30.566087 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503r276n" event={"ID":"62f656e5-ebb1-49bb-8103-f8b8cf7b4d48","Type":"ContainerStarted","Data":"1fec5eab658e499ee307ea47aa86276843d6d111c7dbf144fe0ae6d01fc78bbb"} Apr 24 16:48:30.566247 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:30.566151 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503r276n" event={"ID":"62f656e5-ebb1-49bb-8103-f8b8cf7b4d48","Type":"ContainerStarted","Data":"af844455ac23953fbe7398b48777d5ef484f099e6ef24d825cf7494a19d14ba7"} Apr 24 16:48:30.614081 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:30.614037 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88ph4sz" Apr 24 16:48:30.665309 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:30.665265 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bgll4t"] Apr 24 16:48:30.667523 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:48:30.667216 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc3ad02b_5097_4178_b231_0501dfd0a762.slice/crio-7de0a14bcb90f48e98cceb03f4fd3a1f0be1f65cf1a677961ca722675fff0b99 WatchSource:0}: Error finding container 7de0a14bcb90f48e98cceb03f4fd3a1f0be1f65cf1a677961ca722675fff0b99: Status 404 returned error can't find the container with id 7de0a14bcb90f48e98cceb03f4fd3a1f0be1f65cf1a677961ca722675fff0b99 Apr 24 16:48:30.766399 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:30.766362 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88ph4sz"] Apr 24 16:48:30.792976 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:48:30.792929 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3922e82e_d63a_448a_8155_05005f956cf5.slice/crio-d282382702c31c9e294f86843245ff8dbcfc9a07655f9488b0ff54c8f932962c WatchSource:0}: Error finding container d282382702c31c9e294f86843245ff8dbcfc9a07655f9488b0ff54c8f932962c: Status 404 returned error can't find the container with id d282382702c31c9e294f86843245ff8dbcfc9a07655f9488b0ff54c8f932962c Apr 24 16:48:31.572499 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:31.572459 2579 generic.go:358] "Generic (PLEG): container finished" podID="cc3ad02b-5097-4178-b231-0501dfd0a762" containerID="20368c892fe1a9892d475513cd7db64729ff4b4c495d38c2bf53223a03603569" exitCode=0 Apr 24 16:48:31.573014 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:31.572544 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bgll4t" event={"ID":"cc3ad02b-5097-4178-b231-0501dfd0a762","Type":"ContainerDied","Data":"20368c892fe1a9892d475513cd7db64729ff4b4c495d38c2bf53223a03603569"} Apr 24 16:48:31.573014 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:31.572581 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bgll4t" event={"ID":"cc3ad02b-5097-4178-b231-0501dfd0a762","Type":"ContainerStarted","Data":"7de0a14bcb90f48e98cceb03f4fd3a1f0be1f65cf1a677961ca722675fff0b99"} Apr 24 16:48:31.574134 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:31.574107 2579 generic.go:358] "Generic (PLEG): container finished" podID="3922e82e-d63a-448a-8155-05005f956cf5" containerID="a2a3392c215191bfbefc84a6d98a1edae30b93d38f2d277b6513d7e470931b61" exitCode=0 Apr 24 16:48:31.574237 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:31.574209 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88ph4sz" event={"ID":"3922e82e-d63a-448a-8155-05005f956cf5","Type":"ContainerDied","Data":"a2a3392c215191bfbefc84a6d98a1edae30b93d38f2d277b6513d7e470931b61"} Apr 24 16:48:31.574297 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:31.574255 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88ph4sz" event={"ID":"3922e82e-d63a-448a-8155-05005f956cf5","Type":"ContainerStarted","Data":"d282382702c31c9e294f86843245ff8dbcfc9a07655f9488b0ff54c8f932962c"} Apr 24 16:48:31.575746 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:31.575711 2579 generic.go:358] "Generic (PLEG): container finished" podID="71edc11f-fd5d-4f18-9911-eb73de0e383b" containerID="3b4282becbf6dbf8d23d66f5d42ef201cc2f5051a239460431cfe8e907bee3e6" exitCode=0 Apr 24 16:48:31.575826 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:31.575774 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30ntztj" event={"ID":"71edc11f-fd5d-4f18-9911-eb73de0e383b","Type":"ContainerDied","Data":"3b4282becbf6dbf8d23d66f5d42ef201cc2f5051a239460431cfe8e907bee3e6"} Apr 24 16:48:31.575826 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:31.575802 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30ntztj" event={"ID":"71edc11f-fd5d-4f18-9911-eb73de0e383b","Type":"ContainerStarted","Data":"2fb149032b673e5de44fc74528f72c4f82996c7ce818331cfd3e70e26b587ea5"} Apr 24 16:48:31.577343 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:31.577318 2579 generic.go:358] "Generic (PLEG): container finished" podID="62f656e5-ebb1-49bb-8103-f8b8cf7b4d48" containerID="1fec5eab658e499ee307ea47aa86276843d6d111c7dbf144fe0ae6d01fc78bbb" exitCode=0 Apr 24 16:48:31.577435 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:31.577347 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503r276n" event={"ID":"62f656e5-ebb1-49bb-8103-f8b8cf7b4d48","Type":"ContainerDied","Data":"1fec5eab658e499ee307ea47aa86276843d6d111c7dbf144fe0ae6d01fc78bbb"} Apr 24 16:48:32.585172 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:32.585130 2579 generic.go:358] "Generic (PLEG): container finished" podID="cc3ad02b-5097-4178-b231-0501dfd0a762" containerID="ef68fb0b37552b5bb8dc0db27e9b2de796d2ea99ee7ef053e0c92a84d97d6498" exitCode=0 Apr 24 16:48:32.585674 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:32.585266 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bgll4t" event={"ID":"cc3ad02b-5097-4178-b231-0501dfd0a762","Type":"ContainerDied","Data":"ef68fb0b37552b5bb8dc0db27e9b2de796d2ea99ee7ef053e0c92a84d97d6498"} Apr 24 16:48:33.590776 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:33.590722 2579 generic.go:358] "Generic (PLEG): container finished" podID="62f656e5-ebb1-49bb-8103-f8b8cf7b4d48" containerID="ef50f9de64177ad11c1bf9e573eee5ac219c4a7fe941eb3e05576f00d5eecc5f" exitCode=0 Apr 24 16:48:33.591194 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:33.590816 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503r276n" event={"ID":"62f656e5-ebb1-49bb-8103-f8b8cf7b4d48","Type":"ContainerDied","Data":"ef50f9de64177ad11c1bf9e573eee5ac219c4a7fe941eb3e05576f00d5eecc5f"} Apr 24 16:48:33.593075 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:33.593037 2579 generic.go:358] "Generic (PLEG): container finished" podID="cc3ad02b-5097-4178-b231-0501dfd0a762" containerID="6a87df56a75459425228168a16bbad17741a122a703806ee80eb428d65990392" exitCode=0 Apr 24 16:48:33.593172 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:33.593099 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bgll4t" event={"ID":"cc3ad02b-5097-4178-b231-0501dfd0a762","Type":"ContainerDied","Data":"6a87df56a75459425228168a16bbad17741a122a703806ee80eb428d65990392"} Apr 24 16:48:33.594973 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:33.594939 2579 generic.go:358] "Generic (PLEG): container finished" podID="3922e82e-d63a-448a-8155-05005f956cf5" containerID="fb477b30f17592621d4ea2d16659a0a8f4c23742e80d8240c61bfa66cb4db8ea" exitCode=0 Apr 24 16:48:33.595097 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:33.594971 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88ph4sz" event={"ID":"3922e82e-d63a-448a-8155-05005f956cf5","Type":"ContainerDied","Data":"fb477b30f17592621d4ea2d16659a0a8f4c23742e80d8240c61bfa66cb4db8ea"} Apr 24 16:48:33.597183 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:33.597157 2579 generic.go:358] "Generic (PLEG): container finished" podID="71edc11f-fd5d-4f18-9911-eb73de0e383b" containerID="bdea0a90e1ef3fc4c3efb2450f6a9d884b05c1db2738c5cb9b60eb5617af9e2b" exitCode=0 Apr 24 16:48:33.597274 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:33.597202 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30ntztj" event={"ID":"71edc11f-fd5d-4f18-9911-eb73de0e383b","Type":"ContainerDied","Data":"bdea0a90e1ef3fc4c3efb2450f6a9d884b05c1db2738c5cb9b60eb5617af9e2b"} Apr 24 16:48:34.603664 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:34.603625 2579 generic.go:358] "Generic (PLEG): container finished" podID="62f656e5-ebb1-49bb-8103-f8b8cf7b4d48" containerID="c63b332a18e1e683443d5b21020afb0b421da3f0154a3abffbc36283e7ebb9d8" exitCode=0 Apr 24 16:48:34.604137 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:34.603719 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503r276n" event={"ID":"62f656e5-ebb1-49bb-8103-f8b8cf7b4d48","Type":"ContainerDied","Data":"c63b332a18e1e683443d5b21020afb0b421da3f0154a3abffbc36283e7ebb9d8"} Apr 24 16:48:34.605628 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:34.605605 2579 generic.go:358] "Generic (PLEG): container finished" podID="3922e82e-d63a-448a-8155-05005f956cf5" containerID="656da2d5968c899ae657cdbaf13dc069e309b93b126f5cf67517058c8da67fe5" exitCode=0 Apr 24 16:48:34.605768 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:34.605673 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88ph4sz" event={"ID":"3922e82e-d63a-448a-8155-05005f956cf5","Type":"ContainerDied","Data":"656da2d5968c899ae657cdbaf13dc069e309b93b126f5cf67517058c8da67fe5"} Apr 24 16:48:34.607489 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:34.607460 2579 generic.go:358] "Generic (PLEG): container finished" podID="71edc11f-fd5d-4f18-9911-eb73de0e383b" containerID="6b0d694fb3879caeb83edbbe57b7f31827a4d689d17a0e192a4721cddc0dc363" exitCode=0 Apr 24 16:48:34.607597 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:34.607504 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30ntztj" event={"ID":"71edc11f-fd5d-4f18-9911-eb73de0e383b","Type":"ContainerDied","Data":"6b0d694fb3879caeb83edbbe57b7f31827a4d689d17a0e192a4721cddc0dc363"} Apr 24 16:48:34.748862 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:34.748834 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bgll4t" Apr 24 16:48:34.782283 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:34.782242 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmmjb\" (UniqueName: \"kubernetes.io/projected/cc3ad02b-5097-4178-b231-0501dfd0a762-kube-api-access-rmmjb\") pod \"cc3ad02b-5097-4178-b231-0501dfd0a762\" (UID: \"cc3ad02b-5097-4178-b231-0501dfd0a762\") " Apr 24 16:48:34.782438 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:34.782317 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cc3ad02b-5097-4178-b231-0501dfd0a762-bundle\") pod \"cc3ad02b-5097-4178-b231-0501dfd0a762\" (UID: \"cc3ad02b-5097-4178-b231-0501dfd0a762\") " Apr 24 16:48:34.782489 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:34.782475 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cc3ad02b-5097-4178-b231-0501dfd0a762-util\") pod \"cc3ad02b-5097-4178-b231-0501dfd0a762\" (UID: \"cc3ad02b-5097-4178-b231-0501dfd0a762\") " Apr 24 16:48:34.782896 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:34.782868 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc3ad02b-5097-4178-b231-0501dfd0a762-bundle" (OuterVolumeSpecName: "bundle") pod "cc3ad02b-5097-4178-b231-0501dfd0a762" (UID: "cc3ad02b-5097-4178-b231-0501dfd0a762"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:48:34.784645 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:34.784614 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc3ad02b-5097-4178-b231-0501dfd0a762-kube-api-access-rmmjb" (OuterVolumeSpecName: "kube-api-access-rmmjb") pod "cc3ad02b-5097-4178-b231-0501dfd0a762" (UID: "cc3ad02b-5097-4178-b231-0501dfd0a762"). InnerVolumeSpecName "kube-api-access-rmmjb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:48:34.787322 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:34.787274 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc3ad02b-5097-4178-b231-0501dfd0a762-util" (OuterVolumeSpecName: "util") pod "cc3ad02b-5097-4178-b231-0501dfd0a762" (UID: "cc3ad02b-5097-4178-b231-0501dfd0a762"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:48:34.883769 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:34.883684 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rmmjb\" (UniqueName: \"kubernetes.io/projected/cc3ad02b-5097-4178-b231-0501dfd0a762-kube-api-access-rmmjb\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:48:34.883769 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:34.883764 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cc3ad02b-5097-4178-b231-0501dfd0a762-bundle\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:48:34.883769 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:34.883777 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cc3ad02b-5097-4178-b231-0501dfd0a762-util\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:48:35.613471 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:35.613434 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bgll4t" Apr 24 16:48:35.613471 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:35.613452 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bgll4t" event={"ID":"cc3ad02b-5097-4178-b231-0501dfd0a762","Type":"ContainerDied","Data":"7de0a14bcb90f48e98cceb03f4fd3a1f0be1f65cf1a677961ca722675fff0b99"} Apr 24 16:48:35.614042 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:35.613502 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7de0a14bcb90f48e98cceb03f4fd3a1f0be1f65cf1a677961ca722675fff0b99" Apr 24 16:48:35.761872 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:35.761843 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30ntztj" Apr 24 16:48:35.808088 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:35.808061 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503r276n" Apr 24 16:48:35.811683 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:35.811657 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88ph4sz" Apr 24 16:48:35.894660 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:35.894559 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/71edc11f-fd5d-4f18-9911-eb73de0e383b-bundle\") pod \"71edc11f-fd5d-4f18-9911-eb73de0e383b\" (UID: \"71edc11f-fd5d-4f18-9911-eb73de0e383b\") " Apr 24 16:48:35.894660 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:35.894609 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3922e82e-d63a-448a-8155-05005f956cf5-util\") pod \"3922e82e-d63a-448a-8155-05005f956cf5\" (UID: \"3922e82e-d63a-448a-8155-05005f956cf5\") " Apr 24 16:48:35.894917 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:35.894668 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vm8g\" (UniqueName: \"kubernetes.io/projected/62f656e5-ebb1-49bb-8103-f8b8cf7b4d48-kube-api-access-6vm8g\") pod \"62f656e5-ebb1-49bb-8103-f8b8cf7b4d48\" (UID: \"62f656e5-ebb1-49bb-8103-f8b8cf7b4d48\") " Apr 24 16:48:35.894917 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:35.894767 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3922e82e-d63a-448a-8155-05005f956cf5-bundle\") pod \"3922e82e-d63a-448a-8155-05005f956cf5\" (UID: \"3922e82e-d63a-448a-8155-05005f956cf5\") " Apr 24 16:48:35.894917 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:35.894795 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvx4z\" (UniqueName: \"kubernetes.io/projected/71edc11f-fd5d-4f18-9911-eb73de0e383b-kube-api-access-wvx4z\") pod \"71edc11f-fd5d-4f18-9911-eb73de0e383b\" (UID: \"71edc11f-fd5d-4f18-9911-eb73de0e383b\") " Apr 24 16:48:35.894917 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:35.894831 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/62f656e5-ebb1-49bb-8103-f8b8cf7b4d48-util\") pod \"62f656e5-ebb1-49bb-8103-f8b8cf7b4d48\" (UID: \"62f656e5-ebb1-49bb-8103-f8b8cf7b4d48\") " Apr 24 16:48:35.894917 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:35.894854 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/62f656e5-ebb1-49bb-8103-f8b8cf7b4d48-bundle\") pod \"62f656e5-ebb1-49bb-8103-f8b8cf7b4d48\" (UID: \"62f656e5-ebb1-49bb-8103-f8b8cf7b4d48\") " Apr 24 16:48:35.894917 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:35.894895 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/71edc11f-fd5d-4f18-9911-eb73de0e383b-util\") pod \"71edc11f-fd5d-4f18-9911-eb73de0e383b\" (UID: \"71edc11f-fd5d-4f18-9911-eb73de0e383b\") " Apr 24 16:48:35.895204 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:35.894925 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9k6pc\" (UniqueName: \"kubernetes.io/projected/3922e82e-d63a-448a-8155-05005f956cf5-kube-api-access-9k6pc\") pod \"3922e82e-d63a-448a-8155-05005f956cf5\" (UID: \"3922e82e-d63a-448a-8155-05005f956cf5\") " Apr 24 16:48:35.895362 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:35.895328 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71edc11f-fd5d-4f18-9911-eb73de0e383b-bundle" (OuterVolumeSpecName: "bundle") pod "71edc11f-fd5d-4f18-9911-eb73de0e383b" (UID: "71edc11f-fd5d-4f18-9911-eb73de0e383b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:48:35.895436 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:35.895370 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3922e82e-d63a-448a-8155-05005f956cf5-bundle" (OuterVolumeSpecName: "bundle") pod "3922e82e-d63a-448a-8155-05005f956cf5" (UID: "3922e82e-d63a-448a-8155-05005f956cf5"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:48:35.896008 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:35.895871 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62f656e5-ebb1-49bb-8103-f8b8cf7b4d48-bundle" (OuterVolumeSpecName: "bundle") pod "62f656e5-ebb1-49bb-8103-f8b8cf7b4d48" (UID: "62f656e5-ebb1-49bb-8103-f8b8cf7b4d48"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:48:35.897341 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:35.897299 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62f656e5-ebb1-49bb-8103-f8b8cf7b4d48-kube-api-access-6vm8g" (OuterVolumeSpecName: "kube-api-access-6vm8g") pod "62f656e5-ebb1-49bb-8103-f8b8cf7b4d48" (UID: "62f656e5-ebb1-49bb-8103-f8b8cf7b4d48"). InnerVolumeSpecName "kube-api-access-6vm8g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:48:35.897869 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:35.897836 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3922e82e-d63a-448a-8155-05005f956cf5-kube-api-access-9k6pc" (OuterVolumeSpecName: "kube-api-access-9k6pc") pod "3922e82e-d63a-448a-8155-05005f956cf5" (UID: "3922e82e-d63a-448a-8155-05005f956cf5"). InnerVolumeSpecName "kube-api-access-9k6pc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:48:35.898028 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:35.898005 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71edc11f-fd5d-4f18-9911-eb73de0e383b-kube-api-access-wvx4z" (OuterVolumeSpecName: "kube-api-access-wvx4z") pod "71edc11f-fd5d-4f18-9911-eb73de0e383b" (UID: "71edc11f-fd5d-4f18-9911-eb73de0e383b"). InnerVolumeSpecName "kube-api-access-wvx4z". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:48:35.901637 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:35.901570 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3922e82e-d63a-448a-8155-05005f956cf5-util" (OuterVolumeSpecName: "util") pod "3922e82e-d63a-448a-8155-05005f956cf5" (UID: "3922e82e-d63a-448a-8155-05005f956cf5"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:48:35.901787 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:35.901651 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71edc11f-fd5d-4f18-9911-eb73de0e383b-util" (OuterVolumeSpecName: "util") pod "71edc11f-fd5d-4f18-9911-eb73de0e383b" (UID: "71edc11f-fd5d-4f18-9911-eb73de0e383b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:48:35.902532 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:35.902497 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62f656e5-ebb1-49bb-8103-f8b8cf7b4d48-util" (OuterVolumeSpecName: "util") pod "62f656e5-ebb1-49bb-8103-f8b8cf7b4d48" (UID: "62f656e5-ebb1-49bb-8103-f8b8cf7b4d48"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:48:35.996143 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:35.996080 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6vm8g\" (UniqueName: \"kubernetes.io/projected/62f656e5-ebb1-49bb-8103-f8b8cf7b4d48-kube-api-access-6vm8g\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:48:35.996143 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:35.996136 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3922e82e-d63a-448a-8155-05005f956cf5-bundle\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:48:35.996143 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:35.996148 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wvx4z\" (UniqueName: \"kubernetes.io/projected/71edc11f-fd5d-4f18-9911-eb73de0e383b-kube-api-access-wvx4z\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:48:35.996143 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:35.996158 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/62f656e5-ebb1-49bb-8103-f8b8cf7b4d48-util\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:48:35.996143 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:35.996168 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/62f656e5-ebb1-49bb-8103-f8b8cf7b4d48-bundle\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:48:35.996449 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:35.996176 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/71edc11f-fd5d-4f18-9911-eb73de0e383b-util\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:48:35.996449 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:35.996184 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9k6pc\" (UniqueName: \"kubernetes.io/projected/3922e82e-d63a-448a-8155-05005f956cf5-kube-api-access-9k6pc\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:48:35.996449 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:35.996193 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/71edc11f-fd5d-4f18-9911-eb73de0e383b-bundle\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:48:35.996449 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:35.996201 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3922e82e-d63a-448a-8155-05005f956cf5-util\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:48:36.619431 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:36.619391 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30ntztj" Apr 24 16:48:36.621431 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:36.621390 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30ntztj" event={"ID":"71edc11f-fd5d-4f18-9911-eb73de0e383b","Type":"ContainerDied","Data":"2fb149032b673e5de44fc74528f72c4f82996c7ce818331cfd3e70e26b587ea5"} Apr 24 16:48:36.621624 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:36.621438 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2fb149032b673e5de44fc74528f72c4f82996c7ce818331cfd3e70e26b587ea5" Apr 24 16:48:36.621624 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:36.621454 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503r276n" event={"ID":"62f656e5-ebb1-49bb-8103-f8b8cf7b4d48","Type":"ContainerDied","Data":"af844455ac23953fbe7398b48777d5ef484f099e6ef24d825cf7494a19d14ba7"} Apr 24 16:48:36.621624 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:36.621468 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af844455ac23953fbe7398b48777d5ef484f099e6ef24d825cf7494a19d14ba7" Apr 24 16:48:36.621624 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:36.621414 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503r276n" Apr 24 16:48:36.630877 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:36.630846 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88ph4sz" event={"ID":"3922e82e-d63a-448a-8155-05005f956cf5","Type":"ContainerDied","Data":"d282382702c31c9e294f86843245ff8dbcfc9a07655f9488b0ff54c8f932962c"} Apr 24 16:48:36.631028 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:36.630883 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d282382702c31c9e294f86843245ff8dbcfc9a07655f9488b0ff54c8f932962c" Apr 24 16:48:36.631028 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:36.631014 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88ph4sz" Apr 24 16:48:52.573579 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:52.573540 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-hj6z7"] Apr 24 16:48:52.574092 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:52.573999 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3922e82e-d63a-448a-8155-05005f956cf5" containerName="pull" Apr 24 16:48:52.574092 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:52.574014 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="3922e82e-d63a-448a-8155-05005f956cf5" containerName="pull" Apr 24 16:48:52.574092 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:52.574022 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cc3ad02b-5097-4178-b231-0501dfd0a762" containerName="util" Apr 24 16:48:52.574092 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:52.574028 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc3ad02b-5097-4178-b231-0501dfd0a762" containerName="util" Apr 24 16:48:52.574092 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:52.574034 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cc3ad02b-5097-4178-b231-0501dfd0a762" containerName="extract" Apr 24 16:48:52.574092 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:52.574040 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc3ad02b-5097-4178-b231-0501dfd0a762" containerName="extract" Apr 24 16:48:52.574092 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:52.574053 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="71edc11f-fd5d-4f18-9911-eb73de0e383b" containerName="util" Apr 24 16:48:52.574092 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:52.574058 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="71edc11f-fd5d-4f18-9911-eb73de0e383b" containerName="util" Apr 24 16:48:52.574092 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:52.574067 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3922e82e-d63a-448a-8155-05005f956cf5" containerName="util" Apr 24 16:48:52.574092 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:52.574072 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="3922e82e-d63a-448a-8155-05005f956cf5" containerName="util" Apr 24 16:48:52.574092 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:52.574079 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="62f656e5-ebb1-49bb-8103-f8b8cf7b4d48" containerName="extract" Apr 24 16:48:52.574092 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:52.574084 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="62f656e5-ebb1-49bb-8103-f8b8cf7b4d48" containerName="extract" Apr 24 16:48:52.574092 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:52.574091 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="62f656e5-ebb1-49bb-8103-f8b8cf7b4d48" containerName="util" Apr 24 16:48:52.574092 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:52.574096 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="62f656e5-ebb1-49bb-8103-f8b8cf7b4d48" containerName="util" Apr 24 16:48:52.574497 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:52.574103 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="71edc11f-fd5d-4f18-9911-eb73de0e383b" containerName="pull" Apr 24 16:48:52.574497 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:52.574108 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="71edc11f-fd5d-4f18-9911-eb73de0e383b" containerName="pull" Apr 24 16:48:52.574497 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:52.574113 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="62f656e5-ebb1-49bb-8103-f8b8cf7b4d48" containerName="pull" Apr 24 16:48:52.574497 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:52.574117 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="62f656e5-ebb1-49bb-8103-f8b8cf7b4d48" containerName="pull" Apr 24 16:48:52.574497 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:52.574124 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3922e82e-d63a-448a-8155-05005f956cf5" containerName="extract" Apr 24 16:48:52.574497 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:52.574129 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="3922e82e-d63a-448a-8155-05005f956cf5" containerName="extract" Apr 24 16:48:52.574497 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:52.574135 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cc3ad02b-5097-4178-b231-0501dfd0a762" containerName="pull" Apr 24 16:48:52.574497 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:52.574139 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc3ad02b-5097-4178-b231-0501dfd0a762" containerName="pull" Apr 24 16:48:52.574497 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:52.574147 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="71edc11f-fd5d-4f18-9911-eb73de0e383b" containerName="extract" Apr 24 16:48:52.574497 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:52.574152 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="71edc11f-fd5d-4f18-9911-eb73de0e383b" containerName="extract" Apr 24 16:48:52.574497 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:52.574215 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="cc3ad02b-5097-4178-b231-0501dfd0a762" containerName="extract" Apr 24 16:48:52.574497 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:52.574223 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="3922e82e-d63a-448a-8155-05005f956cf5" containerName="extract" Apr 24 16:48:52.574497 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:52.574233 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="71edc11f-fd5d-4f18-9911-eb73de0e383b" containerName="extract" Apr 24 16:48:52.574497 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:52.574240 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="62f656e5-ebb1-49bb-8103-f8b8cf7b4d48" containerName="extract" Apr 24 16:48:52.586179 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:52.586148 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-hj6z7" Apr 24 16:48:52.592189 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:52.592115 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 24 16:48:52.592386 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:52.592212 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 24 16:48:52.592386 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:52.592115 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-frjpv\"" Apr 24 16:48:52.592877 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:52.592852 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 24 16:48:52.593038 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:52.592889 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 24 16:48:52.594850 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:52.594825 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-hj6z7"] Apr 24 16:48:52.749573 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:52.749533 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/378ae908-8a8e-4bcc-b882-05884531ffe9-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-hj6z7\" (UID: \"378ae908-8a8e-4bcc-b882-05884531ffe9\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-hj6z7" Apr 24 16:48:52.749809 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:52.749601 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqvvn\" (UniqueName: \"kubernetes.io/projected/378ae908-8a8e-4bcc-b882-05884531ffe9-kube-api-access-fqvvn\") pod \"kuadrant-console-plugin-6c886788f8-hj6z7\" (UID: \"378ae908-8a8e-4bcc-b882-05884531ffe9\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-hj6z7" Apr 24 16:48:52.749809 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:52.749664 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/378ae908-8a8e-4bcc-b882-05884531ffe9-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-hj6z7\" (UID: \"378ae908-8a8e-4bcc-b882-05884531ffe9\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-hj6z7" Apr 24 16:48:52.851045 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:52.850950 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/378ae908-8a8e-4bcc-b882-05884531ffe9-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-hj6z7\" (UID: \"378ae908-8a8e-4bcc-b882-05884531ffe9\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-hj6z7" Apr 24 16:48:52.851045 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:52.851019 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fqvvn\" (UniqueName: \"kubernetes.io/projected/378ae908-8a8e-4bcc-b882-05884531ffe9-kube-api-access-fqvvn\") pod \"kuadrant-console-plugin-6c886788f8-hj6z7\" (UID: \"378ae908-8a8e-4bcc-b882-05884531ffe9\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-hj6z7" Apr 24 16:48:52.851045 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:52.851039 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/378ae908-8a8e-4bcc-b882-05884531ffe9-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-hj6z7\" (UID: \"378ae908-8a8e-4bcc-b882-05884531ffe9\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-hj6z7" Apr 24 16:48:52.851577 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:52.851552 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/378ae908-8a8e-4bcc-b882-05884531ffe9-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-hj6z7\" (UID: \"378ae908-8a8e-4bcc-b882-05884531ffe9\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-hj6z7" Apr 24 16:48:52.853593 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:52.853569 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/378ae908-8a8e-4bcc-b882-05884531ffe9-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-hj6z7\" (UID: \"378ae908-8a8e-4bcc-b882-05884531ffe9\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-hj6z7" Apr 24 16:48:52.869164 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:52.869132 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqvvn\" (UniqueName: \"kubernetes.io/projected/378ae908-8a8e-4bcc-b882-05884531ffe9-kube-api-access-fqvvn\") pod \"kuadrant-console-plugin-6c886788f8-hj6z7\" (UID: \"378ae908-8a8e-4bcc-b882-05884531ffe9\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-hj6z7" Apr 24 16:48:52.898232 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:52.898188 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-hj6z7" Apr 24 16:48:53.046228 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:53.046198 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-hj6z7"] Apr 24 16:48:53.048325 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:48:53.048289 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod378ae908_8a8e_4bcc_b882_05884531ffe9.slice/crio-1bef7aad3435135e7e9866bae77db36ab61ea0d54067706ee2d13ec68c72392e WatchSource:0}: Error finding container 1bef7aad3435135e7e9866bae77db36ab61ea0d54067706ee2d13ec68c72392e: Status 404 returned error can't find the container with id 1bef7aad3435135e7e9866bae77db36ab61ea0d54067706ee2d13ec68c72392e Apr 24 16:48:53.707366 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:53.707328 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-hj6z7" event={"ID":"378ae908-8a8e-4bcc-b882-05884531ffe9","Type":"ContainerStarted","Data":"1bef7aad3435135e7e9866bae77db36ab61ea0d54067706ee2d13ec68c72392e"} Apr 24 16:48:58.729818 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:58.729775 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-hj6z7" event={"ID":"378ae908-8a8e-4bcc-b882-05884531ffe9","Type":"ContainerStarted","Data":"adc86c327f6313e41f4d5ede347642f51704a0a7c8097086dfa6552fd6cd0594"} Apr 24 16:48:58.746951 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:48:58.746893 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-hj6z7" podStartSLOduration=2.08181812 podStartE2EDuration="6.746875349s" podCreationTimestamp="2026-04-24 16:48:52 +0000 UTC" firstStartedPulling="2026-04-24 16:48:53.04987504 +0000 UTC m=+568.982944220" lastFinishedPulling="2026-04-24 16:48:57.714932272 +0000 UTC m=+573.648001449" observedRunningTime="2026-04-24 16:48:58.744701897 +0000 UTC m=+574.677771097" watchObservedRunningTime="2026-04-24 16:48:58.746875349 +0000 UTC m=+574.679944548" Apr 24 16:49:24.566808 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:49:24.566779 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-52mlp_9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80/console-operator/2.log" Apr 24 16:49:24.567467 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:49:24.567435 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-52mlp_9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80/console-operator/2.log" Apr 24 16:49:35.314533 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:49:35.314453 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-q4wx6"] Apr 24 16:49:35.317887 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:49:35.317871 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-67566c68b4-q4wx6" Apr 24 16:49:35.319967 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:49:35.319947 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 24 16:49:35.326134 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:49:35.326110 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-q4wx6"] Apr 24 16:49:35.358202 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:49:35.358168 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-q4wx6"] Apr 24 16:49:35.411304 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:49:35.411275 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlpf7\" (UniqueName: \"kubernetes.io/projected/9dcab55d-db4d-417d-85ce-f5473dbd2ac5-kube-api-access-wlpf7\") pod \"limitador-limitador-67566c68b4-q4wx6\" (UID: \"9dcab55d-db4d-417d-85ce-f5473dbd2ac5\") " pod="kuadrant-system/limitador-limitador-67566c68b4-q4wx6" Apr 24 16:49:35.411435 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:49:35.411316 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/9dcab55d-db4d-417d-85ce-f5473dbd2ac5-config-file\") pod \"limitador-limitador-67566c68b4-q4wx6\" (UID: \"9dcab55d-db4d-417d-85ce-f5473dbd2ac5\") " pod="kuadrant-system/limitador-limitador-67566c68b4-q4wx6" Apr 24 16:49:35.511723 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:49:35.511684 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wlpf7\" (UniqueName: \"kubernetes.io/projected/9dcab55d-db4d-417d-85ce-f5473dbd2ac5-kube-api-access-wlpf7\") pod \"limitador-limitador-67566c68b4-q4wx6\" (UID: \"9dcab55d-db4d-417d-85ce-f5473dbd2ac5\") " pod="kuadrant-system/limitador-limitador-67566c68b4-q4wx6" Apr 24 16:49:35.511929 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:49:35.511782 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/9dcab55d-db4d-417d-85ce-f5473dbd2ac5-config-file\") pod \"limitador-limitador-67566c68b4-q4wx6\" (UID: \"9dcab55d-db4d-417d-85ce-f5473dbd2ac5\") " pod="kuadrant-system/limitador-limitador-67566c68b4-q4wx6" Apr 24 16:49:35.512422 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:49:35.512400 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/9dcab55d-db4d-417d-85ce-f5473dbd2ac5-config-file\") pod \"limitador-limitador-67566c68b4-q4wx6\" (UID: \"9dcab55d-db4d-417d-85ce-f5473dbd2ac5\") " pod="kuadrant-system/limitador-limitador-67566c68b4-q4wx6" Apr 24 16:49:35.519787 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:49:35.519766 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlpf7\" (UniqueName: \"kubernetes.io/projected/9dcab55d-db4d-417d-85ce-f5473dbd2ac5-kube-api-access-wlpf7\") pod \"limitador-limitador-67566c68b4-q4wx6\" (UID: \"9dcab55d-db4d-417d-85ce-f5473dbd2ac5\") " pod="kuadrant-system/limitador-limitador-67566c68b4-q4wx6" Apr 24 16:49:35.629439 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:49:35.629347 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-67566c68b4-q4wx6" Apr 24 16:49:35.756548 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:49:35.756513 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-q4wx6"] Apr 24 16:49:35.758788 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:49:35.758759 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9dcab55d_db4d_417d_85ce_f5473dbd2ac5.slice/crio-6977485eb726f1f962b7a86404a6cc880ca1b16b6b163ffc2783dd673bf7c76b WatchSource:0}: Error finding container 6977485eb726f1f962b7a86404a6cc880ca1b16b6b163ffc2783dd673bf7c76b: Status 404 returned error can't find the container with id 6977485eb726f1f962b7a86404a6cc880ca1b16b6b163ffc2783dd673bf7c76b Apr 24 16:49:35.875800 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:49:35.875763 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-67566c68b4-q4wx6" event={"ID":"9dcab55d-db4d-417d-85ce-f5473dbd2ac5","Type":"ContainerStarted","Data":"6977485eb726f1f962b7a86404a6cc880ca1b16b6b163ffc2783dd673bf7c76b"} Apr 24 16:49:35.948665 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:49:35.948592 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-79cbc94b89-tdvbf"] Apr 24 16:49:35.953155 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:49:35.953132 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-tdvbf" Apr 24 16:49:35.955368 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:49:35.955348 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-s5znh\"" Apr 24 16:49:35.958675 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:49:35.958651 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-tdvbf"] Apr 24 16:49:36.016177 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:49:36.016145 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rt722\" (UniqueName: \"kubernetes.io/projected/126fbda9-23a6-4122-8597-ffb6ec947dbc-kube-api-access-rt722\") pod \"authorino-79cbc94b89-tdvbf\" (UID: \"126fbda9-23a6-4122-8597-ffb6ec947dbc\") " pod="kuadrant-system/authorino-79cbc94b89-tdvbf" Apr 24 16:49:36.117088 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:49:36.117051 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rt722\" (UniqueName: \"kubernetes.io/projected/126fbda9-23a6-4122-8597-ffb6ec947dbc-kube-api-access-rt722\") pod \"authorino-79cbc94b89-tdvbf\" (UID: \"126fbda9-23a6-4122-8597-ffb6ec947dbc\") " pod="kuadrant-system/authorino-79cbc94b89-tdvbf" Apr 24 16:49:36.124590 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:49:36.124564 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rt722\" (UniqueName: \"kubernetes.io/projected/126fbda9-23a6-4122-8597-ffb6ec947dbc-kube-api-access-rt722\") pod \"authorino-79cbc94b89-tdvbf\" (UID: \"126fbda9-23a6-4122-8597-ffb6ec947dbc\") " pod="kuadrant-system/authorino-79cbc94b89-tdvbf" Apr 24 16:49:36.263522 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:49:36.263490 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-tdvbf" Apr 24 16:49:36.385788 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:49:36.385765 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-tdvbf"] Apr 24 16:49:36.387365 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:49:36.387341 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod126fbda9_23a6_4122_8597_ffb6ec947dbc.slice/crio-ff31c94c78712de8a87d7a42f1169e103c2423064449ece2f995aca1e8347324 WatchSource:0}: Error finding container ff31c94c78712de8a87d7a42f1169e103c2423064449ece2f995aca1e8347324: Status 404 returned error can't find the container with id ff31c94c78712de8a87d7a42f1169e103c2423064449ece2f995aca1e8347324 Apr 24 16:49:36.881202 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:49:36.881161 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-tdvbf" event={"ID":"126fbda9-23a6-4122-8597-ffb6ec947dbc","Type":"ContainerStarted","Data":"ff31c94c78712de8a87d7a42f1169e103c2423064449ece2f995aca1e8347324"} Apr 24 16:49:37.887479 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:49:37.887431 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-67566c68b4-q4wx6" event={"ID":"9dcab55d-db4d-417d-85ce-f5473dbd2ac5","Type":"ContainerStarted","Data":"02a50530e03310af17f634f72fb78b5b467b8e26e0df4757d893ea058e7a9b9e"} Apr 24 16:49:37.887923 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:49:37.887493 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-67566c68b4-q4wx6" Apr 24 16:49:37.905843 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:49:37.905789 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-67566c68b4-q4wx6" podStartSLOduration=1.509601172 podStartE2EDuration="2.905769386s" podCreationTimestamp="2026-04-24 16:49:35 +0000 UTC" firstStartedPulling="2026-04-24 16:49:35.761026917 +0000 UTC m=+611.694096095" lastFinishedPulling="2026-04-24 16:49:37.157195127 +0000 UTC m=+613.090264309" observedRunningTime="2026-04-24 16:49:37.904829833 +0000 UTC m=+613.837899033" watchObservedRunningTime="2026-04-24 16:49:37.905769386 +0000 UTC m=+613.838838587" Apr 24 16:49:39.897398 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:49:39.897363 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-tdvbf" event={"ID":"126fbda9-23a6-4122-8597-ffb6ec947dbc","Type":"ContainerStarted","Data":"208dacf856e4c03d053e5ac01f620cceaff4fcd8007a4be371e580f6b3bbc4ec"} Apr 24 16:49:39.912769 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:49:39.912708 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-79cbc94b89-tdvbf" podStartSLOduration=2.386829385 podStartE2EDuration="4.912694218s" podCreationTimestamp="2026-04-24 16:49:35 +0000 UTC" firstStartedPulling="2026-04-24 16:49:36.388665812 +0000 UTC m=+612.321734988" lastFinishedPulling="2026-04-24 16:49:38.914530642 +0000 UTC m=+614.847599821" observedRunningTime="2026-04-24 16:49:39.911387867 +0000 UTC m=+615.844457071" watchObservedRunningTime="2026-04-24 16:49:39.912694218 +0000 UTC m=+615.845763416" Apr 24 16:49:48.893538 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:49:48.893508 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-67566c68b4-q4wx6" Apr 24 16:49:59.870627 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:49:59.870589 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-tdvbf"] Apr 24 16:49:59.871907 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:49:59.871872 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-79cbc94b89-tdvbf" podUID="126fbda9-23a6-4122-8597-ffb6ec947dbc" containerName="authorino" containerID="cri-o://208dacf856e4c03d053e5ac01f620cceaff4fcd8007a4be371e580f6b3bbc4ec" gracePeriod=30 Apr 24 16:50:00.111985 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:50:00.111963 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-tdvbf" Apr 24 16:50:00.229338 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:50:00.229254 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rt722\" (UniqueName: \"kubernetes.io/projected/126fbda9-23a6-4122-8597-ffb6ec947dbc-kube-api-access-rt722\") pod \"126fbda9-23a6-4122-8597-ffb6ec947dbc\" (UID: \"126fbda9-23a6-4122-8597-ffb6ec947dbc\") " Apr 24 16:50:00.231522 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:50:00.231498 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/126fbda9-23a6-4122-8597-ffb6ec947dbc-kube-api-access-rt722" (OuterVolumeSpecName: "kube-api-access-rt722") pod "126fbda9-23a6-4122-8597-ffb6ec947dbc" (UID: "126fbda9-23a6-4122-8597-ffb6ec947dbc"). InnerVolumeSpecName "kube-api-access-rt722". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:50:00.330136 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:50:00.330096 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rt722\" (UniqueName: \"kubernetes.io/projected/126fbda9-23a6-4122-8597-ffb6ec947dbc-kube-api-access-rt722\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:50:00.983679 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:50:00.983583 2579 generic.go:358] "Generic (PLEG): container finished" podID="126fbda9-23a6-4122-8597-ffb6ec947dbc" containerID="208dacf856e4c03d053e5ac01f620cceaff4fcd8007a4be371e580f6b3bbc4ec" exitCode=0 Apr 24 16:50:00.983679 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:50:00.983640 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-tdvbf" Apr 24 16:50:00.984196 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:50:00.983673 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-tdvbf" event={"ID":"126fbda9-23a6-4122-8597-ffb6ec947dbc","Type":"ContainerDied","Data":"208dacf856e4c03d053e5ac01f620cceaff4fcd8007a4be371e580f6b3bbc4ec"} Apr 24 16:50:00.984196 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:50:00.983710 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-tdvbf" event={"ID":"126fbda9-23a6-4122-8597-ffb6ec947dbc","Type":"ContainerDied","Data":"ff31c94c78712de8a87d7a42f1169e103c2423064449ece2f995aca1e8347324"} Apr 24 16:50:00.984196 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:50:00.983725 2579 scope.go:117] "RemoveContainer" containerID="208dacf856e4c03d053e5ac01f620cceaff4fcd8007a4be371e580f6b3bbc4ec" Apr 24 16:50:00.992483 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:50:00.992463 2579 scope.go:117] "RemoveContainer" containerID="208dacf856e4c03d053e5ac01f620cceaff4fcd8007a4be371e580f6b3bbc4ec" Apr 24 16:50:00.992757 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:50:00.992710 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"208dacf856e4c03d053e5ac01f620cceaff4fcd8007a4be371e580f6b3bbc4ec\": container with ID starting with 208dacf856e4c03d053e5ac01f620cceaff4fcd8007a4be371e580f6b3bbc4ec not found: ID does not exist" containerID="208dacf856e4c03d053e5ac01f620cceaff4fcd8007a4be371e580f6b3bbc4ec" Apr 24 16:50:00.992825 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:50:00.992758 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"208dacf856e4c03d053e5ac01f620cceaff4fcd8007a4be371e580f6b3bbc4ec"} err="failed to get container status \"208dacf856e4c03d053e5ac01f620cceaff4fcd8007a4be371e580f6b3bbc4ec\": rpc error: code = NotFound desc = could not find container \"208dacf856e4c03d053e5ac01f620cceaff4fcd8007a4be371e580f6b3bbc4ec\": container with ID starting with 208dacf856e4c03d053e5ac01f620cceaff4fcd8007a4be371e580f6b3bbc4ec not found: ID does not exist" Apr 24 16:50:01.002055 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:50:01.002025 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-tdvbf"] Apr 24 16:50:01.005902 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:50:01.005880 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-tdvbf"] Apr 24 16:50:02.620267 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:50:02.620234 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="126fbda9-23a6-4122-8597-ffb6ec947dbc" path="/var/lib/kubelet/pods/126fbda9-23a6-4122-8597-ffb6ec947dbc/volumes" Apr 24 16:51:46.909442 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:51:46.909403 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-r9wbd"] Apr 24 16:51:46.909936 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:51:46.909779 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="126fbda9-23a6-4122-8597-ffb6ec947dbc" containerName="authorino" Apr 24 16:51:46.909936 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:51:46.909793 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="126fbda9-23a6-4122-8597-ffb6ec947dbc" containerName="authorino" Apr 24 16:51:46.909936 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:51:46.909877 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="126fbda9-23a6-4122-8597-ffb6ec947dbc" containerName="authorino" Apr 24 16:51:46.912712 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:51:46.912696 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-r9wbd" Apr 24 16:51:46.915087 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:51:46.915063 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 16:51:46.915194 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:51:46.915112 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 16:51:46.915590 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:51:46.915574 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 24 16:51:46.915670 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:51:46.915594 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-7lswg\"" Apr 24 16:51:46.920033 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:51:46.920012 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-r9wbd"] Apr 24 16:51:46.958125 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:51:46.958091 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q487h\" (UniqueName: \"kubernetes.io/projected/d69a0a16-2739-489b-8fe1-10faa61cb21e-kube-api-access-q487h\") pod \"s3-init-r9wbd\" (UID: \"d69a0a16-2739-489b-8fe1-10faa61cb21e\") " pod="kserve/s3-init-r9wbd" Apr 24 16:51:47.058850 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:51:47.058812 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q487h\" (UniqueName: \"kubernetes.io/projected/d69a0a16-2739-489b-8fe1-10faa61cb21e-kube-api-access-q487h\") pod \"s3-init-r9wbd\" (UID: \"d69a0a16-2739-489b-8fe1-10faa61cb21e\") " pod="kserve/s3-init-r9wbd" Apr 24 16:51:47.068246 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:51:47.068218 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q487h\" (UniqueName: \"kubernetes.io/projected/d69a0a16-2739-489b-8fe1-10faa61cb21e-kube-api-access-q487h\") pod \"s3-init-r9wbd\" (UID: \"d69a0a16-2739-489b-8fe1-10faa61cb21e\") " pod="kserve/s3-init-r9wbd" Apr 24 16:51:47.223572 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:51:47.223486 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-r9wbd" Apr 24 16:51:47.363546 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:51:47.363514 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-r9wbd"] Apr 24 16:51:47.364247 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:51:47.364221 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd69a0a16_2739_489b_8fe1_10faa61cb21e.slice/crio-926b1677b1d7c5a484e4b955ae2124490a7a820dfdd3f68ecd482ae8b386a30a WatchSource:0}: Error finding container 926b1677b1d7c5a484e4b955ae2124490a7a820dfdd3f68ecd482ae8b386a30a: Status 404 returned error can't find the container with id 926b1677b1d7c5a484e4b955ae2124490a7a820dfdd3f68ecd482ae8b386a30a Apr 24 16:51:47.365876 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:51:47.365859 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 16:51:47.397003 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:51:47.396967 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-r9wbd" event={"ID":"d69a0a16-2739-489b-8fe1-10faa61cb21e","Type":"ContainerStarted","Data":"926b1677b1d7c5a484e4b955ae2124490a7a820dfdd3f68ecd482ae8b386a30a"} Apr 24 16:51:52.422657 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:51:52.422616 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-r9wbd" event={"ID":"d69a0a16-2739-489b-8fe1-10faa61cb21e","Type":"ContainerStarted","Data":"8ea6dc8d16d14ed2903aa0333b7bc21a00dbabaca26baed75c9961f0c5cf761f"} Apr 24 16:51:52.438133 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:51:52.438073 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-r9wbd" podStartSLOduration=1.992626194 podStartE2EDuration="6.438054899s" podCreationTimestamp="2026-04-24 16:51:46 +0000 UTC" firstStartedPulling="2026-04-24 16:51:47.365996356 +0000 UTC m=+743.299065536" lastFinishedPulling="2026-04-24 16:51:51.811425048 +0000 UTC m=+747.744494241" observedRunningTime="2026-04-24 16:51:52.437718607 +0000 UTC m=+748.370787808" watchObservedRunningTime="2026-04-24 16:51:52.438054899 +0000 UTC m=+748.371124099" Apr 24 16:51:55.436158 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:51:55.436123 2579 generic.go:358] "Generic (PLEG): container finished" podID="d69a0a16-2739-489b-8fe1-10faa61cb21e" containerID="8ea6dc8d16d14ed2903aa0333b7bc21a00dbabaca26baed75c9961f0c5cf761f" exitCode=0 Apr 24 16:51:55.436553 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:51:55.436199 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-r9wbd" event={"ID":"d69a0a16-2739-489b-8fe1-10faa61cb21e","Type":"ContainerDied","Data":"8ea6dc8d16d14ed2903aa0333b7bc21a00dbabaca26baed75c9961f0c5cf761f"} Apr 24 16:51:56.571797 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:51:56.571774 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-r9wbd" Apr 24 16:51:56.648001 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:51:56.647973 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q487h\" (UniqueName: \"kubernetes.io/projected/d69a0a16-2739-489b-8fe1-10faa61cb21e-kube-api-access-q487h\") pod \"d69a0a16-2739-489b-8fe1-10faa61cb21e\" (UID: \"d69a0a16-2739-489b-8fe1-10faa61cb21e\") " Apr 24 16:51:56.650310 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:51:56.650281 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d69a0a16-2739-489b-8fe1-10faa61cb21e-kube-api-access-q487h" (OuterVolumeSpecName: "kube-api-access-q487h") pod "d69a0a16-2739-489b-8fe1-10faa61cb21e" (UID: "d69a0a16-2739-489b-8fe1-10faa61cb21e"). InnerVolumeSpecName "kube-api-access-q487h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:51:56.748699 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:51:56.748667 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q487h\" (UniqueName: \"kubernetes.io/projected/d69a0a16-2739-489b-8fe1-10faa61cb21e-kube-api-access-q487h\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:51:57.444064 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:51:57.444028 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-r9wbd" Apr 24 16:51:57.444246 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:51:57.444071 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-r9wbd" event={"ID":"d69a0a16-2739-489b-8fe1-10faa61cb21e","Type":"ContainerDied","Data":"926b1677b1d7c5a484e4b955ae2124490a7a820dfdd3f68ecd482ae8b386a30a"} Apr 24 16:51:57.444246 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:51:57.444106 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="926b1677b1d7c5a484e4b955ae2124490a7a820dfdd3f68ecd482ae8b386a30a" Apr 24 16:52:12.527358 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:12.527323 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6cd4c7745b-768p4"] Apr 24 16:52:12.527793 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:12.527779 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d69a0a16-2739-489b-8fe1-10faa61cb21e" containerName="s3-init" Apr 24 16:52:12.527847 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:12.527795 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="d69a0a16-2739-489b-8fe1-10faa61cb21e" containerName="s3-init" Apr 24 16:52:12.527885 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:12.527866 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="d69a0a16-2739-489b-8fe1-10faa61cb21e" containerName="s3-init" Apr 24 16:52:12.531138 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:12.531104 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6cd4c7745b-768p4" Apr 24 16:52:12.535717 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:12.535679 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-q89jh\"" Apr 24 16:52:12.535887 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:12.535793 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 16:52:12.535887 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:12.535824 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 16:52:12.535999 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:12.535920 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-kserve-self-signed-certs\"" Apr 24 16:52:12.545684 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:12.545661 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6cd4c7745b-768p4"] Apr 24 16:52:12.693624 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:12.693586 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ef7d2866-8782-4449-8e2e-498d9925750f-model-cache\") pod \"scheduler-inline-config-test-kserve-6cd4c7745b-768p4\" (UID: \"ef7d2866-8782-4449-8e2e-498d9925750f\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6cd4c7745b-768p4" Apr 24 16:52:12.693819 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:12.693650 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-db8jq\" (UniqueName: \"kubernetes.io/projected/ef7d2866-8782-4449-8e2e-498d9925750f-kube-api-access-db8jq\") pod \"scheduler-inline-config-test-kserve-6cd4c7745b-768p4\" (UID: \"ef7d2866-8782-4449-8e2e-498d9925750f\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6cd4c7745b-768p4" Apr 24 16:52:12.693819 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:12.693700 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ef7d2866-8782-4449-8e2e-498d9925750f-home\") pod \"scheduler-inline-config-test-kserve-6cd4c7745b-768p4\" (UID: \"ef7d2866-8782-4449-8e2e-498d9925750f\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6cd4c7745b-768p4" Apr 24 16:52:12.693819 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:12.693717 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ef7d2866-8782-4449-8e2e-498d9925750f-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-6cd4c7745b-768p4\" (UID: \"ef7d2866-8782-4449-8e2e-498d9925750f\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6cd4c7745b-768p4" Apr 24 16:52:12.693819 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:12.693765 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ef7d2866-8782-4449-8e2e-498d9925750f-tmp-dir\") pod \"scheduler-inline-config-test-kserve-6cd4c7745b-768p4\" (UID: \"ef7d2866-8782-4449-8e2e-498d9925750f\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6cd4c7745b-768p4" Apr 24 16:52:12.693819 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:12.693784 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ef7d2866-8782-4449-8e2e-498d9925750f-tls-certs\") pod \"scheduler-inline-config-test-kserve-6cd4c7745b-768p4\" (UID: \"ef7d2866-8782-4449-8e2e-498d9925750f\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6cd4c7745b-768p4" Apr 24 16:52:12.693819 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:12.693815 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ef7d2866-8782-4449-8e2e-498d9925750f-dshm\") pod \"scheduler-inline-config-test-kserve-6cd4c7745b-768p4\" (UID: \"ef7d2866-8782-4449-8e2e-498d9925750f\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6cd4c7745b-768p4" Apr 24 16:52:12.795029 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:12.794940 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-db8jq\" (UniqueName: \"kubernetes.io/projected/ef7d2866-8782-4449-8e2e-498d9925750f-kube-api-access-db8jq\") pod \"scheduler-inline-config-test-kserve-6cd4c7745b-768p4\" (UID: \"ef7d2866-8782-4449-8e2e-498d9925750f\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6cd4c7745b-768p4" Apr 24 16:52:12.795029 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:12.795004 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ef7d2866-8782-4449-8e2e-498d9925750f-home\") pod \"scheduler-inline-config-test-kserve-6cd4c7745b-768p4\" (UID: \"ef7d2866-8782-4449-8e2e-498d9925750f\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6cd4c7745b-768p4" Apr 24 16:52:12.795248 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:12.795035 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ef7d2866-8782-4449-8e2e-498d9925750f-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-6cd4c7745b-768p4\" (UID: \"ef7d2866-8782-4449-8e2e-498d9925750f\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6cd4c7745b-768p4" Apr 24 16:52:12.795248 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:12.795067 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ef7d2866-8782-4449-8e2e-498d9925750f-tmp-dir\") pod \"scheduler-inline-config-test-kserve-6cd4c7745b-768p4\" (UID: \"ef7d2866-8782-4449-8e2e-498d9925750f\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6cd4c7745b-768p4" Apr 24 16:52:12.795248 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:12.795089 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ef7d2866-8782-4449-8e2e-498d9925750f-tls-certs\") pod \"scheduler-inline-config-test-kserve-6cd4c7745b-768p4\" (UID: \"ef7d2866-8782-4449-8e2e-498d9925750f\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6cd4c7745b-768p4" Apr 24 16:52:12.795248 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:12.795126 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ef7d2866-8782-4449-8e2e-498d9925750f-dshm\") pod \"scheduler-inline-config-test-kserve-6cd4c7745b-768p4\" (UID: \"ef7d2866-8782-4449-8e2e-498d9925750f\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6cd4c7745b-768p4" Apr 24 16:52:12.795248 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:12.795167 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ef7d2866-8782-4449-8e2e-498d9925750f-model-cache\") pod \"scheduler-inline-config-test-kserve-6cd4c7745b-768p4\" (UID: \"ef7d2866-8782-4449-8e2e-498d9925750f\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6cd4c7745b-768p4" Apr 24 16:52:12.795508 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:12.795483 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ef7d2866-8782-4449-8e2e-498d9925750f-home\") pod \"scheduler-inline-config-test-kserve-6cd4c7745b-768p4\" (UID: \"ef7d2866-8782-4449-8e2e-498d9925750f\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6cd4c7745b-768p4" Apr 24 16:52:12.795570 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:12.795505 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ef7d2866-8782-4449-8e2e-498d9925750f-tmp-dir\") pod \"scheduler-inline-config-test-kserve-6cd4c7745b-768p4\" (UID: \"ef7d2866-8782-4449-8e2e-498d9925750f\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6cd4c7745b-768p4" Apr 24 16:52:12.795570 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:12.795561 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ef7d2866-8782-4449-8e2e-498d9925750f-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-6cd4c7745b-768p4\" (UID: \"ef7d2866-8782-4449-8e2e-498d9925750f\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6cd4c7745b-768p4" Apr 24 16:52:12.795685 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:12.795665 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ef7d2866-8782-4449-8e2e-498d9925750f-model-cache\") pod \"scheduler-inline-config-test-kserve-6cd4c7745b-768p4\" (UID: \"ef7d2866-8782-4449-8e2e-498d9925750f\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6cd4c7745b-768p4" Apr 24 16:52:12.797602 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:12.797570 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ef7d2866-8782-4449-8e2e-498d9925750f-dshm\") pod \"scheduler-inline-config-test-kserve-6cd4c7745b-768p4\" (UID: \"ef7d2866-8782-4449-8e2e-498d9925750f\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6cd4c7745b-768p4" Apr 24 16:52:12.797853 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:12.797836 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ef7d2866-8782-4449-8e2e-498d9925750f-tls-certs\") pod \"scheduler-inline-config-test-kserve-6cd4c7745b-768p4\" (UID: \"ef7d2866-8782-4449-8e2e-498d9925750f\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6cd4c7745b-768p4" Apr 24 16:52:12.808639 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:12.808616 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-db8jq\" (UniqueName: \"kubernetes.io/projected/ef7d2866-8782-4449-8e2e-498d9925750f-kube-api-access-db8jq\") pod \"scheduler-inline-config-test-kserve-6cd4c7745b-768p4\" (UID: \"ef7d2866-8782-4449-8e2e-498d9925750f\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6cd4c7745b-768p4" Apr 24 16:52:12.844328 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:12.844299 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6cd4c7745b-768p4" Apr 24 16:52:12.987623 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:12.987592 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6cd4c7745b-768p4"] Apr 24 16:52:12.989415 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:52:12.989389 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef7d2866_8782_4449_8e2e_498d9925750f.slice/crio-2b5852f099ec5aa2b6991a616b49c6bed08e85654458fdc3c4af08f7ded7857e WatchSource:0}: Error finding container 2b5852f099ec5aa2b6991a616b49c6bed08e85654458fdc3c4af08f7ded7857e: Status 404 returned error can't find the container with id 2b5852f099ec5aa2b6991a616b49c6bed08e85654458fdc3c4af08f7ded7857e Apr 24 16:52:13.509239 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:13.509200 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6cd4c7745b-768p4" event={"ID":"ef7d2866-8782-4449-8e2e-498d9925750f","Type":"ContainerStarted","Data":"2b5852f099ec5aa2b6991a616b49c6bed08e85654458fdc3c4af08f7ded7857e"} Apr 24 16:52:17.528031 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:17.527992 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6cd4c7745b-768p4" event={"ID":"ef7d2866-8782-4449-8e2e-498d9925750f","Type":"ContainerStarted","Data":"87dc2d0bfee2fc5d27fbf49376b1dac6df2403ebae0a2c7483bb01f47fdad8cc"} Apr 24 16:52:48.430698 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:48.430607 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-695cf49fd49lnx4"] Apr 24 16:52:48.451614 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:48.451582 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-695cf49fd49lnx4"] Apr 24 16:52:48.451820 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:48.451768 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-695cf49fd49lnx4" Apr 24 16:52:48.454105 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:48.454081 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"gw-sec2774c263d49959f50d9eebc552e13bf9-kserve-self-signed-certs\"" Apr 24 16:52:48.522863 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:48.522826 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e0ed6fe0-938b-4001-9bfe-b4f0549193e2-kserve-provision-location\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-695cf49fd49lnx4\" (UID: \"e0ed6fe0-938b-4001-9bfe-b4f0549193e2\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-695cf49fd49lnx4" Apr 24 16:52:48.522863 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:48.522869 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5xp2\" (UniqueName: \"kubernetes.io/projected/e0ed6fe0-938b-4001-9bfe-b4f0549193e2-kube-api-access-g5xp2\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-695cf49fd49lnx4\" (UID: \"e0ed6fe0-938b-4001-9bfe-b4f0549193e2\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-695cf49fd49lnx4" Apr 24 16:52:48.523084 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:48.522891 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e0ed6fe0-938b-4001-9bfe-b4f0549193e2-tmp-dir\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-695cf49fd49lnx4\" (UID: \"e0ed6fe0-938b-4001-9bfe-b4f0549193e2\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-695cf49fd49lnx4" Apr 24 16:52:48.523084 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:48.522941 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e0ed6fe0-938b-4001-9bfe-b4f0549193e2-tls-certs\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-695cf49fd49lnx4\" (UID: \"e0ed6fe0-938b-4001-9bfe-b4f0549193e2\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-695cf49fd49lnx4" Apr 24 16:52:48.523084 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:48.522988 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e0ed6fe0-938b-4001-9bfe-b4f0549193e2-model-cache\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-695cf49fd49lnx4\" (UID: \"e0ed6fe0-938b-4001-9bfe-b4f0549193e2\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-695cf49fd49lnx4" Apr 24 16:52:48.523084 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:48.523014 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e0ed6fe0-938b-4001-9bfe-b4f0549193e2-home\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-695cf49fd49lnx4\" (UID: \"e0ed6fe0-938b-4001-9bfe-b4f0549193e2\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-695cf49fd49lnx4" Apr 24 16:52:48.523084 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:48.523060 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e0ed6fe0-938b-4001-9bfe-b4f0549193e2-dshm\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-695cf49fd49lnx4\" (UID: \"e0ed6fe0-938b-4001-9bfe-b4f0549193e2\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-695cf49fd49lnx4" Apr 24 16:52:48.624272 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:48.624242 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e0ed6fe0-938b-4001-9bfe-b4f0549193e2-dshm\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-695cf49fd49lnx4\" (UID: \"e0ed6fe0-938b-4001-9bfe-b4f0549193e2\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-695cf49fd49lnx4" Apr 24 16:52:48.624436 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:48.624299 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e0ed6fe0-938b-4001-9bfe-b4f0549193e2-kserve-provision-location\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-695cf49fd49lnx4\" (UID: \"e0ed6fe0-938b-4001-9bfe-b4f0549193e2\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-695cf49fd49lnx4" Apr 24 16:52:48.624436 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:48.624411 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g5xp2\" (UniqueName: \"kubernetes.io/projected/e0ed6fe0-938b-4001-9bfe-b4f0549193e2-kube-api-access-g5xp2\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-695cf49fd49lnx4\" (UID: \"e0ed6fe0-938b-4001-9bfe-b4f0549193e2\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-695cf49fd49lnx4" Apr 24 16:52:48.624532 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:48.624448 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e0ed6fe0-938b-4001-9bfe-b4f0549193e2-tmp-dir\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-695cf49fd49lnx4\" (UID: \"e0ed6fe0-938b-4001-9bfe-b4f0549193e2\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-695cf49fd49lnx4" Apr 24 16:52:48.624532 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:48.624470 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e0ed6fe0-938b-4001-9bfe-b4f0549193e2-tls-certs\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-695cf49fd49lnx4\" (UID: \"e0ed6fe0-938b-4001-9bfe-b4f0549193e2\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-695cf49fd49lnx4" Apr 24 16:52:48.624532 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:48.624503 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e0ed6fe0-938b-4001-9bfe-b4f0549193e2-model-cache\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-695cf49fd49lnx4\" (UID: \"e0ed6fe0-938b-4001-9bfe-b4f0549193e2\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-695cf49fd49lnx4" Apr 24 16:52:48.624685 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:48.624535 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e0ed6fe0-938b-4001-9bfe-b4f0549193e2-home\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-695cf49fd49lnx4\" (UID: \"e0ed6fe0-938b-4001-9bfe-b4f0549193e2\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-695cf49fd49lnx4" Apr 24 16:52:48.624685 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:48.624646 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e0ed6fe0-938b-4001-9bfe-b4f0549193e2-kserve-provision-location\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-695cf49fd49lnx4\" (UID: \"e0ed6fe0-938b-4001-9bfe-b4f0549193e2\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-695cf49fd49lnx4" Apr 24 16:52:48.624871 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:48.624850 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e0ed6fe0-938b-4001-9bfe-b4f0549193e2-tmp-dir\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-695cf49fd49lnx4\" (UID: \"e0ed6fe0-938b-4001-9bfe-b4f0549193e2\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-695cf49fd49lnx4" Apr 24 16:52:48.624938 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:48.624916 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e0ed6fe0-938b-4001-9bfe-b4f0549193e2-model-cache\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-695cf49fd49lnx4\" (UID: \"e0ed6fe0-938b-4001-9bfe-b4f0549193e2\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-695cf49fd49lnx4" Apr 24 16:52:48.624998 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:48.624925 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e0ed6fe0-938b-4001-9bfe-b4f0549193e2-home\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-695cf49fd49lnx4\" (UID: \"e0ed6fe0-938b-4001-9bfe-b4f0549193e2\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-695cf49fd49lnx4" Apr 24 16:52:48.626769 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:48.626725 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e0ed6fe0-938b-4001-9bfe-b4f0549193e2-dshm\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-695cf49fd49lnx4\" (UID: \"e0ed6fe0-938b-4001-9bfe-b4f0549193e2\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-695cf49fd49lnx4" Apr 24 16:52:48.627183 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:48.627165 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e0ed6fe0-938b-4001-9bfe-b4f0549193e2-tls-certs\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-695cf49fd49lnx4\" (UID: \"e0ed6fe0-938b-4001-9bfe-b4f0549193e2\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-695cf49fd49lnx4" Apr 24 16:52:48.632597 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:48.632571 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5xp2\" (UniqueName: \"kubernetes.io/projected/e0ed6fe0-938b-4001-9bfe-b4f0549193e2-kube-api-access-g5xp2\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-695cf49fd49lnx4\" (UID: \"e0ed6fe0-938b-4001-9bfe-b4f0549193e2\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-695cf49fd49lnx4" Apr 24 16:52:48.762365 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:48.762323 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-695cf49fd49lnx4" Apr 24 16:52:48.897154 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:48.897130 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-695cf49fd49lnx4"] Apr 24 16:52:48.898870 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:52:48.898846 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0ed6fe0_938b_4001_9bfe_b4f0549193e2.slice/crio-40b098f3d1ee78492925aaefd81c3679d97b8b632a0a5b344a694e802d8bd0ad WatchSource:0}: Error finding container 40b098f3d1ee78492925aaefd81c3679d97b8b632a0a5b344a694e802d8bd0ad: Status 404 returned error can't find the container with id 40b098f3d1ee78492925aaefd81c3679d97b8b632a0a5b344a694e802d8bd0ad Apr 24 16:52:49.656468 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:49.656430 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-695cf49fd49lnx4" event={"ID":"e0ed6fe0-938b-4001-9bfe-b4f0549193e2","Type":"ContainerStarted","Data":"d9524ba4bb5ccbb90f94b099b768b7ae834b979acc5fd82e2db4cb8bcdbccda5"} Apr 24 16:52:49.656468 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:49.656474 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-695cf49fd49lnx4" event={"ID":"e0ed6fe0-938b-4001-9bfe-b4f0549193e2","Type":"ContainerStarted","Data":"40b098f3d1ee78492925aaefd81c3679d97b8b632a0a5b344a694e802d8bd0ad"} Apr 24 16:52:51.816042 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:51.816011 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-695cf49fd49lnx4"] Apr 24 16:52:51.816404 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:52:51.816276 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-695cf49fd49lnx4" podUID="e0ed6fe0-938b-4001-9bfe-b4f0549193e2" containerName="storage-initializer" containerID="cri-o://d9524ba4bb5ccbb90f94b099b768b7ae834b979acc5fd82e2db4cb8bcdbccda5" gracePeriod=30 Apr 24 16:53:21.996771 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:53:21.996721 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_gw-section-name-router-with-gat-f1d92d0f-kserve-695cf49fd49lnx4_e0ed6fe0-938b-4001-9bfe-b4f0549193e2/storage-initializer/0.log" Apr 24 16:53:21.997109 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:53:21.996815 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-695cf49fd49lnx4" Apr 24 16:53:22.036339 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:53:22.036308 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e0ed6fe0-938b-4001-9bfe-b4f0549193e2-tls-certs\") pod \"e0ed6fe0-938b-4001-9bfe-b4f0549193e2\" (UID: \"e0ed6fe0-938b-4001-9bfe-b4f0549193e2\") " Apr 24 16:53:22.036559 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:53:22.036357 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e0ed6fe0-938b-4001-9bfe-b4f0549193e2-home\") pod \"e0ed6fe0-938b-4001-9bfe-b4f0549193e2\" (UID: \"e0ed6fe0-938b-4001-9bfe-b4f0549193e2\") " Apr 24 16:53:22.036559 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:53:22.036376 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e0ed6fe0-938b-4001-9bfe-b4f0549193e2-model-cache\") pod \"e0ed6fe0-938b-4001-9bfe-b4f0549193e2\" (UID: \"e0ed6fe0-938b-4001-9bfe-b4f0549193e2\") " Apr 24 16:53:22.036559 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:53:22.036427 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e0ed6fe0-938b-4001-9bfe-b4f0549193e2-kserve-provision-location\") pod \"e0ed6fe0-938b-4001-9bfe-b4f0549193e2\" (UID: \"e0ed6fe0-938b-4001-9bfe-b4f0549193e2\") " Apr 24 16:53:22.036559 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:53:22.036447 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e0ed6fe0-938b-4001-9bfe-b4f0549193e2-dshm\") pod \"e0ed6fe0-938b-4001-9bfe-b4f0549193e2\" (UID: \"e0ed6fe0-938b-4001-9bfe-b4f0549193e2\") " Apr 24 16:53:22.036559 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:53:22.036492 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e0ed6fe0-938b-4001-9bfe-b4f0549193e2-tmp-dir\") pod \"e0ed6fe0-938b-4001-9bfe-b4f0549193e2\" (UID: \"e0ed6fe0-938b-4001-9bfe-b4f0549193e2\") " Apr 24 16:53:22.036559 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:53:22.036539 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5xp2\" (UniqueName: \"kubernetes.io/projected/e0ed6fe0-938b-4001-9bfe-b4f0549193e2-kube-api-access-g5xp2\") pod \"e0ed6fe0-938b-4001-9bfe-b4f0549193e2\" (UID: \"e0ed6fe0-938b-4001-9bfe-b4f0549193e2\") " Apr 24 16:53:22.036918 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:53:22.036599 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0ed6fe0-938b-4001-9bfe-b4f0549193e2-home" (OuterVolumeSpecName: "home") pod "e0ed6fe0-938b-4001-9bfe-b4f0549193e2" (UID: "e0ed6fe0-938b-4001-9bfe-b4f0549193e2"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:53:22.036918 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:53:22.036706 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0ed6fe0-938b-4001-9bfe-b4f0549193e2-model-cache" (OuterVolumeSpecName: "model-cache") pod "e0ed6fe0-938b-4001-9bfe-b4f0549193e2" (UID: "e0ed6fe0-938b-4001-9bfe-b4f0549193e2"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:53:22.036918 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:53:22.036808 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0ed6fe0-938b-4001-9bfe-b4f0549193e2-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "e0ed6fe0-938b-4001-9bfe-b4f0549193e2" (UID: "e0ed6fe0-938b-4001-9bfe-b4f0549193e2"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:53:22.036918 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:53:22.036896 2579 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e0ed6fe0-938b-4001-9bfe-b4f0549193e2-tmp-dir\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:53:22.036918 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:53:22.036917 2579 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e0ed6fe0-938b-4001-9bfe-b4f0549193e2-home\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:53:22.037124 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:53:22.036932 2579 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e0ed6fe0-938b-4001-9bfe-b4f0549193e2-model-cache\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:53:22.038903 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:53:22.038875 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0ed6fe0-938b-4001-9bfe-b4f0549193e2-dshm" (OuterVolumeSpecName: "dshm") pod "e0ed6fe0-938b-4001-9bfe-b4f0549193e2" (UID: "e0ed6fe0-938b-4001-9bfe-b4f0549193e2"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:53:22.039264 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:53:22.039231 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0ed6fe0-938b-4001-9bfe-b4f0549193e2-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "e0ed6fe0-938b-4001-9bfe-b4f0549193e2" (UID: "e0ed6fe0-938b-4001-9bfe-b4f0549193e2"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:53:22.039357 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:53:22.039280 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0ed6fe0-938b-4001-9bfe-b4f0549193e2-kube-api-access-g5xp2" (OuterVolumeSpecName: "kube-api-access-g5xp2") pod "e0ed6fe0-938b-4001-9bfe-b4f0549193e2" (UID: "e0ed6fe0-938b-4001-9bfe-b4f0549193e2"). InnerVolumeSpecName "kube-api-access-g5xp2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:53:22.089525 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:53:22.089444 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0ed6fe0-938b-4001-9bfe-b4f0549193e2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e0ed6fe0-938b-4001-9bfe-b4f0549193e2" (UID: "e0ed6fe0-938b-4001-9bfe-b4f0549193e2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:53:22.138142 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:53:22.138106 2579 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e0ed6fe0-938b-4001-9bfe-b4f0549193e2-tls-certs\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:53:22.138142 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:53:22.138138 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e0ed6fe0-938b-4001-9bfe-b4f0549193e2-kserve-provision-location\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:53:22.138301 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:53:22.138151 2579 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e0ed6fe0-938b-4001-9bfe-b4f0549193e2-dshm\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:53:22.138301 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:53:22.138161 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g5xp2\" (UniqueName: \"kubernetes.io/projected/e0ed6fe0-938b-4001-9bfe-b4f0549193e2-kube-api-access-g5xp2\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:53:22.782389 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:53:22.782359 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_gw-section-name-router-with-gat-f1d92d0f-kserve-695cf49fd49lnx4_e0ed6fe0-938b-4001-9bfe-b4f0549193e2/storage-initializer/0.log" Apr 24 16:53:22.782530 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:53:22.782401 2579 generic.go:358] "Generic (PLEG): container finished" podID="e0ed6fe0-938b-4001-9bfe-b4f0549193e2" containerID="d9524ba4bb5ccbb90f94b099b768b7ae834b979acc5fd82e2db4cb8bcdbccda5" exitCode=137 Apr 24 16:53:22.782530 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:53:22.782476 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-695cf49fd49lnx4" Apr 24 16:53:22.782530 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:53:22.782495 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-695cf49fd49lnx4" event={"ID":"e0ed6fe0-938b-4001-9bfe-b4f0549193e2","Type":"ContainerDied","Data":"d9524ba4bb5ccbb90f94b099b768b7ae834b979acc5fd82e2db4cb8bcdbccda5"} Apr 24 16:53:22.782667 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:53:22.782535 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-695cf49fd49lnx4" event={"ID":"e0ed6fe0-938b-4001-9bfe-b4f0549193e2","Type":"ContainerDied","Data":"40b098f3d1ee78492925aaefd81c3679d97b8b632a0a5b344a694e802d8bd0ad"} Apr 24 16:53:22.782667 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:53:22.782551 2579 scope.go:117] "RemoveContainer" containerID="d9524ba4bb5ccbb90f94b099b768b7ae834b979acc5fd82e2db4cb8bcdbccda5" Apr 24 16:53:22.824093 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:53:22.824063 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-695cf49fd49lnx4"] Apr 24 16:53:22.828117 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:53:22.828076 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-695cf49fd49lnx4"] Apr 24 16:53:22.845234 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:53:22.845073 2579 scope.go:117] "RemoveContainer" containerID="d9524ba4bb5ccbb90f94b099b768b7ae834b979acc5fd82e2db4cb8bcdbccda5" Apr 24 16:53:22.845451 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:53:22.845430 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9524ba4bb5ccbb90f94b099b768b7ae834b979acc5fd82e2db4cb8bcdbccda5\": container with ID starting with d9524ba4bb5ccbb90f94b099b768b7ae834b979acc5fd82e2db4cb8bcdbccda5 not found: ID does not exist" containerID="d9524ba4bb5ccbb90f94b099b768b7ae834b979acc5fd82e2db4cb8bcdbccda5" Apr 24 16:53:22.845496 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:53:22.845463 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9524ba4bb5ccbb90f94b099b768b7ae834b979acc5fd82e2db4cb8bcdbccda5"} err="failed to get container status \"d9524ba4bb5ccbb90f94b099b768b7ae834b979acc5fd82e2db4cb8bcdbccda5\": rpc error: code = NotFound desc = could not find container \"d9524ba4bb5ccbb90f94b099b768b7ae834b979acc5fd82e2db4cb8bcdbccda5\": container with ID starting with d9524ba4bb5ccbb90f94b099b768b7ae834b979acc5fd82e2db4cb8bcdbccda5 not found: ID does not exist" Apr 24 16:53:24.619937 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:53:24.619904 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0ed6fe0-938b-4001-9bfe-b4f0549193e2" path="/var/lib/kubelet/pods/e0ed6fe0-938b-4001-9bfe-b4f0549193e2/volumes" Apr 24 16:54:24.601089 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:54:24.601054 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-52mlp_9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80/console-operator/2.log" Apr 24 16:54:24.602109 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:54:24.602089 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-52mlp_9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80/console-operator/2.log" Apr 24 16:56:03.414865 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:03.414831 2579 generic.go:358] "Generic (PLEG): container finished" podID="ef7d2866-8782-4449-8e2e-498d9925750f" containerID="87dc2d0bfee2fc5d27fbf49376b1dac6df2403ebae0a2c7483bb01f47fdad8cc" exitCode=0 Apr 24 16:56:03.415269 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:03.414907 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6cd4c7745b-768p4" event={"ID":"ef7d2866-8782-4449-8e2e-498d9925750f","Type":"ContainerDied","Data":"87dc2d0bfee2fc5d27fbf49376b1dac6df2403ebae0a2c7483bb01f47fdad8cc"} Apr 24 16:56:05.425909 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:05.425874 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6cd4c7745b-768p4" event={"ID":"ef7d2866-8782-4449-8e2e-498d9925750f","Type":"ContainerStarted","Data":"8c110a32764ee6f5045efece15bee73f9f6835b98f93bce0daca7b8bfe069693"} Apr 24 16:56:05.445171 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:05.445107 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6cd4c7745b-768p4" podStartSLOduration=1.713218224 podStartE2EDuration="3m53.445089661s" podCreationTimestamp="2026-04-24 16:52:12 +0000 UTC" firstStartedPulling="2026-04-24 16:52:12.991324957 +0000 UTC m=+768.924394134" lastFinishedPulling="2026-04-24 16:56:04.723196391 +0000 UTC m=+1000.656265571" observedRunningTime="2026-04-24 16:56:05.443418797 +0000 UTC m=+1001.376488000" watchObservedRunningTime="2026-04-24 16:56:05.445089661 +0000 UTC m=+1001.378158862" Apr 24 16:56:12.845038 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:12.844999 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6cd4c7745b-768p4" Apr 24 16:56:12.845038 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:12.845042 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6cd4c7745b-768p4" Apr 24 16:56:12.857845 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:12.857819 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6cd4c7745b-768p4" Apr 24 16:56:13.466913 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:13.466882 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6cd4c7745b-768p4" Apr 24 16:56:14.679712 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:14.679678 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6cd4c7745b-768p4"] Apr 24 16:56:15.463765 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:15.463702 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6cd4c7745b-768p4" podUID="ef7d2866-8782-4449-8e2e-498d9925750f" containerName="main" containerID="cri-o://8c110a32764ee6f5045efece15bee73f9f6835b98f93bce0daca7b8bfe069693" gracePeriod=30 Apr 24 16:56:15.722092 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:15.722025 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6cd4c7745b-768p4" Apr 24 16:56:15.803551 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:15.803524 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ef7d2866-8782-4449-8e2e-498d9925750f-tmp-dir\") pod \"ef7d2866-8782-4449-8e2e-498d9925750f\" (UID: \"ef7d2866-8782-4449-8e2e-498d9925750f\") " Apr 24 16:56:15.803803 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:15.803567 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-db8jq\" (UniqueName: \"kubernetes.io/projected/ef7d2866-8782-4449-8e2e-498d9925750f-kube-api-access-db8jq\") pod \"ef7d2866-8782-4449-8e2e-498d9925750f\" (UID: \"ef7d2866-8782-4449-8e2e-498d9925750f\") " Apr 24 16:56:15.803803 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:15.803596 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ef7d2866-8782-4449-8e2e-498d9925750f-dshm\") pod \"ef7d2866-8782-4449-8e2e-498d9925750f\" (UID: \"ef7d2866-8782-4449-8e2e-498d9925750f\") " Apr 24 16:56:15.803803 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:15.803620 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ef7d2866-8782-4449-8e2e-498d9925750f-model-cache\") pod \"ef7d2866-8782-4449-8e2e-498d9925750f\" (UID: \"ef7d2866-8782-4449-8e2e-498d9925750f\") " Apr 24 16:56:15.803803 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:15.803765 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ef7d2866-8782-4449-8e2e-498d9925750f-kserve-provision-location\") pod \"ef7d2866-8782-4449-8e2e-498d9925750f\" (UID: \"ef7d2866-8782-4449-8e2e-498d9925750f\") " Apr 24 16:56:15.804049 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:15.803846 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ef7d2866-8782-4449-8e2e-498d9925750f-home\") pod \"ef7d2866-8782-4449-8e2e-498d9925750f\" (UID: \"ef7d2866-8782-4449-8e2e-498d9925750f\") " Apr 24 16:56:15.804049 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:15.803900 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef7d2866-8782-4449-8e2e-498d9925750f-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "ef7d2866-8782-4449-8e2e-498d9925750f" (UID: "ef7d2866-8782-4449-8e2e-498d9925750f"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:56:15.804049 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:15.803912 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef7d2866-8782-4449-8e2e-498d9925750f-model-cache" (OuterVolumeSpecName: "model-cache") pod "ef7d2866-8782-4449-8e2e-498d9925750f" (UID: "ef7d2866-8782-4449-8e2e-498d9925750f"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:56:15.804049 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:15.803921 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ef7d2866-8782-4449-8e2e-498d9925750f-tls-certs\") pod \"ef7d2866-8782-4449-8e2e-498d9925750f\" (UID: \"ef7d2866-8782-4449-8e2e-498d9925750f\") " Apr 24 16:56:15.804254 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:15.804103 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef7d2866-8782-4449-8e2e-498d9925750f-home" (OuterVolumeSpecName: "home") pod "ef7d2866-8782-4449-8e2e-498d9925750f" (UID: "ef7d2866-8782-4449-8e2e-498d9925750f"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:56:15.804254 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:15.804224 2579 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ef7d2866-8782-4449-8e2e-498d9925750f-home\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:56:15.804254 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:15.804240 2579 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ef7d2866-8782-4449-8e2e-498d9925750f-tmp-dir\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:56:15.804254 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:15.804255 2579 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ef7d2866-8782-4449-8e2e-498d9925750f-model-cache\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:56:15.806000 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:15.805974 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef7d2866-8782-4449-8e2e-498d9925750f-dshm" (OuterVolumeSpecName: "dshm") pod "ef7d2866-8782-4449-8e2e-498d9925750f" (UID: "ef7d2866-8782-4449-8e2e-498d9925750f"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:56:15.806124 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:15.806055 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef7d2866-8782-4449-8e2e-498d9925750f-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "ef7d2866-8782-4449-8e2e-498d9925750f" (UID: "ef7d2866-8782-4449-8e2e-498d9925750f"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:56:15.806303 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:15.806278 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef7d2866-8782-4449-8e2e-498d9925750f-kube-api-access-db8jq" (OuterVolumeSpecName: "kube-api-access-db8jq") pod "ef7d2866-8782-4449-8e2e-498d9925750f" (UID: "ef7d2866-8782-4449-8e2e-498d9925750f"). InnerVolumeSpecName "kube-api-access-db8jq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:56:15.860140 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:15.860098 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef7d2866-8782-4449-8e2e-498d9925750f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ef7d2866-8782-4449-8e2e-498d9925750f" (UID: "ef7d2866-8782-4449-8e2e-498d9925750f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:56:15.905603 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:15.905572 2579 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ef7d2866-8782-4449-8e2e-498d9925750f-tls-certs\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:56:15.905603 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:15.905600 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-db8jq\" (UniqueName: \"kubernetes.io/projected/ef7d2866-8782-4449-8e2e-498d9925750f-kube-api-access-db8jq\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:56:15.905603 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:15.905610 2579 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ef7d2866-8782-4449-8e2e-498d9925750f-dshm\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:56:15.905851 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:15.905619 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ef7d2866-8782-4449-8e2e-498d9925750f-kserve-provision-location\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:56:16.468903 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:16.468865 2579 generic.go:358] "Generic (PLEG): container finished" podID="ef7d2866-8782-4449-8e2e-498d9925750f" containerID="8c110a32764ee6f5045efece15bee73f9f6835b98f93bce0daca7b8bfe069693" exitCode=0 Apr 24 16:56:16.469194 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:16.468995 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6cd4c7745b-768p4" Apr 24 16:56:16.469194 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:16.469013 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6cd4c7745b-768p4" event={"ID":"ef7d2866-8782-4449-8e2e-498d9925750f","Type":"ContainerDied","Data":"8c110a32764ee6f5045efece15bee73f9f6835b98f93bce0daca7b8bfe069693"} Apr 24 16:56:16.469194 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:16.469060 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6cd4c7745b-768p4" event={"ID":"ef7d2866-8782-4449-8e2e-498d9925750f","Type":"ContainerDied","Data":"2b5852f099ec5aa2b6991a616b49c6bed08e85654458fdc3c4af08f7ded7857e"} Apr 24 16:56:16.469194 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:16.469082 2579 scope.go:117] "RemoveContainer" containerID="8c110a32764ee6f5045efece15bee73f9f6835b98f93bce0daca7b8bfe069693" Apr 24 16:56:16.479849 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:16.479829 2579 scope.go:117] "RemoveContainer" containerID="87dc2d0bfee2fc5d27fbf49376b1dac6df2403ebae0a2c7483bb01f47fdad8cc" Apr 24 16:56:16.492475 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:16.492443 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6cd4c7745b-768p4"] Apr 24 16:56:16.495496 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:16.495476 2579 scope.go:117] "RemoveContainer" containerID="8c110a32764ee6f5045efece15bee73f9f6835b98f93bce0daca7b8bfe069693" Apr 24 16:56:16.495650 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:16.495629 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6cd4c7745b-768p4"] Apr 24 16:56:16.495829 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:56:16.495806 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c110a32764ee6f5045efece15bee73f9f6835b98f93bce0daca7b8bfe069693\": container with ID starting with 8c110a32764ee6f5045efece15bee73f9f6835b98f93bce0daca7b8bfe069693 not found: ID does not exist" containerID="8c110a32764ee6f5045efece15bee73f9f6835b98f93bce0daca7b8bfe069693" Apr 24 16:56:16.495889 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:16.495832 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c110a32764ee6f5045efece15bee73f9f6835b98f93bce0daca7b8bfe069693"} err="failed to get container status \"8c110a32764ee6f5045efece15bee73f9f6835b98f93bce0daca7b8bfe069693\": rpc error: code = NotFound desc = could not find container \"8c110a32764ee6f5045efece15bee73f9f6835b98f93bce0daca7b8bfe069693\": container with ID starting with 8c110a32764ee6f5045efece15bee73f9f6835b98f93bce0daca7b8bfe069693 not found: ID does not exist" Apr 24 16:56:16.495889 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:16.495851 2579 scope.go:117] "RemoveContainer" containerID="87dc2d0bfee2fc5d27fbf49376b1dac6df2403ebae0a2c7483bb01f47fdad8cc" Apr 24 16:56:16.496102 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:56:16.496080 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87dc2d0bfee2fc5d27fbf49376b1dac6df2403ebae0a2c7483bb01f47fdad8cc\": container with ID starting with 87dc2d0bfee2fc5d27fbf49376b1dac6df2403ebae0a2c7483bb01f47fdad8cc not found: ID does not exist" containerID="87dc2d0bfee2fc5d27fbf49376b1dac6df2403ebae0a2c7483bb01f47fdad8cc" Apr 24 16:56:16.496193 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:16.496108 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87dc2d0bfee2fc5d27fbf49376b1dac6df2403ebae0a2c7483bb01f47fdad8cc"} err="failed to get container status \"87dc2d0bfee2fc5d27fbf49376b1dac6df2403ebae0a2c7483bb01f47fdad8cc\": rpc error: code = NotFound desc = could not find container \"87dc2d0bfee2fc5d27fbf49376b1dac6df2403ebae0a2c7483bb01f47fdad8cc\": container with ID starting with 87dc2d0bfee2fc5d27fbf49376b1dac6df2403ebae0a2c7483bb01f47fdad8cc not found: ID does not exist" Apr 24 16:56:16.620486 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:16.620450 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef7d2866-8782-4449-8e2e-498d9925750f" path="/var/lib/kubelet/pods/ef7d2866-8782-4449-8e2e-498d9925750f/volumes" Apr 24 16:56:24.510218 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:24.510186 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-695ff4dfb-46xql"] Apr 24 16:56:24.510686 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:24.510607 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e0ed6fe0-938b-4001-9bfe-b4f0549193e2" containerName="storage-initializer" Apr 24 16:56:24.510686 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:24.510620 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0ed6fe0-938b-4001-9bfe-b4f0549193e2" containerName="storage-initializer" Apr 24 16:56:24.510686 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:24.510634 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ef7d2866-8782-4449-8e2e-498d9925750f" containerName="main" Apr 24 16:56:24.510686 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:24.510639 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef7d2866-8782-4449-8e2e-498d9925750f" containerName="main" Apr 24 16:56:24.510686 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:24.510647 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ef7d2866-8782-4449-8e2e-498d9925750f" containerName="storage-initializer" Apr 24 16:56:24.510686 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:24.510652 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef7d2866-8782-4449-8e2e-498d9925750f" containerName="storage-initializer" Apr 24 16:56:24.510944 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:24.510714 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="e0ed6fe0-938b-4001-9bfe-b4f0549193e2" containerName="storage-initializer" Apr 24 16:56:24.510944 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:24.510723 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="ef7d2866-8782-4449-8e2e-498d9925750f" containerName="main" Apr 24 16:56:24.515884 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:24.515862 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-695ff4dfb-46xql" Apr 24 16:56:24.519545 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:24.519521 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 16:56:24.519685 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:24.519666 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-q89jh\"" Apr 24 16:56:24.520142 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:24.520123 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 16:56:24.522500 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:24.522481 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-kserve-self-signed-certs\"" Apr 24 16:56:24.547280 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:24.547256 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-695ff4dfb-46xql"] Apr 24 16:56:24.582132 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:24.582106 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d-home\") pod \"scheduler-configmap-ref-test-kserve-695ff4dfb-46xql\" (UID: \"02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-695ff4dfb-46xql" Apr 24 16:56:24.582285 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:24.582144 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrnpx\" (UniqueName: \"kubernetes.io/projected/02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d-kube-api-access-lrnpx\") pod \"scheduler-configmap-ref-test-kserve-695ff4dfb-46xql\" (UID: \"02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-695ff4dfb-46xql" Apr 24 16:56:24.582285 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:24.582165 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-695ff4dfb-46xql\" (UID: \"02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-695ff4dfb-46xql" Apr 24 16:56:24.582378 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:24.582294 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d-model-cache\") pod \"scheduler-configmap-ref-test-kserve-695ff4dfb-46xql\" (UID: \"02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-695ff4dfb-46xql" Apr 24 16:56:24.582378 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:24.582349 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d-dshm\") pod \"scheduler-configmap-ref-test-kserve-695ff4dfb-46xql\" (UID: \"02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-695ff4dfb-46xql" Apr 24 16:56:24.582457 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:24.582397 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d-tmp-dir\") pod \"scheduler-configmap-ref-test-kserve-695ff4dfb-46xql\" (UID: \"02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-695ff4dfb-46xql" Apr 24 16:56:24.582457 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:24.582414 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-695ff4dfb-46xql\" (UID: \"02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-695ff4dfb-46xql" Apr 24 16:56:24.683123 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:24.683095 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lrnpx\" (UniqueName: \"kubernetes.io/projected/02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d-kube-api-access-lrnpx\") pod \"scheduler-configmap-ref-test-kserve-695ff4dfb-46xql\" (UID: \"02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-695ff4dfb-46xql" Apr 24 16:56:24.683258 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:24.683128 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-695ff4dfb-46xql\" (UID: \"02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-695ff4dfb-46xql" Apr 24 16:56:24.683258 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:24.683161 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d-model-cache\") pod \"scheduler-configmap-ref-test-kserve-695ff4dfb-46xql\" (UID: \"02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-695ff4dfb-46xql" Apr 24 16:56:24.683258 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:24.683212 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d-dshm\") pod \"scheduler-configmap-ref-test-kserve-695ff4dfb-46xql\" (UID: \"02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-695ff4dfb-46xql" Apr 24 16:56:24.683390 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:24.683286 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d-tmp-dir\") pod \"scheduler-configmap-ref-test-kserve-695ff4dfb-46xql\" (UID: \"02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-695ff4dfb-46xql" Apr 24 16:56:24.683390 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:24.683311 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-695ff4dfb-46xql\" (UID: \"02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-695ff4dfb-46xql" Apr 24 16:56:24.683390 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:24.683359 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d-home\") pod \"scheduler-configmap-ref-test-kserve-695ff4dfb-46xql\" (UID: \"02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-695ff4dfb-46xql" Apr 24 16:56:24.683709 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:24.683682 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d-model-cache\") pod \"scheduler-configmap-ref-test-kserve-695ff4dfb-46xql\" (UID: \"02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-695ff4dfb-46xql" Apr 24 16:56:24.683831 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:24.683725 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-695ff4dfb-46xql\" (UID: \"02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-695ff4dfb-46xql" Apr 24 16:56:24.683831 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:24.683689 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d-tmp-dir\") pod \"scheduler-configmap-ref-test-kserve-695ff4dfb-46xql\" (UID: \"02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-695ff4dfb-46xql" Apr 24 16:56:24.683831 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:24.683815 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d-home\") pod \"scheduler-configmap-ref-test-kserve-695ff4dfb-46xql\" (UID: \"02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-695ff4dfb-46xql" Apr 24 16:56:24.685606 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:24.685587 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d-dshm\") pod \"scheduler-configmap-ref-test-kserve-695ff4dfb-46xql\" (UID: \"02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-695ff4dfb-46xql" Apr 24 16:56:24.685914 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:24.685895 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-695ff4dfb-46xql\" (UID: \"02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-695ff4dfb-46xql" Apr 24 16:56:24.693696 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:24.693674 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrnpx\" (UniqueName: \"kubernetes.io/projected/02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d-kube-api-access-lrnpx\") pod \"scheduler-configmap-ref-test-kserve-695ff4dfb-46xql\" (UID: \"02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-695ff4dfb-46xql" Apr 24 16:56:24.826761 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:24.826657 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-695ff4dfb-46xql" Apr 24 16:56:24.973895 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:24.973875 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-695ff4dfb-46xql"] Apr 24 16:56:24.975843 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:56:24.975800 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02ebe4aa_f2f7_4ffe_8dad_3cd54ccb526d.slice/crio-1b64146a21c721e412a7c1d7fb8bf848accfbfca91839c997746e1cc30bece26 WatchSource:0}: Error finding container 1b64146a21c721e412a7c1d7fb8bf848accfbfca91839c997746e1cc30bece26: Status 404 returned error can't find the container with id 1b64146a21c721e412a7c1d7fb8bf848accfbfca91839c997746e1cc30bece26 Apr 24 16:56:25.507116 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:25.507078 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-695ff4dfb-46xql" event={"ID":"02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d","Type":"ContainerStarted","Data":"425e91b54ae8ea71f8f7125b5252bb7db564212a9d101cafbf46cea4d47ca1cc"} Apr 24 16:56:25.507116 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:25.507120 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-695ff4dfb-46xql" event={"ID":"02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d","Type":"ContainerStarted","Data":"1b64146a21c721e412a7c1d7fb8bf848accfbfca91839c997746e1cc30bece26"} Apr 24 16:56:29.525270 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:29.525238 2579 generic.go:358] "Generic (PLEG): container finished" podID="02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d" containerID="425e91b54ae8ea71f8f7125b5252bb7db564212a9d101cafbf46cea4d47ca1cc" exitCode=0 Apr 24 16:56:29.525673 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:29.525323 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-695ff4dfb-46xql" event={"ID":"02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d","Type":"ContainerDied","Data":"425e91b54ae8ea71f8f7125b5252bb7db564212a9d101cafbf46cea4d47ca1cc"} Apr 24 16:56:30.531823 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:30.531787 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-695ff4dfb-46xql" event={"ID":"02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d","Type":"ContainerStarted","Data":"1f31a608cfaed22e3621ed819f7d4d7c9734dd58e64ad52d7ba4b689609e6675"} Apr 24 16:56:30.554750 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:30.554694 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-695ff4dfb-46xql" podStartSLOduration=6.554679874 podStartE2EDuration="6.554679874s" podCreationTimestamp="2026-04-24 16:56:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:56:30.551466891 +0000 UTC m=+1026.484536102" watchObservedRunningTime="2026-04-24 16:56:30.554679874 +0000 UTC m=+1026.487749073" Apr 24 16:56:34.827185 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:34.827153 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-695ff4dfb-46xql" Apr 24 16:56:34.827185 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:34.827190 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-695ff4dfb-46xql" Apr 24 16:56:34.839632 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:34.839609 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-695ff4dfb-46xql" Apr 24 16:56:35.566176 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:35.566147 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-695ff4dfb-46xql" Apr 24 16:56:58.246418 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:58.246382 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-695ff4dfb-46xql"] Apr 24 16:56:58.247003 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:58.246766 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-695ff4dfb-46xql" podUID="02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d" containerName="main" containerID="cri-o://1f31a608cfaed22e3621ed819f7d4d7c9734dd58e64ad52d7ba4b689609e6675" gracePeriod=30 Apr 24 16:56:58.500144 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:58.500122 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-695ff4dfb-46xql" Apr 24 16:56:58.585796 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:58.585760 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d-model-cache\") pod \"02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d\" (UID: \"02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d\") " Apr 24 16:56:58.585971 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:58.585814 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrnpx\" (UniqueName: \"kubernetes.io/projected/02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d-kube-api-access-lrnpx\") pod \"02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d\" (UID: \"02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d\") " Apr 24 16:56:58.585971 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:58.585838 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d-kserve-provision-location\") pod \"02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d\" (UID: \"02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d\") " Apr 24 16:56:58.585971 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:58.585896 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d-dshm\") pod \"02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d\" (UID: \"02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d\") " Apr 24 16:56:58.585971 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:58.585933 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d-home\") pod \"02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d\" (UID: \"02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d\") " Apr 24 16:56:58.586194 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:58.585977 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d-tls-certs\") pod \"02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d\" (UID: \"02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d\") " Apr 24 16:56:58.586194 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:58.586011 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d-tmp-dir\") pod \"02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d\" (UID: \"02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d\") " Apr 24 16:56:58.586194 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:58.586087 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d-model-cache" (OuterVolumeSpecName: "model-cache") pod "02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d" (UID: "02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:56:58.586350 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:58.586267 2579 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d-model-cache\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:56:58.586350 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:58.586293 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d" (UID: "02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:56:58.586482 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:58.586448 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d-home" (OuterVolumeSpecName: "home") pod "02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d" (UID: "02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:56:58.588286 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:58.588256 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d-dshm" (OuterVolumeSpecName: "dshm") pod "02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d" (UID: "02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:56:58.588776 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:58.588755 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d" (UID: "02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:56:58.588853 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:58.588808 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d-kube-api-access-lrnpx" (OuterVolumeSpecName: "kube-api-access-lrnpx") pod "02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d" (UID: "02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d"). InnerVolumeSpecName "kube-api-access-lrnpx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:56:58.640947 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:58.640906 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d" (UID: "02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:56:58.648191 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:58.648161 2579 generic.go:358] "Generic (PLEG): container finished" podID="02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d" containerID="1f31a608cfaed22e3621ed819f7d4d7c9734dd58e64ad52d7ba4b689609e6675" exitCode=0 Apr 24 16:56:58.648339 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:58.648232 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-695ff4dfb-46xql" Apr 24 16:56:58.648339 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:58.648235 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-695ff4dfb-46xql" event={"ID":"02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d","Type":"ContainerDied","Data":"1f31a608cfaed22e3621ed819f7d4d7c9734dd58e64ad52d7ba4b689609e6675"} Apr 24 16:56:58.648339 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:58.648279 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-695ff4dfb-46xql" event={"ID":"02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d","Type":"ContainerDied","Data":"1b64146a21c721e412a7c1d7fb8bf848accfbfca91839c997746e1cc30bece26"} Apr 24 16:56:58.648339 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:58.648296 2579 scope.go:117] "RemoveContainer" containerID="1f31a608cfaed22e3621ed819f7d4d7c9734dd58e64ad52d7ba4b689609e6675" Apr 24 16:56:58.657414 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:58.657397 2579 scope.go:117] "RemoveContainer" containerID="425e91b54ae8ea71f8f7125b5252bb7db564212a9d101cafbf46cea4d47ca1cc" Apr 24 16:56:58.673082 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:58.673059 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-695ff4dfb-46xql"] Apr 24 16:56:58.673358 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:58.673341 2579 scope.go:117] "RemoveContainer" containerID="1f31a608cfaed22e3621ed819f7d4d7c9734dd58e64ad52d7ba4b689609e6675" Apr 24 16:56:58.673638 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:56:58.673618 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f31a608cfaed22e3621ed819f7d4d7c9734dd58e64ad52d7ba4b689609e6675\": container with ID starting with 1f31a608cfaed22e3621ed819f7d4d7c9734dd58e64ad52d7ba4b689609e6675 not found: ID does not exist" containerID="1f31a608cfaed22e3621ed819f7d4d7c9734dd58e64ad52d7ba4b689609e6675" Apr 24 16:56:58.673689 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:58.673646 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f31a608cfaed22e3621ed819f7d4d7c9734dd58e64ad52d7ba4b689609e6675"} err="failed to get container status \"1f31a608cfaed22e3621ed819f7d4d7c9734dd58e64ad52d7ba4b689609e6675\": rpc error: code = NotFound desc = could not find container \"1f31a608cfaed22e3621ed819f7d4d7c9734dd58e64ad52d7ba4b689609e6675\": container with ID starting with 1f31a608cfaed22e3621ed819f7d4d7c9734dd58e64ad52d7ba4b689609e6675 not found: ID does not exist" Apr 24 16:56:58.673689 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:58.673664 2579 scope.go:117] "RemoveContainer" containerID="425e91b54ae8ea71f8f7125b5252bb7db564212a9d101cafbf46cea4d47ca1cc" Apr 24 16:56:58.673910 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:56:58.673894 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"425e91b54ae8ea71f8f7125b5252bb7db564212a9d101cafbf46cea4d47ca1cc\": container with ID starting with 425e91b54ae8ea71f8f7125b5252bb7db564212a9d101cafbf46cea4d47ca1cc not found: ID does not exist" containerID="425e91b54ae8ea71f8f7125b5252bb7db564212a9d101cafbf46cea4d47ca1cc" Apr 24 16:56:58.673950 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:58.673915 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"425e91b54ae8ea71f8f7125b5252bb7db564212a9d101cafbf46cea4d47ca1cc"} err="failed to get container status \"425e91b54ae8ea71f8f7125b5252bb7db564212a9d101cafbf46cea4d47ca1cc\": rpc error: code = NotFound desc = could not find container \"425e91b54ae8ea71f8f7125b5252bb7db564212a9d101cafbf46cea4d47ca1cc\": container with ID starting with 425e91b54ae8ea71f8f7125b5252bb7db564212a9d101cafbf46cea4d47ca1cc not found: ID does not exist" Apr 24 16:56:58.675711 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:58.675691 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-695ff4dfb-46xql"] Apr 24 16:56:58.687588 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:58.687566 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lrnpx\" (UniqueName: \"kubernetes.io/projected/02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d-kube-api-access-lrnpx\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:56:58.687664 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:58.687594 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d-kserve-provision-location\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:56:58.687664 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:58.687611 2579 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d-dshm\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:56:58.687664 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:58.687626 2579 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d-home\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:56:58.687664 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:58.687639 2579 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d-tls-certs\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:56:58.687664 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:56:58.687653 2579 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d-tmp-dir\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:57:00.621004 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:00.620968 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d" path="/var/lib/kubelet/pods/02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d/volumes" Apr 24 16:57:09.928412 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:09.928323 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76f55cf86d-dsq8k"] Apr 24 16:57:09.928900 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:09.928709 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d" containerName="storage-initializer" Apr 24 16:57:09.928900 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:09.928722 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d" containerName="storage-initializer" Apr 24 16:57:09.928900 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:09.928749 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d" containerName="main" Apr 24 16:57:09.928900 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:09.928756 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d" containerName="main" Apr 24 16:57:09.928900 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:09.928828 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="02ebe4aa-f2f7-4ffe-8dad-3cd54ccb526d" containerName="main" Apr 24 16:57:09.931912 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:09.931898 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76f55cf86d-dsq8k" Apr 24 16:57:09.934228 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:09.934205 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 16:57:09.934343 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:09.934300 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-q89jh\"" Apr 24 16:57:09.934944 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:09.934928 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 24 16:57:09.940523 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:09.940504 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 16:57:09.942359 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:09.942321 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76f55cf86d-dsq8k"] Apr 24 16:57:09.990462 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:09.990432 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkh5q\" (UniqueName: \"kubernetes.io/projected/86dd6637-985d-427f-a90b-51304537fd81-kube-api-access-xkh5q\") pod \"scheduler-ha-replicas-test-kserve-76f55cf86d-dsq8k\" (UID: \"86dd6637-985d-427f-a90b-51304537fd81\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76f55cf86d-dsq8k" Apr 24 16:57:09.990615 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:09.990474 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/86dd6637-985d-427f-a90b-51304537fd81-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-76f55cf86d-dsq8k\" (UID: \"86dd6637-985d-427f-a90b-51304537fd81\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76f55cf86d-dsq8k" Apr 24 16:57:09.990615 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:09.990505 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/86dd6637-985d-427f-a90b-51304537fd81-model-cache\") pod \"scheduler-ha-replicas-test-kserve-76f55cf86d-dsq8k\" (UID: \"86dd6637-985d-427f-a90b-51304537fd81\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76f55cf86d-dsq8k" Apr 24 16:57:09.990615 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:09.990571 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/86dd6637-985d-427f-a90b-51304537fd81-tmp-dir\") pod \"scheduler-ha-replicas-test-kserve-76f55cf86d-dsq8k\" (UID: \"86dd6637-985d-427f-a90b-51304537fd81\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76f55cf86d-dsq8k" Apr 24 16:57:09.990615 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:09.990604 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/86dd6637-985d-427f-a90b-51304537fd81-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-76f55cf86d-dsq8k\" (UID: \"86dd6637-985d-427f-a90b-51304537fd81\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76f55cf86d-dsq8k" Apr 24 16:57:09.990810 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:09.990633 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/86dd6637-985d-427f-a90b-51304537fd81-dshm\") pod \"scheduler-ha-replicas-test-kserve-76f55cf86d-dsq8k\" (UID: \"86dd6637-985d-427f-a90b-51304537fd81\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76f55cf86d-dsq8k" Apr 24 16:57:09.990810 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:09.990687 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/86dd6637-985d-427f-a90b-51304537fd81-home\") pod \"scheduler-ha-replicas-test-kserve-76f55cf86d-dsq8k\" (UID: \"86dd6637-985d-427f-a90b-51304537fd81\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76f55cf86d-dsq8k" Apr 24 16:57:10.092252 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:10.092220 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xkh5q\" (UniqueName: \"kubernetes.io/projected/86dd6637-985d-427f-a90b-51304537fd81-kube-api-access-xkh5q\") pod \"scheduler-ha-replicas-test-kserve-76f55cf86d-dsq8k\" (UID: \"86dd6637-985d-427f-a90b-51304537fd81\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76f55cf86d-dsq8k" Apr 24 16:57:10.092415 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:10.092259 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/86dd6637-985d-427f-a90b-51304537fd81-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-76f55cf86d-dsq8k\" (UID: \"86dd6637-985d-427f-a90b-51304537fd81\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76f55cf86d-dsq8k" Apr 24 16:57:10.092415 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:10.092387 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/86dd6637-985d-427f-a90b-51304537fd81-model-cache\") pod \"scheduler-ha-replicas-test-kserve-76f55cf86d-dsq8k\" (UID: \"86dd6637-985d-427f-a90b-51304537fd81\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76f55cf86d-dsq8k" Apr 24 16:57:10.092495 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:10.092456 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/86dd6637-985d-427f-a90b-51304537fd81-tmp-dir\") pod \"scheduler-ha-replicas-test-kserve-76f55cf86d-dsq8k\" (UID: \"86dd6637-985d-427f-a90b-51304537fd81\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76f55cf86d-dsq8k" Apr 24 16:57:10.092544 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:10.092496 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/86dd6637-985d-427f-a90b-51304537fd81-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-76f55cf86d-dsq8k\" (UID: \"86dd6637-985d-427f-a90b-51304537fd81\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76f55cf86d-dsq8k" Apr 24 16:57:10.092544 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:10.092535 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/86dd6637-985d-427f-a90b-51304537fd81-dshm\") pod \"scheduler-ha-replicas-test-kserve-76f55cf86d-dsq8k\" (UID: \"86dd6637-985d-427f-a90b-51304537fd81\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76f55cf86d-dsq8k" Apr 24 16:57:10.092648 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:10.092578 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/86dd6637-985d-427f-a90b-51304537fd81-home\") pod \"scheduler-ha-replicas-test-kserve-76f55cf86d-dsq8k\" (UID: \"86dd6637-985d-427f-a90b-51304537fd81\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76f55cf86d-dsq8k" Apr 24 16:57:10.092837 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:10.092808 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/86dd6637-985d-427f-a90b-51304537fd81-model-cache\") pod \"scheduler-ha-replicas-test-kserve-76f55cf86d-dsq8k\" (UID: \"86dd6637-985d-427f-a90b-51304537fd81\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76f55cf86d-dsq8k" Apr 24 16:57:10.092957 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:10.092856 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/86dd6637-985d-427f-a90b-51304537fd81-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-76f55cf86d-dsq8k\" (UID: \"86dd6637-985d-427f-a90b-51304537fd81\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76f55cf86d-dsq8k" Apr 24 16:57:10.092957 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:10.092894 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/86dd6637-985d-427f-a90b-51304537fd81-tmp-dir\") pod \"scheduler-ha-replicas-test-kserve-76f55cf86d-dsq8k\" (UID: \"86dd6637-985d-427f-a90b-51304537fd81\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76f55cf86d-dsq8k" Apr 24 16:57:10.093038 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:10.092967 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/86dd6637-985d-427f-a90b-51304537fd81-home\") pod \"scheduler-ha-replicas-test-kserve-76f55cf86d-dsq8k\" (UID: \"86dd6637-985d-427f-a90b-51304537fd81\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76f55cf86d-dsq8k" Apr 24 16:57:10.094916 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:10.094869 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/86dd6637-985d-427f-a90b-51304537fd81-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-76f55cf86d-dsq8k\" (UID: \"86dd6637-985d-427f-a90b-51304537fd81\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76f55cf86d-dsq8k" Apr 24 16:57:10.095031 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:10.094970 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/86dd6637-985d-427f-a90b-51304537fd81-dshm\") pod \"scheduler-ha-replicas-test-kserve-76f55cf86d-dsq8k\" (UID: \"86dd6637-985d-427f-a90b-51304537fd81\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76f55cf86d-dsq8k" Apr 24 16:57:10.100484 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:10.100456 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkh5q\" (UniqueName: \"kubernetes.io/projected/86dd6637-985d-427f-a90b-51304537fd81-kube-api-access-xkh5q\") pod \"scheduler-ha-replicas-test-kserve-76f55cf86d-dsq8k\" (UID: \"86dd6637-985d-427f-a90b-51304537fd81\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76f55cf86d-dsq8k" Apr 24 16:57:10.175792 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:10.175761 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc"] Apr 24 16:57:10.179875 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:10.179821 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc" Apr 24 16:57:10.182458 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:10.182437 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-epp-sa-dockercfg-hvncj\"" Apr 24 16:57:10.199363 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:10.199333 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc"] Apr 24 16:57:10.242905 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:10.242878 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76f55cf86d-dsq8k" Apr 24 16:57:10.294477 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:10.294435 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/56429b96-efa3-4498-81ed-0f4fc96911d0-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc\" (UID: \"56429b96-efa3-4498-81ed-0f4fc96911d0\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc" Apr 24 16:57:10.294600 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:10.294503 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/56429b96-efa3-4498-81ed-0f4fc96911d0-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc\" (UID: \"56429b96-efa3-4498-81ed-0f4fc96911d0\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc" Apr 24 16:57:10.294600 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:10.294558 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/56429b96-efa3-4498-81ed-0f4fc96911d0-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc\" (UID: \"56429b96-efa3-4498-81ed-0f4fc96911d0\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc" Apr 24 16:57:10.294764 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:10.294664 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmhsg\" (UniqueName: \"kubernetes.io/projected/56429b96-efa3-4498-81ed-0f4fc96911d0-kube-api-access-gmhsg\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc\" (UID: \"56429b96-efa3-4498-81ed-0f4fc96911d0\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc" Apr 24 16:57:10.294764 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:10.294715 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/56429b96-efa3-4498-81ed-0f4fc96911d0-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc\" (UID: \"56429b96-efa3-4498-81ed-0f4fc96911d0\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc" Apr 24 16:57:10.294872 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:10.294767 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/56429b96-efa3-4498-81ed-0f4fc96911d0-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc\" (UID: \"56429b96-efa3-4498-81ed-0f4fc96911d0\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc" Apr 24 16:57:10.379249 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:10.379220 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76f55cf86d-dsq8k"] Apr 24 16:57:10.380599 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:57:10.380572 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86dd6637_985d_427f_a90b_51304537fd81.slice/crio-ae68caa118f18412f11649be9e538c35455b333cfe45027392a3f0ec92fa82ab WatchSource:0}: Error finding container ae68caa118f18412f11649be9e538c35455b333cfe45027392a3f0ec92fa82ab: Status 404 returned error can't find the container with id ae68caa118f18412f11649be9e538c35455b333cfe45027392a3f0ec92fa82ab Apr 24 16:57:10.382538 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:10.382517 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 16:57:10.396054 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:10.396034 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/56429b96-efa3-4498-81ed-0f4fc96911d0-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc\" (UID: \"56429b96-efa3-4498-81ed-0f4fc96911d0\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc" Apr 24 16:57:10.396133 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:10.396070 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/56429b96-efa3-4498-81ed-0f4fc96911d0-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc\" (UID: \"56429b96-efa3-4498-81ed-0f4fc96911d0\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc" Apr 24 16:57:10.396231 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:10.396197 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/56429b96-efa3-4498-81ed-0f4fc96911d0-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc\" (UID: \"56429b96-efa3-4498-81ed-0f4fc96911d0\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc" Apr 24 16:57:10.396281 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:10.396267 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gmhsg\" (UniqueName: \"kubernetes.io/projected/56429b96-efa3-4498-81ed-0f4fc96911d0-kube-api-access-gmhsg\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc\" (UID: \"56429b96-efa3-4498-81ed-0f4fc96911d0\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc" Apr 24 16:57:10.396327 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:10.396307 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/56429b96-efa3-4498-81ed-0f4fc96911d0-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc\" (UID: \"56429b96-efa3-4498-81ed-0f4fc96911d0\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc" Apr 24 16:57:10.396436 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:10.396338 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/56429b96-efa3-4498-81ed-0f4fc96911d0-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc\" (UID: \"56429b96-efa3-4498-81ed-0f4fc96911d0\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc" Apr 24 16:57:10.396491 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:10.396435 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/56429b96-efa3-4498-81ed-0f4fc96911d0-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc\" (UID: \"56429b96-efa3-4498-81ed-0f4fc96911d0\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc" Apr 24 16:57:10.396491 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:10.396455 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/56429b96-efa3-4498-81ed-0f4fc96911d0-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc\" (UID: \"56429b96-efa3-4498-81ed-0f4fc96911d0\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc" Apr 24 16:57:10.396592 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:10.396498 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/56429b96-efa3-4498-81ed-0f4fc96911d0-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc\" (UID: \"56429b96-efa3-4498-81ed-0f4fc96911d0\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc" Apr 24 16:57:10.396645 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:10.396597 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/56429b96-efa3-4498-81ed-0f4fc96911d0-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc\" (UID: \"56429b96-efa3-4498-81ed-0f4fc96911d0\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc" Apr 24 16:57:10.398905 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:10.398885 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/56429b96-efa3-4498-81ed-0f4fc96911d0-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc\" (UID: \"56429b96-efa3-4498-81ed-0f4fc96911d0\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc" Apr 24 16:57:10.404559 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:10.404538 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmhsg\" (UniqueName: \"kubernetes.io/projected/56429b96-efa3-4498-81ed-0f4fc96911d0-kube-api-access-gmhsg\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc\" (UID: \"56429b96-efa3-4498-81ed-0f4fc96911d0\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc" Apr 24 16:57:10.490081 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:10.490047 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc" Apr 24 16:57:10.629453 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:10.629423 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc"] Apr 24 16:57:10.631596 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:57:10.631571 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56429b96_efa3_4498_81ed_0f4fc96911d0.slice/crio-c2b2d8cfd38db6446a37e3d4a89e0c1a411f923a0f4524956fea9dfe1293a6e2 WatchSource:0}: Error finding container c2b2d8cfd38db6446a37e3d4a89e0c1a411f923a0f4524956fea9dfe1293a6e2: Status 404 returned error can't find the container with id c2b2d8cfd38db6446a37e3d4a89e0c1a411f923a0f4524956fea9dfe1293a6e2 Apr 24 16:57:10.698864 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:10.698829 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc" event={"ID":"56429b96-efa3-4498-81ed-0f4fc96911d0","Type":"ContainerStarted","Data":"472a6d3e46a926de701640169c33860b10ef5c25ec825d3d0240d4d58e10150d"} Apr 24 16:57:10.699015 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:10.698876 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc" event={"ID":"56429b96-efa3-4498-81ed-0f4fc96911d0","Type":"ContainerStarted","Data":"c2b2d8cfd38db6446a37e3d4a89e0c1a411f923a0f4524956fea9dfe1293a6e2"} Apr 24 16:57:10.700433 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:10.700401 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76f55cf86d-dsq8k" event={"ID":"86dd6637-985d-427f-a90b-51304537fd81","Type":"ContainerStarted","Data":"9873648ddb1badb9f3d415552674402944375ab2ce1fac4f978a657d7f86ada5"} Apr 24 16:57:10.700538 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:10.700439 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76f55cf86d-dsq8k" event={"ID":"86dd6637-985d-427f-a90b-51304537fd81","Type":"ContainerStarted","Data":"ae68caa118f18412f11649be9e538c35455b333cfe45027392a3f0ec92fa82ab"} Apr 24 16:57:11.706377 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:11.706284 2579 generic.go:358] "Generic (PLEG): container finished" podID="56429b96-efa3-4498-81ed-0f4fc96911d0" containerID="472a6d3e46a926de701640169c33860b10ef5c25ec825d3d0240d4d58e10150d" exitCode=0 Apr 24 16:57:11.706853 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:11.706369 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc" event={"ID":"56429b96-efa3-4498-81ed-0f4fc96911d0","Type":"ContainerDied","Data":"472a6d3e46a926de701640169c33860b10ef5c25ec825d3d0240d4d58e10150d"} Apr 24 16:57:13.717871 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:13.717832 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc" event={"ID":"56429b96-efa3-4498-81ed-0f4fc96911d0","Type":"ContainerStarted","Data":"aad8f69599aa3dc450331a3e5a313cb8238184b55fd87cda9e9d32a2f6af321a"} Apr 24 16:57:42.852977 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:42.852942 2579 generic.go:358] "Generic (PLEG): container finished" podID="86dd6637-985d-427f-a90b-51304537fd81" containerID="9873648ddb1badb9f3d415552674402944375ab2ce1fac4f978a657d7f86ada5" exitCode=0 Apr 24 16:57:42.853372 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:42.852984 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76f55cf86d-dsq8k" event={"ID":"86dd6637-985d-427f-a90b-51304537fd81","Type":"ContainerDied","Data":"9873648ddb1badb9f3d415552674402944375ab2ce1fac4f978a657d7f86ada5"} Apr 24 16:57:43.858609 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:43.858564 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76f55cf86d-dsq8k" event={"ID":"86dd6637-985d-427f-a90b-51304537fd81","Type":"ContainerStarted","Data":"3c43592543592c816b7767cf2df6e47f3f51e1840b1e970ddaae8374b1e687c5"} Apr 24 16:57:43.860573 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:43.860546 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc" event={"ID":"56429b96-efa3-4498-81ed-0f4fc96911d0","Type":"ContainerStarted","Data":"a40a3fd115d3ccfd5b689e5335ee7e4e79eb9a2029d882943c38c3ba63256f78"} Apr 24 16:57:43.860789 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:43.860771 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc" Apr 24 16:57:43.863543 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:43.863500 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc" podUID="56429b96-efa3-4498-81ed-0f4fc96911d0" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 16:57:43.882743 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:43.882673 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76f55cf86d-dsq8k" podStartSLOduration=34.882658741 podStartE2EDuration="34.882658741s" podCreationTimestamp="2026-04-24 16:57:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:57:43.881277816 +0000 UTC m=+1099.814347040" watchObservedRunningTime="2026-04-24 16:57:43.882658741 +0000 UTC m=+1099.815727939" Apr 24 16:57:43.902361 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:43.902308 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc" podStartSLOduration=2.607720103 podStartE2EDuration="33.902292323s" podCreationTimestamp="2026-04-24 16:57:10 +0000 UTC" firstStartedPulling="2026-04-24 16:57:11.707820873 +0000 UTC m=+1067.640890049" lastFinishedPulling="2026-04-24 16:57:43.002393092 +0000 UTC m=+1098.935462269" observedRunningTime="2026-04-24 16:57:43.899866682 +0000 UTC m=+1099.832935880" watchObservedRunningTime="2026-04-24 16:57:43.902292323 +0000 UTC m=+1099.835361522" Apr 24 16:57:44.865458 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:44.865424 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc" podUID="56429b96-efa3-4498-81ed-0f4fc96911d0" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 16:57:50.243139 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:50.243105 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76f55cf86d-dsq8k" Apr 24 16:57:50.243665 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:50.243203 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76f55cf86d-dsq8k" Apr 24 16:57:50.255793 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:50.255768 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76f55cf86d-dsq8k" Apr 24 16:57:50.491068 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:50.491033 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc" Apr 24 16:57:50.491258 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:50.491080 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc" Apr 24 16:57:50.492817 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:50.492788 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc" Apr 24 16:57:50.492954 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:50.492812 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc" podUID="56429b96-efa3-4498-81ed-0f4fc96911d0" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 16:57:50.887058 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:50.887027 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc" Apr 24 16:57:50.887243 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:50.887187 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc" podUID="56429b96-efa3-4498-81ed-0f4fc96911d0" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 16:57:50.897605 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:50.897586 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76f55cf86d-dsq8k" Apr 24 16:57:51.890901 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:57:51.890861 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc" podUID="56429b96-efa3-4498-81ed-0f4fc96911d0" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 16:58:01.891121 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:01.891078 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc" podUID="56429b96-efa3-4498-81ed-0f4fc96911d0" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 16:58:11.891950 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:11.891909 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc" podUID="56429b96-efa3-4498-81ed-0f4fc96911d0" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 16:58:21.891763 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:21.891708 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc" podUID="56429b96-efa3-4498-81ed-0f4fc96911d0" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 16:58:26.396294 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:26.396255 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc"] Apr 24 16:58:26.396696 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:26.396617 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc" podUID="56429b96-efa3-4498-81ed-0f4fc96911d0" containerName="main" containerID="cri-o://aad8f69599aa3dc450331a3e5a313cb8238184b55fd87cda9e9d32a2f6af321a" gracePeriod=30 Apr 24 16:58:26.396792 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:26.396685 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc" podUID="56429b96-efa3-4498-81ed-0f4fc96911d0" containerName="tokenizer" containerID="cri-o://a40a3fd115d3ccfd5b689e5335ee7e4e79eb9a2029d882943c38c3ba63256f78" gracePeriod=30 Apr 24 16:58:26.398338 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:26.398303 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc" podUID="56429b96-efa3-4498-81ed-0f4fc96911d0" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 16:58:26.400200 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:26.400175 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76f55cf86d-dsq8k"] Apr 24 16:58:26.400747 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:26.400686 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76f55cf86d-dsq8k" podUID="86dd6637-985d-427f-a90b-51304537fd81" containerName="main" containerID="cri-o://3c43592543592c816b7767cf2df6e47f3f51e1840b1e970ddaae8374b1e687c5" gracePeriod=30 Apr 24 16:58:26.637656 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:26.637630 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76f55cf86d-dsq8k" Apr 24 16:58:26.785009 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:26.784975 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/86dd6637-985d-427f-a90b-51304537fd81-model-cache\") pod \"86dd6637-985d-427f-a90b-51304537fd81\" (UID: \"86dd6637-985d-427f-a90b-51304537fd81\") " Apr 24 16:58:26.785009 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:26.785015 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/86dd6637-985d-427f-a90b-51304537fd81-home\") pod \"86dd6637-985d-427f-a90b-51304537fd81\" (UID: \"86dd6637-985d-427f-a90b-51304537fd81\") " Apr 24 16:58:26.785252 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:26.785082 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/86dd6637-985d-427f-a90b-51304537fd81-tls-certs\") pod \"86dd6637-985d-427f-a90b-51304537fd81\" (UID: \"86dd6637-985d-427f-a90b-51304537fd81\") " Apr 24 16:58:26.785252 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:26.785147 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkh5q\" (UniqueName: \"kubernetes.io/projected/86dd6637-985d-427f-a90b-51304537fd81-kube-api-access-xkh5q\") pod \"86dd6637-985d-427f-a90b-51304537fd81\" (UID: \"86dd6637-985d-427f-a90b-51304537fd81\") " Apr 24 16:58:26.785252 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:26.785178 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/86dd6637-985d-427f-a90b-51304537fd81-tmp-dir\") pod \"86dd6637-985d-427f-a90b-51304537fd81\" (UID: \"86dd6637-985d-427f-a90b-51304537fd81\") " Apr 24 16:58:26.785252 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:26.785203 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/86dd6637-985d-427f-a90b-51304537fd81-kserve-provision-location\") pod \"86dd6637-985d-427f-a90b-51304537fd81\" (UID: \"86dd6637-985d-427f-a90b-51304537fd81\") " Apr 24 16:58:26.785252 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:26.785225 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/86dd6637-985d-427f-a90b-51304537fd81-dshm\") pod \"86dd6637-985d-427f-a90b-51304537fd81\" (UID: \"86dd6637-985d-427f-a90b-51304537fd81\") " Apr 24 16:58:26.785506 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:26.785258 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86dd6637-985d-427f-a90b-51304537fd81-model-cache" (OuterVolumeSpecName: "model-cache") pod "86dd6637-985d-427f-a90b-51304537fd81" (UID: "86dd6637-985d-427f-a90b-51304537fd81"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:58:26.785506 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:26.785326 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86dd6637-985d-427f-a90b-51304537fd81-home" (OuterVolumeSpecName: "home") pod "86dd6637-985d-427f-a90b-51304537fd81" (UID: "86dd6637-985d-427f-a90b-51304537fd81"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:58:26.785506 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:26.785398 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86dd6637-985d-427f-a90b-51304537fd81-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "86dd6637-985d-427f-a90b-51304537fd81" (UID: "86dd6637-985d-427f-a90b-51304537fd81"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:58:26.785667 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:26.785574 2579 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/86dd6637-985d-427f-a90b-51304537fd81-tmp-dir\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:58:26.785667 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:26.785597 2579 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/86dd6637-985d-427f-a90b-51304537fd81-model-cache\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:58:26.785667 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:26.785610 2579 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/86dd6637-985d-427f-a90b-51304537fd81-home\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:58:26.787473 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:26.787446 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86dd6637-985d-427f-a90b-51304537fd81-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "86dd6637-985d-427f-a90b-51304537fd81" (UID: "86dd6637-985d-427f-a90b-51304537fd81"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:58:26.787644 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:26.787625 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86dd6637-985d-427f-a90b-51304537fd81-kube-api-access-xkh5q" (OuterVolumeSpecName: "kube-api-access-xkh5q") pod "86dd6637-985d-427f-a90b-51304537fd81" (UID: "86dd6637-985d-427f-a90b-51304537fd81"). InnerVolumeSpecName "kube-api-access-xkh5q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:58:26.787813 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:26.787791 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86dd6637-985d-427f-a90b-51304537fd81-dshm" (OuterVolumeSpecName: "dshm") pod "86dd6637-985d-427f-a90b-51304537fd81" (UID: "86dd6637-985d-427f-a90b-51304537fd81"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:58:26.841152 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:26.841105 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86dd6637-985d-427f-a90b-51304537fd81-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "86dd6637-985d-427f-a90b-51304537fd81" (UID: "86dd6637-985d-427f-a90b-51304537fd81"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:58:26.886643 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:26.886608 2579 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/86dd6637-985d-427f-a90b-51304537fd81-tls-certs\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:58:26.886643 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:26.886637 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xkh5q\" (UniqueName: \"kubernetes.io/projected/86dd6637-985d-427f-a90b-51304537fd81-kube-api-access-xkh5q\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:58:26.886643 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:26.886647 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/86dd6637-985d-427f-a90b-51304537fd81-kserve-provision-location\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:58:26.886946 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:26.886656 2579 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/86dd6637-985d-427f-a90b-51304537fd81-dshm\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:58:27.037412 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:27.037326 2579 generic.go:358] "Generic (PLEG): container finished" podID="56429b96-efa3-4498-81ed-0f4fc96911d0" containerID="aad8f69599aa3dc450331a3e5a313cb8238184b55fd87cda9e9d32a2f6af321a" exitCode=0 Apr 24 16:58:27.037573 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:27.037406 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc" event={"ID":"56429b96-efa3-4498-81ed-0f4fc96911d0","Type":"ContainerDied","Data":"aad8f69599aa3dc450331a3e5a313cb8238184b55fd87cda9e9d32a2f6af321a"} Apr 24 16:58:27.038885 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:27.038863 2579 generic.go:358] "Generic (PLEG): container finished" podID="86dd6637-985d-427f-a90b-51304537fd81" containerID="3c43592543592c816b7767cf2df6e47f3f51e1840b1e970ddaae8374b1e687c5" exitCode=0 Apr 24 16:58:27.039009 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:27.038941 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76f55cf86d-dsq8k" Apr 24 16:58:27.039009 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:27.038942 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76f55cf86d-dsq8k" event={"ID":"86dd6637-985d-427f-a90b-51304537fd81","Type":"ContainerDied","Data":"3c43592543592c816b7767cf2df6e47f3f51e1840b1e970ddaae8374b1e687c5"} Apr 24 16:58:27.039009 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:27.038978 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76f55cf86d-dsq8k" event={"ID":"86dd6637-985d-427f-a90b-51304537fd81","Type":"ContainerDied","Data":"ae68caa118f18412f11649be9e538c35455b333cfe45027392a3f0ec92fa82ab"} Apr 24 16:58:27.039009 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:27.038993 2579 scope.go:117] "RemoveContainer" containerID="3c43592543592c816b7767cf2df6e47f3f51e1840b1e970ddaae8374b1e687c5" Apr 24 16:58:27.049261 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:27.049243 2579 scope.go:117] "RemoveContainer" containerID="9873648ddb1badb9f3d415552674402944375ab2ce1fac4f978a657d7f86ada5" Apr 24 16:58:27.064029 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:27.064000 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76f55cf86d-dsq8k"] Apr 24 16:58:27.066257 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:27.066230 2579 scope.go:117] "RemoveContainer" containerID="3c43592543592c816b7767cf2df6e47f3f51e1840b1e970ddaae8374b1e687c5" Apr 24 16:58:27.066761 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:58:27.066702 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c43592543592c816b7767cf2df6e47f3f51e1840b1e970ddaae8374b1e687c5\": container with ID starting with 3c43592543592c816b7767cf2df6e47f3f51e1840b1e970ddaae8374b1e687c5 not found: ID does not exist" containerID="3c43592543592c816b7767cf2df6e47f3f51e1840b1e970ddaae8374b1e687c5" Apr 24 16:58:27.066865 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:27.066767 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c43592543592c816b7767cf2df6e47f3f51e1840b1e970ddaae8374b1e687c5"} err="failed to get container status \"3c43592543592c816b7767cf2df6e47f3f51e1840b1e970ddaae8374b1e687c5\": rpc error: code = NotFound desc = could not find container \"3c43592543592c816b7767cf2df6e47f3f51e1840b1e970ddaae8374b1e687c5\": container with ID starting with 3c43592543592c816b7767cf2df6e47f3f51e1840b1e970ddaae8374b1e687c5 not found: ID does not exist" Apr 24 16:58:27.066865 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:27.066788 2579 scope.go:117] "RemoveContainer" containerID="9873648ddb1badb9f3d415552674402944375ab2ce1fac4f978a657d7f86ada5" Apr 24 16:58:27.067364 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:58:27.067337 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9873648ddb1badb9f3d415552674402944375ab2ce1fac4f978a657d7f86ada5\": container with ID starting with 9873648ddb1badb9f3d415552674402944375ab2ce1fac4f978a657d7f86ada5 not found: ID does not exist" containerID="9873648ddb1badb9f3d415552674402944375ab2ce1fac4f978a657d7f86ada5" Apr 24 16:58:27.067457 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:27.067375 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9873648ddb1badb9f3d415552674402944375ab2ce1fac4f978a657d7f86ada5"} err="failed to get container status \"9873648ddb1badb9f3d415552674402944375ab2ce1fac4f978a657d7f86ada5\": rpc error: code = NotFound desc = could not find container \"9873648ddb1badb9f3d415552674402944375ab2ce1fac4f978a657d7f86ada5\": container with ID starting with 9873648ddb1badb9f3d415552674402944375ab2ce1fac4f978a657d7f86ada5 not found: ID does not exist" Apr 24 16:58:27.069209 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:27.069185 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-76f55cf86d-dsq8k"] Apr 24 16:58:27.665625 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:27.665604 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc" Apr 24 16:58:27.796624 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:27.796595 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/56429b96-efa3-4498-81ed-0f4fc96911d0-tls-certs\") pod \"56429b96-efa3-4498-81ed-0f4fc96911d0\" (UID: \"56429b96-efa3-4498-81ed-0f4fc96911d0\") " Apr 24 16:58:27.796828 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:27.796650 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/56429b96-efa3-4498-81ed-0f4fc96911d0-kserve-provision-location\") pod \"56429b96-efa3-4498-81ed-0f4fc96911d0\" (UID: \"56429b96-efa3-4498-81ed-0f4fc96911d0\") " Apr 24 16:58:27.796828 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:27.796698 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/56429b96-efa3-4498-81ed-0f4fc96911d0-tokenizer-tmp\") pod \"56429b96-efa3-4498-81ed-0f4fc96911d0\" (UID: \"56429b96-efa3-4498-81ed-0f4fc96911d0\") " Apr 24 16:58:27.796828 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:27.796722 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/56429b96-efa3-4498-81ed-0f4fc96911d0-tokenizer-uds\") pod \"56429b96-efa3-4498-81ed-0f4fc96911d0\" (UID: \"56429b96-efa3-4498-81ed-0f4fc96911d0\") " Apr 24 16:58:27.797028 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:27.796871 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmhsg\" (UniqueName: \"kubernetes.io/projected/56429b96-efa3-4498-81ed-0f4fc96911d0-kube-api-access-gmhsg\") pod \"56429b96-efa3-4498-81ed-0f4fc96911d0\" (UID: \"56429b96-efa3-4498-81ed-0f4fc96911d0\") " Apr 24 16:58:27.797028 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:27.796901 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/56429b96-efa3-4498-81ed-0f4fc96911d0-tokenizer-cache\") pod \"56429b96-efa3-4498-81ed-0f4fc96911d0\" (UID: \"56429b96-efa3-4498-81ed-0f4fc96911d0\") " Apr 24 16:58:27.797173 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:27.797036 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56429b96-efa3-4498-81ed-0f4fc96911d0-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "56429b96-efa3-4498-81ed-0f4fc96911d0" (UID: "56429b96-efa3-4498-81ed-0f4fc96911d0"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:58:27.797173 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:27.797058 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56429b96-efa3-4498-81ed-0f4fc96911d0-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "56429b96-efa3-4498-81ed-0f4fc96911d0" (UID: "56429b96-efa3-4498-81ed-0f4fc96911d0"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:58:27.797261 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:27.797197 2579 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/56429b96-efa3-4498-81ed-0f4fc96911d0-tokenizer-tmp\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:58:27.797261 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:27.797215 2579 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/56429b96-efa3-4498-81ed-0f4fc96911d0-tokenizer-uds\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:58:27.797261 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:27.797243 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56429b96-efa3-4498-81ed-0f4fc96911d0-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "56429b96-efa3-4498-81ed-0f4fc96911d0" (UID: "56429b96-efa3-4498-81ed-0f4fc96911d0"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:58:27.797634 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:27.797612 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56429b96-efa3-4498-81ed-0f4fc96911d0-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "56429b96-efa3-4498-81ed-0f4fc96911d0" (UID: "56429b96-efa3-4498-81ed-0f4fc96911d0"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:58:27.798955 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:27.798934 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56429b96-efa3-4498-81ed-0f4fc96911d0-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "56429b96-efa3-4498-81ed-0f4fc96911d0" (UID: "56429b96-efa3-4498-81ed-0f4fc96911d0"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:58:27.799251 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:27.799231 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56429b96-efa3-4498-81ed-0f4fc96911d0-kube-api-access-gmhsg" (OuterVolumeSpecName: "kube-api-access-gmhsg") pod "56429b96-efa3-4498-81ed-0f4fc96911d0" (UID: "56429b96-efa3-4498-81ed-0f4fc96911d0"). InnerVolumeSpecName "kube-api-access-gmhsg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:58:27.898511 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:27.898477 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gmhsg\" (UniqueName: \"kubernetes.io/projected/56429b96-efa3-4498-81ed-0f4fc96911d0-kube-api-access-gmhsg\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:58:27.898511 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:27.898507 2579 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/56429b96-efa3-4498-81ed-0f4fc96911d0-tokenizer-cache\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:58:27.898511 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:27.898518 2579 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/56429b96-efa3-4498-81ed-0f4fc96911d0-tls-certs\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:58:27.898769 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:27.898528 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/56429b96-efa3-4498-81ed-0f4fc96911d0-kserve-provision-location\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 16:58:28.044386 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:28.044351 2579 generic.go:358] "Generic (PLEG): container finished" podID="56429b96-efa3-4498-81ed-0f4fc96911d0" containerID="a40a3fd115d3ccfd5b689e5335ee7e4e79eb9a2029d882943c38c3ba63256f78" exitCode=0 Apr 24 16:58:28.044577 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:28.044426 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc" Apr 24 16:58:28.044577 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:28.044437 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc" event={"ID":"56429b96-efa3-4498-81ed-0f4fc96911d0","Type":"ContainerDied","Data":"a40a3fd115d3ccfd5b689e5335ee7e4e79eb9a2029d882943c38c3ba63256f78"} Apr 24 16:58:28.044577 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:28.044481 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc" event={"ID":"56429b96-efa3-4498-81ed-0f4fc96911d0","Type":"ContainerDied","Data":"c2b2d8cfd38db6446a37e3d4a89e0c1a411f923a0f4524956fea9dfe1293a6e2"} Apr 24 16:58:28.044577 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:28.044502 2579 scope.go:117] "RemoveContainer" containerID="a40a3fd115d3ccfd5b689e5335ee7e4e79eb9a2029d882943c38c3ba63256f78" Apr 24 16:58:28.054206 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:28.054191 2579 scope.go:117] "RemoveContainer" containerID="aad8f69599aa3dc450331a3e5a313cb8238184b55fd87cda9e9d32a2f6af321a" Apr 24 16:58:28.062224 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:28.062205 2579 scope.go:117] "RemoveContainer" containerID="472a6d3e46a926de701640169c33860b10ef5c25ec825d3d0240d4d58e10150d" Apr 24 16:58:28.069267 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:28.069244 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc"] Apr 24 16:58:28.071205 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:28.071185 2579 scope.go:117] "RemoveContainer" containerID="a40a3fd115d3ccfd5b689e5335ee7e4e79eb9a2029d882943c38c3ba63256f78" Apr 24 16:58:28.071519 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:58:28.071491 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a40a3fd115d3ccfd5b689e5335ee7e4e79eb9a2029d882943c38c3ba63256f78\": container with ID starting with a40a3fd115d3ccfd5b689e5335ee7e4e79eb9a2029d882943c38c3ba63256f78 not found: ID does not exist" containerID="a40a3fd115d3ccfd5b689e5335ee7e4e79eb9a2029d882943c38c3ba63256f78" Apr 24 16:58:28.071588 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:28.071530 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a40a3fd115d3ccfd5b689e5335ee7e4e79eb9a2029d882943c38c3ba63256f78"} err="failed to get container status \"a40a3fd115d3ccfd5b689e5335ee7e4e79eb9a2029d882943c38c3ba63256f78\": rpc error: code = NotFound desc = could not find container \"a40a3fd115d3ccfd5b689e5335ee7e4e79eb9a2029d882943c38c3ba63256f78\": container with ID starting with a40a3fd115d3ccfd5b689e5335ee7e4e79eb9a2029d882943c38c3ba63256f78 not found: ID does not exist" Apr 24 16:58:28.071588 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:28.071556 2579 scope.go:117] "RemoveContainer" containerID="aad8f69599aa3dc450331a3e5a313cb8238184b55fd87cda9e9d32a2f6af321a" Apr 24 16:58:28.071861 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:58:28.071842 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aad8f69599aa3dc450331a3e5a313cb8238184b55fd87cda9e9d32a2f6af321a\": container with ID starting with aad8f69599aa3dc450331a3e5a313cb8238184b55fd87cda9e9d32a2f6af321a not found: ID does not exist" containerID="aad8f69599aa3dc450331a3e5a313cb8238184b55fd87cda9e9d32a2f6af321a" Apr 24 16:58:28.071932 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:28.071870 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aad8f69599aa3dc450331a3e5a313cb8238184b55fd87cda9e9d32a2f6af321a"} err="failed to get container status \"aad8f69599aa3dc450331a3e5a313cb8238184b55fd87cda9e9d32a2f6af321a\": rpc error: code = NotFound desc = could not find container \"aad8f69599aa3dc450331a3e5a313cb8238184b55fd87cda9e9d32a2f6af321a\": container with ID starting with aad8f69599aa3dc450331a3e5a313cb8238184b55fd87cda9e9d32a2f6af321a not found: ID does not exist" Apr 24 16:58:28.071932 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:28.071886 2579 scope.go:117] "RemoveContainer" containerID="472a6d3e46a926de701640169c33860b10ef5c25ec825d3d0240d4d58e10150d" Apr 24 16:58:28.072170 ip-10-0-129-227 kubenswrapper[2579]: E0424 16:58:28.072147 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"472a6d3e46a926de701640169c33860b10ef5c25ec825d3d0240d4d58e10150d\": container with ID starting with 472a6d3e46a926de701640169c33860b10ef5c25ec825d3d0240d4d58e10150d not found: ID does not exist" containerID="472a6d3e46a926de701640169c33860b10ef5c25ec825d3d0240d4d58e10150d" Apr 24 16:58:28.072232 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:28.072182 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"472a6d3e46a926de701640169c33860b10ef5c25ec825d3d0240d4d58e10150d"} err="failed to get container status \"472a6d3e46a926de701640169c33860b10ef5c25ec825d3d0240d4d58e10150d\": rpc error: code = NotFound desc = could not find container \"472a6d3e46a926de701640169c33860b10ef5c25ec825d3d0240d4d58e10150d\": container with ID starting with 472a6d3e46a926de701640169c33860b10ef5c25ec825d3d0240d4d58e10150d not found: ID does not exist" Apr 24 16:58:28.073060 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:28.073043 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c866758fpwc"] Apr 24 16:58:28.619984 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:28.619951 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56429b96-efa3-4498-81ed-0f4fc96911d0" path="/var/lib/kubelet/pods/56429b96-efa3-4498-81ed-0f4fc96911d0/volumes" Apr 24 16:58:28.620414 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:28.620400 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86dd6637-985d-427f-a90b-51304537fd81" path="/var/lib/kubelet/pods/86dd6637-985d-427f-a90b-51304537fd81/volumes" Apr 24 16:58:31.834824 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:31.834791 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bd5b44b-z8kzp"] Apr 24 16:58:31.835192 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:31.835153 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="56429b96-efa3-4498-81ed-0f4fc96911d0" containerName="tokenizer" Apr 24 16:58:31.835192 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:31.835164 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="56429b96-efa3-4498-81ed-0f4fc96911d0" containerName="tokenizer" Apr 24 16:58:31.835192 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:31.835172 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="56429b96-efa3-4498-81ed-0f4fc96911d0" containerName="storage-initializer" Apr 24 16:58:31.835192 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:31.835178 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="56429b96-efa3-4498-81ed-0f4fc96911d0" containerName="storage-initializer" Apr 24 16:58:31.835192 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:31.835188 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="86dd6637-985d-427f-a90b-51304537fd81" containerName="main" Apr 24 16:58:31.835192 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:31.835193 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="86dd6637-985d-427f-a90b-51304537fd81" containerName="main" Apr 24 16:58:31.835393 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:31.835206 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="86dd6637-985d-427f-a90b-51304537fd81" containerName="storage-initializer" Apr 24 16:58:31.835393 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:31.835211 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="86dd6637-985d-427f-a90b-51304537fd81" containerName="storage-initializer" Apr 24 16:58:31.835393 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:31.835220 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="56429b96-efa3-4498-81ed-0f4fc96911d0" containerName="main" Apr 24 16:58:31.835393 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:31.835225 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="56429b96-efa3-4498-81ed-0f4fc96911d0" containerName="main" Apr 24 16:58:31.835393 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:31.835287 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="56429b96-efa3-4498-81ed-0f4fc96911d0" containerName="main" Apr 24 16:58:31.835393 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:31.835294 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="56429b96-efa3-4498-81ed-0f4fc96911d0" containerName="tokenizer" Apr 24 16:58:31.835393 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:31.835303 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="86dd6637-985d-427f-a90b-51304537fd81" containerName="main" Apr 24 16:58:31.840134 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:31.840114 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bd5b44b-z8kzp" Apr 24 16:58:31.843079 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:31.843056 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 16:58:31.843812 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:31.843794 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 16:58:31.843926 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:31.843797 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 24 16:58:31.843926 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:31.843895 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-q89jh\"" Apr 24 16:58:31.849007 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:31.848983 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bd5b44b-z8kzp"] Apr 24 16:58:31.935233 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:31.935165 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0bd785ff-756f-4061-bd83-cfbed3d6ed71-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-646bd5b44b-z8kzp\" (UID: \"0bd785ff-756f-4061-bd83-cfbed3d6ed71\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bd5b44b-z8kzp" Apr 24 16:58:31.935233 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:31.935195 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwjwp\" (UniqueName: \"kubernetes.io/projected/0bd785ff-756f-4061-bd83-cfbed3d6ed71-kube-api-access-bwjwp\") pod \"precise-prefix-cache-test-kserve-646bd5b44b-z8kzp\" (UID: \"0bd785ff-756f-4061-bd83-cfbed3d6ed71\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bd5b44b-z8kzp" Apr 24 16:58:31.935233 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:31.935228 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0bd785ff-756f-4061-bd83-cfbed3d6ed71-tmp-dir\") pod \"precise-prefix-cache-test-kserve-646bd5b44b-z8kzp\" (UID: \"0bd785ff-756f-4061-bd83-cfbed3d6ed71\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bd5b44b-z8kzp" Apr 24 16:58:31.935475 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:31.935360 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0bd785ff-756f-4061-bd83-cfbed3d6ed71-home\") pod \"precise-prefix-cache-test-kserve-646bd5b44b-z8kzp\" (UID: \"0bd785ff-756f-4061-bd83-cfbed3d6ed71\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bd5b44b-z8kzp" Apr 24 16:58:31.935475 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:31.935394 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0bd785ff-756f-4061-bd83-cfbed3d6ed71-dshm\") pod \"precise-prefix-cache-test-kserve-646bd5b44b-z8kzp\" (UID: \"0bd785ff-756f-4061-bd83-cfbed3d6ed71\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bd5b44b-z8kzp" Apr 24 16:58:31.935475 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:31.935423 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0bd785ff-756f-4061-bd83-cfbed3d6ed71-model-cache\") pod \"precise-prefix-cache-test-kserve-646bd5b44b-z8kzp\" (UID: \"0bd785ff-756f-4061-bd83-cfbed3d6ed71\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bd5b44b-z8kzp" Apr 24 16:58:31.935475 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:31.935459 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0bd785ff-756f-4061-bd83-cfbed3d6ed71-tls-certs\") pod \"precise-prefix-cache-test-kserve-646bd5b44b-z8kzp\" (UID: \"0bd785ff-756f-4061-bd83-cfbed3d6ed71\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bd5b44b-z8kzp" Apr 24 16:58:32.036535 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:32.036493 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0bd785ff-756f-4061-bd83-cfbed3d6ed71-tls-certs\") pod \"precise-prefix-cache-test-kserve-646bd5b44b-z8kzp\" (UID: \"0bd785ff-756f-4061-bd83-cfbed3d6ed71\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bd5b44b-z8kzp" Apr 24 16:58:32.036705 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:32.036553 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0bd785ff-756f-4061-bd83-cfbed3d6ed71-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-646bd5b44b-z8kzp\" (UID: \"0bd785ff-756f-4061-bd83-cfbed3d6ed71\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bd5b44b-z8kzp" Apr 24 16:58:32.036705 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:32.036585 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bwjwp\" (UniqueName: \"kubernetes.io/projected/0bd785ff-756f-4061-bd83-cfbed3d6ed71-kube-api-access-bwjwp\") pod \"precise-prefix-cache-test-kserve-646bd5b44b-z8kzp\" (UID: \"0bd785ff-756f-4061-bd83-cfbed3d6ed71\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bd5b44b-z8kzp" Apr 24 16:58:32.036705 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:32.036628 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0bd785ff-756f-4061-bd83-cfbed3d6ed71-tmp-dir\") pod \"precise-prefix-cache-test-kserve-646bd5b44b-z8kzp\" (UID: \"0bd785ff-756f-4061-bd83-cfbed3d6ed71\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bd5b44b-z8kzp" Apr 24 16:58:32.036705 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:32.036698 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0bd785ff-756f-4061-bd83-cfbed3d6ed71-home\") pod \"precise-prefix-cache-test-kserve-646bd5b44b-z8kzp\" (UID: \"0bd785ff-756f-4061-bd83-cfbed3d6ed71\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bd5b44b-z8kzp" Apr 24 16:58:32.037001 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:32.036722 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0bd785ff-756f-4061-bd83-cfbed3d6ed71-dshm\") pod \"precise-prefix-cache-test-kserve-646bd5b44b-z8kzp\" (UID: \"0bd785ff-756f-4061-bd83-cfbed3d6ed71\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bd5b44b-z8kzp" Apr 24 16:58:32.037001 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:32.036776 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0bd785ff-756f-4061-bd83-cfbed3d6ed71-model-cache\") pod \"precise-prefix-cache-test-kserve-646bd5b44b-z8kzp\" (UID: \"0bd785ff-756f-4061-bd83-cfbed3d6ed71\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bd5b44b-z8kzp" Apr 24 16:58:32.037119 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:32.037046 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0bd785ff-756f-4061-bd83-cfbed3d6ed71-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-646bd5b44b-z8kzp\" (UID: \"0bd785ff-756f-4061-bd83-cfbed3d6ed71\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bd5b44b-z8kzp" Apr 24 16:58:32.037119 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:32.037067 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0bd785ff-756f-4061-bd83-cfbed3d6ed71-home\") pod \"precise-prefix-cache-test-kserve-646bd5b44b-z8kzp\" (UID: \"0bd785ff-756f-4061-bd83-cfbed3d6ed71\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bd5b44b-z8kzp" Apr 24 16:58:32.037208 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:32.037131 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0bd785ff-756f-4061-bd83-cfbed3d6ed71-model-cache\") pod \"precise-prefix-cache-test-kserve-646bd5b44b-z8kzp\" (UID: \"0bd785ff-756f-4061-bd83-cfbed3d6ed71\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bd5b44b-z8kzp" Apr 24 16:58:32.037208 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:32.037157 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0bd785ff-756f-4061-bd83-cfbed3d6ed71-tmp-dir\") pod \"precise-prefix-cache-test-kserve-646bd5b44b-z8kzp\" (UID: \"0bd785ff-756f-4061-bd83-cfbed3d6ed71\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bd5b44b-z8kzp" Apr 24 16:58:32.039052 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:32.039027 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0bd785ff-756f-4061-bd83-cfbed3d6ed71-dshm\") pod \"precise-prefix-cache-test-kserve-646bd5b44b-z8kzp\" (UID: \"0bd785ff-756f-4061-bd83-cfbed3d6ed71\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bd5b44b-z8kzp" Apr 24 16:58:32.039355 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:32.039336 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0bd785ff-756f-4061-bd83-cfbed3d6ed71-tls-certs\") pod \"precise-prefix-cache-test-kserve-646bd5b44b-z8kzp\" (UID: \"0bd785ff-756f-4061-bd83-cfbed3d6ed71\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bd5b44b-z8kzp" Apr 24 16:58:32.045980 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:32.045952 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwjwp\" (UniqueName: \"kubernetes.io/projected/0bd785ff-756f-4061-bd83-cfbed3d6ed71-kube-api-access-bwjwp\") pod \"precise-prefix-cache-test-kserve-646bd5b44b-z8kzp\" (UID: \"0bd785ff-756f-4061-bd83-cfbed3d6ed71\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bd5b44b-z8kzp" Apr 24 16:58:32.152248 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:32.152160 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bd5b44b-z8kzp" Apr 24 16:58:32.241979 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:32.241950 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb"] Apr 24 16:58:32.248276 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:32.248252 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb" Apr 24 16:58:32.253952 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:32.250993 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-epp-sa-dockercfg-x29jm\"" Apr 24 16:58:32.257986 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:32.257944 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb"] Apr 24 16:58:32.290286 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:32.290261 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bd5b44b-z8kzp"] Apr 24 16:58:32.291477 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:58:32.291451 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bd785ff_756f_4061_bd83_cfbed3d6ed71.slice/crio-145b2a598373fd74692ecf598c9c48c7a898d1402173702b9755ea56d56505e8 WatchSource:0}: Error finding container 145b2a598373fd74692ecf598c9c48c7a898d1402173702b9755ea56d56505e8: Status 404 returned error can't find the container with id 145b2a598373fd74692ecf598c9c48c7a898d1402173702b9755ea56d56505e8 Apr 24 16:58:32.339160 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:32.339135 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6135b2a3-054c-4b08-8740-d23956b9d48e-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb\" (UID: \"6135b2a3-054c-4b08-8740-d23956b9d48e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb" Apr 24 16:58:32.339267 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:32.339183 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6135b2a3-054c-4b08-8740-d23956b9d48e-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb\" (UID: \"6135b2a3-054c-4b08-8740-d23956b9d48e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb" Apr 24 16:58:32.339267 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:32.339211 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6135b2a3-054c-4b08-8740-d23956b9d48e-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb\" (UID: \"6135b2a3-054c-4b08-8740-d23956b9d48e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb" Apr 24 16:58:32.339267 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:32.339265 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrgbc\" (UniqueName: \"kubernetes.io/projected/6135b2a3-054c-4b08-8740-d23956b9d48e-kube-api-access-rrgbc\") pod \"precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb\" (UID: \"6135b2a3-054c-4b08-8740-d23956b9d48e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb" Apr 24 16:58:32.339393 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:32.339285 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6135b2a3-054c-4b08-8740-d23956b9d48e-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb\" (UID: \"6135b2a3-054c-4b08-8740-d23956b9d48e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb" Apr 24 16:58:32.339393 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:32.339338 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6135b2a3-054c-4b08-8740-d23956b9d48e-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb\" (UID: \"6135b2a3-054c-4b08-8740-d23956b9d48e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb" Apr 24 16:58:32.440896 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:32.440808 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rrgbc\" (UniqueName: \"kubernetes.io/projected/6135b2a3-054c-4b08-8740-d23956b9d48e-kube-api-access-rrgbc\") pod \"precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb\" (UID: \"6135b2a3-054c-4b08-8740-d23956b9d48e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb" Apr 24 16:58:32.440896 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:32.440853 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6135b2a3-054c-4b08-8740-d23956b9d48e-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb\" (UID: \"6135b2a3-054c-4b08-8740-d23956b9d48e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb" Apr 24 16:58:32.440896 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:32.440890 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6135b2a3-054c-4b08-8740-d23956b9d48e-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb\" (UID: \"6135b2a3-054c-4b08-8740-d23956b9d48e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb" Apr 24 16:58:32.441147 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:32.440943 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6135b2a3-054c-4b08-8740-d23956b9d48e-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb\" (UID: \"6135b2a3-054c-4b08-8740-d23956b9d48e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb" Apr 24 16:58:32.441147 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:32.440999 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6135b2a3-054c-4b08-8740-d23956b9d48e-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb\" (UID: \"6135b2a3-054c-4b08-8740-d23956b9d48e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb" Apr 24 16:58:32.441259 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:32.441156 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6135b2a3-054c-4b08-8740-d23956b9d48e-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb\" (UID: \"6135b2a3-054c-4b08-8740-d23956b9d48e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb" Apr 24 16:58:32.441311 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:32.441287 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6135b2a3-054c-4b08-8740-d23956b9d48e-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb\" (UID: \"6135b2a3-054c-4b08-8740-d23956b9d48e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb" Apr 24 16:58:32.441364 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:32.441314 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6135b2a3-054c-4b08-8740-d23956b9d48e-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb\" (UID: \"6135b2a3-054c-4b08-8740-d23956b9d48e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb" Apr 24 16:58:32.441421 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:32.441372 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6135b2a3-054c-4b08-8740-d23956b9d48e-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb\" (UID: \"6135b2a3-054c-4b08-8740-d23956b9d48e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb" Apr 24 16:58:32.441458 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:32.441429 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6135b2a3-054c-4b08-8740-d23956b9d48e-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb\" (UID: \"6135b2a3-054c-4b08-8740-d23956b9d48e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb" Apr 24 16:58:32.443580 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:32.443564 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6135b2a3-054c-4b08-8740-d23956b9d48e-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb\" (UID: \"6135b2a3-054c-4b08-8740-d23956b9d48e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb" Apr 24 16:58:32.449479 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:32.449454 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrgbc\" (UniqueName: \"kubernetes.io/projected/6135b2a3-054c-4b08-8740-d23956b9d48e-kube-api-access-rrgbc\") pod \"precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb\" (UID: \"6135b2a3-054c-4b08-8740-d23956b9d48e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb" Apr 24 16:58:32.562911 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:32.562875 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb" Apr 24 16:58:32.704761 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:32.704721 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb"] Apr 24 16:58:32.706383 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:58:32.706356 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6135b2a3_054c_4b08_8740_d23956b9d48e.slice/crio-0b55b6cecc3aecafabceda05e263c910638d4d5ffeb53f53c59f34d15b7bac79 WatchSource:0}: Error finding container 0b55b6cecc3aecafabceda05e263c910638d4d5ffeb53f53c59f34d15b7bac79: Status 404 returned error can't find the container with id 0b55b6cecc3aecafabceda05e263c910638d4d5ffeb53f53c59f34d15b7bac79 Apr 24 16:58:33.069038 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:33.068997 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb" event={"ID":"6135b2a3-054c-4b08-8740-d23956b9d48e","Type":"ContainerStarted","Data":"ffc8addeb9d5204b49bc3b6f90fb128fc2d851124ffbff939a417db553adbb43"} Apr 24 16:58:33.069482 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:33.069044 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb" event={"ID":"6135b2a3-054c-4b08-8740-d23956b9d48e","Type":"ContainerStarted","Data":"0b55b6cecc3aecafabceda05e263c910638d4d5ffeb53f53c59f34d15b7bac79"} Apr 24 16:58:33.070587 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:33.070560 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bd5b44b-z8kzp" event={"ID":"0bd785ff-756f-4061-bd83-cfbed3d6ed71","Type":"ContainerStarted","Data":"dfa10174125f8b5695f2c23043e3b914766161d7163d5caec16924c2b1fd7e6a"} Apr 24 16:58:33.070697 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:33.070593 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bd5b44b-z8kzp" event={"ID":"0bd785ff-756f-4061-bd83-cfbed3d6ed71","Type":"ContainerStarted","Data":"145b2a598373fd74692ecf598c9c48c7a898d1402173702b9755ea56d56505e8"} Apr 24 16:58:34.076521 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:34.076476 2579 generic.go:358] "Generic (PLEG): container finished" podID="6135b2a3-054c-4b08-8740-d23956b9d48e" containerID="ffc8addeb9d5204b49bc3b6f90fb128fc2d851124ffbff939a417db553adbb43" exitCode=0 Apr 24 16:58:34.076940 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:34.076531 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb" event={"ID":"6135b2a3-054c-4b08-8740-d23956b9d48e","Type":"ContainerDied","Data":"ffc8addeb9d5204b49bc3b6f90fb128fc2d851124ffbff939a417db553adbb43"} Apr 24 16:58:35.083067 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:35.083028 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb" event={"ID":"6135b2a3-054c-4b08-8740-d23956b9d48e","Type":"ContainerStarted","Data":"d3a504020ab5015808ff038e4b5416b6ef162051651a38e812c2073a0a02a154"} Apr 24 16:58:35.083067 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:35.083068 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb" event={"ID":"6135b2a3-054c-4b08-8740-d23956b9d48e","Type":"ContainerStarted","Data":"6c1eca0a01cead1d903241c8697df0eaf75a441ac73e3de639cde9be57a85ee3"} Apr 24 16:58:35.083541 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:35.083247 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb" Apr 24 16:58:35.105123 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:35.105073 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb" podStartSLOduration=3.105058371 podStartE2EDuration="3.105058371s" podCreationTimestamp="2026-04-24 16:58:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:58:35.102715351 +0000 UTC m=+1151.035784549" watchObservedRunningTime="2026-04-24 16:58:35.105058371 +0000 UTC m=+1151.038127570" Apr 24 16:58:37.095340 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:37.095307 2579 generic.go:358] "Generic (PLEG): container finished" podID="0bd785ff-756f-4061-bd83-cfbed3d6ed71" containerID="dfa10174125f8b5695f2c23043e3b914766161d7163d5caec16924c2b1fd7e6a" exitCode=0 Apr 24 16:58:37.095763 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:37.095382 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bd5b44b-z8kzp" event={"ID":"0bd785ff-756f-4061-bd83-cfbed3d6ed71","Type":"ContainerDied","Data":"dfa10174125f8b5695f2c23043e3b914766161d7163d5caec16924c2b1fd7e6a"} Apr 24 16:58:38.102025 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:38.101988 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bd5b44b-z8kzp" event={"ID":"0bd785ff-756f-4061-bd83-cfbed3d6ed71","Type":"ContainerStarted","Data":"22fc592d6993e2caa43ed751be13d554bd568e1b084d55605c377b44cd3f30ba"} Apr 24 16:58:38.140718 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:38.140669 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bd5b44b-z8kzp" podStartSLOduration=7.140656126 podStartE2EDuration="7.140656126s" podCreationTimestamp="2026-04-24 16:58:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:58:38.138774383 +0000 UTC m=+1154.071843583" watchObservedRunningTime="2026-04-24 16:58:38.140656126 +0000 UTC m=+1154.073725324" Apr 24 16:58:42.153090 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:42.153051 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bd5b44b-z8kzp" Apr 24 16:58:42.153585 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:42.153133 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bd5b44b-z8kzp" Apr 24 16:58:42.165912 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:42.165882 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bd5b44b-z8kzp" Apr 24 16:58:42.563551 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:42.563503 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb" Apr 24 16:58:42.563551 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:42.563540 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb" Apr 24 16:58:42.565015 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:58:42.564993 2579 logging.go:55] [core] [Channel #56 SubChannel #57]grpc: addrConn.createTransport failed to connect to {Addr: "10.134.0.52:9003", ServerName: "10.134.0.52:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.134.0.52:9003: connect: connection refused" Apr 24 16:58:42.566347 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:42.566319 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb" Apr 24 16:58:43.127013 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:43.126986 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb" Apr 24 16:58:43.137348 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:43.137320 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bd5b44b-z8kzp" Apr 24 16:58:43.564111 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:43.564062 2579 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb" podUID="6135b2a3-054c-4b08-8740-d23956b9d48e" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.134.0.52:9003\" within 1s: context deadline exceeded" Apr 24 16:58:45.134764 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:45.134713 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb_6135b2a3-054c-4b08-8740-d23956b9d48e/main/0.log" Apr 24 16:58:45.135166 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:45.135065 2579 generic.go:358] "Generic (PLEG): container finished" podID="6135b2a3-054c-4b08-8740-d23956b9d48e" containerID="6c1eca0a01cead1d903241c8697df0eaf75a441ac73e3de639cde9be57a85ee3" exitCode=1 Apr 24 16:58:45.135166 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:45.135133 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb" event={"ID":"6135b2a3-054c-4b08-8740-d23956b9d48e","Type":"ContainerDied","Data":"6c1eca0a01cead1d903241c8697df0eaf75a441ac73e3de639cde9be57a85ee3"} Apr 24 16:58:45.135700 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:45.135684 2579 scope.go:117] "RemoveContainer" containerID="6c1eca0a01cead1d903241c8697df0eaf75a441ac73e3de639cde9be57a85ee3" Apr 24 16:58:46.144276 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:46.144251 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb_6135b2a3-054c-4b08-8740-d23956b9d48e/main/0.log" Apr 24 16:58:46.144692 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:46.144616 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb" event={"ID":"6135b2a3-054c-4b08-8740-d23956b9d48e","Type":"ContainerStarted","Data":"a35a02f222d3815b9ca0b9075b79fd8c77b2944be1eb067920cd6965f51547ee"} Apr 24 16:58:46.144937 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:46.144918 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb" Apr 24 16:58:52.564395 ip-10-0-129-227 kubenswrapper[2579]: W0424 16:58:52.564364 2579 logging.go:55] [core] [Channel #58 SubChannel #59]grpc: addrConn.createTransport failed to connect to {Addr: "10.134.0.52:9003", ServerName: "10.134.0.52:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.134.0.52:9003: connect: connection refused" Apr 24 16:58:53.564466 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:58:53.564417 2579 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb" podUID="6135b2a3-054c-4b08-8740-d23956b9d48e" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.134.0.52:9003\" within 1s: context deadline exceeded" Apr 24 16:59:17.151358 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:59:17.151329 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb" Apr 24 16:59:24.617268 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:59:24.617231 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb_6135b2a3-054c-4b08-8740-d23956b9d48e/main/0.log" Apr 24 16:59:24.620760 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:59:24.620719 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb_6135b2a3-054c-4b08-8740-d23956b9d48e/main/0.log" Apr 24 16:59:24.636660 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:59:24.636636 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-52mlp_9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80/console-operator/2.log" Apr 24 16:59:24.639504 ip-10-0-129-227 kubenswrapper[2579]: I0424 16:59:24.639479 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-52mlp_9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80/console-operator/2.log" Apr 24 17:04:24.654823 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:04:24.654795 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb_6135b2a3-054c-4b08-8740-d23956b9d48e/main/0.log" Apr 24 17:04:24.660466 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:04:24.660443 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb_6135b2a3-054c-4b08-8740-d23956b9d48e/main/0.log" Apr 24 17:04:24.672210 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:04:24.672187 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-52mlp_9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80/console-operator/2.log" Apr 24 17:04:24.676703 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:04:24.676673 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-52mlp_9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80/console-operator/2.log" Apr 24 17:09:24.687220 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:09:24.687190 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb_6135b2a3-054c-4b08-8740-d23956b9d48e/main/0.log" Apr 24 17:09:24.695610 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:09:24.695587 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb_6135b2a3-054c-4b08-8740-d23956b9d48e/main/0.log" Apr 24 17:09:24.709180 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:09:24.709153 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-52mlp_9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80/console-operator/2.log" Apr 24 17:09:24.717505 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:09:24.717486 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-52mlp_9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80/console-operator/2.log" Apr 24 17:13:07.931691 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:07.931655 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bd5b44b-z8kzp"] Apr 24 17:13:07.932303 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:07.931966 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bd5b44b-z8kzp" podUID="0bd785ff-756f-4061-bd83-cfbed3d6ed71" containerName="main" containerID="cri-o://22fc592d6993e2caa43ed751be13d554bd568e1b084d55605c377b44cd3f30ba" gracePeriod=30 Apr 24 17:13:07.940169 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:07.940141 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb"] Apr 24 17:13:08.185444 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:08.185385 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bd5b44b-z8kzp" Apr 24 17:13:08.271668 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:08.271636 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0bd785ff-756f-4061-bd83-cfbed3d6ed71-tmp-dir\") pod \"0bd785ff-756f-4061-bd83-cfbed3d6ed71\" (UID: \"0bd785ff-756f-4061-bd83-cfbed3d6ed71\") " Apr 24 17:13:08.271875 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:08.271678 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0bd785ff-756f-4061-bd83-cfbed3d6ed71-home\") pod \"0bd785ff-756f-4061-bd83-cfbed3d6ed71\" (UID: \"0bd785ff-756f-4061-bd83-cfbed3d6ed71\") " Apr 24 17:13:08.271875 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:08.271705 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwjwp\" (UniqueName: \"kubernetes.io/projected/0bd785ff-756f-4061-bd83-cfbed3d6ed71-kube-api-access-bwjwp\") pod \"0bd785ff-756f-4061-bd83-cfbed3d6ed71\" (UID: \"0bd785ff-756f-4061-bd83-cfbed3d6ed71\") " Apr 24 17:13:08.271875 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:08.271768 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0bd785ff-756f-4061-bd83-cfbed3d6ed71-tls-certs\") pod \"0bd785ff-756f-4061-bd83-cfbed3d6ed71\" (UID: \"0bd785ff-756f-4061-bd83-cfbed3d6ed71\") " Apr 24 17:13:08.271875 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:08.271804 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0bd785ff-756f-4061-bd83-cfbed3d6ed71-kserve-provision-location\") pod \"0bd785ff-756f-4061-bd83-cfbed3d6ed71\" (UID: \"0bd785ff-756f-4061-bd83-cfbed3d6ed71\") " Apr 24 17:13:08.271875 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:08.271842 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0bd785ff-756f-4061-bd83-cfbed3d6ed71-dshm\") pod \"0bd785ff-756f-4061-bd83-cfbed3d6ed71\" (UID: \"0bd785ff-756f-4061-bd83-cfbed3d6ed71\") " Apr 24 17:13:08.272153 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:08.271955 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bd785ff-756f-4061-bd83-cfbed3d6ed71-home" (OuterVolumeSpecName: "home") pod "0bd785ff-756f-4061-bd83-cfbed3d6ed71" (UID: "0bd785ff-756f-4061-bd83-cfbed3d6ed71"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:13:08.272153 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:08.271965 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bd785ff-756f-4061-bd83-cfbed3d6ed71-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "0bd785ff-756f-4061-bd83-cfbed3d6ed71" (UID: "0bd785ff-756f-4061-bd83-cfbed3d6ed71"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:13:08.272153 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:08.272149 2579 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0bd785ff-756f-4061-bd83-cfbed3d6ed71-tmp-dir\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 17:13:08.272299 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:08.272166 2579 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0bd785ff-756f-4061-bd83-cfbed3d6ed71-home\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 17:13:08.274171 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:08.274145 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bd785ff-756f-4061-bd83-cfbed3d6ed71-dshm" (OuterVolumeSpecName: "dshm") pod "0bd785ff-756f-4061-bd83-cfbed3d6ed71" (UID: "0bd785ff-756f-4061-bd83-cfbed3d6ed71"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:13:08.274290 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:08.274185 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bd785ff-756f-4061-bd83-cfbed3d6ed71-kube-api-access-bwjwp" (OuterVolumeSpecName: "kube-api-access-bwjwp") pod "0bd785ff-756f-4061-bd83-cfbed3d6ed71" (UID: "0bd785ff-756f-4061-bd83-cfbed3d6ed71"). InnerVolumeSpecName "kube-api-access-bwjwp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 17:13:08.274290 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:08.274193 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bd785ff-756f-4061-bd83-cfbed3d6ed71-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "0bd785ff-756f-4061-bd83-cfbed3d6ed71" (UID: "0bd785ff-756f-4061-bd83-cfbed3d6ed71"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:13:08.327992 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:08.327943 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bd785ff-756f-4061-bd83-cfbed3d6ed71-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0bd785ff-756f-4061-bd83-cfbed3d6ed71" (UID: "0bd785ff-756f-4061-bd83-cfbed3d6ed71"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:13:08.372463 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:08.372421 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0bd785ff-756f-4061-bd83-cfbed3d6ed71-model-cache\") pod \"0bd785ff-756f-4061-bd83-cfbed3d6ed71\" (UID: \"0bd785ff-756f-4061-bd83-cfbed3d6ed71\") " Apr 24 17:13:08.372684 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:08.372664 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0bd785ff-756f-4061-bd83-cfbed3d6ed71-kserve-provision-location\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 17:13:08.372684 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:08.372672 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bd785ff-756f-4061-bd83-cfbed3d6ed71-model-cache" (OuterVolumeSpecName: "model-cache") pod "0bd785ff-756f-4061-bd83-cfbed3d6ed71" (UID: "0bd785ff-756f-4061-bd83-cfbed3d6ed71"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:13:08.372833 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:08.372690 2579 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0bd785ff-756f-4061-bd83-cfbed3d6ed71-dshm\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 17:13:08.372833 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:08.372706 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bwjwp\" (UniqueName: \"kubernetes.io/projected/0bd785ff-756f-4061-bd83-cfbed3d6ed71-kube-api-access-bwjwp\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 17:13:08.372833 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:08.372721 2579 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0bd785ff-756f-4061-bd83-cfbed3d6ed71-tls-certs\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 17:13:08.473183 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:08.473100 2579 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0bd785ff-756f-4061-bd83-cfbed3d6ed71-model-cache\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 17:13:08.594156 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:08.594122 2579 generic.go:358] "Generic (PLEG): container finished" podID="0bd785ff-756f-4061-bd83-cfbed3d6ed71" containerID="22fc592d6993e2caa43ed751be13d554bd568e1b084d55605c377b44cd3f30ba" exitCode=0 Apr 24 17:13:08.594347 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:08.594245 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bd5b44b-z8kzp" Apr 24 17:13:08.594347 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:08.594272 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bd5b44b-z8kzp" event={"ID":"0bd785ff-756f-4061-bd83-cfbed3d6ed71","Type":"ContainerDied","Data":"22fc592d6993e2caa43ed751be13d554bd568e1b084d55605c377b44cd3f30ba"} Apr 24 17:13:08.594347 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:08.594320 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bd5b44b-z8kzp" event={"ID":"0bd785ff-756f-4061-bd83-cfbed3d6ed71","Type":"ContainerDied","Data":"145b2a598373fd74692ecf598c9c48c7a898d1402173702b9755ea56d56505e8"} Apr 24 17:13:08.594347 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:08.594339 2579 scope.go:117] "RemoveContainer" containerID="22fc592d6993e2caa43ed751be13d554bd568e1b084d55605c377b44cd3f30ba" Apr 24 17:13:08.594710 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:08.594651 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb" podUID="6135b2a3-054c-4b08-8740-d23956b9d48e" containerName="tokenizer" containerID="cri-o://d3a504020ab5015808ff038e4b5416b6ef162051651a38e812c2073a0a02a154" gracePeriod=30 Apr 24 17:13:08.594872 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:08.594685 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb" podUID="6135b2a3-054c-4b08-8740-d23956b9d48e" containerName="main" containerID="cri-o://a35a02f222d3815b9ca0b9075b79fd8c77b2944be1eb067920cd6965f51547ee" gracePeriod=30 Apr 24 17:13:08.610368 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:08.610343 2579 scope.go:117] "RemoveContainer" containerID="dfa10174125f8b5695f2c23043e3b914766161d7163d5caec16924c2b1fd7e6a" Apr 24 17:13:08.622769 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:08.622725 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bd5b44b-z8kzp"] Apr 24 17:13:08.627001 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:08.626980 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bd5b44b-z8kzp"] Apr 24 17:13:08.760932 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:08.760910 2579 scope.go:117] "RemoveContainer" containerID="22fc592d6993e2caa43ed751be13d554bd568e1b084d55605c377b44cd3f30ba" Apr 24 17:13:08.761284 ip-10-0-129-227 kubenswrapper[2579]: E0424 17:13:08.761253 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22fc592d6993e2caa43ed751be13d554bd568e1b084d55605c377b44cd3f30ba\": container with ID starting with 22fc592d6993e2caa43ed751be13d554bd568e1b084d55605c377b44cd3f30ba not found: ID does not exist" containerID="22fc592d6993e2caa43ed751be13d554bd568e1b084d55605c377b44cd3f30ba" Apr 24 17:13:08.761456 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:08.761287 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22fc592d6993e2caa43ed751be13d554bd568e1b084d55605c377b44cd3f30ba"} err="failed to get container status \"22fc592d6993e2caa43ed751be13d554bd568e1b084d55605c377b44cd3f30ba\": rpc error: code = NotFound desc = could not find container \"22fc592d6993e2caa43ed751be13d554bd568e1b084d55605c377b44cd3f30ba\": container with ID starting with 22fc592d6993e2caa43ed751be13d554bd568e1b084d55605c377b44cd3f30ba not found: ID does not exist" Apr 24 17:13:08.761456 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:08.761312 2579 scope.go:117] "RemoveContainer" containerID="dfa10174125f8b5695f2c23043e3b914766161d7163d5caec16924c2b1fd7e6a" Apr 24 17:13:08.761621 ip-10-0-129-227 kubenswrapper[2579]: E0424 17:13:08.761599 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfa10174125f8b5695f2c23043e3b914766161d7163d5caec16924c2b1fd7e6a\": container with ID starting with dfa10174125f8b5695f2c23043e3b914766161d7163d5caec16924c2b1fd7e6a not found: ID does not exist" containerID="dfa10174125f8b5695f2c23043e3b914766161d7163d5caec16924c2b1fd7e6a" Apr 24 17:13:08.761667 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:08.761628 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfa10174125f8b5695f2c23043e3b914766161d7163d5caec16924c2b1fd7e6a"} err="failed to get container status \"dfa10174125f8b5695f2c23043e3b914766161d7163d5caec16924c2b1fd7e6a\": rpc error: code = NotFound desc = could not find container \"dfa10174125f8b5695f2c23043e3b914766161d7163d5caec16924c2b1fd7e6a\": container with ID starting with dfa10174125f8b5695f2c23043e3b914766161d7163d5caec16924c2b1fd7e6a not found: ID does not exist" Apr 24 17:13:09.600655 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:09.600629 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb_6135b2a3-054c-4b08-8740-d23956b9d48e/main/0.log" Apr 24 17:13:09.601102 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:09.601004 2579 generic.go:358] "Generic (PLEG): container finished" podID="6135b2a3-054c-4b08-8740-d23956b9d48e" containerID="a35a02f222d3815b9ca0b9075b79fd8c77b2944be1eb067920cd6965f51547ee" exitCode=0 Apr 24 17:13:09.601102 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:09.601068 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb" event={"ID":"6135b2a3-054c-4b08-8740-d23956b9d48e","Type":"ContainerDied","Data":"a35a02f222d3815b9ca0b9075b79fd8c77b2944be1eb067920cd6965f51547ee"} Apr 24 17:13:09.601224 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:09.601115 2579 scope.go:117] "RemoveContainer" containerID="6c1eca0a01cead1d903241c8697df0eaf75a441ac73e3de639cde9be57a85ee3" Apr 24 17:13:10.046923 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:10.046903 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb" Apr 24 17:13:10.085579 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:10.085552 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6135b2a3-054c-4b08-8740-d23956b9d48e-tokenizer-uds\") pod \"6135b2a3-054c-4b08-8740-d23956b9d48e\" (UID: \"6135b2a3-054c-4b08-8740-d23956b9d48e\") " Apr 24 17:13:10.085724 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:10.085626 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6135b2a3-054c-4b08-8740-d23956b9d48e-kserve-provision-location\") pod \"6135b2a3-054c-4b08-8740-d23956b9d48e\" (UID: \"6135b2a3-054c-4b08-8740-d23956b9d48e\") " Apr 24 17:13:10.085724 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:10.085670 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6135b2a3-054c-4b08-8740-d23956b9d48e-tokenizer-cache\") pod \"6135b2a3-054c-4b08-8740-d23956b9d48e\" (UID: \"6135b2a3-054c-4b08-8740-d23956b9d48e\") " Apr 24 17:13:10.085724 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:10.085715 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrgbc\" (UniqueName: \"kubernetes.io/projected/6135b2a3-054c-4b08-8740-d23956b9d48e-kube-api-access-rrgbc\") pod \"6135b2a3-054c-4b08-8740-d23956b9d48e\" (UID: \"6135b2a3-054c-4b08-8740-d23956b9d48e\") " Apr 24 17:13:10.085880 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:10.085762 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6135b2a3-054c-4b08-8740-d23956b9d48e-tokenizer-tmp\") pod \"6135b2a3-054c-4b08-8740-d23956b9d48e\" (UID: \"6135b2a3-054c-4b08-8740-d23956b9d48e\") " Apr 24 17:13:10.085880 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:10.085801 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6135b2a3-054c-4b08-8740-d23956b9d48e-tls-certs\") pod \"6135b2a3-054c-4b08-8740-d23956b9d48e\" (UID: \"6135b2a3-054c-4b08-8740-d23956b9d48e\") " Apr 24 17:13:10.085880 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:10.085813 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6135b2a3-054c-4b08-8740-d23956b9d48e-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "6135b2a3-054c-4b08-8740-d23956b9d48e" (UID: "6135b2a3-054c-4b08-8740-d23956b9d48e"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:13:10.086078 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:10.086045 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6135b2a3-054c-4b08-8740-d23956b9d48e-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "6135b2a3-054c-4b08-8740-d23956b9d48e" (UID: "6135b2a3-054c-4b08-8740-d23956b9d48e"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:13:10.086078 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:10.086051 2579 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6135b2a3-054c-4b08-8740-d23956b9d48e-tokenizer-uds\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 17:13:10.086225 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:10.086102 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6135b2a3-054c-4b08-8740-d23956b9d48e-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "6135b2a3-054c-4b08-8740-d23956b9d48e" (UID: "6135b2a3-054c-4b08-8740-d23956b9d48e"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:13:10.086549 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:10.086471 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6135b2a3-054c-4b08-8740-d23956b9d48e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6135b2a3-054c-4b08-8740-d23956b9d48e" (UID: "6135b2a3-054c-4b08-8740-d23956b9d48e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:13:10.088033 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:10.088007 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6135b2a3-054c-4b08-8740-d23956b9d48e-kube-api-access-rrgbc" (OuterVolumeSpecName: "kube-api-access-rrgbc") pod "6135b2a3-054c-4b08-8740-d23956b9d48e" (UID: "6135b2a3-054c-4b08-8740-d23956b9d48e"). InnerVolumeSpecName "kube-api-access-rrgbc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 17:13:10.088202 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:10.088186 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6135b2a3-054c-4b08-8740-d23956b9d48e-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "6135b2a3-054c-4b08-8740-d23956b9d48e" (UID: "6135b2a3-054c-4b08-8740-d23956b9d48e"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:13:10.186470 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:10.186396 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rrgbc\" (UniqueName: \"kubernetes.io/projected/6135b2a3-054c-4b08-8740-d23956b9d48e-kube-api-access-rrgbc\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 17:13:10.186470 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:10.186423 2579 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6135b2a3-054c-4b08-8740-d23956b9d48e-tokenizer-tmp\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 17:13:10.186470 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:10.186434 2579 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6135b2a3-054c-4b08-8740-d23956b9d48e-tls-certs\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 17:13:10.186470 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:10.186442 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6135b2a3-054c-4b08-8740-d23956b9d48e-kserve-provision-location\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 17:13:10.186470 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:10.186451 2579 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6135b2a3-054c-4b08-8740-d23956b9d48e-tokenizer-cache\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 17:13:10.607594 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:10.607559 2579 generic.go:358] "Generic (PLEG): container finished" podID="6135b2a3-054c-4b08-8740-d23956b9d48e" containerID="d3a504020ab5015808ff038e4b5416b6ef162051651a38e812c2073a0a02a154" exitCode=0 Apr 24 17:13:10.608056 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:10.607637 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb" Apr 24 17:13:10.608056 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:10.607642 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb" event={"ID":"6135b2a3-054c-4b08-8740-d23956b9d48e","Type":"ContainerDied","Data":"d3a504020ab5015808ff038e4b5416b6ef162051651a38e812c2073a0a02a154"} Apr 24 17:13:10.608056 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:10.607684 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb" event={"ID":"6135b2a3-054c-4b08-8740-d23956b9d48e","Type":"ContainerDied","Data":"0b55b6cecc3aecafabceda05e263c910638d4d5ffeb53f53c59f34d15b7bac79"} Apr 24 17:13:10.608056 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:10.607701 2579 scope.go:117] "RemoveContainer" containerID="a35a02f222d3815b9ca0b9075b79fd8c77b2944be1eb067920cd6965f51547ee" Apr 24 17:13:10.617543 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:10.617529 2579 scope.go:117] "RemoveContainer" containerID="d3a504020ab5015808ff038e4b5416b6ef162051651a38e812c2073a0a02a154" Apr 24 17:13:10.620669 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:10.620648 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bd785ff-756f-4061-bd83-cfbed3d6ed71" path="/var/lib/kubelet/pods/0bd785ff-756f-4061-bd83-cfbed3d6ed71/volumes" Apr 24 17:13:10.626607 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:10.626591 2579 scope.go:117] "RemoveContainer" containerID="ffc8addeb9d5204b49bc3b6f90fb128fc2d851124ffbff939a417db553adbb43" Apr 24 17:13:10.631792 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:10.631769 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb"] Apr 24 17:13:10.635245 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:10.635208 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-66dc89cdnmrwb"] Apr 24 17:13:10.635606 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:10.635594 2579 scope.go:117] "RemoveContainer" containerID="a35a02f222d3815b9ca0b9075b79fd8c77b2944be1eb067920cd6965f51547ee" Apr 24 17:13:10.635989 ip-10-0-129-227 kubenswrapper[2579]: E0424 17:13:10.635962 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a35a02f222d3815b9ca0b9075b79fd8c77b2944be1eb067920cd6965f51547ee\": container with ID starting with a35a02f222d3815b9ca0b9075b79fd8c77b2944be1eb067920cd6965f51547ee not found: ID does not exist" containerID="a35a02f222d3815b9ca0b9075b79fd8c77b2944be1eb067920cd6965f51547ee" Apr 24 17:13:10.636044 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:10.635996 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a35a02f222d3815b9ca0b9075b79fd8c77b2944be1eb067920cd6965f51547ee"} err="failed to get container status \"a35a02f222d3815b9ca0b9075b79fd8c77b2944be1eb067920cd6965f51547ee\": rpc error: code = NotFound desc = could not find container \"a35a02f222d3815b9ca0b9075b79fd8c77b2944be1eb067920cd6965f51547ee\": container with ID starting with a35a02f222d3815b9ca0b9075b79fd8c77b2944be1eb067920cd6965f51547ee not found: ID does not exist" Apr 24 17:13:10.636044 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:10.636013 2579 scope.go:117] "RemoveContainer" containerID="d3a504020ab5015808ff038e4b5416b6ef162051651a38e812c2073a0a02a154" Apr 24 17:13:10.636261 ip-10-0-129-227 kubenswrapper[2579]: E0424 17:13:10.636243 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3a504020ab5015808ff038e4b5416b6ef162051651a38e812c2073a0a02a154\": container with ID starting with d3a504020ab5015808ff038e4b5416b6ef162051651a38e812c2073a0a02a154 not found: ID does not exist" containerID="d3a504020ab5015808ff038e4b5416b6ef162051651a38e812c2073a0a02a154" Apr 24 17:13:10.636304 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:10.636269 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3a504020ab5015808ff038e4b5416b6ef162051651a38e812c2073a0a02a154"} err="failed to get container status \"d3a504020ab5015808ff038e4b5416b6ef162051651a38e812c2073a0a02a154\": rpc error: code = NotFound desc = could not find container \"d3a504020ab5015808ff038e4b5416b6ef162051651a38e812c2073a0a02a154\": container with ID starting with d3a504020ab5015808ff038e4b5416b6ef162051651a38e812c2073a0a02a154 not found: ID does not exist" Apr 24 17:13:10.636304 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:10.636284 2579 scope.go:117] "RemoveContainer" containerID="ffc8addeb9d5204b49bc3b6f90fb128fc2d851124ffbff939a417db553adbb43" Apr 24 17:13:10.636504 ip-10-0-129-227 kubenswrapper[2579]: E0424 17:13:10.636487 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffc8addeb9d5204b49bc3b6f90fb128fc2d851124ffbff939a417db553adbb43\": container with ID starting with ffc8addeb9d5204b49bc3b6f90fb128fc2d851124ffbff939a417db553adbb43 not found: ID does not exist" containerID="ffc8addeb9d5204b49bc3b6f90fb128fc2d851124ffbff939a417db553adbb43" Apr 24 17:13:10.636548 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:10.636507 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffc8addeb9d5204b49bc3b6f90fb128fc2d851124ffbff939a417db553adbb43"} err="failed to get container status \"ffc8addeb9d5204b49bc3b6f90fb128fc2d851124ffbff939a417db553adbb43\": rpc error: code = NotFound desc = could not find container \"ffc8addeb9d5204b49bc3b6f90fb128fc2d851124ffbff939a417db553adbb43\": container with ID starting with ffc8addeb9d5204b49bc3b6f90fb128fc2d851124ffbff939a417db553adbb43 not found: ID does not exist" Apr 24 17:13:12.620382 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:13:12.620346 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6135b2a3-054c-4b08-8740-d23956b9d48e" path="/var/lib/kubelet/pods/6135b2a3-054c-4b08-8740-d23956b9d48e/volumes" Apr 24 17:14:24.741579 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:24.741553 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-52mlp_9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80/console-operator/2.log" Apr 24 17:14:24.750863 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:24.750843 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-52mlp_9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80/console-operator/2.log" Apr 24 17:14:31.376495 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.376454 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schewgx7q"] Apr 24 17:14:31.377071 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.377044 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6135b2a3-054c-4b08-8740-d23956b9d48e" containerName="main" Apr 24 17:14:31.377071 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.377070 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="6135b2a3-054c-4b08-8740-d23956b9d48e" containerName="main" Apr 24 17:14:31.377265 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.377083 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6135b2a3-054c-4b08-8740-d23956b9d48e" containerName="tokenizer" Apr 24 17:14:31.377265 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.377092 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="6135b2a3-054c-4b08-8740-d23956b9d48e" containerName="tokenizer" Apr 24 17:14:31.377265 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.377104 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0bd785ff-756f-4061-bd83-cfbed3d6ed71" containerName="main" Apr 24 17:14:31.377265 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.377113 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bd785ff-756f-4061-bd83-cfbed3d6ed71" containerName="main" Apr 24 17:14:31.377265 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.377132 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6135b2a3-054c-4b08-8740-d23956b9d48e" containerName="storage-initializer" Apr 24 17:14:31.377265 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.377140 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="6135b2a3-054c-4b08-8740-d23956b9d48e" containerName="storage-initializer" Apr 24 17:14:31.377265 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.377148 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6135b2a3-054c-4b08-8740-d23956b9d48e" containerName="main" Apr 24 17:14:31.377265 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.377156 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="6135b2a3-054c-4b08-8740-d23956b9d48e" containerName="main" Apr 24 17:14:31.377265 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.377166 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0bd785ff-756f-4061-bd83-cfbed3d6ed71" containerName="storage-initializer" Apr 24 17:14:31.377265 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.377173 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bd785ff-756f-4061-bd83-cfbed3d6ed71" containerName="storage-initializer" Apr 24 17:14:31.377853 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.377284 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="6135b2a3-054c-4b08-8740-d23956b9d48e" containerName="tokenizer" Apr 24 17:14:31.377853 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.377298 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="0bd785ff-756f-4061-bd83-cfbed3d6ed71" containerName="main" Apr 24 17:14:31.377853 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.377309 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="6135b2a3-054c-4b08-8740-d23956b9d48e" containerName="main" Apr 24 17:14:31.377853 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.377317 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="6135b2a3-054c-4b08-8740-d23956b9d48e" containerName="main" Apr 24 17:14:31.381110 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.381091 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schewgx7q" Apr 24 17:14:31.384468 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.384439 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 17:14:31.384598 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.384468 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 17:14:31.384598 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.384450 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-q89jh\"" Apr 24 17:14:31.384598 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.384469 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 24 17:14:31.385679 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.385660 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5ec-epp-sa-dockercfg-dlmrw\"" Apr 24 17:14:31.392293 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.392272 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schewgx7q"] Apr 24 17:14:31.405585 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.405563 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 24 17:14:31.409163 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.409143 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 17:14:31.411303 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.411279 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5-cb7fb8cf-dockercfg-97qfp\"" Apr 24 17:14:31.419291 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.419272 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 24 17:14:31.483094 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.483065 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e572e759-a847-4305-ada9-88093fdfc983-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"e572e759-a847-4305-ada9-88093fdfc983\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 17:14:31.483287 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.483101 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/887a61f4-2698-47eb-bbe4-0e0c5e992a37-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schewgx7q\" (UID: \"887a61f4-2698-47eb-bbe4-0e0c5e992a37\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schewgx7q" Apr 24 17:14:31.483287 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.483120 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e572e759-a847-4305-ada9-88093fdfc983-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"e572e759-a847-4305-ada9-88093fdfc983\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 17:14:31.483287 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.483179 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jcr7\" (UniqueName: \"kubernetes.io/projected/e572e759-a847-4305-ada9-88093fdfc983-kube-api-access-2jcr7\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"e572e759-a847-4305-ada9-88093fdfc983\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 17:14:31.483287 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.483247 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e572e759-a847-4305-ada9-88093fdfc983-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"e572e759-a847-4305-ada9-88093fdfc983\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 17:14:31.483287 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.483273 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e572e759-a847-4305-ada9-88093fdfc983-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"e572e759-a847-4305-ada9-88093fdfc983\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 17:14:31.483535 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.483313 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e572e759-a847-4305-ada9-88093fdfc983-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"e572e759-a847-4305-ada9-88093fdfc983\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 17:14:31.483535 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.483387 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfgn8\" (UniqueName: \"kubernetes.io/projected/887a61f4-2698-47eb-bbe4-0e0c5e992a37-kube-api-access-hfgn8\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schewgx7q\" (UID: \"887a61f4-2698-47eb-bbe4-0e0c5e992a37\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schewgx7q" Apr 24 17:14:31.483535 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.483425 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e572e759-a847-4305-ada9-88093fdfc983-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"e572e759-a847-4305-ada9-88093fdfc983\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 17:14:31.483535 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.483445 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/887a61f4-2698-47eb-bbe4-0e0c5e992a37-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schewgx7q\" (UID: \"887a61f4-2698-47eb-bbe4-0e0c5e992a37\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schewgx7q" Apr 24 17:14:31.483535 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.483498 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/887a61f4-2698-47eb-bbe4-0e0c5e992a37-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schewgx7q\" (UID: \"887a61f4-2698-47eb-bbe4-0e0c5e992a37\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schewgx7q" Apr 24 17:14:31.483535 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.483518 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/887a61f4-2698-47eb-bbe4-0e0c5e992a37-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schewgx7q\" (UID: \"887a61f4-2698-47eb-bbe4-0e0c5e992a37\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schewgx7q" Apr 24 17:14:31.483535 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.483536 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/887a61f4-2698-47eb-bbe4-0e0c5e992a37-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schewgx7q\" (UID: \"887a61f4-2698-47eb-bbe4-0e0c5e992a37\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schewgx7q" Apr 24 17:14:31.584602 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.584544 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/887a61f4-2698-47eb-bbe4-0e0c5e992a37-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schewgx7q\" (UID: \"887a61f4-2698-47eb-bbe4-0e0c5e992a37\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schewgx7q" Apr 24 17:14:31.584602 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.584605 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e572e759-a847-4305-ada9-88093fdfc983-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"e572e759-a847-4305-ada9-88093fdfc983\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 17:14:31.584916 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.584638 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2jcr7\" (UniqueName: \"kubernetes.io/projected/e572e759-a847-4305-ada9-88093fdfc983-kube-api-access-2jcr7\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"e572e759-a847-4305-ada9-88093fdfc983\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 17:14:31.584916 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.584689 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e572e759-a847-4305-ada9-88093fdfc983-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"e572e759-a847-4305-ada9-88093fdfc983\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 17:14:31.584916 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.584712 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e572e759-a847-4305-ada9-88093fdfc983-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"e572e759-a847-4305-ada9-88093fdfc983\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 17:14:31.585080 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.584906 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e572e759-a847-4305-ada9-88093fdfc983-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"e572e759-a847-4305-ada9-88093fdfc983\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 17:14:31.585080 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.584988 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hfgn8\" (UniqueName: \"kubernetes.io/projected/887a61f4-2698-47eb-bbe4-0e0c5e992a37-kube-api-access-hfgn8\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schewgx7q\" (UID: \"887a61f4-2698-47eb-bbe4-0e0c5e992a37\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schewgx7q" Apr 24 17:14:31.585080 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.585025 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/887a61f4-2698-47eb-bbe4-0e0c5e992a37-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schewgx7q\" (UID: \"887a61f4-2698-47eb-bbe4-0e0c5e992a37\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schewgx7q" Apr 24 17:14:31.585080 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.585026 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e572e759-a847-4305-ada9-88093fdfc983-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"e572e759-a847-4305-ada9-88093fdfc983\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 17:14:31.585242 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.585093 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e572e759-a847-4305-ada9-88093fdfc983-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"e572e759-a847-4305-ada9-88093fdfc983\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 17:14:31.585242 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.585128 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e572e759-a847-4305-ada9-88093fdfc983-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"e572e759-a847-4305-ada9-88093fdfc983\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 17:14:31.585334 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.585240 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e572e759-a847-4305-ada9-88093fdfc983-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"e572e759-a847-4305-ada9-88093fdfc983\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 17:14:31.585334 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.585251 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e572e759-a847-4305-ada9-88093fdfc983-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"e572e759-a847-4305-ada9-88093fdfc983\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 17:14:31.585334 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.585309 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/887a61f4-2698-47eb-bbe4-0e0c5e992a37-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schewgx7q\" (UID: \"887a61f4-2698-47eb-bbe4-0e0c5e992a37\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schewgx7q" Apr 24 17:14:31.585493 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.585368 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/887a61f4-2698-47eb-bbe4-0e0c5e992a37-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schewgx7q\" (UID: \"887a61f4-2698-47eb-bbe4-0e0c5e992a37\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schewgx7q" Apr 24 17:14:31.585493 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.585401 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/887a61f4-2698-47eb-bbe4-0e0c5e992a37-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schewgx7q\" (UID: \"887a61f4-2698-47eb-bbe4-0e0c5e992a37\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schewgx7q" Apr 24 17:14:31.585493 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.585435 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/887a61f4-2698-47eb-bbe4-0e0c5e992a37-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schewgx7q\" (UID: \"887a61f4-2698-47eb-bbe4-0e0c5e992a37\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schewgx7q" Apr 24 17:14:31.585493 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.585486 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e572e759-a847-4305-ada9-88093fdfc983-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"e572e759-a847-4305-ada9-88093fdfc983\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 17:14:31.585696 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.585618 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/887a61f4-2698-47eb-bbe4-0e0c5e992a37-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schewgx7q\" (UID: \"887a61f4-2698-47eb-bbe4-0e0c5e992a37\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schewgx7q" Apr 24 17:14:31.585696 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.585674 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/887a61f4-2698-47eb-bbe4-0e0c5e992a37-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schewgx7q\" (UID: \"887a61f4-2698-47eb-bbe4-0e0c5e992a37\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schewgx7q" Apr 24 17:14:31.585831 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.585787 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/887a61f4-2698-47eb-bbe4-0e0c5e992a37-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schewgx7q\" (UID: \"887a61f4-2698-47eb-bbe4-0e0c5e992a37\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schewgx7q" Apr 24 17:14:31.587232 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.587211 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e572e759-a847-4305-ada9-88093fdfc983-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"e572e759-a847-4305-ada9-88093fdfc983\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 17:14:31.587835 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.587817 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/887a61f4-2698-47eb-bbe4-0e0c5e992a37-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schewgx7q\" (UID: \"887a61f4-2698-47eb-bbe4-0e0c5e992a37\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schewgx7q" Apr 24 17:14:31.587962 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.587943 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e572e759-a847-4305-ada9-88093fdfc983-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"e572e759-a847-4305-ada9-88093fdfc983\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 17:14:31.593022 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.592992 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfgn8\" (UniqueName: \"kubernetes.io/projected/887a61f4-2698-47eb-bbe4-0e0c5e992a37-kube-api-access-hfgn8\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schewgx7q\" (UID: \"887a61f4-2698-47eb-bbe4-0e0c5e992a37\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schewgx7q" Apr 24 17:14:31.593799 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.593779 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jcr7\" (UniqueName: \"kubernetes.io/projected/e572e759-a847-4305-ada9-88093fdfc983-kube-api-access-2jcr7\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"e572e759-a847-4305-ada9-88093fdfc983\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 17:14:31.692088 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.691986 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schewgx7q" Apr 24 17:14:31.721809 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.721772 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 17:14:31.850332 ip-10-0-129-227 kubenswrapper[2579]: W0424 17:14:31.850295 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod887a61f4_2698_47eb_bbe4_0e0c5e992a37.slice/crio-6744b39e96585c92fce1e7c7e85c866c9ab65bce073aaffbaa4d7d5d83212dac WatchSource:0}: Error finding container 6744b39e96585c92fce1e7c7e85c866c9ab65bce073aaffbaa4d7d5d83212dac: Status 404 returned error can't find the container with id 6744b39e96585c92fce1e7c7e85c866c9ab65bce073aaffbaa4d7d5d83212dac Apr 24 17:14:31.850585 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.850562 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schewgx7q"] Apr 24 17:14:31.852332 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.852316 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 17:14:31.869317 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.869293 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 24 17:14:31.870306 ip-10-0-129-227 kubenswrapper[2579]: W0424 17:14:31.870283 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode572e759_a847_4305_ada9_88093fdfc983.slice/crio-c971ef5a8a5ad31c92dfbebb03375731b48602dbd63f2eeee1a01264e0c68ecc WatchSource:0}: Error finding container c971ef5a8a5ad31c92dfbebb03375731b48602dbd63f2eeee1a01264e0c68ecc: Status 404 returned error can't find the container with id c971ef5a8a5ad31c92dfbebb03375731b48602dbd63f2eeee1a01264e0c68ecc Apr 24 17:14:31.921013 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.920977 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schewgx7q" event={"ID":"887a61f4-2698-47eb-bbe4-0e0c5e992a37","Type":"ContainerStarted","Data":"6744b39e96585c92fce1e7c7e85c866c9ab65bce073aaffbaa4d7d5d83212dac"} Apr 24 17:14:31.922158 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:31.922122 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"e572e759-a847-4305-ada9-88093fdfc983","Type":"ContainerStarted","Data":"c971ef5a8a5ad31c92dfbebb03375731b48602dbd63f2eeee1a01264e0c68ecc"} Apr 24 17:14:32.928180 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:32.928104 2579 generic.go:358] "Generic (PLEG): container finished" podID="887a61f4-2698-47eb-bbe4-0e0c5e992a37" containerID="28cd01ede7c30b05c5e15068271e1564569be0315f4c9abeead94aa98681b234" exitCode=0 Apr 24 17:14:32.928628 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:32.928209 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schewgx7q" event={"ID":"887a61f4-2698-47eb-bbe4-0e0c5e992a37","Type":"ContainerDied","Data":"28cd01ede7c30b05c5e15068271e1564569be0315f4c9abeead94aa98681b234"} Apr 24 17:14:32.930056 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:32.930020 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"e572e759-a847-4305-ada9-88093fdfc983","Type":"ContainerStarted","Data":"827c71718cb317b89b33bbc0e63e974afc4ef4df99581d49b4950799e2d57f40"} Apr 24 17:14:33.935784 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:33.935710 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schewgx7q" event={"ID":"887a61f4-2698-47eb-bbe4-0e0c5e992a37","Type":"ContainerStarted","Data":"52d45f3ade1d49323992a4e0d383696f39dd18d591fcc2b39e8d7c723597b6e4"} Apr 24 17:14:33.936176 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:33.935792 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schewgx7q" event={"ID":"887a61f4-2698-47eb-bbe4-0e0c5e992a37","Type":"ContainerStarted","Data":"28c23beed7b7b02787a7d00f8c2435d748f62cfd5ed7e925b7001f9b725aef52"} Apr 24 17:14:33.936176 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:33.935837 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schewgx7q" Apr 24 17:14:33.960617 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:33.960570 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schewgx7q" podStartSLOduration=2.960554815 podStartE2EDuration="2.960554815s" podCreationTimestamp="2026-04-24 17:14:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 17:14:33.95859522 +0000 UTC m=+2109.891664447" watchObservedRunningTime="2026-04-24 17:14:33.960554815 +0000 UTC m=+2109.893624013" Apr 24 17:14:41.692149 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:41.692109 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schewgx7q" Apr 24 17:14:41.692688 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:41.692280 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schewgx7q" Apr 24 17:14:41.695237 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:41.695212 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schewgx7q" Apr 24 17:14:41.968971 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:14:41.968889 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schewgx7q" Apr 24 17:15:03.977647 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:15:03.977572 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schewgx7q" Apr 24 17:15:32.164661 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:15:32.164625 2579 generic.go:358] "Generic (PLEG): container finished" podID="e572e759-a847-4305-ada9-88093fdfc983" containerID="827c71718cb317b89b33bbc0e63e974afc4ef4df99581d49b4950799e2d57f40" exitCode=0 Apr 24 17:15:32.164661 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:15:32.164658 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"e572e759-a847-4305-ada9-88093fdfc983","Type":"ContainerDied","Data":"827c71718cb317b89b33bbc0e63e974afc4ef4df99581d49b4950799e2d57f40"} Apr 24 17:16:19.272223 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:19.272186 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schewgx7q"] Apr 24 17:16:19.272663 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:19.272511 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schewgx7q" podUID="887a61f4-2698-47eb-bbe4-0e0c5e992a37" containerName="main" containerID="cri-o://28c23beed7b7b02787a7d00f8c2435d748f62cfd5ed7e925b7001f9b725aef52" gracePeriod=30 Apr 24 17:16:19.272663 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:19.272555 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schewgx7q" podUID="887a61f4-2698-47eb-bbe4-0e0c5e992a37" containerName="tokenizer" containerID="cri-o://52d45f3ade1d49323992a4e0d383696f39dd18d591fcc2b39e8d7c723597b6e4" gracePeriod=30 Apr 24 17:16:19.376577 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:19.376471 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"e572e759-a847-4305-ada9-88093fdfc983","Type":"ContainerStarted","Data":"7a561e5b5daea667a72e7ac1a58a2d87b53ad97f995eb8ce26e9cd7bdd082a68"} Apr 24 17:16:19.376805 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:19.376780 2579 kubelet_pods.go:1019] "Unable to retrieve pull secret, the image pull may not succeed." pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" secret="" err="secret \"llmisvc-model-fb-opt-125m-route-f312f5-cb7fb8cf-dockercfg-97qfp\" not found" Apr 24 17:16:19.398143 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:19.398080 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" podStartSLOduration=61.486858684 podStartE2EDuration="1m48.398061226s" podCreationTimestamp="2026-04-24 17:14:31 +0000 UTC" firstStartedPulling="2026-04-24 17:15:32.165943376 +0000 UTC m=+2168.099012554" lastFinishedPulling="2026-04-24 17:16:19.077145906 +0000 UTC m=+2215.010215096" observedRunningTime="2026-04-24 17:16:19.3955922 +0000 UTC m=+2215.328661398" watchObservedRunningTime="2026-04-24 17:16:19.398061226 +0000 UTC m=+2215.331130425" Apr 24 17:16:19.475788 ip-10-0-129-227 kubenswrapper[2579]: E0424 17:16:19.475717 2579 secret.go:189] Couldn't get secret kserve-ci-e2e-test/llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs: secret "llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs" not found Apr 24 17:16:19.475996 ip-10-0-129-227 kubenswrapper[2579]: E0424 17:16:19.475864 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e572e759-a847-4305-ada9-88093fdfc983-tls-certs podName:e572e759-a847-4305-ada9-88093fdfc983 nodeName:}" failed. No retries permitted until 2026-04-24 17:16:19.975838699 +0000 UTC m=+2215.908907900 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/e572e759-a847-4305-ada9-88093fdfc983-tls-certs") pod "llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" (UID: "e572e759-a847-4305-ada9-88093fdfc983") : secret "llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs" not found Apr 24 17:16:19.980311 ip-10-0-129-227 kubenswrapper[2579]: E0424 17:16:19.980273 2579 secret.go:189] Couldn't get secret kserve-ci-e2e-test/llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs: secret "llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs" not found Apr 24 17:16:19.980539 ip-10-0-129-227 kubenswrapper[2579]: E0424 17:16:19.980362 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e572e759-a847-4305-ada9-88093fdfc983-tls-certs podName:e572e759-a847-4305-ada9-88093fdfc983 nodeName:}" failed. No retries permitted until 2026-04-24 17:16:20.98034186 +0000 UTC m=+2216.913411049 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/e572e759-a847-4305-ada9-88093fdfc983-tls-certs") pod "llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" (UID: "e572e759-a847-4305-ada9-88093fdfc983") : secret "llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs" not found Apr 24 17:16:20.382791 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:20.382754 2579 generic.go:358] "Generic (PLEG): container finished" podID="887a61f4-2698-47eb-bbe4-0e0c5e992a37" containerID="28c23beed7b7b02787a7d00f8c2435d748f62cfd5ed7e925b7001f9b725aef52" exitCode=0 Apr 24 17:16:20.383270 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:20.382850 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schewgx7q" event={"ID":"887a61f4-2698-47eb-bbe4-0e0c5e992a37","Type":"ContainerDied","Data":"28c23beed7b7b02787a7d00f8c2435d748f62cfd5ed7e925b7001f9b725aef52"} Apr 24 17:16:20.383469 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:20.383452 2579 kubelet_pods.go:1019] "Unable to retrieve pull secret, the image pull may not succeed." pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" secret="" err="secret \"llmisvc-model-fb-opt-125m-route-f312f5-cb7fb8cf-dockercfg-97qfp\" not found" Apr 24 17:16:20.612778 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:20.612708 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 24 17:16:20.637218 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:20.637140 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schewgx7q" Apr 24 17:16:20.687035 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:20.687002 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/887a61f4-2698-47eb-bbe4-0e0c5e992a37-tokenizer-cache\") pod \"887a61f4-2698-47eb-bbe4-0e0c5e992a37\" (UID: \"887a61f4-2698-47eb-bbe4-0e0c5e992a37\") " Apr 24 17:16:20.687220 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:20.687084 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfgn8\" (UniqueName: \"kubernetes.io/projected/887a61f4-2698-47eb-bbe4-0e0c5e992a37-kube-api-access-hfgn8\") pod \"887a61f4-2698-47eb-bbe4-0e0c5e992a37\" (UID: \"887a61f4-2698-47eb-bbe4-0e0c5e992a37\") " Apr 24 17:16:20.687220 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:20.687135 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/887a61f4-2698-47eb-bbe4-0e0c5e992a37-tls-certs\") pod \"887a61f4-2698-47eb-bbe4-0e0c5e992a37\" (UID: \"887a61f4-2698-47eb-bbe4-0e0c5e992a37\") " Apr 24 17:16:20.687220 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:20.687191 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/887a61f4-2698-47eb-bbe4-0e0c5e992a37-tokenizer-uds\") pod \"887a61f4-2698-47eb-bbe4-0e0c5e992a37\" (UID: \"887a61f4-2698-47eb-bbe4-0e0c5e992a37\") " Apr 24 17:16:20.687371 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:20.687241 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/887a61f4-2698-47eb-bbe4-0e0c5e992a37-kserve-provision-location\") pod \"887a61f4-2698-47eb-bbe4-0e0c5e992a37\" (UID: \"887a61f4-2698-47eb-bbe4-0e0c5e992a37\") " Apr 24 17:16:20.687371 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:20.687268 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/887a61f4-2698-47eb-bbe4-0e0c5e992a37-tokenizer-tmp\") pod \"887a61f4-2698-47eb-bbe4-0e0c5e992a37\" (UID: \"887a61f4-2698-47eb-bbe4-0e0c5e992a37\") " Apr 24 17:16:20.687371 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:20.687325 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/887a61f4-2698-47eb-bbe4-0e0c5e992a37-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "887a61f4-2698-47eb-bbe4-0e0c5e992a37" (UID: "887a61f4-2698-47eb-bbe4-0e0c5e992a37"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:16:20.687564 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:20.687538 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/887a61f4-2698-47eb-bbe4-0e0c5e992a37-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "887a61f4-2698-47eb-bbe4-0e0c5e992a37" (UID: "887a61f4-2698-47eb-bbe4-0e0c5e992a37"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:16:20.687669 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:20.687573 2579 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/887a61f4-2698-47eb-bbe4-0e0c5e992a37-tokenizer-cache\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 17:16:20.687763 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:20.687715 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/887a61f4-2698-47eb-bbe4-0e0c5e992a37-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "887a61f4-2698-47eb-bbe4-0e0c5e992a37" (UID: "887a61f4-2698-47eb-bbe4-0e0c5e992a37"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:16:20.688259 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:20.688232 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/887a61f4-2698-47eb-bbe4-0e0c5e992a37-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "887a61f4-2698-47eb-bbe4-0e0c5e992a37" (UID: "887a61f4-2698-47eb-bbe4-0e0c5e992a37"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:16:20.689563 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:20.689538 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/887a61f4-2698-47eb-bbe4-0e0c5e992a37-kube-api-access-hfgn8" (OuterVolumeSpecName: "kube-api-access-hfgn8") pod "887a61f4-2698-47eb-bbe4-0e0c5e992a37" (UID: "887a61f4-2698-47eb-bbe4-0e0c5e992a37"). InnerVolumeSpecName "kube-api-access-hfgn8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 17:16:20.690431 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:20.690411 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/887a61f4-2698-47eb-bbe4-0e0c5e992a37-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "887a61f4-2698-47eb-bbe4-0e0c5e992a37" (UID: "887a61f4-2698-47eb-bbe4-0e0c5e992a37"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:16:20.788661 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:20.788629 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hfgn8\" (UniqueName: \"kubernetes.io/projected/887a61f4-2698-47eb-bbe4-0e0c5e992a37-kube-api-access-hfgn8\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 17:16:20.788661 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:20.788655 2579 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/887a61f4-2698-47eb-bbe4-0e0c5e992a37-tls-certs\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 17:16:20.788661 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:20.788665 2579 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/887a61f4-2698-47eb-bbe4-0e0c5e992a37-tokenizer-uds\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 17:16:20.789006 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:20.788674 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/887a61f4-2698-47eb-bbe4-0e0c5e992a37-kserve-provision-location\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 17:16:20.789006 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:20.788684 2579 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/887a61f4-2698-47eb-bbe4-0e0c5e992a37-tokenizer-tmp\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 17:16:20.990659 ip-10-0-129-227 kubenswrapper[2579]: E0424 17:16:20.990560 2579 secret.go:189] Couldn't get secret kserve-ci-e2e-test/llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs: secret "llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs" not found Apr 24 17:16:20.990659 ip-10-0-129-227 kubenswrapper[2579]: E0424 17:16:20.990634 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e572e759-a847-4305-ada9-88093fdfc983-tls-certs podName:e572e759-a847-4305-ada9-88093fdfc983 nodeName:}" failed. No retries permitted until 2026-04-24 17:16:22.990618203 +0000 UTC m=+2218.923687380 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/e572e759-a847-4305-ada9-88093fdfc983-tls-certs") pod "llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" (UID: "e572e759-a847-4305-ada9-88093fdfc983") : secret "llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs" not found Apr 24 17:16:21.389649 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:21.389596 2579 generic.go:358] "Generic (PLEG): container finished" podID="887a61f4-2698-47eb-bbe4-0e0c5e992a37" containerID="52d45f3ade1d49323992a4e0d383696f39dd18d591fcc2b39e8d7c723597b6e4" exitCode=0 Apr 24 17:16:21.390187 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:21.389665 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schewgx7q" event={"ID":"887a61f4-2698-47eb-bbe4-0e0c5e992a37","Type":"ContainerDied","Data":"52d45f3ade1d49323992a4e0d383696f39dd18d591fcc2b39e8d7c723597b6e4"} Apr 24 17:16:21.390187 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:21.389709 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schewgx7q" Apr 24 17:16:21.390187 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:21.389766 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schewgx7q" event={"ID":"887a61f4-2698-47eb-bbe4-0e0c5e992a37","Type":"ContainerDied","Data":"6744b39e96585c92fce1e7c7e85c866c9ab65bce073aaffbaa4d7d5d83212dac"} Apr 24 17:16:21.390187 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:21.389798 2579 scope.go:117] "RemoveContainer" containerID="52d45f3ade1d49323992a4e0d383696f39dd18d591fcc2b39e8d7c723597b6e4" Apr 24 17:16:21.390486 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:21.390443 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" podUID="e572e759-a847-4305-ada9-88093fdfc983" containerName="main" containerID="cri-o://7a561e5b5daea667a72e7ac1a58a2d87b53ad97f995eb8ce26e9cd7bdd082a68" gracePeriod=30 Apr 24 17:16:21.417187 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:21.416902 2579 scope.go:117] "RemoveContainer" containerID="28c23beed7b7b02787a7d00f8c2435d748f62cfd5ed7e925b7001f9b725aef52" Apr 24 17:16:21.418356 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:21.418326 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schewgx7q"] Apr 24 17:16:21.424026 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:21.424002 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schewgx7q"] Apr 24 17:16:21.426868 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:21.426840 2579 scope.go:117] "RemoveContainer" containerID="28cd01ede7c30b05c5e15068271e1564569be0315f4c9abeead94aa98681b234" Apr 24 17:16:21.435381 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:21.435361 2579 scope.go:117] "RemoveContainer" containerID="52d45f3ade1d49323992a4e0d383696f39dd18d591fcc2b39e8d7c723597b6e4" Apr 24 17:16:21.435652 ip-10-0-129-227 kubenswrapper[2579]: E0424 17:16:21.435633 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52d45f3ade1d49323992a4e0d383696f39dd18d591fcc2b39e8d7c723597b6e4\": container with ID starting with 52d45f3ade1d49323992a4e0d383696f39dd18d591fcc2b39e8d7c723597b6e4 not found: ID does not exist" containerID="52d45f3ade1d49323992a4e0d383696f39dd18d591fcc2b39e8d7c723597b6e4" Apr 24 17:16:21.435724 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:21.435668 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52d45f3ade1d49323992a4e0d383696f39dd18d591fcc2b39e8d7c723597b6e4"} err="failed to get container status \"52d45f3ade1d49323992a4e0d383696f39dd18d591fcc2b39e8d7c723597b6e4\": rpc error: code = NotFound desc = could not find container \"52d45f3ade1d49323992a4e0d383696f39dd18d591fcc2b39e8d7c723597b6e4\": container with ID starting with 52d45f3ade1d49323992a4e0d383696f39dd18d591fcc2b39e8d7c723597b6e4 not found: ID does not exist" Apr 24 17:16:21.435724 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:21.435699 2579 scope.go:117] "RemoveContainer" containerID="28c23beed7b7b02787a7d00f8c2435d748f62cfd5ed7e925b7001f9b725aef52" Apr 24 17:16:21.436152 ip-10-0-129-227 kubenswrapper[2579]: E0424 17:16:21.436135 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28c23beed7b7b02787a7d00f8c2435d748f62cfd5ed7e925b7001f9b725aef52\": container with ID starting with 28c23beed7b7b02787a7d00f8c2435d748f62cfd5ed7e925b7001f9b725aef52 not found: ID does not exist" containerID="28c23beed7b7b02787a7d00f8c2435d748f62cfd5ed7e925b7001f9b725aef52" Apr 24 17:16:21.436219 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:21.436159 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28c23beed7b7b02787a7d00f8c2435d748f62cfd5ed7e925b7001f9b725aef52"} err="failed to get container status \"28c23beed7b7b02787a7d00f8c2435d748f62cfd5ed7e925b7001f9b725aef52\": rpc error: code = NotFound desc = could not find container \"28c23beed7b7b02787a7d00f8c2435d748f62cfd5ed7e925b7001f9b725aef52\": container with ID starting with 28c23beed7b7b02787a7d00f8c2435d748f62cfd5ed7e925b7001f9b725aef52 not found: ID does not exist" Apr 24 17:16:21.436219 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:21.436175 2579 scope.go:117] "RemoveContainer" containerID="28cd01ede7c30b05c5e15068271e1564569be0315f4c9abeead94aa98681b234" Apr 24 17:16:21.436398 ip-10-0-129-227 kubenswrapper[2579]: E0424 17:16:21.436375 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28cd01ede7c30b05c5e15068271e1564569be0315f4c9abeead94aa98681b234\": container with ID starting with 28cd01ede7c30b05c5e15068271e1564569be0315f4c9abeead94aa98681b234 not found: ID does not exist" containerID="28cd01ede7c30b05c5e15068271e1564569be0315f4c9abeead94aa98681b234" Apr 24 17:16:21.436454 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:21.436408 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28cd01ede7c30b05c5e15068271e1564569be0315f4c9abeead94aa98681b234"} err="failed to get container status \"28cd01ede7c30b05c5e15068271e1564569be0315f4c9abeead94aa98681b234\": rpc error: code = NotFound desc = could not find container \"28cd01ede7c30b05c5e15068271e1564569be0315f4c9abeead94aa98681b234\": container with ID starting with 28cd01ede7c30b05c5e15068271e1564569be0315f4c9abeead94aa98681b234 not found: ID does not exist" Apr 24 17:16:22.622014 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:22.621972 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="887a61f4-2698-47eb-bbe4-0e0c5e992a37" path="/var/lib/kubelet/pods/887a61f4-2698-47eb-bbe4-0e0c5e992a37/volumes" Apr 24 17:16:23.009656 ip-10-0-129-227 kubenswrapper[2579]: E0424 17:16:23.009617 2579 secret.go:189] Couldn't get secret kserve-ci-e2e-test/llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs: secret "llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs" not found Apr 24 17:16:23.009866 ip-10-0-129-227 kubenswrapper[2579]: E0424 17:16:23.009724 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e572e759-a847-4305-ada9-88093fdfc983-tls-certs podName:e572e759-a847-4305-ada9-88093fdfc983 nodeName:}" failed. No retries permitted until 2026-04-24 17:16:27.00970153 +0000 UTC m=+2222.942770731 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/e572e759-a847-4305-ada9-88093fdfc983-tls-certs") pod "llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" (UID: "e572e759-a847-4305-ada9-88093fdfc983") : secret "llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs" not found Apr 24 17:16:27.051646 ip-10-0-129-227 kubenswrapper[2579]: E0424 17:16:27.051608 2579 secret.go:189] Couldn't get secret kserve-ci-e2e-test/llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs: secret "llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs" not found Apr 24 17:16:27.052078 ip-10-0-129-227 kubenswrapper[2579]: E0424 17:16:27.051713 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e572e759-a847-4305-ada9-88093fdfc983-tls-certs podName:e572e759-a847-4305-ada9-88093fdfc983 nodeName:}" failed. No retries permitted until 2026-04-24 17:16:35.051697742 +0000 UTC m=+2230.984766920 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/e572e759-a847-4305-ada9-88093fdfc983-tls-certs") pod "llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" (UID: "e572e759-a847-4305-ada9-88093fdfc983") : secret "llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs" not found Apr 24 17:16:35.132488 ip-10-0-129-227 kubenswrapper[2579]: E0424 17:16:35.132396 2579 secret.go:189] Couldn't get secret kserve-ci-e2e-test/llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs: secret "llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs" not found Apr 24 17:16:35.132983 ip-10-0-129-227 kubenswrapper[2579]: E0424 17:16:35.132495 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e572e759-a847-4305-ada9-88093fdfc983-tls-certs podName:e572e759-a847-4305-ada9-88093fdfc983 nodeName:}" failed. No retries permitted until 2026-04-24 17:16:51.132478249 +0000 UTC m=+2247.065547425 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/e572e759-a847-4305-ada9-88093fdfc983-tls-certs") pod "llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" (UID: "e572e759-a847-4305-ada9-88093fdfc983") : secret "llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs" not found Apr 24 17:16:51.183805 ip-10-0-129-227 kubenswrapper[2579]: E0424 17:16:51.183772 2579 secret.go:189] Couldn't get secret kserve-ci-e2e-test/llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs: secret "llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs" not found Apr 24 17:16:51.184235 ip-10-0-129-227 kubenswrapper[2579]: E0424 17:16:51.183839 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e572e759-a847-4305-ada9-88093fdfc983-tls-certs podName:e572e759-a847-4305-ada9-88093fdfc983 nodeName:}" failed. No retries permitted until 2026-04-24 17:17:23.183824095 +0000 UTC m=+2279.116893272 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/e572e759-a847-4305-ada9-88093fdfc983-tls-certs") pod "llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" (UID: "e572e759-a847-4305-ada9-88093fdfc983") : secret "llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs" not found Apr 24 17:16:51.508913 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:51.508883 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1_e572e759-a847-4305-ada9-88093fdfc983/main/0.log" Apr 24 17:16:51.509257 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:51.509231 2579 generic.go:358] "Generic (PLEG): container finished" podID="e572e759-a847-4305-ada9-88093fdfc983" containerID="7a561e5b5daea667a72e7ac1a58a2d87b53ad97f995eb8ce26e9cd7bdd082a68" exitCode=137 Apr 24 17:16:51.509335 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:51.509311 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"e572e759-a847-4305-ada9-88093fdfc983","Type":"ContainerDied","Data":"7a561e5b5daea667a72e7ac1a58a2d87b53ad97f995eb8ce26e9cd7bdd082a68"} Apr 24 17:16:52.054396 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:52.054374 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1_e572e759-a847-4305-ada9-88093fdfc983/main/0.log" Apr 24 17:16:52.054763 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:52.054723 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 17:16:52.090958 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:52.090931 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jcr7\" (UniqueName: \"kubernetes.io/projected/e572e759-a847-4305-ada9-88093fdfc983-kube-api-access-2jcr7\") pod \"e572e759-a847-4305-ada9-88093fdfc983\" (UID: \"e572e759-a847-4305-ada9-88093fdfc983\") " Apr 24 17:16:52.091124 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:52.090998 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e572e759-a847-4305-ada9-88093fdfc983-tmp-dir\") pod \"e572e759-a847-4305-ada9-88093fdfc983\" (UID: \"e572e759-a847-4305-ada9-88093fdfc983\") " Apr 24 17:16:52.091124 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:52.091030 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e572e759-a847-4305-ada9-88093fdfc983-home\") pod \"e572e759-a847-4305-ada9-88093fdfc983\" (UID: \"e572e759-a847-4305-ada9-88093fdfc983\") " Apr 24 17:16:52.091244 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:52.091130 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e572e759-a847-4305-ada9-88093fdfc983-kserve-provision-location\") pod \"e572e759-a847-4305-ada9-88093fdfc983\" (UID: \"e572e759-a847-4305-ada9-88093fdfc983\") " Apr 24 17:16:52.091244 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:52.091191 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e572e759-a847-4305-ada9-88093fdfc983-tls-certs\") pod \"e572e759-a847-4305-ada9-88093fdfc983\" (UID: \"e572e759-a847-4305-ada9-88093fdfc983\") " Apr 24 17:16:52.091244 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:52.091229 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e572e759-a847-4305-ada9-88093fdfc983-home" (OuterVolumeSpecName: "home") pod "e572e759-a847-4305-ada9-88093fdfc983" (UID: "e572e759-a847-4305-ada9-88093fdfc983"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:16:52.091408 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:52.091231 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e572e759-a847-4305-ada9-88093fdfc983-dshm\") pod \"e572e759-a847-4305-ada9-88093fdfc983\" (UID: \"e572e759-a847-4305-ada9-88093fdfc983\") " Apr 24 17:16:52.091408 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:52.091288 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e572e759-a847-4305-ada9-88093fdfc983-model-cache\") pod \"e572e759-a847-4305-ada9-88093fdfc983\" (UID: \"e572e759-a847-4305-ada9-88093fdfc983\") " Apr 24 17:16:52.091408 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:52.091289 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e572e759-a847-4305-ada9-88093fdfc983-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "e572e759-a847-4305-ada9-88093fdfc983" (UID: "e572e759-a847-4305-ada9-88093fdfc983"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:16:52.091595 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:52.091507 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e572e759-a847-4305-ada9-88093fdfc983-model-cache" (OuterVolumeSpecName: "model-cache") pod "e572e759-a847-4305-ada9-88093fdfc983" (UID: "e572e759-a847-4305-ada9-88093fdfc983"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:16:52.091654 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:52.091609 2579 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e572e759-a847-4305-ada9-88093fdfc983-tmp-dir\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 17:16:52.091654 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:52.091628 2579 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e572e759-a847-4305-ada9-88093fdfc983-home\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 17:16:52.091654 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:52.091642 2579 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e572e759-a847-4305-ada9-88093fdfc983-model-cache\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 17:16:52.093435 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:52.093406 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e572e759-a847-4305-ada9-88093fdfc983-dshm" (OuterVolumeSpecName: "dshm") pod "e572e759-a847-4305-ada9-88093fdfc983" (UID: "e572e759-a847-4305-ada9-88093fdfc983"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:16:52.093656 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:52.093638 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e572e759-a847-4305-ada9-88093fdfc983-kube-api-access-2jcr7" (OuterVolumeSpecName: "kube-api-access-2jcr7") pod "e572e759-a847-4305-ada9-88093fdfc983" (UID: "e572e759-a847-4305-ada9-88093fdfc983"). InnerVolumeSpecName "kube-api-access-2jcr7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 17:16:52.093813 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:52.093793 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e572e759-a847-4305-ada9-88093fdfc983-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "e572e759-a847-4305-ada9-88093fdfc983" (UID: "e572e759-a847-4305-ada9-88093fdfc983"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:16:52.147378 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:52.147334 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e572e759-a847-4305-ada9-88093fdfc983-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e572e759-a847-4305-ada9-88093fdfc983" (UID: "e572e759-a847-4305-ada9-88093fdfc983"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:16:52.192146 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:52.192076 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e572e759-a847-4305-ada9-88093fdfc983-kserve-provision-location\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 17:16:52.192146 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:52.192102 2579 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e572e759-a847-4305-ada9-88093fdfc983-tls-certs\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 17:16:52.192146 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:52.192112 2579 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e572e759-a847-4305-ada9-88093fdfc983-dshm\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 17:16:52.192146 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:52.192121 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2jcr7\" (UniqueName: \"kubernetes.io/projected/e572e759-a847-4305-ada9-88093fdfc983-kube-api-access-2jcr7\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 17:16:52.514460 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:52.514429 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1_e572e759-a847-4305-ada9-88093fdfc983/main/0.log" Apr 24 17:16:52.514837 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:52.514816 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"e572e759-a847-4305-ada9-88093fdfc983","Type":"ContainerDied","Data":"c971ef5a8a5ad31c92dfbebb03375731b48602dbd63f2eeee1a01264e0c68ecc"} Apr 24 17:16:52.514930 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:52.514843 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 17:16:52.514930 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:52.514855 2579 scope.go:117] "RemoveContainer" containerID="7a561e5b5daea667a72e7ac1a58a2d87b53ad97f995eb8ce26e9cd7bdd082a68" Apr 24 17:16:52.524379 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:52.524361 2579 scope.go:117] "RemoveContainer" containerID="827c71718cb317b89b33bbc0e63e974afc4ef4df99581d49b4950799e2d57f40" Apr 24 17:16:52.537622 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:52.537596 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 24 17:16:52.541606 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:52.541578 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 24 17:16:52.620179 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:16:52.620145 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e572e759-a847-4305-ada9-88093fdfc983" path="/var/lib/kubelet/pods/e572e759-a847-4305-ada9-88093fdfc983/volumes" Apr 24 17:19:24.777667 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:19:24.777638 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-52mlp_9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80/console-operator/2.log" Apr 24 17:19:24.785917 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:19:24.785898 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-52mlp_9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80/console-operator/2.log" Apr 24 17:24:24.814181 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:24:24.814093 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-52mlp_9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80/console-operator/2.log" Apr 24 17:24:24.823109 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:24:24.823083 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-52mlp_9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80/console-operator/2.log" Apr 24 17:28:12.534585 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:28:12.534551 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-64756dc76b-pfqh6"] Apr 24 17:28:12.535109 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:28:12.534935 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="887a61f4-2698-47eb-bbe4-0e0c5e992a37" containerName="main" Apr 24 17:28:12.535109 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:28:12.534946 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="887a61f4-2698-47eb-bbe4-0e0c5e992a37" containerName="main" Apr 24 17:28:12.535109 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:28:12.534958 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e572e759-a847-4305-ada9-88093fdfc983" containerName="storage-initializer" Apr 24 17:28:12.535109 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:28:12.534964 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="e572e759-a847-4305-ada9-88093fdfc983" containerName="storage-initializer" Apr 24 17:28:12.535109 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:28:12.534971 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="887a61f4-2698-47eb-bbe4-0e0c5e992a37" containerName="tokenizer" Apr 24 17:28:12.535109 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:28:12.534977 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="887a61f4-2698-47eb-bbe4-0e0c5e992a37" containerName="tokenizer" Apr 24 17:28:12.535109 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:28:12.534991 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e572e759-a847-4305-ada9-88093fdfc983" containerName="main" Apr 24 17:28:12.535109 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:28:12.534996 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="e572e759-a847-4305-ada9-88093fdfc983" containerName="main" Apr 24 17:28:12.535109 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:28:12.535011 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="887a61f4-2698-47eb-bbe4-0e0c5e992a37" containerName="storage-initializer" Apr 24 17:28:12.535109 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:28:12.535016 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="887a61f4-2698-47eb-bbe4-0e0c5e992a37" containerName="storage-initializer" Apr 24 17:28:12.535109 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:28:12.535074 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="887a61f4-2698-47eb-bbe4-0e0c5e992a37" containerName="tokenizer" Apr 24 17:28:12.535109 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:28:12.535083 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="e572e759-a847-4305-ada9-88093fdfc983" containerName="main" Apr 24 17:28:12.535109 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:28:12.535092 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="887a61f4-2698-47eb-bbe4-0e0c5e992a37" containerName="main" Apr 24 17:28:12.538140 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:28:12.538124 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-64756dc76b-pfqh6" Apr 24 17:28:12.540448 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:28:12.540422 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 17:28:12.541149 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:28:12.541130 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-nrxvf\"" Apr 24 17:28:12.541223 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:28:12.541156 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 17:28:12.541223 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:28:12.541163 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 24 17:28:12.555246 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:28:12.555223 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-64756dc76b-pfqh6"] Apr 24 17:28:12.617957 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:28:12.617920 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7zgh\" (UniqueName: \"kubernetes.io/projected/cfb9553a-a40a-4a30-8058-bed4e841e974-kube-api-access-g7zgh\") pod \"llmisvc-controller-manager-64756dc76b-pfqh6\" (UID: \"cfb9553a-a40a-4a30-8058-bed4e841e974\") " pod="kserve/llmisvc-controller-manager-64756dc76b-pfqh6" Apr 24 17:28:12.617957 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:28:12.617960 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cfb9553a-a40a-4a30-8058-bed4e841e974-cert\") pod \"llmisvc-controller-manager-64756dc76b-pfqh6\" (UID: \"cfb9553a-a40a-4a30-8058-bed4e841e974\") " pod="kserve/llmisvc-controller-manager-64756dc76b-pfqh6" Apr 24 17:28:12.718854 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:28:12.718821 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g7zgh\" (UniqueName: \"kubernetes.io/projected/cfb9553a-a40a-4a30-8058-bed4e841e974-kube-api-access-g7zgh\") pod \"llmisvc-controller-manager-64756dc76b-pfqh6\" (UID: \"cfb9553a-a40a-4a30-8058-bed4e841e974\") " pod="kserve/llmisvc-controller-manager-64756dc76b-pfqh6" Apr 24 17:28:12.719018 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:28:12.718864 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cfb9553a-a40a-4a30-8058-bed4e841e974-cert\") pod \"llmisvc-controller-manager-64756dc76b-pfqh6\" (UID: \"cfb9553a-a40a-4a30-8058-bed4e841e974\") " pod="kserve/llmisvc-controller-manager-64756dc76b-pfqh6" Apr 24 17:28:12.721417 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:28:12.721389 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cfb9553a-a40a-4a30-8058-bed4e841e974-cert\") pod \"llmisvc-controller-manager-64756dc76b-pfqh6\" (UID: \"cfb9553a-a40a-4a30-8058-bed4e841e974\") " pod="kserve/llmisvc-controller-manager-64756dc76b-pfqh6" Apr 24 17:28:12.726782 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:28:12.726757 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7zgh\" (UniqueName: \"kubernetes.io/projected/cfb9553a-a40a-4a30-8058-bed4e841e974-kube-api-access-g7zgh\") pod \"llmisvc-controller-manager-64756dc76b-pfqh6\" (UID: \"cfb9553a-a40a-4a30-8058-bed4e841e974\") " pod="kserve/llmisvc-controller-manager-64756dc76b-pfqh6" Apr 24 17:28:12.848537 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:28:12.848421 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-64756dc76b-pfqh6" Apr 24 17:28:13.187374 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:28:13.187328 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-64756dc76b-pfqh6"] Apr 24 17:28:13.188473 ip-10-0-129-227 kubenswrapper[2579]: W0424 17:28:13.188446 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podcfb9553a_a40a_4a30_8058_bed4e841e974.slice/crio-7dd2dab833de6b6f3ee478535e964c9515a13fac90c4455cdeb76c96166c7401 WatchSource:0}: Error finding container 7dd2dab833de6b6f3ee478535e964c9515a13fac90c4455cdeb76c96166c7401: Status 404 returned error can't find the container with id 7dd2dab833de6b6f3ee478535e964c9515a13fac90c4455cdeb76c96166c7401 Apr 24 17:28:13.189648 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:28:13.189628 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 17:28:13.207251 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:28:13.207227 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-64756dc76b-pfqh6" event={"ID":"cfb9553a-a40a-4a30-8058-bed4e841e974","Type":"ContainerStarted","Data":"7dd2dab833de6b6f3ee478535e964c9515a13fac90c4455cdeb76c96166c7401"} Apr 24 17:28:17.225914 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:28:17.225879 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-64756dc76b-pfqh6" event={"ID":"cfb9553a-a40a-4a30-8058-bed4e841e974","Type":"ContainerStarted","Data":"a2176d49e0d291b154140ede96c10ca35fdba17c542a7f921e9ca4da6b692722"} Apr 24 17:28:17.226296 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:28:17.225995 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-64756dc76b-pfqh6" Apr 24 17:28:17.249791 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:28:17.249723 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-64756dc76b-pfqh6" podStartSLOduration=1.4272736670000001 podStartE2EDuration="5.249709334s" podCreationTimestamp="2026-04-24 17:28:12 +0000 UTC" firstStartedPulling="2026-04-24 17:28:13.189801772 +0000 UTC m=+2929.122870948" lastFinishedPulling="2026-04-24 17:28:17.012237426 +0000 UTC m=+2932.945306615" observedRunningTime="2026-04-24 17:28:17.247146482 +0000 UTC m=+2933.180215680" watchObservedRunningTime="2026-04-24 17:28:17.249709334 +0000 UTC m=+2933.182778532" Apr 24 17:28:48.231765 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:28:48.231660 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-64756dc76b-pfqh6" Apr 24 17:29:19.421518 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:29:19.421482 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vlv4f/must-gather-s4nq9"] Apr 24 17:29:19.425076 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:29:19.425060 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vlv4f/must-gather-s4nq9" Apr 24 17:29:19.427604 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:29:19.427586 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-vlv4f\"/\"kube-root-ca.crt\"" Apr 24 17:29:19.428296 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:29:19.428263 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-vlv4f\"/\"default-dockercfg-jqbx2\"" Apr 24 17:29:19.428296 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:29:19.428277 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-vlv4f\"/\"openshift-service-ca.crt\"" Apr 24 17:29:19.434529 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:29:19.434508 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vlv4f/must-gather-s4nq9"] Apr 24 17:29:19.498056 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:29:19.498019 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdb4f\" (UniqueName: \"kubernetes.io/projected/5773b75f-e5b7-4325-a0e9-fea7d6b8c4dc-kube-api-access-hdb4f\") pod \"must-gather-s4nq9\" (UID: \"5773b75f-e5b7-4325-a0e9-fea7d6b8c4dc\") " pod="openshift-must-gather-vlv4f/must-gather-s4nq9" Apr 24 17:29:19.498239 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:29:19.498081 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5773b75f-e5b7-4325-a0e9-fea7d6b8c4dc-must-gather-output\") pod \"must-gather-s4nq9\" (UID: \"5773b75f-e5b7-4325-a0e9-fea7d6b8c4dc\") " pod="openshift-must-gather-vlv4f/must-gather-s4nq9" Apr 24 17:29:19.599294 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:29:19.599259 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hdb4f\" (UniqueName: \"kubernetes.io/projected/5773b75f-e5b7-4325-a0e9-fea7d6b8c4dc-kube-api-access-hdb4f\") pod \"must-gather-s4nq9\" (UID: \"5773b75f-e5b7-4325-a0e9-fea7d6b8c4dc\") " pod="openshift-must-gather-vlv4f/must-gather-s4nq9" Apr 24 17:29:19.599516 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:29:19.599307 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5773b75f-e5b7-4325-a0e9-fea7d6b8c4dc-must-gather-output\") pod \"must-gather-s4nq9\" (UID: \"5773b75f-e5b7-4325-a0e9-fea7d6b8c4dc\") " pod="openshift-must-gather-vlv4f/must-gather-s4nq9" Apr 24 17:29:19.599602 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:29:19.599585 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5773b75f-e5b7-4325-a0e9-fea7d6b8c4dc-must-gather-output\") pod \"must-gather-s4nq9\" (UID: \"5773b75f-e5b7-4325-a0e9-fea7d6b8c4dc\") " pod="openshift-must-gather-vlv4f/must-gather-s4nq9" Apr 24 17:29:19.608257 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:29:19.608225 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdb4f\" (UniqueName: \"kubernetes.io/projected/5773b75f-e5b7-4325-a0e9-fea7d6b8c4dc-kube-api-access-hdb4f\") pod \"must-gather-s4nq9\" (UID: \"5773b75f-e5b7-4325-a0e9-fea7d6b8c4dc\") " pod="openshift-must-gather-vlv4f/must-gather-s4nq9" Apr 24 17:29:19.735010 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:29:19.734909 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vlv4f/must-gather-s4nq9" Apr 24 17:29:19.866436 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:29:19.866409 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vlv4f/must-gather-s4nq9"] Apr 24 17:29:19.867894 ip-10-0-129-227 kubenswrapper[2579]: W0424 17:29:19.867867 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5773b75f_e5b7_4325_a0e9_fea7d6b8c4dc.slice/crio-25e67bce1e807fe8729dd2c7e27bfabe758aff72f452585183ac84d7e220f42f WatchSource:0}: Error finding container 25e67bce1e807fe8729dd2c7e27bfabe758aff72f452585183ac84d7e220f42f: Status 404 returned error can't find the container with id 25e67bce1e807fe8729dd2c7e27bfabe758aff72f452585183ac84d7e220f42f Apr 24 17:29:20.474190 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:29:20.474157 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vlv4f/must-gather-s4nq9" event={"ID":"5773b75f-e5b7-4325-a0e9-fea7d6b8c4dc","Type":"ContainerStarted","Data":"25e67bce1e807fe8729dd2c7e27bfabe758aff72f452585183ac84d7e220f42f"} Apr 24 17:29:24.905620 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:29:24.905513 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-52mlp_9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80/console-operator/2.log" Apr 24 17:29:24.914454 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:29:24.905651 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-52mlp_9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80/console-operator/2.log" Apr 24 17:29:25.499716 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:29:25.499679 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vlv4f/must-gather-s4nq9" event={"ID":"5773b75f-e5b7-4325-a0e9-fea7d6b8c4dc","Type":"ContainerStarted","Data":"7587965de0cd1453016fd050cf46814c080e201920e31ae41510a1b2f5489298"} Apr 24 17:29:25.500023 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:29:25.500004 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vlv4f/must-gather-s4nq9" event={"ID":"5773b75f-e5b7-4325-a0e9-fea7d6b8c4dc","Type":"ContainerStarted","Data":"87998d36200ee93759cb82ece69c2d5f45ba2164211a75009022a16acef21436"} Apr 24 17:29:25.514102 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:29:25.514002 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vlv4f/must-gather-s4nq9" podStartSLOduration=1.437326782 podStartE2EDuration="6.513987326s" podCreationTimestamp="2026-04-24 17:29:19 +0000 UTC" firstStartedPulling="2026-04-24 17:29:19.869930147 +0000 UTC m=+2995.802999326" lastFinishedPulling="2026-04-24 17:29:24.946590689 +0000 UTC m=+3000.879659870" observedRunningTime="2026-04-24 17:29:25.513815794 +0000 UTC m=+3001.446884992" watchObservedRunningTime="2026-04-24 17:29:25.513987326 +0000 UTC m=+3001.447056525" Apr 24 17:29:51.008707 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:29:51.008671 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-hj6z7_378ae908-8a8e-4bcc-b882-05884531ffe9/kuadrant-console-plugin/0.log" Apr 24 17:29:51.067536 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:29:51.067505 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-q4wx6_9dcab55d-db4d-417d-85ce-f5473dbd2ac5/limitador/0.log" Apr 24 17:29:51.914212 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:29:51.914183 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-hj6z7_378ae908-8a8e-4bcc-b882-05884531ffe9/kuadrant-console-plugin/0.log" Apr 24 17:29:51.959936 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:29:51.959909 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-q4wx6_9dcab55d-db4d-417d-85ce-f5473dbd2ac5/limitador/0.log" Apr 24 17:29:52.801561 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:29:52.801531 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-hj6z7_378ae908-8a8e-4bcc-b882-05884531ffe9/kuadrant-console-plugin/0.log" Apr 24 17:29:52.845902 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:29:52.845872 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-q4wx6_9dcab55d-db4d-417d-85ce-f5473dbd2ac5/limitador/0.log" Apr 24 17:29:53.679505 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:29:53.679479 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-hj6z7_378ae908-8a8e-4bcc-b882-05884531ffe9/kuadrant-console-plugin/0.log" Apr 24 17:29:53.721440 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:29:53.721410 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-q4wx6_9dcab55d-db4d-417d-85ce-f5473dbd2ac5/limitador/0.log" Apr 24 17:29:54.554536 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:29:54.554507 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-hj6z7_378ae908-8a8e-4bcc-b882-05884531ffe9/kuadrant-console-plugin/0.log" Apr 24 17:29:54.596978 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:29:54.596945 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-q4wx6_9dcab55d-db4d-417d-85ce-f5473dbd2ac5/limitador/0.log" Apr 24 17:29:55.628747 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:29:55.628706 2579 generic.go:358] "Generic (PLEG): container finished" podID="5773b75f-e5b7-4325-a0e9-fea7d6b8c4dc" containerID="87998d36200ee93759cb82ece69c2d5f45ba2164211a75009022a16acef21436" exitCode=0 Apr 24 17:29:55.629154 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:29:55.628760 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vlv4f/must-gather-s4nq9" event={"ID":"5773b75f-e5b7-4325-a0e9-fea7d6b8c4dc","Type":"ContainerDied","Data":"87998d36200ee93759cb82ece69c2d5f45ba2164211a75009022a16acef21436"} Apr 24 17:29:55.629154 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:29:55.629091 2579 scope.go:117] "RemoveContainer" containerID="87998d36200ee93759cb82ece69c2d5f45ba2164211a75009022a16acef21436" Apr 24 17:29:56.235017 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:29:56.234992 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vlv4f_must-gather-s4nq9_5773b75f-e5b7-4325-a0e9-fea7d6b8c4dc/gather/0.log" Apr 24 17:29:56.855720 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:29:56.855683 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-47mg2/must-gather-c9qsm"] Apr 24 17:29:56.860751 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:29:56.860705 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-47mg2/must-gather-c9qsm" Apr 24 17:29:56.863208 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:29:56.863186 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-47mg2\"/\"kube-root-ca.crt\"" Apr 24 17:29:56.863312 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:29:56.863230 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-47mg2\"/\"openshift-service-ca.crt\"" Apr 24 17:29:56.863903 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:29:56.863886 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-47mg2\"/\"default-dockercfg-bn9xv\"" Apr 24 17:29:56.867524 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:29:56.867504 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-47mg2/must-gather-c9qsm"] Apr 24 17:29:56.948847 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:29:56.948797 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6013abf9-bf71-417b-8114-deeb1a492cbc-must-gather-output\") pod \"must-gather-c9qsm\" (UID: \"6013abf9-bf71-417b-8114-deeb1a492cbc\") " pod="openshift-must-gather-47mg2/must-gather-c9qsm" Apr 24 17:29:56.949042 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:29:56.948866 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pwv6\" (UniqueName: \"kubernetes.io/projected/6013abf9-bf71-417b-8114-deeb1a492cbc-kube-api-access-4pwv6\") pod \"must-gather-c9qsm\" (UID: \"6013abf9-bf71-417b-8114-deeb1a492cbc\") " pod="openshift-must-gather-47mg2/must-gather-c9qsm" Apr 24 17:29:57.049854 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:29:57.049818 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6013abf9-bf71-417b-8114-deeb1a492cbc-must-gather-output\") pod \"must-gather-c9qsm\" (UID: \"6013abf9-bf71-417b-8114-deeb1a492cbc\") " pod="openshift-must-gather-47mg2/must-gather-c9qsm" Apr 24 17:29:57.049854 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:29:57.049860 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4pwv6\" (UniqueName: \"kubernetes.io/projected/6013abf9-bf71-417b-8114-deeb1a492cbc-kube-api-access-4pwv6\") pod \"must-gather-c9qsm\" (UID: \"6013abf9-bf71-417b-8114-deeb1a492cbc\") " pod="openshift-must-gather-47mg2/must-gather-c9qsm" Apr 24 17:29:57.050180 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:29:57.050161 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6013abf9-bf71-417b-8114-deeb1a492cbc-must-gather-output\") pod \"must-gather-c9qsm\" (UID: \"6013abf9-bf71-417b-8114-deeb1a492cbc\") " pod="openshift-must-gather-47mg2/must-gather-c9qsm" Apr 24 17:29:57.064456 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:29:57.064426 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pwv6\" (UniqueName: \"kubernetes.io/projected/6013abf9-bf71-417b-8114-deeb1a492cbc-kube-api-access-4pwv6\") pod \"must-gather-c9qsm\" (UID: \"6013abf9-bf71-417b-8114-deeb1a492cbc\") " pod="openshift-must-gather-47mg2/must-gather-c9qsm" Apr 24 17:29:57.171882 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:29:57.171793 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-47mg2/must-gather-c9qsm" Apr 24 17:29:57.301009 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:29:57.300980 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-47mg2/must-gather-c9qsm"] Apr 24 17:29:57.302491 ip-10-0-129-227 kubenswrapper[2579]: W0424 17:29:57.302466 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6013abf9_bf71_417b_8114_deeb1a492cbc.slice/crio-582420d7d178553afe08b312a692edd931101d123521ed14e68232872893aaa5 WatchSource:0}: Error finding container 582420d7d178553afe08b312a692edd931101d123521ed14e68232872893aaa5: Status 404 returned error can't find the container with id 582420d7d178553afe08b312a692edd931101d123521ed14e68232872893aaa5 Apr 24 17:29:57.637925 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:29:57.637887 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-47mg2/must-gather-c9qsm" event={"ID":"6013abf9-bf71-417b-8114-deeb1a492cbc","Type":"ContainerStarted","Data":"582420d7d178553afe08b312a692edd931101d123521ed14e68232872893aaa5"} Apr 24 17:29:59.652402 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:29:59.652355 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-47mg2/must-gather-c9qsm" event={"ID":"6013abf9-bf71-417b-8114-deeb1a492cbc","Type":"ContainerStarted","Data":"ad4bd73bd32a46acb6d59ff6755e0c1d17b2e3a66b44a765b1ccc9668ad5d8c9"} Apr 24 17:29:59.653021 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:29:59.652970 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-47mg2/must-gather-c9qsm" event={"ID":"6013abf9-bf71-417b-8114-deeb1a492cbc","Type":"ContainerStarted","Data":"c9c9f40e9fb99de167afc4138d8c412d83607d3adf18c11d1d52ee5822ee0114"} Apr 24 17:29:59.671189 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:29:59.671132 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-47mg2/must-gather-c9qsm" podStartSLOduration=2.442235443 podStartE2EDuration="3.6711136s" podCreationTimestamp="2026-04-24 17:29:56 +0000 UTC" firstStartedPulling="2026-04-24 17:29:57.304261009 +0000 UTC m=+3033.237330186" lastFinishedPulling="2026-04-24 17:29:58.533139163 +0000 UTC m=+3034.466208343" observedRunningTime="2026-04-24 17:29:59.66713252 +0000 UTC m=+3035.600201722" watchObservedRunningTime="2026-04-24 17:29:59.6711136 +0000 UTC m=+3035.604182800" Apr 24 17:30:00.106884 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:00.106851 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-dpptg_756e5676-cd3e-4b45-abfe-777889ee4865/global-pull-secret-syncer/0.log" Apr 24 17:30:00.174205 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:00.174155 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-b565t_d7c17281-bc8e-4ae5-bf1c-eaf465abf88b/konnectivity-agent/0.log" Apr 24 17:30:00.257978 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:00.257946 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-129-227.ec2.internal_abe1697a38d64560f353e38f63759680/haproxy/0.log" Apr 24 17:30:01.709503 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:01.709462 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vlv4f/must-gather-s4nq9"] Apr 24 17:30:01.710112 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:01.709767 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-vlv4f/must-gather-s4nq9" podUID="5773b75f-e5b7-4325-a0e9-fea7d6b8c4dc" containerName="copy" containerID="cri-o://7587965de0cd1453016fd050cf46814c080e201920e31ae41510a1b2f5489298" gracePeriod=2 Apr 24 17:30:01.711826 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:01.711790 2579 status_manager.go:895] "Failed to get status for pod" podUID="5773b75f-e5b7-4325-a0e9-fea7d6b8c4dc" pod="openshift-must-gather-vlv4f/must-gather-s4nq9" err="pods \"must-gather-s4nq9\" is forbidden: User \"system:node:ip-10-0-129-227.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-vlv4f\": no relationship found between node 'ip-10-0-129-227.ec2.internal' and this object" Apr 24 17:30:01.715274 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:01.715245 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vlv4f/must-gather-s4nq9"] Apr 24 17:30:02.121481 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:02.121454 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vlv4f_must-gather-s4nq9_5773b75f-e5b7-4325-a0e9-fea7d6b8c4dc/copy/0.log" Apr 24 17:30:02.122201 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:02.122182 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vlv4f/must-gather-s4nq9" Apr 24 17:30:02.125074 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:02.125045 2579 status_manager.go:895] "Failed to get status for pod" podUID="5773b75f-e5b7-4325-a0e9-fea7d6b8c4dc" pod="openshift-must-gather-vlv4f/must-gather-s4nq9" err="pods \"must-gather-s4nq9\" is forbidden: User \"system:node:ip-10-0-129-227.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-vlv4f\": no relationship found between node 'ip-10-0-129-227.ec2.internal' and this object" Apr 24 17:30:02.204978 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:02.204933 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdb4f\" (UniqueName: \"kubernetes.io/projected/5773b75f-e5b7-4325-a0e9-fea7d6b8c4dc-kube-api-access-hdb4f\") pod \"5773b75f-e5b7-4325-a0e9-fea7d6b8c4dc\" (UID: \"5773b75f-e5b7-4325-a0e9-fea7d6b8c4dc\") " Apr 24 17:30:02.205269 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:02.205236 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5773b75f-e5b7-4325-a0e9-fea7d6b8c4dc-must-gather-output\") pod \"5773b75f-e5b7-4325-a0e9-fea7d6b8c4dc\" (UID: \"5773b75f-e5b7-4325-a0e9-fea7d6b8c4dc\") " Apr 24 17:30:02.214983 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:02.214282 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5773b75f-e5b7-4325-a0e9-fea7d6b8c4dc-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "5773b75f-e5b7-4325-a0e9-fea7d6b8c4dc" (UID: "5773b75f-e5b7-4325-a0e9-fea7d6b8c4dc"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:30:02.220849 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:02.217290 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5773b75f-e5b7-4325-a0e9-fea7d6b8c4dc-kube-api-access-hdb4f" (OuterVolumeSpecName: "kube-api-access-hdb4f") pod "5773b75f-e5b7-4325-a0e9-fea7d6b8c4dc" (UID: "5773b75f-e5b7-4325-a0e9-fea7d6b8c4dc"). InnerVolumeSpecName "kube-api-access-hdb4f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 17:30:02.316049 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:02.316005 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hdb4f\" (UniqueName: \"kubernetes.io/projected/5773b75f-e5b7-4325-a0e9-fea7d6b8c4dc-kube-api-access-hdb4f\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 17:30:02.316049 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:02.316051 2579 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5773b75f-e5b7-4325-a0e9-fea7d6b8c4dc-must-gather-output\") on node \"ip-10-0-129-227.ec2.internal\" DevicePath \"\"" Apr 24 17:30:02.625113 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:02.625070 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5773b75f-e5b7-4325-a0e9-fea7d6b8c4dc" path="/var/lib/kubelet/pods/5773b75f-e5b7-4325-a0e9-fea7d6b8c4dc/volumes" Apr 24 17:30:02.694462 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:02.694306 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vlv4f_must-gather-s4nq9_5773b75f-e5b7-4325-a0e9-fea7d6b8c4dc/copy/0.log" Apr 24 17:30:02.695966 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:02.694757 2579 generic.go:358] "Generic (PLEG): container finished" podID="5773b75f-e5b7-4325-a0e9-fea7d6b8c4dc" containerID="7587965de0cd1453016fd050cf46814c080e201920e31ae41510a1b2f5489298" exitCode=143 Apr 24 17:30:02.695966 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:02.694819 2579 scope.go:117] "RemoveContainer" containerID="7587965de0cd1453016fd050cf46814c080e201920e31ae41510a1b2f5489298" Apr 24 17:30:02.695966 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:02.694961 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vlv4f/must-gather-s4nq9" Apr 24 17:30:02.722392 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:02.721412 2579 scope.go:117] "RemoveContainer" containerID="87998d36200ee93759cb82ece69c2d5f45ba2164211a75009022a16acef21436" Apr 24 17:30:02.767245 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:02.767218 2579 scope.go:117] "RemoveContainer" containerID="7587965de0cd1453016fd050cf46814c080e201920e31ae41510a1b2f5489298" Apr 24 17:30:02.768219 ip-10-0-129-227 kubenswrapper[2579]: E0424 17:30:02.767899 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7587965de0cd1453016fd050cf46814c080e201920e31ae41510a1b2f5489298\": container with ID starting with 7587965de0cd1453016fd050cf46814c080e201920e31ae41510a1b2f5489298 not found: ID does not exist" containerID="7587965de0cd1453016fd050cf46814c080e201920e31ae41510a1b2f5489298" Apr 24 17:30:02.768219 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:02.767939 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7587965de0cd1453016fd050cf46814c080e201920e31ae41510a1b2f5489298"} err="failed to get container status \"7587965de0cd1453016fd050cf46814c080e201920e31ae41510a1b2f5489298\": rpc error: code = NotFound desc = could not find container \"7587965de0cd1453016fd050cf46814c080e201920e31ae41510a1b2f5489298\": container with ID starting with 7587965de0cd1453016fd050cf46814c080e201920e31ae41510a1b2f5489298 not found: ID does not exist" Apr 24 17:30:02.768219 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:02.767966 2579 scope.go:117] "RemoveContainer" containerID="87998d36200ee93759cb82ece69c2d5f45ba2164211a75009022a16acef21436" Apr 24 17:30:02.768601 ip-10-0-129-227 kubenswrapper[2579]: E0424 17:30:02.768527 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87998d36200ee93759cb82ece69c2d5f45ba2164211a75009022a16acef21436\": container with ID starting with 87998d36200ee93759cb82ece69c2d5f45ba2164211a75009022a16acef21436 not found: ID does not exist" containerID="87998d36200ee93759cb82ece69c2d5f45ba2164211a75009022a16acef21436" Apr 24 17:30:02.768601 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:02.768567 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87998d36200ee93759cb82ece69c2d5f45ba2164211a75009022a16acef21436"} err="failed to get container status \"87998d36200ee93759cb82ece69c2d5f45ba2164211a75009022a16acef21436\": rpc error: code = NotFound desc = could not find container \"87998d36200ee93759cb82ece69c2d5f45ba2164211a75009022a16acef21436\": container with ID starting with 87998d36200ee93759cb82ece69c2d5f45ba2164211a75009022a16acef21436 not found: ID does not exist" Apr 24 17:30:04.738947 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:04.738918 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-hj6z7_378ae908-8a8e-4bcc-b882-05884531ffe9/kuadrant-console-plugin/0.log" Apr 24 17:30:04.816883 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:04.816855 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-q4wx6_9dcab55d-db4d-417d-85ce-f5473dbd2ac5/limitador/0.log" Apr 24 17:30:06.039445 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:06.039286 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-5nnkx_a32d7d16-b384-47fc-a565-2b51f6f8c945/cluster-monitoring-operator/0.log" Apr 24 17:30:06.150100 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:06.149995 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-7468bc468d-h9m7f_1892d90d-bef5-49d7-87af-a352bf60c2a3/metrics-server/0.log" Apr 24 17:30:06.179305 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:06.179278 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-rpv8t_d732ca6b-ee1c-4dac-8c6b-17d88a89520a/monitoring-plugin/0.log" Apr 24 17:30:06.222442 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:06.222413 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2tmrq_e68c601e-ace9-461f-8270-139af643bb24/node-exporter/0.log" Apr 24 17:30:06.245068 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:06.245037 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2tmrq_e68c601e-ace9-461f-8270-139af643bb24/kube-rbac-proxy/0.log" Apr 24 17:30:06.266785 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:06.266722 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2tmrq_e68c601e-ace9-461f-8270-139af643bb24/init-textfile/0.log" Apr 24 17:30:06.443201 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:06.443111 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-cfv6k_08d16e0f-be67-4d98-93a9-25a13038cbb7/kube-rbac-proxy-main/0.log" Apr 24 17:30:06.470582 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:06.470557 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-cfv6k_08d16e0f-be67-4d98-93a9-25a13038cbb7/kube-rbac-proxy-self/0.log" Apr 24 17:30:06.506591 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:06.506564 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-cfv6k_08d16e0f-be67-4d98-93a9-25a13038cbb7/openshift-state-metrics/0.log" Apr 24 17:30:06.739808 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:06.739682 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-chvg7_fe961a5e-d33f-4c72-a253-e83a37663457/prometheus-operator/0.log" Apr 24 17:30:06.758262 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:06.758223 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-chvg7_fe961a5e-d33f-4c72-a253-e83a37663457/kube-rbac-proxy/0.log" Apr 24 17:30:06.892951 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:06.892922 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-65f49765b8-dzndp_6c27e1fe-fd53-4300-9942-8c630cdcafe5/thanos-query/0.log" Apr 24 17:30:06.928614 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:06.928587 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-65f49765b8-dzndp_6c27e1fe-fd53-4300-9942-8c630cdcafe5/kube-rbac-proxy-web/0.log" Apr 24 17:30:06.949939 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:06.949888 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-65f49765b8-dzndp_6c27e1fe-fd53-4300-9942-8c630cdcafe5/kube-rbac-proxy/0.log" Apr 24 17:30:06.970542 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:06.970502 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-65f49765b8-dzndp_6c27e1fe-fd53-4300-9942-8c630cdcafe5/prom-label-proxy/0.log" Apr 24 17:30:06.989724 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:06.989689 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-65f49765b8-dzndp_6c27e1fe-fd53-4300-9942-8c630cdcafe5/kube-rbac-proxy-rules/0.log" Apr 24 17:30:07.012846 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:07.012818 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-65f49765b8-dzndp_6c27e1fe-fd53-4300-9942-8c630cdcafe5/kube-rbac-proxy-metrics/0.log" Apr 24 17:30:08.649484 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:08.649450 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-52mlp_9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80/console-operator/2.log" Apr 24 17:30:08.656120 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:08.656096 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-52mlp_9a9d0510-e8e5-4e24-bd5a-c4e88d66aa80/console-operator/3.log" Apr 24 17:30:09.098535 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:09.098507 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-n8jmb_9d205aa0-a444-4516-a2f2-f9d5e15b0a24/download-server/0.log" Apr 24 17:30:09.366622 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:09.366535 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-47mg2/perf-node-gather-daemonset-z5rxf"] Apr 24 17:30:09.367117 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:09.367095 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5773b75f-e5b7-4325-a0e9-fea7d6b8c4dc" containerName="copy" Apr 24 17:30:09.367230 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:09.367120 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="5773b75f-e5b7-4325-a0e9-fea7d6b8c4dc" containerName="copy" Apr 24 17:30:09.367230 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:09.367148 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5773b75f-e5b7-4325-a0e9-fea7d6b8c4dc" containerName="gather" Apr 24 17:30:09.367230 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:09.367155 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="5773b75f-e5b7-4325-a0e9-fea7d6b8c4dc" containerName="gather" Apr 24 17:30:09.367401 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:09.367298 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="5773b75f-e5b7-4325-a0e9-fea7d6b8c4dc" containerName="gather" Apr 24 17:30:09.367401 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:09.367316 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="5773b75f-e5b7-4325-a0e9-fea7d6b8c4dc" containerName="copy" Apr 24 17:30:09.371859 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:09.371834 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-47mg2/perf-node-gather-daemonset-z5rxf" Apr 24 17:30:09.382483 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:09.382449 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-47mg2/perf-node-gather-daemonset-z5rxf"] Apr 24 17:30:09.499028 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:09.498994 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qzvm\" (UniqueName: \"kubernetes.io/projected/e8da943c-25fb-4880-986e-d7631d15d89d-kube-api-access-8qzvm\") pod \"perf-node-gather-daemonset-z5rxf\" (UID: \"e8da943c-25fb-4880-986e-d7631d15d89d\") " pod="openshift-must-gather-47mg2/perf-node-gather-daemonset-z5rxf" Apr 24 17:30:09.499252 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:09.499130 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e8da943c-25fb-4880-986e-d7631d15d89d-lib-modules\") pod \"perf-node-gather-daemonset-z5rxf\" (UID: \"e8da943c-25fb-4880-986e-d7631d15d89d\") " pod="openshift-must-gather-47mg2/perf-node-gather-daemonset-z5rxf" Apr 24 17:30:09.499252 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:09.499168 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e8da943c-25fb-4880-986e-d7631d15d89d-sys\") pod \"perf-node-gather-daemonset-z5rxf\" (UID: \"e8da943c-25fb-4880-986e-d7631d15d89d\") " pod="openshift-must-gather-47mg2/perf-node-gather-daemonset-z5rxf" Apr 24 17:30:09.499252 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:09.499232 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e8da943c-25fb-4880-986e-d7631d15d89d-proc\") pod \"perf-node-gather-daemonset-z5rxf\" (UID: \"e8da943c-25fb-4880-986e-d7631d15d89d\") " pod="openshift-must-gather-47mg2/perf-node-gather-daemonset-z5rxf" Apr 24 17:30:09.499443 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:09.499288 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e8da943c-25fb-4880-986e-d7631d15d89d-podres\") pod \"perf-node-gather-daemonset-z5rxf\" (UID: \"e8da943c-25fb-4880-986e-d7631d15d89d\") " pod="openshift-must-gather-47mg2/perf-node-gather-daemonset-z5rxf" Apr 24 17:30:09.571197 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:09.571170 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-84cbm_90bd04ca-0a65-47d9-a182-69a15ab185c9/volume-data-source-validator/0.log" Apr 24 17:30:09.600325 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:09.600290 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e8da943c-25fb-4880-986e-d7631d15d89d-podres\") pod \"perf-node-gather-daemonset-z5rxf\" (UID: \"e8da943c-25fb-4880-986e-d7631d15d89d\") " pod="openshift-must-gather-47mg2/perf-node-gather-daemonset-z5rxf" Apr 24 17:30:09.600325 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:09.600341 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8qzvm\" (UniqueName: \"kubernetes.io/projected/e8da943c-25fb-4880-986e-d7631d15d89d-kube-api-access-8qzvm\") pod \"perf-node-gather-daemonset-z5rxf\" (UID: \"e8da943c-25fb-4880-986e-d7631d15d89d\") " pod="openshift-must-gather-47mg2/perf-node-gather-daemonset-z5rxf" Apr 24 17:30:09.600587 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:09.600382 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e8da943c-25fb-4880-986e-d7631d15d89d-lib-modules\") pod \"perf-node-gather-daemonset-z5rxf\" (UID: \"e8da943c-25fb-4880-986e-d7631d15d89d\") " pod="openshift-must-gather-47mg2/perf-node-gather-daemonset-z5rxf" Apr 24 17:30:09.600587 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:09.600398 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e8da943c-25fb-4880-986e-d7631d15d89d-sys\") pod \"perf-node-gather-daemonset-z5rxf\" (UID: \"e8da943c-25fb-4880-986e-d7631d15d89d\") " pod="openshift-must-gather-47mg2/perf-node-gather-daemonset-z5rxf" Apr 24 17:30:09.600587 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:09.600425 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e8da943c-25fb-4880-986e-d7631d15d89d-proc\") pod \"perf-node-gather-daemonset-z5rxf\" (UID: \"e8da943c-25fb-4880-986e-d7631d15d89d\") " pod="openshift-must-gather-47mg2/perf-node-gather-daemonset-z5rxf" Apr 24 17:30:09.600587 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:09.600469 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e8da943c-25fb-4880-986e-d7631d15d89d-podres\") pod \"perf-node-gather-daemonset-z5rxf\" (UID: \"e8da943c-25fb-4880-986e-d7631d15d89d\") " pod="openshift-must-gather-47mg2/perf-node-gather-daemonset-z5rxf" Apr 24 17:30:09.600587 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:09.600498 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e8da943c-25fb-4880-986e-d7631d15d89d-sys\") pod \"perf-node-gather-daemonset-z5rxf\" (UID: \"e8da943c-25fb-4880-986e-d7631d15d89d\") " pod="openshift-must-gather-47mg2/perf-node-gather-daemonset-z5rxf" Apr 24 17:30:09.600587 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:09.600513 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e8da943c-25fb-4880-986e-d7631d15d89d-proc\") pod \"perf-node-gather-daemonset-z5rxf\" (UID: \"e8da943c-25fb-4880-986e-d7631d15d89d\") " pod="openshift-must-gather-47mg2/perf-node-gather-daemonset-z5rxf" Apr 24 17:30:09.600587 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:09.600541 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e8da943c-25fb-4880-986e-d7631d15d89d-lib-modules\") pod \"perf-node-gather-daemonset-z5rxf\" (UID: \"e8da943c-25fb-4880-986e-d7631d15d89d\") " pod="openshift-must-gather-47mg2/perf-node-gather-daemonset-z5rxf" Apr 24 17:30:09.612332 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:09.612306 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qzvm\" (UniqueName: \"kubernetes.io/projected/e8da943c-25fb-4880-986e-d7631d15d89d-kube-api-access-8qzvm\") pod \"perf-node-gather-daemonset-z5rxf\" (UID: \"e8da943c-25fb-4880-986e-d7631d15d89d\") " pod="openshift-must-gather-47mg2/perf-node-gather-daemonset-z5rxf" Apr 24 17:30:09.685300 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:09.685209 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-47mg2/perf-node-gather-daemonset-z5rxf" Apr 24 17:30:09.837923 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:09.837889 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-47mg2/perf-node-gather-daemonset-z5rxf"] Apr 24 17:30:09.843185 ip-10-0-129-227 kubenswrapper[2579]: W0424 17:30:09.843157 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode8da943c_25fb_4880_986e_d7631d15d89d.slice/crio-3d5986ffdfd5eed773c940e6b21ff6f8c23228f409cffa0b5b2c9c9cba37622d WatchSource:0}: Error finding container 3d5986ffdfd5eed773c940e6b21ff6f8c23228f409cffa0b5b2c9c9cba37622d: Status 404 returned error can't find the container with id 3d5986ffdfd5eed773c940e6b21ff6f8c23228f409cffa0b5b2c9c9cba37622d Apr 24 17:30:10.361845 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:10.361815 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-pkmpl_fc3c012b-d60b-47fa-aa82-a2d3bb5649b3/dns/0.log" Apr 24 17:30:10.382564 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:10.382540 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-pkmpl_fc3c012b-d60b-47fa-aa82-a2d3bb5649b3/kube-rbac-proxy/0.log" Apr 24 17:30:10.475197 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:10.475173 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-7grsg_53e5c63d-2459-4e1a-be74-d81070bbbeff/dns-node-resolver/0.log" Apr 24 17:30:10.738047 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:10.737963 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-47mg2/perf-node-gather-daemonset-z5rxf" event={"ID":"e8da943c-25fb-4880-986e-d7631d15d89d","Type":"ContainerStarted","Data":"6bd8ccfed17a0815dce4880151da6e8e54c606898406a18a45ab8276ed4120b5"} Apr 24 17:30:10.738852 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:10.738826 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-47mg2/perf-node-gather-daemonset-z5rxf" event={"ID":"e8da943c-25fb-4880-986e-d7631d15d89d","Type":"ContainerStarted","Data":"3d5986ffdfd5eed773c940e6b21ff6f8c23228f409cffa0b5b2c9c9cba37622d"} Apr 24 17:30:10.738852 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:10.738854 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-47mg2/perf-node-gather-daemonset-z5rxf" Apr 24 17:30:10.756323 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:10.756265 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-47mg2/perf-node-gather-daemonset-z5rxf" podStartSLOduration=1.7562474959999999 podStartE2EDuration="1.756247496s" podCreationTimestamp="2026-04-24 17:30:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 17:30:10.755175858 +0000 UTC m=+3046.688245098" watchObservedRunningTime="2026-04-24 17:30:10.756247496 +0000 UTC m=+3046.689316689" Apr 24 17:30:11.042507 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:11.042475 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-kdlh8_801dc599-67b9-400f-a14c-835900dba24e/node-ca/0.log" Apr 24 17:30:12.376302 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:12.376273 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-rgpmf_4b53d555-6ece-4be6-a70d-30c64956654b/serve-healthcheck-canary/0.log" Apr 24 17:30:12.788807 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:12.788775 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-4wsnt_07d4b91a-a227-4b22-8f87-40c4e9c8139c/insights-operator/0.log" Apr 24 17:30:12.791080 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:12.791051 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-4wsnt_07d4b91a-a227-4b22-8f87-40c4e9c8139c/insights-operator/1.log" Apr 24 17:30:12.812201 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:12.812167 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-d7n6q_15b2786f-0566-4570-a7f7-f09fd69c6f54/kube-rbac-proxy/0.log" Apr 24 17:30:12.835596 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:12.835572 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-d7n6q_15b2786f-0566-4570-a7f7-f09fd69c6f54/exporter/0.log" Apr 24 17:30:12.856950 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:12.856924 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-d7n6q_15b2786f-0566-4570-a7f7-f09fd69c6f54/extractor/0.log" Apr 24 17:30:15.567141 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:15.567101 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-75b5bf9f6d-splcs_6065d775-116b-4301-9a90-beb7dc443688/manager/0.log" Apr 24 17:30:15.615242 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:15.615204 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-ht659_d95d03e0-6d52-4fe0-bc24-a0bc74394c12/openshift-lws-operator/0.log" Apr 24 17:30:16.138636 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:16.138605 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-64756dc76b-pfqh6_cfb9553a-a40a-4a30-8058-bed4e841e974/manager/0.log" Apr 24 17:30:16.365263 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:16.365238 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-r9wbd_d69a0a16-2739-489b-8fe1-10faa61cb21e/s3-init/0.log" Apr 24 17:30:16.756805 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:16.756779 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-47mg2/perf-node-gather-daemonset-z5rxf" Apr 24 17:30:21.102338 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:21.102308 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-c59xq_16107c19-f352-4fb7-b6d8-6604b6f5ceb1/migrator/0.log" Apr 24 17:30:21.122961 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:21.122883 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-c59xq_16107c19-f352-4fb7-b6d8-6604b6f5ceb1/graceful-termination/0.log" Apr 24 17:30:21.502120 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:21.502081 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-dghtm_d6ed4479-aced-4d35-9249-229096300dc7/kube-storage-version-migrator-operator/1.log" Apr 24 17:30:21.503190 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:21.503161 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-dghtm_d6ed4479-aced-4d35-9249-229096300dc7/kube-storage-version-migrator-operator/0.log" Apr 24 17:30:22.558293 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:22.558267 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-92bp8_54f6ae1b-e6ac-48cd-92cc-1c9ae843b609/kube-multus-additional-cni-plugins/0.log" Apr 24 17:30:22.581945 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:22.581920 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-92bp8_54f6ae1b-e6ac-48cd-92cc-1c9ae843b609/egress-router-binary-copy/0.log" Apr 24 17:30:22.611664 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:22.611636 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-92bp8_54f6ae1b-e6ac-48cd-92cc-1c9ae843b609/cni-plugins/0.log" Apr 24 17:30:22.634455 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:22.634422 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-92bp8_54f6ae1b-e6ac-48cd-92cc-1c9ae843b609/bond-cni-plugin/0.log" Apr 24 17:30:22.654770 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:22.654708 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-92bp8_54f6ae1b-e6ac-48cd-92cc-1c9ae843b609/routeoverride-cni/0.log" Apr 24 17:30:22.676200 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:22.676169 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-92bp8_54f6ae1b-e6ac-48cd-92cc-1c9ae843b609/whereabouts-cni-bincopy/0.log" Apr 24 17:30:22.696654 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:22.696604 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-92bp8_54f6ae1b-e6ac-48cd-92cc-1c9ae843b609/whereabouts-cni/0.log" Apr 24 17:30:23.047289 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:23.047260 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ng4dd_9afa95f3-5eb1-47b0-b00d-a487cb566ffc/kube-multus/0.log" Apr 24 17:30:23.163683 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:23.163647 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-kmh29_4af5b09f-ceea-413a-bec5-40a2b59c7ea3/network-metrics-daemon/0.log" Apr 24 17:30:23.182606 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:23.182576 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-kmh29_4af5b09f-ceea-413a-bec5-40a2b59c7ea3/kube-rbac-proxy/0.log" Apr 24 17:30:24.035544 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:24.035517 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7fpht_1e8f7734-976a-4f89-b273-9e519d952582/ovn-controller/0.log" Apr 24 17:30:24.066557 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:24.066521 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7fpht_1e8f7734-976a-4f89-b273-9e519d952582/ovn-acl-logging/0.log" Apr 24 17:30:24.086823 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:24.086794 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7fpht_1e8f7734-976a-4f89-b273-9e519d952582/kube-rbac-proxy-node/0.log" Apr 24 17:30:24.108569 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:24.108539 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7fpht_1e8f7734-976a-4f89-b273-9e519d952582/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 17:30:24.129384 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:24.129355 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7fpht_1e8f7734-976a-4f89-b273-9e519d952582/northd/0.log" Apr 24 17:30:24.152158 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:24.152135 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7fpht_1e8f7734-976a-4f89-b273-9e519d952582/nbdb/0.log" Apr 24 17:30:24.174562 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:24.174533 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7fpht_1e8f7734-976a-4f89-b273-9e519d952582/sbdb/0.log" Apr 24 17:30:24.295612 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:24.295530 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7fpht_1e8f7734-976a-4f89-b273-9e519d952582/ovnkube-controller/0.log" Apr 24 17:30:25.952852 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:25.952818 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-nfd4n_0e87d719-99e3-4983-8268-513fafb03033/check-endpoints/0.log" Apr 24 17:30:25.983117 ip-10-0-129-227 kubenswrapper[2579]: I0424 17:30:25.983087 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-hknkz_28a21434-57fa-420d-a7e4-3b011eb1dc0e/network-check-target-container/0.log"