Apr 22 18:44:06.913312 ip-10-0-130-234 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 22 18:44:06.913323 ip-10-0-130-234 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 22 18:44:06.913330 ip-10-0-130-234 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 22 18:44:06.913579 ip-10-0-130-234 systemd[1]: Failed to start Kubernetes Kubelet. Apr 22 18:44:17.070800 ip-10-0-130-234 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 22 18:44:17.070818 ip-10-0-130-234 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 97ff6e0b994c44578945594e5934e301 -- Apr 22 18:46:38.105243 ip-10-0-130-234 systemd[1]: Starting Kubernetes Kubelet... Apr 22 18:46:38.553037 ip-10-0-130-234 kubenswrapper[2566]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:46:38.553037 ip-10-0-130-234 kubenswrapper[2566]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 18:46:38.553037 ip-10-0-130-234 kubenswrapper[2566]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:46:38.553037 ip-10-0-130-234 kubenswrapper[2566]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 18:46:38.553037 ip-10-0-130-234 kubenswrapper[2566]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:46:38.555391 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.555308 2566 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 18:46:38.558327 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558312 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:46:38.558327 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558327 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:46:38.558391 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558331 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:46:38.558391 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558334 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:46:38.558391 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558337 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:46:38.558391 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558340 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:46:38.558391 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558343 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:46:38.558391 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558347 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:46:38.558391 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558350 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:46:38.558391 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558353 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:46:38.558391 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558355 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:46:38.558391 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558358 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:46:38.558391 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558361 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:46:38.558391 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558364 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:46:38.558391 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558367 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:46:38.558391 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558370 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:46:38.558391 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558373 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:46:38.558391 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558376 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:46:38.558391 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558378 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:46:38.558391 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558381 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:46:38.558391 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558383 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:46:38.558391 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558386 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:46:38.558897 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558389 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:46:38.558897 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558391 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:46:38.558897 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558394 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:46:38.558897 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558397 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:46:38.558897 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558399 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:46:38.558897 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558402 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:46:38.558897 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558405 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:46:38.558897 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558408 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:46:38.558897 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558412 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:46:38.558897 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558416 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:46:38.558897 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558418 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:46:38.558897 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558421 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:46:38.558897 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558424 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:46:38.558897 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558427 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:46:38.558897 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558429 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:46:38.558897 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558432 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:46:38.558897 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558434 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:46:38.558897 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558437 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:46:38.558897 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558439 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:46:38.559400 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558442 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:46:38.559400 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558444 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:46:38.559400 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558447 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:46:38.559400 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558449 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:46:38.559400 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558452 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:46:38.559400 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558454 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:46:38.559400 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558457 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:46:38.559400 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558461 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:46:38.559400 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558464 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:46:38.559400 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558466 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:46:38.559400 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558469 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:46:38.559400 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558472 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:46:38.559400 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558474 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:46:38.559400 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558478 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:46:38.559400 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558483 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:46:38.559400 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558485 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:46:38.559400 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558488 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:46:38.559400 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558491 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:46:38.559400 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558493 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:46:38.559881 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558496 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:46:38.559881 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558499 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:46:38.559881 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558501 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:46:38.559881 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558504 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:46:38.559881 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558507 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:46:38.559881 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558509 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:46:38.559881 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558512 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:46:38.559881 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558515 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:46:38.559881 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558517 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:46:38.559881 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558521 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:46:38.559881 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558523 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:46:38.559881 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558526 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:46:38.559881 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558528 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:46:38.559881 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558531 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:46:38.559881 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558534 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:46:38.559881 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558536 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:46:38.559881 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558539 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:46:38.559881 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558541 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:46:38.559881 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558544 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:46:38.559881 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558547 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:46:38.560405 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558549 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:46:38.560405 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558552 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:46:38.560405 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558555 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:46:38.560405 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558557 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:46:38.560405 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558560 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:46:38.560405 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558562 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:46:38.560405 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558959 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:46:38.560405 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558966 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:46:38.560405 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558969 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:46:38.560405 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558973 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:46:38.560405 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558975 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:46:38.560405 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558978 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:46:38.560405 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558981 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:46:38.560405 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558984 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:46:38.560405 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558986 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:46:38.560405 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558989 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:46:38.560405 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558992 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:46:38.560405 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558994 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:46:38.560405 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558997 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:46:38.560405 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.558999 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:46:38.560933 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559002 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:46:38.560933 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559004 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:46:38.560933 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559007 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:46:38.560933 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559010 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:46:38.560933 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559012 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:46:38.560933 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559015 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:46:38.560933 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559017 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:46:38.560933 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559020 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:46:38.560933 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559022 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:46:38.560933 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559025 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:46:38.560933 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559027 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:46:38.560933 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559030 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:46:38.560933 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559033 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:46:38.560933 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559041 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:46:38.560933 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559044 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:46:38.560933 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559047 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:46:38.560933 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559049 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:46:38.560933 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559052 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:46:38.560933 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559054 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:46:38.560933 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559057 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:46:38.561446 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559060 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:46:38.561446 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559062 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:46:38.561446 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559065 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:46:38.561446 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559068 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:46:38.561446 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559071 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:46:38.561446 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559074 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:46:38.561446 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559076 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:46:38.561446 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559079 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:46:38.561446 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559081 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:46:38.561446 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559084 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:46:38.561446 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559087 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:46:38.561446 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559089 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:46:38.561446 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559093 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:46:38.561446 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559097 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:46:38.561446 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559100 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:46:38.561446 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559102 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:46:38.561446 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559105 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:46:38.561446 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559107 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:46:38.561446 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559111 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:46:38.562039 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559115 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:46:38.562039 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559117 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:46:38.562039 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559120 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:46:38.562039 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559123 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:46:38.562039 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559126 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:46:38.562039 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559128 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:46:38.562039 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559131 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:46:38.562039 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559134 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:46:38.562039 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559137 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:46:38.562039 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559139 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:46:38.562039 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559142 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:46:38.562039 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559145 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:46:38.562039 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559148 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:46:38.562039 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559151 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:46:38.562039 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559153 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:46:38.562039 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559156 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:46:38.562039 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559159 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:46:38.562039 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559161 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:46:38.562039 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559164 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:46:38.562823 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559166 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:46:38.562823 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559169 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:46:38.562823 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559171 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:46:38.562823 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559174 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:46:38.562823 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559176 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:46:38.562823 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559179 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:46:38.562823 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559182 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:46:38.562823 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559185 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:46:38.562823 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559188 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:46:38.562823 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559190 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:46:38.562823 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559193 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:46:38.562823 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559195 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:46:38.562823 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559198 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:46:38.562823 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.559200 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:46:38.562823 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.560923 2566 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 18:46:38.562823 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.560931 2566 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 18:46:38.562823 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.560949 2566 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 18:46:38.562823 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.560954 2566 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 18:46:38.562823 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.560959 2566 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 18:46:38.562823 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.560962 2566 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 18:46:38.562823 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.560966 2566 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 18:46:38.563430 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.560971 2566 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 18:46:38.563430 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.560975 2566 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 18:46:38.563430 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.560978 2566 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 18:46:38.563430 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.560981 2566 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 18:46:38.563430 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.560985 2566 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 18:46:38.563430 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.560988 2566 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 18:46:38.563430 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.560992 2566 flags.go:64] FLAG: --cgroup-root="" Apr 22 18:46:38.563430 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.560995 2566 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 18:46:38.563430 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.560998 2566 flags.go:64] FLAG: --client-ca-file="" Apr 22 18:46:38.563430 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561001 2566 flags.go:64] FLAG: --cloud-config="" Apr 22 18:46:38.563430 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561003 2566 flags.go:64] FLAG: --cloud-provider="external" Apr 22 18:46:38.563430 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561007 2566 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 18:46:38.563430 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561011 2566 flags.go:64] FLAG: --cluster-domain="" Apr 22 18:46:38.563430 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561014 2566 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 18:46:38.563430 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561017 2566 flags.go:64] FLAG: --config-dir="" Apr 22 18:46:38.563430 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561020 2566 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 18:46:38.563430 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561023 2566 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 18:46:38.563430 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561027 2566 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 18:46:38.563430 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561030 2566 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 18:46:38.563430 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561033 2566 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 18:46:38.563430 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561037 2566 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 18:46:38.563430 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561040 2566 flags.go:64] FLAG: --contention-profiling="false" Apr 22 18:46:38.563430 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561042 2566 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 18:46:38.563430 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561045 2566 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 18:46:38.564033 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561049 2566 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 18:46:38.564033 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561051 2566 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 18:46:38.564033 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561056 2566 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 18:46:38.564033 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561059 2566 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 18:46:38.564033 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561062 2566 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 18:46:38.564033 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561065 2566 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 18:46:38.564033 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561068 2566 flags.go:64] FLAG: --enable-server="true" Apr 22 18:46:38.564033 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561071 2566 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 18:46:38.564033 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561076 2566 flags.go:64] FLAG: --event-burst="100" Apr 22 18:46:38.564033 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561080 2566 flags.go:64] FLAG: --event-qps="50" Apr 22 18:46:38.564033 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561083 2566 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 18:46:38.564033 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561086 2566 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 18:46:38.564033 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561089 2566 flags.go:64] FLAG: --eviction-hard="" Apr 22 18:46:38.564033 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561093 2566 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 18:46:38.564033 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561096 2566 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 18:46:38.564033 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561099 2566 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 18:46:38.564033 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561102 2566 flags.go:64] FLAG: --eviction-soft="" Apr 22 18:46:38.564033 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561105 2566 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 18:46:38.564033 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561108 2566 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 18:46:38.564033 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561111 2566 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 18:46:38.564033 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561113 2566 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 18:46:38.564033 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561116 2566 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 18:46:38.564033 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561119 2566 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 18:46:38.564033 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561122 2566 flags.go:64] FLAG: --feature-gates="" Apr 22 18:46:38.564033 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561126 2566 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 18:46:38.564635 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561129 2566 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 18:46:38.564635 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561132 2566 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 18:46:38.564635 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561135 2566 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 18:46:38.564635 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561138 2566 flags.go:64] FLAG: --healthz-port="10248" Apr 22 18:46:38.564635 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561141 2566 flags.go:64] FLAG: --help="false" Apr 22 18:46:38.564635 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561144 2566 flags.go:64] FLAG: --hostname-override="ip-10-0-130-234.ec2.internal" Apr 22 18:46:38.564635 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561147 2566 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 18:46:38.564635 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561150 2566 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 18:46:38.564635 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561153 2566 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 18:46:38.564635 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561156 2566 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 18:46:38.564635 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561159 2566 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 18:46:38.564635 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561163 2566 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 18:46:38.564635 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561165 2566 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 18:46:38.564635 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561168 2566 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 18:46:38.564635 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561171 2566 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 18:46:38.564635 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561174 2566 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 18:46:38.564635 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561177 2566 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 18:46:38.564635 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561180 2566 flags.go:64] FLAG: --kube-reserved="" Apr 22 18:46:38.564635 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561183 2566 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 18:46:38.564635 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561186 2566 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 18:46:38.564635 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561189 2566 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 18:46:38.564635 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561192 2566 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 18:46:38.564635 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561195 2566 flags.go:64] FLAG: --lock-file="" Apr 22 18:46:38.564635 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561198 2566 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 18:46:38.565264 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561201 2566 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 18:46:38.565264 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561204 2566 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 18:46:38.565264 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561209 2566 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 18:46:38.565264 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561212 2566 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 18:46:38.565264 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561215 2566 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 18:46:38.565264 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561218 2566 flags.go:64] FLAG: --logging-format="text" Apr 22 18:46:38.565264 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561221 2566 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 18:46:38.565264 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561225 2566 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 18:46:38.565264 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561228 2566 flags.go:64] FLAG: --manifest-url="" Apr 22 18:46:38.565264 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561231 2566 flags.go:64] FLAG: --manifest-url-header="" Apr 22 18:46:38.565264 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561240 2566 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 18:46:38.565264 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561243 2566 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 18:46:38.565264 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561247 2566 flags.go:64] FLAG: --max-pods="110" Apr 22 18:46:38.565264 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561250 2566 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 18:46:38.565264 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561253 2566 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 18:46:38.565264 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561256 2566 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 18:46:38.565264 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561259 2566 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 18:46:38.565264 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561262 2566 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 18:46:38.565264 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561264 2566 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 18:46:38.565264 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561267 2566 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 18:46:38.565264 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561275 2566 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 18:46:38.565264 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561278 2566 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 18:46:38.565264 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561281 2566 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 18:46:38.565264 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561283 2566 flags.go:64] FLAG: --pod-cidr="" Apr 22 18:46:38.565848 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561286 2566 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 18:46:38.565848 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561292 2566 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 18:46:38.565848 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561295 2566 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 18:46:38.565848 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561298 2566 flags.go:64] FLAG: --pods-per-core="0" Apr 22 18:46:38.565848 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561301 2566 flags.go:64] FLAG: --port="10250" Apr 22 18:46:38.565848 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561305 2566 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 18:46:38.565848 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561308 2566 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-06442347e28a7f48f" Apr 22 18:46:38.565848 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561311 2566 flags.go:64] FLAG: --qos-reserved="" Apr 22 18:46:38.565848 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561314 2566 flags.go:64] FLAG: --read-only-port="10255" Apr 22 18:46:38.565848 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561317 2566 flags.go:64] FLAG: --register-node="true" Apr 22 18:46:38.565848 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561320 2566 flags.go:64] FLAG: --register-schedulable="true" Apr 22 18:46:38.565848 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561323 2566 flags.go:64] FLAG: --register-with-taints="" Apr 22 18:46:38.565848 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561326 2566 flags.go:64] FLAG: --registry-burst="10" Apr 22 18:46:38.565848 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561329 2566 flags.go:64] FLAG: --registry-qps="5" Apr 22 18:46:38.565848 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561332 2566 flags.go:64] FLAG: --reserved-cpus="" Apr 22 18:46:38.565848 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561335 2566 flags.go:64] FLAG: --reserved-memory="" Apr 22 18:46:38.565848 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561338 2566 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 18:46:38.565848 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561341 2566 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 18:46:38.565848 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561344 2566 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 18:46:38.565848 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561347 2566 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 18:46:38.565848 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561350 2566 flags.go:64] FLAG: --runonce="false" Apr 22 18:46:38.565848 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561352 2566 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 18:46:38.565848 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561355 2566 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 18:46:38.565848 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561358 2566 flags.go:64] FLAG: --seccomp-default="false" Apr 22 18:46:38.565848 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561361 2566 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 18:46:38.566544 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561364 2566 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 18:46:38.566544 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561367 2566 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 18:46:38.566544 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561372 2566 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 18:46:38.566544 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561375 2566 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 18:46:38.566544 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561378 2566 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 18:46:38.566544 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561381 2566 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 18:46:38.566544 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561384 2566 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 18:46:38.566544 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561387 2566 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 18:46:38.566544 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561390 2566 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 18:46:38.566544 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561393 2566 flags.go:64] FLAG: --system-cgroups="" Apr 22 18:46:38.566544 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561396 2566 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 18:46:38.566544 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561405 2566 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 18:46:38.566544 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561408 2566 flags.go:64] FLAG: --tls-cert-file="" Apr 22 18:46:38.566544 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561411 2566 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 18:46:38.566544 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561416 2566 flags.go:64] FLAG: --tls-min-version="" Apr 22 18:46:38.566544 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561419 2566 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 18:46:38.566544 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561421 2566 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 18:46:38.566544 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561424 2566 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 18:46:38.566544 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561427 2566 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 18:46:38.566544 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561430 2566 flags.go:64] FLAG: --v="2" Apr 22 18:46:38.566544 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561434 2566 flags.go:64] FLAG: --version="false" Apr 22 18:46:38.566544 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561438 2566 flags.go:64] FLAG: --vmodule="" Apr 22 18:46:38.566544 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561443 2566 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 18:46:38.566544 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.561446 2566 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 18:46:38.566544 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561537 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:46:38.567176 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561540 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:46:38.567176 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561543 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:46:38.567176 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561546 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:46:38.567176 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561549 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:46:38.567176 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561551 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:46:38.567176 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561554 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:46:38.567176 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561556 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:46:38.567176 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561559 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:46:38.567176 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561561 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:46:38.567176 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561565 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:46:38.567176 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561568 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:46:38.567176 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561571 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:46:38.567176 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561573 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:46:38.567176 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561576 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:46:38.567176 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561578 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:46:38.567176 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561581 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:46:38.567176 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561584 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:46:38.567176 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561587 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:46:38.567176 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561591 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:46:38.567176 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561593 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:46:38.567176 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561596 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:46:38.567744 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561599 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:46:38.567744 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561601 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:46:38.567744 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561604 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:46:38.567744 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561607 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:46:38.567744 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561609 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:46:38.567744 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561612 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:46:38.567744 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561615 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:46:38.567744 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561618 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:46:38.567744 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561620 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:46:38.567744 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561623 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:46:38.567744 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561627 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:46:38.567744 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561630 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:46:38.567744 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561633 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:46:38.567744 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561635 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:46:38.567744 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561638 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:46:38.567744 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561641 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:46:38.567744 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561644 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:46:38.567744 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561647 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:46:38.567744 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561650 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:46:38.568243 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561652 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:46:38.568243 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561657 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:46:38.568243 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561660 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:46:38.568243 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561663 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:46:38.568243 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561665 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:46:38.568243 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561668 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:46:38.568243 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561671 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:46:38.568243 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561673 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:46:38.568243 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561676 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:46:38.568243 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561678 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:46:38.568243 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561682 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:46:38.568243 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561685 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:46:38.568243 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561688 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:46:38.568243 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561692 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:46:38.568243 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561695 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:46:38.568243 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561697 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:46:38.568243 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561700 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:46:38.568243 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561702 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:46:38.568243 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561705 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:46:38.568723 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561707 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:46:38.568723 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561710 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:46:38.568723 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561712 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:46:38.568723 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561715 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:46:38.568723 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561717 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:46:38.568723 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561720 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:46:38.568723 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561722 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:46:38.568723 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561724 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:46:38.568723 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561727 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:46:38.568723 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561730 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:46:38.568723 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561732 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:46:38.568723 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561734 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:46:38.568723 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561737 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:46:38.568723 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561739 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:46:38.568723 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561743 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:46:38.568723 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561746 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:46:38.568723 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561748 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:46:38.568723 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561751 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:46:38.568723 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561754 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:46:38.569220 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561757 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:46:38.569220 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561759 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:46:38.569220 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561761 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:46:38.569220 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561764 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:46:38.569220 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561767 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:46:38.569220 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561770 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:46:38.569220 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.561814 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:46:38.569220 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.564689 2566 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:46:38.572060 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.572040 2566 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 18:46:38.572097 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.572060 2566 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 18:46:38.572125 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572108 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:46:38.572125 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572114 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:46:38.572125 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572117 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:46:38.572125 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572120 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:46:38.572125 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572123 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:46:38.572125 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572126 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:46:38.572270 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572129 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:46:38.572270 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572133 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:46:38.572270 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572136 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:46:38.572270 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572140 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:46:38.572270 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572143 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:46:38.572270 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572146 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:46:38.572270 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572149 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:46:38.572270 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572151 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:46:38.572270 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572154 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:46:38.572270 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572157 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:46:38.572270 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572160 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:46:38.572270 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572163 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:46:38.572270 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572165 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:46:38.572270 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572168 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:46:38.572270 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572171 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:46:38.572270 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572173 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:46:38.572270 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572176 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:46:38.572270 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572179 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:46:38.572270 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572181 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:46:38.572750 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572184 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:46:38.572750 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572186 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:46:38.572750 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572188 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:46:38.572750 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572191 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:46:38.572750 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572194 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:46:38.572750 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572197 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:46:38.572750 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572200 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:46:38.572750 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572202 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:46:38.572750 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572205 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:46:38.572750 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572208 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:46:38.572750 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572210 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:46:38.572750 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572213 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:46:38.572750 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572215 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:46:38.572750 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572218 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:46:38.572750 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572221 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:46:38.572750 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572224 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:46:38.572750 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572226 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:46:38.572750 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572229 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:46:38.572750 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572232 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:46:38.572750 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572235 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:46:38.573272 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572238 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:46:38.573272 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572240 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:46:38.573272 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572243 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:46:38.573272 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572246 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:46:38.573272 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572250 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:46:38.573272 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572253 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:46:38.573272 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572256 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:46:38.573272 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572259 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:46:38.573272 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572262 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:46:38.573272 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572265 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:46:38.573272 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572267 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:46:38.573272 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572270 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:46:38.573272 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572272 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:46:38.573272 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572275 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:46:38.573272 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572278 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:46:38.573272 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572281 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:46:38.573272 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572284 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:46:38.573272 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572286 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:46:38.573272 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572289 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:46:38.573272 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572291 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:46:38.573767 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572294 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:46:38.573767 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572297 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:46:38.573767 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572299 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:46:38.573767 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572302 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:46:38.573767 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572304 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:46:38.573767 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572307 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:46:38.573767 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572310 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:46:38.573767 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572313 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:46:38.573767 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572317 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:46:38.573767 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572321 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:46:38.573767 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572324 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:46:38.573767 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572327 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:46:38.573767 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572330 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:46:38.573767 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572333 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:46:38.573767 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572335 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:46:38.573767 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572338 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:46:38.573767 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572341 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:46:38.573767 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572343 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:46:38.573767 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572346 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:46:38.573767 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572348 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:46:38.574386 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572350 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:46:38.574386 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.572355 2566 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:46:38.574386 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572462 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:46:38.574386 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572466 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:46:38.574386 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572470 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:46:38.574386 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572473 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:46:38.574386 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572476 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:46:38.574386 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572479 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:46:38.574386 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572481 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:46:38.574386 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572485 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:46:38.574386 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572489 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:46:38.574386 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572492 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:46:38.574386 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572495 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:46:38.574386 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572497 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:46:38.574386 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572500 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:46:38.574757 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572503 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:46:38.574757 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572506 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:46:38.574757 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572509 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:46:38.574757 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572511 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:46:38.574757 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572514 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:46:38.574757 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572517 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:46:38.574757 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572519 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:46:38.574757 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572522 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:46:38.574757 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572524 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:46:38.574757 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572527 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:46:38.574757 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572530 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:46:38.574757 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572532 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:46:38.574757 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572535 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:46:38.574757 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572538 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:46:38.574757 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572540 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:46:38.574757 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572543 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:46:38.574757 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572545 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:46:38.574757 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572548 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:46:38.574757 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572550 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:46:38.574757 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572553 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:46:38.575255 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572555 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:46:38.575255 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572558 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:46:38.575255 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572560 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:46:38.575255 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572563 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:46:38.575255 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572565 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:46:38.575255 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572567 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:46:38.575255 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572570 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:46:38.575255 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572572 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:46:38.575255 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572575 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:46:38.575255 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572578 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:46:38.575255 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572580 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:46:38.575255 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572583 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:46:38.575255 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572585 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:46:38.575255 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572588 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:46:38.575255 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572590 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:46:38.575255 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572594 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:46:38.575255 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572597 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:46:38.575255 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572600 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:46:38.575255 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572603 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:46:38.575716 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572606 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:46:38.575716 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572609 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:46:38.575716 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572612 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:46:38.575716 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572615 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:46:38.575716 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572617 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:46:38.575716 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572620 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:46:38.575716 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572623 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:46:38.575716 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572626 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:46:38.575716 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572628 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:46:38.575716 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572631 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:46:38.575716 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572634 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:46:38.575716 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572636 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:46:38.575716 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572639 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:46:38.575716 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572642 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:46:38.575716 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572644 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:46:38.575716 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572647 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:46:38.575716 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572649 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:46:38.575716 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572651 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:46:38.575716 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572654 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:46:38.575716 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572656 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:46:38.576223 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572659 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:46:38.576223 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572661 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:46:38.576223 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572664 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:46:38.576223 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572666 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:46:38.576223 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572669 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:46:38.576223 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572671 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:46:38.576223 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572674 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:46:38.576223 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572676 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:46:38.576223 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572679 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:46:38.576223 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572682 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:46:38.576223 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572684 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:46:38.576223 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572687 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:46:38.576223 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572689 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:46:38.576223 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:38.572692 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:46:38.576223 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.572697 2566 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:46:38.576223 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.573365 2566 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 18:46:38.577549 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.577536 2566 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 18:46:38.578474 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.578463 2566 server.go:1019] "Starting client certificate rotation" Apr 22 18:46:38.578569 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.578554 2566 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 18:46:38.578598 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.578591 2566 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 18:46:38.603245 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.603226 2566 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 18:46:38.606933 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.606908 2566 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 18:46:38.625502 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.625485 2566 log.go:25] "Validated CRI v1 runtime API" Apr 22 18:46:38.632917 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.632902 2566 log.go:25] "Validated CRI v1 image API" Apr 22 18:46:38.634742 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.634723 2566 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 18:46:38.635259 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.635240 2566 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 18:46:38.639278 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.639256 2566 fs.go:135] Filesystem UUIDs: map[0a98638f-3852-4b2c-b595-d3587228fc02:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 7d8e4a4f-bd1d-49fa-a88e-0d9e7883166f:/dev/nvme0n1p3] Apr 22 18:46:38.639364 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.639278 2566 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 18:46:38.644829 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.644725 2566 manager.go:217] Machine: {Timestamp:2026-04-22 18:46:38.642924304 +0000 UTC m=+0.421757782 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100355 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec248eea7190fdbe1dfc3b5bc9e50334 SystemUUID:ec248eea-7190-fdbe-1dfc-3b5bc9e50334 BootID:97ff6e0b-994c-4457-8945-594e5934e301 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:28:0d:64:16:df Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:28:0d:64:16:df Speed:0 Mtu:9001} {Name:ovs-system MacAddress:92:c5:26:f2:64:c5 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 18:46:38.645332 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.645322 2566 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 18:46:38.645421 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.645409 2566 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 18:46:38.646436 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.646408 2566 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 18:46:38.646597 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.646439 2566 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-130-234.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 18:46:38.646671 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.646611 2566 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 18:46:38.646671 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.646621 2566 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 18:46:38.646671 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.646640 2566 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 18:46:38.646671 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.646660 2566 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 18:46:38.648157 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.648144 2566 state_mem.go:36] "Initialized new in-memory state store" Apr 22 18:46:38.648289 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.648278 2566 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 18:46:38.651272 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.651260 2566 kubelet.go:491] "Attempting to sync node with API server" Apr 22 18:46:38.651330 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.651278 2566 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 18:46:38.651330 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.651299 2566 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 18:46:38.651330 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.651312 2566 kubelet.go:397] "Adding apiserver pod source" Apr 22 18:46:38.651330 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.651324 2566 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 18:46:38.652471 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.652458 2566 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 18:46:38.652543 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.652480 2566 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 18:46:38.655089 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.655074 2566 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 18:46:38.656492 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.656480 2566 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 18:46:38.658589 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.658578 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 18:46:38.658647 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.658594 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 18:46:38.658647 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.658600 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 18:46:38.658647 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.658605 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 18:46:38.658647 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.658612 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 18:46:38.658647 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.658617 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 18:46:38.658647 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.658623 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 18:46:38.658647 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.658628 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 18:46:38.658647 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.658635 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 18:46:38.658647 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.658641 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 18:46:38.658647 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.658649 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 18:46:38.659061 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.658657 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 18:46:38.659819 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.659809 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 18:46:38.659819 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.659821 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 18:46:38.663311 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.663297 2566 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 18:46:38.663386 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.663334 2566 server.go:1295] "Started kubelet" Apr 22 18:46:38.663439 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.663414 2566 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 18:46:38.663565 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.663506 2566 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 18:46:38.663616 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.663593 2566 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 18:46:38.664029 ip-10-0-130-234 systemd[1]: Started Kubernetes Kubelet. Apr 22 18:46:38.664384 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.664360 2566 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-130-234.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 18:46:38.664596 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:38.664435 2566 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-130-234.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 18:46:38.664596 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:38.664517 2566 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 18:46:38.664799 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.664707 2566 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 18:46:38.666156 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.666144 2566 server.go:317] "Adding debug handlers to kubelet server" Apr 22 18:46:38.670130 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.670111 2566 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 18:46:38.670713 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.670698 2566 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 18:46:38.671806 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.671786 2566 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 18:46:38.673108 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.673088 2566 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 18:46:38.673187 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.671983 2566 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 18:46:38.673187 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.672021 2566 factory.go:55] Registering systemd factory Apr 22 18:46:38.673287 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.673245 2566 factory.go:223] Registration of the systemd container factory successfully Apr 22 18:46:38.673460 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:38.671831 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-234.ec2.internal\" not found" Apr 22 18:46:38.673516 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.673471 2566 reconstruct.go:97] "Volume reconstruction finished" Apr 22 18:46:38.673516 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.673489 2566 reconciler.go:26] "Reconciler: start to sync state" Apr 22 18:46:38.673954 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.673919 2566 factory.go:153] Registering CRI-O factory Apr 22 18:46:38.673954 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.673953 2566 factory.go:223] Registration of the crio container factory successfully Apr 22 18:46:38.674078 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.674031 2566 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 18:46:38.674078 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.674057 2566 factory.go:103] Registering Raw factory Apr 22 18:46:38.674078 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.674078 2566 manager.go:1196] Started watching for new ooms in manager Apr 22 18:46:38.674521 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.674474 2566 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-4l4jx" Apr 22 18:46:38.675119 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.675103 2566 manager.go:319] Starting recovery of all containers Apr 22 18:46:38.675204 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:38.675158 2566 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 18:46:38.678181 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:38.678074 2566 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-130-234.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 22 18:46:38.679351 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:38.678113 2566 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-130-234.ec2.internal.18a8c237f98b55ee default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-130-234.ec2.internal,UID:ip-10-0-130-234.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-130-234.ec2.internal,},FirstTimestamp:2026-04-22 18:46:38.663308782 +0000 UTC m=+0.442142259,LastTimestamp:2026-04-22 18:46:38.663308782 +0000 UTC m=+0.442142259,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-234.ec2.internal,}" Apr 22 18:46:38.680159 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:38.680133 2566 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 22 18:46:38.682355 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.682335 2566 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-4l4jx" Apr 22 18:46:38.686629 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.686474 2566 manager.go:324] Recovery completed Apr 22 18:46:38.691066 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.691052 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:46:38.693823 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.693804 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-234.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:46:38.693896 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.693840 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-234.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:46:38.693896 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.693856 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-234.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:46:38.694332 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.694317 2566 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 18:46:38.694332 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.694330 2566 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 18:46:38.694451 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.694343 2566 state_mem.go:36] "Initialized new in-memory state store" Apr 22 18:46:38.695449 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:38.695391 2566 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-130-234.ec2.internal.18a8c237fb5cf4f8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-130-234.ec2.internal,UID:ip-10-0-130-234.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-130-234.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-130-234.ec2.internal,},FirstTimestamp:2026-04-22 18:46:38.693823736 +0000 UTC m=+0.472657213,LastTimestamp:2026-04-22 18:46:38.693823736 +0000 UTC m=+0.472657213,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-234.ec2.internal,}" Apr 22 18:46:38.696741 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.696728 2566 policy_none.go:49] "None policy: Start" Apr 22 18:46:38.696795 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.696744 2566 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 18:46:38.696795 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.696754 2566 state_mem.go:35] "Initializing new in-memory state store" Apr 22 18:46:38.736529 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.736504 2566 manager.go:341] "Starting Device Plugin manager" Apr 22 18:46:38.736529 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:38.736537 2566 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 18:46:38.746952 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.736547 2566 server.go:85] "Starting device plugin registration server" Apr 22 18:46:38.746952 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.736775 2566 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 18:46:38.746952 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.736789 2566 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 18:46:38.746952 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.736890 2566 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 18:46:38.746952 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.736990 2566 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 18:46:38.746952 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.736999 2566 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 18:46:38.746952 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:38.737458 2566 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 18:46:38.746952 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:38.737485 2566 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-130-234.ec2.internal\" not found" Apr 22 18:46:38.823491 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.823425 2566 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 18:46:38.824560 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.824536 2566 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 18:46:38.824560 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.824560 2566 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 18:46:38.824678 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.824575 2566 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 18:46:38.824678 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.824581 2566 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 18:46:38.824678 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:38.824614 2566 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 18:46:38.827328 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.827309 2566 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:38.836877 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.836855 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:46:38.837731 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.837716 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-234.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:46:38.837788 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.837743 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-234.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:46:38.837788 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.837765 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-234.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:46:38.837788 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.837788 2566 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-130-234.ec2.internal" Apr 22 18:46:38.843352 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.843339 2566 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-130-234.ec2.internal" Apr 22 18:46:38.843399 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:38.843358 2566 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-130-234.ec2.internal\": node \"ip-10-0-130-234.ec2.internal\" not found" Apr 22 18:46:38.854117 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:38.854101 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-234.ec2.internal\" not found" Apr 22 18:46:38.925509 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.925482 2566 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-130-234.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-234.ec2.internal"] Apr 22 18:46:38.925613 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.925540 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:46:38.926467 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.926453 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-234.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:46:38.926547 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.926487 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-234.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:46:38.926547 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.926497 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-234.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:46:38.927896 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.927884 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:46:38.928119 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.928104 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-234.ec2.internal" Apr 22 18:46:38.928177 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.928135 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:46:38.928664 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.928647 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-234.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:46:38.928744 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.928681 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-234.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:46:38.928744 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.928691 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-234.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:46:38.928744 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.928716 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-234.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:46:38.928744 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.928696 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-234.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:46:38.928744 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.928730 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-234.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:46:38.930354 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.930341 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-234.ec2.internal" Apr 22 18:46:38.930413 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.930370 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:46:38.931075 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.931057 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-234.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:46:38.931149 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.931088 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-234.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:46:38.931149 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.931109 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-234.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:46:38.954195 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:38.954173 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-234.ec2.internal\" not found" Apr 22 18:46:38.956573 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:38.956560 2566 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-234.ec2.internal\" not found" node="ip-10-0-130-234.ec2.internal" Apr 22 18:46:38.961031 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:38.961015 2566 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-234.ec2.internal\" not found" node="ip-10-0-130-234.ec2.internal" Apr 22 18:46:38.974671 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:38.974649 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/cd060d06ccaccd228d0dac1aa712346b-config\") pod \"kube-apiserver-proxy-ip-10-0-130-234.ec2.internal\" (UID: \"cd060d06ccaccd228d0dac1aa712346b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-234.ec2.internal" Apr 22 18:46:39.054748 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:39.054726 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-234.ec2.internal\" not found" Apr 22 18:46:39.075183 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:39.075139 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b362398358ad211b5d4965dd1cf35b04-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-234.ec2.internal\" (UID: \"b362398358ad211b5d4965dd1cf35b04\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-234.ec2.internal" Apr 22 18:46:39.075183 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:39.075164 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b362398358ad211b5d4965dd1cf35b04-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-234.ec2.internal\" (UID: \"b362398358ad211b5d4965dd1cf35b04\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-234.ec2.internal" Apr 22 18:46:39.075278 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:39.075210 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/cd060d06ccaccd228d0dac1aa712346b-config\") pod \"kube-apiserver-proxy-ip-10-0-130-234.ec2.internal\" (UID: \"cd060d06ccaccd228d0dac1aa712346b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-234.ec2.internal" Apr 22 18:46:39.075278 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:39.075251 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/cd060d06ccaccd228d0dac1aa712346b-config\") pod \"kube-apiserver-proxy-ip-10-0-130-234.ec2.internal\" (UID: \"cd060d06ccaccd228d0dac1aa712346b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-234.ec2.internal" Apr 22 18:46:39.155492 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:39.155464 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-234.ec2.internal\" not found" Apr 22 18:46:39.175964 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:39.175920 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b362398358ad211b5d4965dd1cf35b04-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-234.ec2.internal\" (UID: \"b362398358ad211b5d4965dd1cf35b04\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-234.ec2.internal" Apr 22 18:46:39.175964 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:39.175969 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b362398358ad211b5d4965dd1cf35b04-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-234.ec2.internal\" (UID: \"b362398358ad211b5d4965dd1cf35b04\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-234.ec2.internal" Apr 22 18:46:39.176089 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:39.176038 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b362398358ad211b5d4965dd1cf35b04-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-234.ec2.internal\" (UID: \"b362398358ad211b5d4965dd1cf35b04\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-234.ec2.internal" Apr 22 18:46:39.176129 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:39.176085 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b362398358ad211b5d4965dd1cf35b04-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-234.ec2.internal\" (UID: \"b362398358ad211b5d4965dd1cf35b04\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-234.ec2.internal" Apr 22 18:46:39.256084 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:39.256064 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-234.ec2.internal\" not found" Apr 22 18:46:39.259245 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:39.259233 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-234.ec2.internal" Apr 22 18:46:39.264425 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:39.264405 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-234.ec2.internal" Apr 22 18:46:39.356732 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:39.356656 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-234.ec2.internal\" not found" Apr 22 18:46:39.457292 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:39.457266 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-234.ec2.internal\" not found" Apr 22 18:46:39.557897 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:39.557871 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-234.ec2.internal\" not found" Apr 22 18:46:39.578324 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:39.578304 2566 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 18:46:39.578440 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:39.578425 2566 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 18:46:39.621309 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:39.621288 2566 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:39.658289 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:39.658266 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-234.ec2.internal\" not found" Apr 22 18:46:39.670432 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:39.670415 2566 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 18:46:39.681388 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:39.681370 2566 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 18:46:39.684993 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:39.684966 2566 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 18:41:38 +0000 UTC" deadline="2027-12-14 07:58:28.215285915 +0000 UTC" Apr 22 18:46:39.685085 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:39.684998 2566 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14413h11m48.530290796s" Apr 22 18:46:39.697638 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:39.697619 2566 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-9x4tg" Apr 22 18:46:39.704409 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:39.704384 2566 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-9x4tg" Apr 22 18:46:39.758782 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:39.758761 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-234.ec2.internal\" not found" Apr 22 18:46:39.759222 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:39.759200 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd060d06ccaccd228d0dac1aa712346b.slice/crio-1ac05f751bcc683d47bc62c651d3998e3a915af3c31c03b849ba3a6ec06527f5 WatchSource:0}: Error finding container 1ac05f751bcc683d47bc62c651d3998e3a915af3c31c03b849ba3a6ec06527f5: Status 404 returned error can't find the container with id 1ac05f751bcc683d47bc62c651d3998e3a915af3c31c03b849ba3a6ec06527f5 Apr 22 18:46:39.759599 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:39.759581 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb362398358ad211b5d4965dd1cf35b04.slice/crio-b05f1fbf3caac5910626f16355e1c046d55d6feaad4ff00be8e4b9b4859e61c2 WatchSource:0}: Error finding container b05f1fbf3caac5910626f16355e1c046d55d6feaad4ff00be8e4b9b4859e61c2: Status 404 returned error can't find the container with id b05f1fbf3caac5910626f16355e1c046d55d6feaad4ff00be8e4b9b4859e61c2 Apr 22 18:46:39.765317 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:39.765303 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:46:39.827251 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:39.827204 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-234.ec2.internal" event={"ID":"cd060d06ccaccd228d0dac1aa712346b","Type":"ContainerStarted","Data":"1ac05f751bcc683d47bc62c651d3998e3a915af3c31c03b849ba3a6ec06527f5"} Apr 22 18:46:39.827581 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:39.827565 2566 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:39.828190 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:39.828164 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-234.ec2.internal" event={"ID":"b362398358ad211b5d4965dd1cf35b04","Type":"ContainerStarted","Data":"b05f1fbf3caac5910626f16355e1c046d55d6feaad4ff00be8e4b9b4859e61c2"} Apr 22 18:46:39.870891 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:39.870866 2566 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-234.ec2.internal" Apr 22 18:46:39.880564 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:39.880518 2566 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 18:46:39.882108 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:39.882097 2566 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-234.ec2.internal" Apr 22 18:46:39.891453 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:39.891433 2566 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 18:46:40.113518 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.113491 2566 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:40.566861 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.566837 2566 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:40.652385 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.652345 2566 apiserver.go:52] "Watching apiserver" Apr 22 18:46:40.661365 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.661341 2566 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 18:46:40.662530 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.662510 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-xwhq6","openshift-dns/node-resolver-6xjch","openshift-image-registry/node-ca-wj7nl","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-234.ec2.internal","openshift-network-operator/iptables-alerter-94k7v","openshift-ovn-kubernetes/ovnkube-node-s8qfk","kube-system/konnectivity-agent-x8klk","kube-system/kube-apiserver-proxy-ip-10-0-130-234.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-p6bnc","openshift-multus/multus-additional-cni-plugins-bt425","openshift-multus/multus-gw5cp","openshift-multus/network-metrics-daemon-7zj27","openshift-network-diagnostics/network-check-target-6cpjm"] Apr 22 18:46:40.664336 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.664315 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-bt425" Apr 22 18:46:40.666424 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.666402 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-6xjch" Apr 22 18:46:40.666529 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.666457 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 18:46:40.666529 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.666458 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 18:46:40.666767 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.666744 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 18:46:40.666917 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.666900 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-zh5qh\"" Apr 22 18:46:40.667008 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.666963 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 18:46:40.667008 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.666999 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 18:46:40.667915 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.667893 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-x8klk" Apr 22 18:46:40.667915 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.667906 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-gw5cp" Apr 22 18:46:40.670026 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.669272 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 18:46:40.670026 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.669832 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 18:46:40.670026 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.670015 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-bnk4n\"" Apr 22 18:46:40.670755 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.670738 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 18:46:40.671141 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.671122 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7zj27" Apr 22 18:46:40.671860 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.671238 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-mgd5m\"" Apr 22 18:46:40.671860 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:40.671235 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7zj27" podUID="66e2806a-82a6-444a-a19a-13641a426bed" Apr 22 18:46:40.671860 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.671457 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 18:46:40.671860 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.671531 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 18:46:40.671860 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.671570 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-pkl5k\"" Apr 22 18:46:40.672596 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.672467 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-wj7nl" Apr 22 18:46:40.674315 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.674294 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 18:46:40.675079 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.674563 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 18:46:40.675079 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.674774 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 18:46:40.675079 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.674831 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-94k7v" Apr 22 18:46:40.675079 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.674992 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:46:40.675327 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.675235 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-wg8t6\"" Apr 22 18:46:40.676231 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.676212 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-p6bnc" Apr 22 18:46:40.676868 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.676851 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:46:40.677080 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.677063 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 18:46:40.677291 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.677274 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-j7zqv\"" Apr 22 18:46:40.677389 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.677303 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-kbwq8\"" Apr 22 18:46:40.677652 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.677635 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-xwhq6" Apr 22 18:46:40.677799 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.677781 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 18:46:40.678002 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.677959 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 18:46:40.678482 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.678352 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 18:46:40.678482 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.678375 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 18:46:40.678482 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.678404 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 18:46:40.678482 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.678441 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 18:46:40.678482 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.678374 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 18:46:40.678795 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.678407 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 18:46:40.678847 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.678815 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 18:46:40.678847 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.678843 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-rqfjz\"" Apr 22 18:46:40.678951 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.678869 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 18:46:40.679169 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.679150 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6cpjm" Apr 22 18:46:40.679259 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:40.679208 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6cpjm" podUID="85d15dde-b7d7-4309-980e-49ce8977c219" Apr 22 18:46:40.679633 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.679594 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-dn7rk\"" Apr 22 18:46:40.679979 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.679960 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:46:40.680079 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.679983 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 18:46:40.686909 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.686890 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3952e485-121f-46cf-820b-24e6a13ec3bc-host-var-lib-cni-bin\") pod \"multus-gw5cp\" (UID: \"3952e485-121f-46cf-820b-24e6a13ec3bc\") " pod="openshift-multus/multus-gw5cp" Apr 22 18:46:40.687088 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.686916 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3952e485-121f-46cf-820b-24e6a13ec3bc-hostroot\") pod \"multus-gw5cp\" (UID: \"3952e485-121f-46cf-820b-24e6a13ec3bc\") " pod="openshift-multus/multus-gw5cp" Apr 22 18:46:40.687088 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.686955 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3c183a68-147a-4cc9-8e01-6e6df3a46a7b-host-slash\") pod \"ovnkube-node-s8qfk\" (UID: \"3c183a68-147a-4cc9-8e01-6e6df3a46a7b\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:46:40.687088 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.686977 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3952e485-121f-46cf-820b-24e6a13ec3bc-etc-kubernetes\") pod \"multus-gw5cp\" (UID: \"3952e485-121f-46cf-820b-24e6a13ec3bc\") " pod="openshift-multus/multus-gw5cp" Apr 22 18:46:40.687088 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.687002 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2780ff15-4023-4008-904e-d48820d8e1d9-iptables-alerter-script\") pod \"iptables-alerter-94k7v\" (UID: \"2780ff15-4023-4008-904e-d48820d8e1d9\") " pod="openshift-network-operator/iptables-alerter-94k7v" Apr 22 18:46:40.687088 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.687047 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e92d5c0b-3e38-4729-bb74-4ca59dfb7f0f-hosts-file\") pod \"node-resolver-6xjch\" (UID: \"e92d5c0b-3e38-4729-bb74-4ca59dfb7f0f\") " pod="openshift-dns/node-resolver-6xjch" Apr 22 18:46:40.687423 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.687089 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3c183a68-147a-4cc9-8e01-6e6df3a46a7b-log-socket\") pod \"ovnkube-node-s8qfk\" (UID: \"3c183a68-147a-4cc9-8e01-6e6df3a46a7b\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:46:40.687423 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.687118 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3952e485-121f-46cf-820b-24e6a13ec3bc-multus-conf-dir\") pod \"multus-gw5cp\" (UID: \"3952e485-121f-46cf-820b-24e6a13ec3bc\") " pod="openshift-multus/multus-gw5cp" Apr 22 18:46:40.687423 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.687164 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ffe6e393-6faa-47fe-a818-c1ffc6dda35c-os-release\") pod \"multus-additional-cni-plugins-bt425\" (UID: \"ffe6e393-6faa-47fe-a818-c1ffc6dda35c\") " pod="openshift-multus/multus-additional-cni-plugins-bt425" Apr 22 18:46:40.687423 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.687189 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3c183a68-147a-4cc9-8e01-6e6df3a46a7b-host-run-netns\") pod \"ovnkube-node-s8qfk\" (UID: \"3c183a68-147a-4cc9-8e01-6e6df3a46a7b\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:46:40.687423 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.687211 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e92d5c0b-3e38-4729-bb74-4ca59dfb7f0f-tmp-dir\") pod \"node-resolver-6xjch\" (UID: \"e92d5c0b-3e38-4729-bb74-4ca59dfb7f0f\") " pod="openshift-dns/node-resolver-6xjch" Apr 22 18:46:40.687423 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.687248 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3c183a68-147a-4cc9-8e01-6e6df3a46a7b-etc-openvswitch\") pod \"ovnkube-node-s8qfk\" (UID: \"3c183a68-147a-4cc9-8e01-6e6df3a46a7b\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:46:40.687423 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.687272 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ffe6e393-6faa-47fe-a818-c1ffc6dda35c-cni-binary-copy\") pod \"multus-additional-cni-plugins-bt425\" (UID: \"ffe6e393-6faa-47fe-a818-c1ffc6dda35c\") " pod="openshift-multus/multus-additional-cni-plugins-bt425" Apr 22 18:46:40.687423 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.687294 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3952e485-121f-46cf-820b-24e6a13ec3bc-multus-daemon-config\") pod \"multus-gw5cp\" (UID: \"3952e485-121f-46cf-820b-24e6a13ec3bc\") " pod="openshift-multus/multus-gw5cp" Apr 22 18:46:40.687423 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.687308 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3c183a68-147a-4cc9-8e01-6e6df3a46a7b-host-cni-bin\") pod \"ovnkube-node-s8qfk\" (UID: \"3c183a68-147a-4cc9-8e01-6e6df3a46a7b\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:46:40.687423 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.687328 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3c183a68-147a-4cc9-8e01-6e6df3a46a7b-host-run-ovn-kubernetes\") pod \"ovnkube-node-s8qfk\" (UID: \"3c183a68-147a-4cc9-8e01-6e6df3a46a7b\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:46:40.687423 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.687367 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3952e485-121f-46cf-820b-24e6a13ec3bc-cnibin\") pod \"multus-gw5cp\" (UID: \"3952e485-121f-46cf-820b-24e6a13ec3bc\") " pod="openshift-multus/multus-gw5cp" Apr 22 18:46:40.687423 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.687381 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3952e485-121f-46cf-820b-24e6a13ec3bc-os-release\") pod \"multus-gw5cp\" (UID: \"3952e485-121f-46cf-820b-24e6a13ec3bc\") " pod="openshift-multus/multus-gw5cp" Apr 22 18:46:40.687423 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.687422 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77wvj\" (UniqueName: \"kubernetes.io/projected/66e2806a-82a6-444a-a19a-13641a426bed-kube-api-access-77wvj\") pod \"network-metrics-daemon-7zj27\" (UID: \"66e2806a-82a6-444a-a19a-13641a426bed\") " pod="openshift-multus/network-metrics-daemon-7zj27" Apr 22 18:46:40.688150 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.687454 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3952e485-121f-46cf-820b-24e6a13ec3bc-host-run-multus-certs\") pod \"multus-gw5cp\" (UID: \"3952e485-121f-46cf-820b-24e6a13ec3bc\") " pod="openshift-multus/multus-gw5cp" Apr 22 18:46:40.688150 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.687475 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3952e485-121f-46cf-820b-24e6a13ec3bc-multus-cni-dir\") pod \"multus-gw5cp\" (UID: \"3952e485-121f-46cf-820b-24e6a13ec3bc\") " pod="openshift-multus/multus-gw5cp" Apr 22 18:46:40.688150 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.687489 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/66e2806a-82a6-444a-a19a-13641a426bed-metrics-certs\") pod \"network-metrics-daemon-7zj27\" (UID: \"66e2806a-82a6-444a-a19a-13641a426bed\") " pod="openshift-multus/network-metrics-daemon-7zj27" Apr 22 18:46:40.688150 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.687503 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn9sz\" (UniqueName: \"kubernetes.io/projected/462b5fca-7ee5-444f-af12-38942635aa2f-kube-api-access-rn9sz\") pod \"node-ca-wj7nl\" (UID: \"462b5fca-7ee5-444f-af12-38942635aa2f\") " pod="openshift-image-registry/node-ca-wj7nl" Apr 22 18:46:40.688150 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.687525 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3c183a68-147a-4cc9-8e01-6e6df3a46a7b-ovnkube-script-lib\") pod \"ovnkube-node-s8qfk\" (UID: \"3c183a68-147a-4cc9-8e01-6e6df3a46a7b\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:46:40.688150 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.687548 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4m9x\" (UniqueName: \"kubernetes.io/projected/3c183a68-147a-4cc9-8e01-6e6df3a46a7b-kube-api-access-h4m9x\") pod \"ovnkube-node-s8qfk\" (UID: \"3c183a68-147a-4cc9-8e01-6e6df3a46a7b\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:46:40.688150 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.687564 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b76rw\" (UniqueName: \"kubernetes.io/projected/3952e485-121f-46cf-820b-24e6a13ec3bc-kube-api-access-b76rw\") pod \"multus-gw5cp\" (UID: \"3952e485-121f-46cf-820b-24e6a13ec3bc\") " pod="openshift-multus/multus-gw5cp" Apr 22 18:46:40.688150 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.687591 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ffe6e393-6faa-47fe-a818-c1ffc6dda35c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-bt425\" (UID: \"ffe6e393-6faa-47fe-a818-c1ffc6dda35c\") " pod="openshift-multus/multus-additional-cni-plugins-bt425" Apr 22 18:46:40.688150 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.687612 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ffe6e393-6faa-47fe-a818-c1ffc6dda35c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-bt425\" (UID: \"ffe6e393-6faa-47fe-a818-c1ffc6dda35c\") " pod="openshift-multus/multus-additional-cni-plugins-bt425" Apr 22 18:46:40.688150 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.687632 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3952e485-121f-46cf-820b-24e6a13ec3bc-system-cni-dir\") pod \"multus-gw5cp\" (UID: \"3952e485-121f-46cf-820b-24e6a13ec3bc\") " pod="openshift-multus/multus-gw5cp" Apr 22 18:46:40.688150 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.687655 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3952e485-121f-46cf-820b-24e6a13ec3bc-host-var-lib-cni-multus\") pod \"multus-gw5cp\" (UID: \"3952e485-121f-46cf-820b-24e6a13ec3bc\") " pod="openshift-multus/multus-gw5cp" Apr 22 18:46:40.688150 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.687678 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3c183a68-147a-4cc9-8e01-6e6df3a46a7b-host-kubelet\") pod \"ovnkube-node-s8qfk\" (UID: \"3c183a68-147a-4cc9-8e01-6e6df3a46a7b\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:46:40.688150 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.687725 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3c183a68-147a-4cc9-8e01-6e6df3a46a7b-systemd-units\") pod \"ovnkube-node-s8qfk\" (UID: \"3c183a68-147a-4cc9-8e01-6e6df3a46a7b\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:46:40.688150 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.687763 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3952e485-121f-46cf-820b-24e6a13ec3bc-host-var-lib-kubelet\") pod \"multus-gw5cp\" (UID: \"3952e485-121f-46cf-820b-24e6a13ec3bc\") " pod="openshift-multus/multus-gw5cp" Apr 22 18:46:40.688150 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.687795 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ffe6e393-6faa-47fe-a818-c1ffc6dda35c-system-cni-dir\") pod \"multus-additional-cni-plugins-bt425\" (UID: \"ffe6e393-6faa-47fe-a818-c1ffc6dda35c\") " pod="openshift-multus/multus-additional-cni-plugins-bt425" Apr 22 18:46:40.688150 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.687855 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2780ff15-4023-4008-904e-d48820d8e1d9-host-slash\") pod \"iptables-alerter-94k7v\" (UID: \"2780ff15-4023-4008-904e-d48820d8e1d9\") " pod="openshift-network-operator/iptables-alerter-94k7v" Apr 22 18:46:40.689064 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.687874 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/284cb099-6d1a-47cf-883b-5ada5cd11fa0-agent-certs\") pod \"konnectivity-agent-x8klk\" (UID: \"284cb099-6d1a-47cf-883b-5ada5cd11fa0\") " pod="kube-system/konnectivity-agent-x8klk" Apr 22 18:46:40.689064 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.687888 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/462b5fca-7ee5-444f-af12-38942635aa2f-serviceca\") pod \"node-ca-wj7nl\" (UID: \"462b5fca-7ee5-444f-af12-38942635aa2f\") " pod="openshift-image-registry/node-ca-wj7nl" Apr 22 18:46:40.689064 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.687906 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3c183a68-147a-4cc9-8e01-6e6df3a46a7b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-s8qfk\" (UID: \"3c183a68-147a-4cc9-8e01-6e6df3a46a7b\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:46:40.689064 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.687955 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3c183a68-147a-4cc9-8e01-6e6df3a46a7b-env-overrides\") pod \"ovnkube-node-s8qfk\" (UID: \"3c183a68-147a-4cc9-8e01-6e6df3a46a7b\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:46:40.689064 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.687984 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3c183a68-147a-4cc9-8e01-6e6df3a46a7b-ovn-node-metrics-cert\") pod \"ovnkube-node-s8qfk\" (UID: \"3c183a68-147a-4cc9-8e01-6e6df3a46a7b\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:46:40.689064 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.688011 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpbhw\" (UniqueName: \"kubernetes.io/projected/e92d5c0b-3e38-4729-bb74-4ca59dfb7f0f-kube-api-access-lpbhw\") pod \"node-resolver-6xjch\" (UID: \"e92d5c0b-3e38-4729-bb74-4ca59dfb7f0f\") " pod="openshift-dns/node-resolver-6xjch" Apr 22 18:46:40.689064 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.688051 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3c183a68-147a-4cc9-8e01-6e6df3a46a7b-run-ovn\") pod \"ovnkube-node-s8qfk\" (UID: \"3c183a68-147a-4cc9-8e01-6e6df3a46a7b\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:46:40.689064 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.688078 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3c183a68-147a-4cc9-8e01-6e6df3a46a7b-node-log\") pod \"ovnkube-node-s8qfk\" (UID: \"3c183a68-147a-4cc9-8e01-6e6df3a46a7b\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:46:40.689064 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.688104 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/284cb099-6d1a-47cf-883b-5ada5cd11fa0-konnectivity-ca\") pod \"konnectivity-agent-x8klk\" (UID: \"284cb099-6d1a-47cf-883b-5ada5cd11fa0\") " pod="kube-system/konnectivity-agent-x8klk" Apr 22 18:46:40.689064 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.688136 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3c183a68-147a-4cc9-8e01-6e6df3a46a7b-ovnkube-config\") pod \"ovnkube-node-s8qfk\" (UID: \"3c183a68-147a-4cc9-8e01-6e6df3a46a7b\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:46:40.689064 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.688159 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3c183a68-147a-4cc9-8e01-6e6df3a46a7b-host-cni-netd\") pod \"ovnkube-node-s8qfk\" (UID: \"3c183a68-147a-4cc9-8e01-6e6df3a46a7b\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:46:40.689064 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.688186 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ffe6e393-6faa-47fe-a818-c1ffc6dda35c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-bt425\" (UID: \"ffe6e393-6faa-47fe-a818-c1ffc6dda35c\") " pod="openshift-multus/multus-additional-cni-plugins-bt425" Apr 22 18:46:40.689064 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.688213 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crwg8\" (UniqueName: \"kubernetes.io/projected/ffe6e393-6faa-47fe-a818-c1ffc6dda35c-kube-api-access-crwg8\") pod \"multus-additional-cni-plugins-bt425\" (UID: \"ffe6e393-6faa-47fe-a818-c1ffc6dda35c\") " pod="openshift-multus/multus-additional-cni-plugins-bt425" Apr 22 18:46:40.689064 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.688235 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3952e485-121f-46cf-820b-24e6a13ec3bc-cni-binary-copy\") pod \"multus-gw5cp\" (UID: \"3952e485-121f-46cf-820b-24e6a13ec3bc\") " pod="openshift-multus/multus-gw5cp" Apr 22 18:46:40.689064 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.688258 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3952e485-121f-46cf-820b-24e6a13ec3bc-multus-socket-dir-parent\") pod \"multus-gw5cp\" (UID: \"3952e485-121f-46cf-820b-24e6a13ec3bc\") " pod="openshift-multus/multus-gw5cp" Apr 22 18:46:40.689064 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.688282 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3952e485-121f-46cf-820b-24e6a13ec3bc-host-run-netns\") pod \"multus-gw5cp\" (UID: \"3952e485-121f-46cf-820b-24e6a13ec3bc\") " pod="openshift-multus/multus-gw5cp" Apr 22 18:46:40.690057 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.688318 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/462b5fca-7ee5-444f-af12-38942635aa2f-host\") pod \"node-ca-wj7nl\" (UID: \"462b5fca-7ee5-444f-af12-38942635aa2f\") " pod="openshift-image-registry/node-ca-wj7nl" Apr 22 18:46:40.690057 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.688352 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3c183a68-147a-4cc9-8e01-6e6df3a46a7b-run-systemd\") pod \"ovnkube-node-s8qfk\" (UID: \"3c183a68-147a-4cc9-8e01-6e6df3a46a7b\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:46:40.690057 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.688393 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3c183a68-147a-4cc9-8e01-6e6df3a46a7b-var-lib-openvswitch\") pod \"ovnkube-node-s8qfk\" (UID: \"3c183a68-147a-4cc9-8e01-6e6df3a46a7b\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:46:40.690057 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.688434 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gz95r\" (UniqueName: \"kubernetes.io/projected/2780ff15-4023-4008-904e-d48820d8e1d9-kube-api-access-gz95r\") pod \"iptables-alerter-94k7v\" (UID: \"2780ff15-4023-4008-904e-d48820d8e1d9\") " pod="openshift-network-operator/iptables-alerter-94k7v" Apr 22 18:46:40.690057 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.688465 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3952e485-121f-46cf-820b-24e6a13ec3bc-host-run-k8s-cni-cncf-io\") pod \"multus-gw5cp\" (UID: \"3952e485-121f-46cf-820b-24e6a13ec3bc\") " pod="openshift-multus/multus-gw5cp" Apr 22 18:46:40.690057 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.688497 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3c183a68-147a-4cc9-8e01-6e6df3a46a7b-run-openvswitch\") pod \"ovnkube-node-s8qfk\" (UID: \"3c183a68-147a-4cc9-8e01-6e6df3a46a7b\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:46:40.690057 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.688540 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ffe6e393-6faa-47fe-a818-c1ffc6dda35c-cnibin\") pod \"multus-additional-cni-plugins-bt425\" (UID: \"ffe6e393-6faa-47fe-a818-c1ffc6dda35c\") " pod="openshift-multus/multus-additional-cni-plugins-bt425" Apr 22 18:46:40.705596 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.705561 2566 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 18:41:39 +0000 UTC" deadline="2028-01-18 13:07:21.624124376 +0000 UTC" Apr 22 18:46:40.705684 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.705597 2566 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15258h20m40.918531504s" Apr 22 18:46:40.774210 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.774183 2566 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 18:46:40.789312 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.789275 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b6911a95-1c4c-4513-b688-98a7456d5642-etc-sysconfig\") pod \"tuned-xwhq6\" (UID: \"b6911a95-1c4c-4513-b688-98a7456d5642\") " pod="openshift-cluster-node-tuning-operator/tuned-xwhq6" Apr 22 18:46:40.789462 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.789319 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b6911a95-1c4c-4513-b688-98a7456d5642-etc-tuned\") pod \"tuned-xwhq6\" (UID: \"b6911a95-1c4c-4513-b688-98a7456d5642\") " pod="openshift-cluster-node-tuning-operator/tuned-xwhq6" Apr 22 18:46:40.789462 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.789354 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3952e485-121f-46cf-820b-24e6a13ec3bc-host-run-k8s-cni-cncf-io\") pod \"multus-gw5cp\" (UID: \"3952e485-121f-46cf-820b-24e6a13ec3bc\") " pod="openshift-multus/multus-gw5cp" Apr 22 18:46:40.789462 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.789379 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3c183a68-147a-4cc9-8e01-6e6df3a46a7b-run-openvswitch\") pod \"ovnkube-node-s8qfk\" (UID: \"3c183a68-147a-4cc9-8e01-6e6df3a46a7b\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:46:40.789462 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.789406 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm486\" (UniqueName: \"kubernetes.io/projected/bbe988ee-649d-4eed-9871-40def68d89af-kube-api-access-mm486\") pod \"aws-ebs-csi-driver-node-p6bnc\" (UID: \"bbe988ee-649d-4eed-9871-40def68d89af\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-p6bnc" Apr 22 18:46:40.789462 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.789434 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b6911a95-1c4c-4513-b688-98a7456d5642-etc-sysctl-d\") pod \"tuned-xwhq6\" (UID: \"b6911a95-1c4c-4513-b688-98a7456d5642\") " pod="openshift-cluster-node-tuning-operator/tuned-xwhq6" Apr 22 18:46:40.789462 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.789459 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ffe6e393-6faa-47fe-a818-c1ffc6dda35c-cnibin\") pod \"multus-additional-cni-plugins-bt425\" (UID: \"ffe6e393-6faa-47fe-a818-c1ffc6dda35c\") " pod="openshift-multus/multus-additional-cni-plugins-bt425" Apr 22 18:46:40.789709 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.789470 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3952e485-121f-46cf-820b-24e6a13ec3bc-host-run-k8s-cni-cncf-io\") pod \"multus-gw5cp\" (UID: \"3952e485-121f-46cf-820b-24e6a13ec3bc\") " pod="openshift-multus/multus-gw5cp" Apr 22 18:46:40.789709 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.789477 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3c183a68-147a-4cc9-8e01-6e6df3a46a7b-run-openvswitch\") pod \"ovnkube-node-s8qfk\" (UID: \"3c183a68-147a-4cc9-8e01-6e6df3a46a7b\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:46:40.789709 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.789484 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3952e485-121f-46cf-820b-24e6a13ec3bc-host-var-lib-cni-bin\") pod \"multus-gw5cp\" (UID: \"3952e485-121f-46cf-820b-24e6a13ec3bc\") " pod="openshift-multus/multus-gw5cp" Apr 22 18:46:40.789709 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.789527 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3952e485-121f-46cf-820b-24e6a13ec3bc-hostroot\") pod \"multus-gw5cp\" (UID: \"3952e485-121f-46cf-820b-24e6a13ec3bc\") " pod="openshift-multus/multus-gw5cp" Apr 22 18:46:40.789709 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.789537 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3952e485-121f-46cf-820b-24e6a13ec3bc-host-var-lib-cni-bin\") pod \"multus-gw5cp\" (UID: \"3952e485-121f-46cf-820b-24e6a13ec3bc\") " pod="openshift-multus/multus-gw5cp" Apr 22 18:46:40.789709 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.789517 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ffe6e393-6faa-47fe-a818-c1ffc6dda35c-cnibin\") pod \"multus-additional-cni-plugins-bt425\" (UID: \"ffe6e393-6faa-47fe-a818-c1ffc6dda35c\") " pod="openshift-multus/multus-additional-cni-plugins-bt425" Apr 22 18:46:40.789709 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.789554 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3c183a68-147a-4cc9-8e01-6e6df3a46a7b-host-slash\") pod \"ovnkube-node-s8qfk\" (UID: \"3c183a68-147a-4cc9-8e01-6e6df3a46a7b\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:46:40.789709 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.789562 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3952e485-121f-46cf-820b-24e6a13ec3bc-hostroot\") pod \"multus-gw5cp\" (UID: \"3952e485-121f-46cf-820b-24e6a13ec3bc\") " pod="openshift-multus/multus-gw5cp" Apr 22 18:46:40.789709 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.789582 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b6911a95-1c4c-4513-b688-98a7456d5642-etc-systemd\") pod \"tuned-xwhq6\" (UID: \"b6911a95-1c4c-4513-b688-98a7456d5642\") " pod="openshift-cluster-node-tuning-operator/tuned-xwhq6" Apr 22 18:46:40.789709 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.789605 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b6911a95-1c4c-4513-b688-98a7456d5642-sys\") pod \"tuned-xwhq6\" (UID: \"b6911a95-1c4c-4513-b688-98a7456d5642\") " pod="openshift-cluster-node-tuning-operator/tuned-xwhq6" Apr 22 18:46:40.789709 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.789634 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b6911a95-1c4c-4513-b688-98a7456d5642-lib-modules\") pod \"tuned-xwhq6\" (UID: \"b6911a95-1c4c-4513-b688-98a7456d5642\") " pod="openshift-cluster-node-tuning-operator/tuned-xwhq6" Apr 22 18:46:40.789709 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.789675 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3c183a68-147a-4cc9-8e01-6e6df3a46a7b-host-slash\") pod \"ovnkube-node-s8qfk\" (UID: \"3c183a68-147a-4cc9-8e01-6e6df3a46a7b\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:46:40.790416 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.789725 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b6911a95-1c4c-4513-b688-98a7456d5642-tmp\") pod \"tuned-xwhq6\" (UID: \"b6911a95-1c4c-4513-b688-98a7456d5642\") " pod="openshift-cluster-node-tuning-operator/tuned-xwhq6" Apr 22 18:46:40.790416 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.789772 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3952e485-121f-46cf-820b-24e6a13ec3bc-etc-kubernetes\") pod \"multus-gw5cp\" (UID: \"3952e485-121f-46cf-820b-24e6a13ec3bc\") " pod="openshift-multus/multus-gw5cp" Apr 22 18:46:40.790416 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.789804 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2780ff15-4023-4008-904e-d48820d8e1d9-iptables-alerter-script\") pod \"iptables-alerter-94k7v\" (UID: \"2780ff15-4023-4008-904e-d48820d8e1d9\") " pod="openshift-network-operator/iptables-alerter-94k7v" Apr 22 18:46:40.790416 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.789830 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e92d5c0b-3e38-4729-bb74-4ca59dfb7f0f-hosts-file\") pod \"node-resolver-6xjch\" (UID: \"e92d5c0b-3e38-4729-bb74-4ca59dfb7f0f\") " pod="openshift-dns/node-resolver-6xjch" Apr 22 18:46:40.790416 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.789853 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3c183a68-147a-4cc9-8e01-6e6df3a46a7b-log-socket\") pod \"ovnkube-node-s8qfk\" (UID: \"3c183a68-147a-4cc9-8e01-6e6df3a46a7b\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:46:40.790416 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.789881 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b6911a95-1c4c-4513-b688-98a7456d5642-var-lib-kubelet\") pod \"tuned-xwhq6\" (UID: \"b6911a95-1c4c-4513-b688-98a7456d5642\") " pod="openshift-cluster-node-tuning-operator/tuned-xwhq6" Apr 22 18:46:40.790416 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.789905 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3952e485-121f-46cf-820b-24e6a13ec3bc-multus-conf-dir\") pod \"multus-gw5cp\" (UID: \"3952e485-121f-46cf-820b-24e6a13ec3bc\") " pod="openshift-multus/multus-gw5cp" Apr 22 18:46:40.790416 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.789924 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3952e485-121f-46cf-820b-24e6a13ec3bc-etc-kubernetes\") pod \"multus-gw5cp\" (UID: \"3952e485-121f-46cf-820b-24e6a13ec3bc\") " pod="openshift-multus/multus-gw5cp" Apr 22 18:46:40.790416 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.789959 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3952e485-121f-46cf-820b-24e6a13ec3bc-multus-conf-dir\") pod \"multus-gw5cp\" (UID: \"3952e485-121f-46cf-820b-24e6a13ec3bc\") " pod="openshift-multus/multus-gw5cp" Apr 22 18:46:40.790416 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.789973 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3c183a68-147a-4cc9-8e01-6e6df3a46a7b-log-socket\") pod \"ovnkube-node-s8qfk\" (UID: \"3c183a68-147a-4cc9-8e01-6e6df3a46a7b\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:46:40.790416 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.789996 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ffe6e393-6faa-47fe-a818-c1ffc6dda35c-os-release\") pod \"multus-additional-cni-plugins-bt425\" (UID: \"ffe6e393-6faa-47fe-a818-c1ffc6dda35c\") " pod="openshift-multus/multus-additional-cni-plugins-bt425" Apr 22 18:46:40.790416 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.790032 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3c183a68-147a-4cc9-8e01-6e6df3a46a7b-host-run-netns\") pod \"ovnkube-node-s8qfk\" (UID: \"3c183a68-147a-4cc9-8e01-6e6df3a46a7b\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:46:40.790416 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.790048 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e92d5c0b-3e38-4729-bb74-4ca59dfb7f0f-hosts-file\") pod \"node-resolver-6xjch\" (UID: \"e92d5c0b-3e38-4729-bb74-4ca59dfb7f0f\") " pod="openshift-dns/node-resolver-6xjch" Apr 22 18:46:40.790416 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.790056 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b6911a95-1c4c-4513-b688-98a7456d5642-etc-sysctl-conf\") pod \"tuned-xwhq6\" (UID: \"b6911a95-1c4c-4513-b688-98a7456d5642\") " pod="openshift-cluster-node-tuning-operator/tuned-xwhq6" Apr 22 18:46:40.790416 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.790078 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e92d5c0b-3e38-4729-bb74-4ca59dfb7f0f-tmp-dir\") pod \"node-resolver-6xjch\" (UID: \"e92d5c0b-3e38-4729-bb74-4ca59dfb7f0f\") " pod="openshift-dns/node-resolver-6xjch" Apr 22 18:46:40.790416 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.790096 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3c183a68-147a-4cc9-8e01-6e6df3a46a7b-etc-openvswitch\") pod \"ovnkube-node-s8qfk\" (UID: \"3c183a68-147a-4cc9-8e01-6e6df3a46a7b\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:46:40.790416 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.790103 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3c183a68-147a-4cc9-8e01-6e6df3a46a7b-host-run-netns\") pod \"ovnkube-node-s8qfk\" (UID: \"3c183a68-147a-4cc9-8e01-6e6df3a46a7b\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:46:40.790416 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.790120 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b6911a95-1c4c-4513-b688-98a7456d5642-etc-modprobe-d\") pod \"tuned-xwhq6\" (UID: \"b6911a95-1c4c-4513-b688-98a7456d5642\") " pod="openshift-cluster-node-tuning-operator/tuned-xwhq6" Apr 22 18:46:40.791345 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.790142 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b6911a95-1c4c-4513-b688-98a7456d5642-run\") pod \"tuned-xwhq6\" (UID: \"b6911a95-1c4c-4513-b688-98a7456d5642\") " pod="openshift-cluster-node-tuning-operator/tuned-xwhq6" Apr 22 18:46:40.791345 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.790167 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ffe6e393-6faa-47fe-a818-c1ffc6dda35c-cni-binary-copy\") pod \"multus-additional-cni-plugins-bt425\" (UID: \"ffe6e393-6faa-47fe-a818-c1ffc6dda35c\") " pod="openshift-multus/multus-additional-cni-plugins-bt425" Apr 22 18:46:40.791345 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.790180 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3c183a68-147a-4cc9-8e01-6e6df3a46a7b-etc-openvswitch\") pod \"ovnkube-node-s8qfk\" (UID: \"3c183a68-147a-4cc9-8e01-6e6df3a46a7b\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:46:40.791345 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.790196 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3952e485-121f-46cf-820b-24e6a13ec3bc-multus-daemon-config\") pod \"multus-gw5cp\" (UID: \"3952e485-121f-46cf-820b-24e6a13ec3bc\") " pod="openshift-multus/multus-gw5cp" Apr 22 18:46:40.791345 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.790199 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ffe6e393-6faa-47fe-a818-c1ffc6dda35c-os-release\") pod \"multus-additional-cni-plugins-bt425\" (UID: \"ffe6e393-6faa-47fe-a818-c1ffc6dda35c\") " pod="openshift-multus/multus-additional-cni-plugins-bt425" Apr 22 18:46:40.791345 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.790250 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3c183a68-147a-4cc9-8e01-6e6df3a46a7b-host-cni-bin\") pod \"ovnkube-node-s8qfk\" (UID: \"3c183a68-147a-4cc9-8e01-6e6df3a46a7b\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:46:40.791345 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.790277 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3c183a68-147a-4cc9-8e01-6e6df3a46a7b-host-run-ovn-kubernetes\") pod \"ovnkube-node-s8qfk\" (UID: \"3c183a68-147a-4cc9-8e01-6e6df3a46a7b\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:46:40.791345 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.790304 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3952e485-121f-46cf-820b-24e6a13ec3bc-cnibin\") pod \"multus-gw5cp\" (UID: \"3952e485-121f-46cf-820b-24e6a13ec3bc\") " pod="openshift-multus/multus-gw5cp" Apr 22 18:46:40.791345 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.790316 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3c183a68-147a-4cc9-8e01-6e6df3a46a7b-host-cni-bin\") pod \"ovnkube-node-s8qfk\" (UID: \"3c183a68-147a-4cc9-8e01-6e6df3a46a7b\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:46:40.791345 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.790328 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3952e485-121f-46cf-820b-24e6a13ec3bc-os-release\") pod \"multus-gw5cp\" (UID: \"3952e485-121f-46cf-820b-24e6a13ec3bc\") " pod="openshift-multus/multus-gw5cp" Apr 22 18:46:40.791345 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.790360 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3c183a68-147a-4cc9-8e01-6e6df3a46a7b-host-run-ovn-kubernetes\") pod \"ovnkube-node-s8qfk\" (UID: \"3c183a68-147a-4cc9-8e01-6e6df3a46a7b\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:46:40.791345 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.790351 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-77wvj\" (UniqueName: \"kubernetes.io/projected/66e2806a-82a6-444a-a19a-13641a426bed-kube-api-access-77wvj\") pod \"network-metrics-daemon-7zj27\" (UID: \"66e2806a-82a6-444a-a19a-13641a426bed\") " pod="openshift-multus/network-metrics-daemon-7zj27" Apr 22 18:46:40.791345 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.790399 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3952e485-121f-46cf-820b-24e6a13ec3bc-host-run-multus-certs\") pod \"multus-gw5cp\" (UID: \"3952e485-121f-46cf-820b-24e6a13ec3bc\") " pod="openshift-multus/multus-gw5cp" Apr 22 18:46:40.791345 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.790425 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3952e485-121f-46cf-820b-24e6a13ec3bc-multus-cni-dir\") pod \"multus-gw5cp\" (UID: \"3952e485-121f-46cf-820b-24e6a13ec3bc\") " pod="openshift-multus/multus-gw5cp" Apr 22 18:46:40.791345 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.790452 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/66e2806a-82a6-444a-a19a-13641a426bed-metrics-certs\") pod \"network-metrics-daemon-7zj27\" (UID: \"66e2806a-82a6-444a-a19a-13641a426bed\") " pod="openshift-multus/network-metrics-daemon-7zj27" Apr 22 18:46:40.791345 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.790467 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e92d5c0b-3e38-4729-bb74-4ca59dfb7f0f-tmp-dir\") pod \"node-resolver-6xjch\" (UID: \"e92d5c0b-3e38-4729-bb74-4ca59dfb7f0f\") " pod="openshift-dns/node-resolver-6xjch" Apr 22 18:46:40.791345 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.790476 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rn9sz\" (UniqueName: \"kubernetes.io/projected/462b5fca-7ee5-444f-af12-38942635aa2f-kube-api-access-rn9sz\") pod \"node-ca-wj7nl\" (UID: \"462b5fca-7ee5-444f-af12-38942635aa2f\") " pod="openshift-image-registry/node-ca-wj7nl" Apr 22 18:46:40.792060 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.790506 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/bbe988ee-649d-4eed-9871-40def68d89af-sys-fs\") pod \"aws-ebs-csi-driver-node-p6bnc\" (UID: \"bbe988ee-649d-4eed-9871-40def68d89af\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-p6bnc" Apr 22 18:46:40.792060 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.790531 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b6911a95-1c4c-4513-b688-98a7456d5642-etc-kubernetes\") pod \"tuned-xwhq6\" (UID: \"b6911a95-1c4c-4513-b688-98a7456d5642\") " pod="openshift-cluster-node-tuning-operator/tuned-xwhq6" Apr 22 18:46:40.792060 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.790561 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2780ff15-4023-4008-904e-d48820d8e1d9-iptables-alerter-script\") pod \"iptables-alerter-94k7v\" (UID: \"2780ff15-4023-4008-904e-d48820d8e1d9\") " pod="openshift-network-operator/iptables-alerter-94k7v" Apr 22 18:46:40.792060 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.790563 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3952e485-121f-46cf-820b-24e6a13ec3bc-cnibin\") pod \"multus-gw5cp\" (UID: \"3952e485-121f-46cf-820b-24e6a13ec3bc\") " pod="openshift-multus/multus-gw5cp" Apr 22 18:46:40.792060 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.790602 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3952e485-121f-46cf-820b-24e6a13ec3bc-multus-cni-dir\") pod \"multus-gw5cp\" (UID: \"3952e485-121f-46cf-820b-24e6a13ec3bc\") " pod="openshift-multus/multus-gw5cp" Apr 22 18:46:40.792060 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.790606 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3c183a68-147a-4cc9-8e01-6e6df3a46a7b-ovnkube-script-lib\") pod \"ovnkube-node-s8qfk\" (UID: \"3c183a68-147a-4cc9-8e01-6e6df3a46a7b\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:46:40.792060 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.790618 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3952e485-121f-46cf-820b-24e6a13ec3bc-host-run-multus-certs\") pod \"multus-gw5cp\" (UID: \"3952e485-121f-46cf-820b-24e6a13ec3bc\") " pod="openshift-multus/multus-gw5cp" Apr 22 18:46:40.792060 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:40.790621 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:40.792060 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.790638 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b6911a95-1c4c-4513-b688-98a7456d5642-host\") pod \"tuned-xwhq6\" (UID: \"b6911a95-1c4c-4513-b688-98a7456d5642\") " pod="openshift-cluster-node-tuning-operator/tuned-xwhq6" Apr 22 18:46:40.792060 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.790660 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3952e485-121f-46cf-820b-24e6a13ec3bc-os-release\") pod \"multus-gw5cp\" (UID: \"3952e485-121f-46cf-820b-24e6a13ec3bc\") " pod="openshift-multus/multus-gw5cp" Apr 22 18:46:40.792060 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.790670 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h4m9x\" (UniqueName: \"kubernetes.io/projected/3c183a68-147a-4cc9-8e01-6e6df3a46a7b-kube-api-access-h4m9x\") pod \"ovnkube-node-s8qfk\" (UID: \"3c183a68-147a-4cc9-8e01-6e6df3a46a7b\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:46:40.792060 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:40.790697 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66e2806a-82a6-444a-a19a-13641a426bed-metrics-certs podName:66e2806a-82a6-444a-a19a-13641a426bed nodeName:}" failed. No retries permitted until 2026-04-22 18:46:41.290667135 +0000 UTC m=+3.069500619 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/66e2806a-82a6-444a-a19a-13641a426bed-metrics-certs") pod "network-metrics-daemon-7zj27" (UID: "66e2806a-82a6-444a-a19a-13641a426bed") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:40.792060 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.790717 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3952e485-121f-46cf-820b-24e6a13ec3bc-multus-daemon-config\") pod \"multus-gw5cp\" (UID: \"3952e485-121f-46cf-820b-24e6a13ec3bc\") " pod="openshift-multus/multus-gw5cp" Apr 22 18:46:40.792060 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.790776 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ts99x\" (UniqueName: \"kubernetes.io/projected/85d15dde-b7d7-4309-980e-49ce8977c219-kube-api-access-ts99x\") pod \"network-check-target-6cpjm\" (UID: \"85d15dde-b7d7-4309-980e-49ce8977c219\") " pod="openshift-network-diagnostics/network-check-target-6cpjm" Apr 22 18:46:40.792060 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.790800 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ffe6e393-6faa-47fe-a818-c1ffc6dda35c-cni-binary-copy\") pod \"multus-additional-cni-plugins-bt425\" (UID: \"ffe6e393-6faa-47fe-a818-c1ffc6dda35c\") " pod="openshift-multus/multus-additional-cni-plugins-bt425" Apr 22 18:46:40.792060 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.790812 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b76rw\" (UniqueName: \"kubernetes.io/projected/3952e485-121f-46cf-820b-24e6a13ec3bc-kube-api-access-b76rw\") pod \"multus-gw5cp\" (UID: \"3952e485-121f-46cf-820b-24e6a13ec3bc\") " pod="openshift-multus/multus-gw5cp" Apr 22 18:46:40.792060 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.790849 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ffe6e393-6faa-47fe-a818-c1ffc6dda35c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-bt425\" (UID: \"ffe6e393-6faa-47fe-a818-c1ffc6dda35c\") " pod="openshift-multus/multus-additional-cni-plugins-bt425" Apr 22 18:46:40.792905 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.790878 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ffe6e393-6faa-47fe-a818-c1ffc6dda35c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-bt425\" (UID: \"ffe6e393-6faa-47fe-a818-c1ffc6dda35c\") " pod="openshift-multus/multus-additional-cni-plugins-bt425" Apr 22 18:46:40.792905 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.790908 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3952e485-121f-46cf-820b-24e6a13ec3bc-system-cni-dir\") pod \"multus-gw5cp\" (UID: \"3952e485-121f-46cf-820b-24e6a13ec3bc\") " pod="openshift-multus/multus-gw5cp" Apr 22 18:46:40.792905 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.790953 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3952e485-121f-46cf-820b-24e6a13ec3bc-host-var-lib-cni-multus\") pod \"multus-gw5cp\" (UID: \"3952e485-121f-46cf-820b-24e6a13ec3bc\") " pod="openshift-multus/multus-gw5cp" Apr 22 18:46:40.792905 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.790978 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3c183a68-147a-4cc9-8e01-6e6df3a46a7b-host-kubelet\") pod \"ovnkube-node-s8qfk\" (UID: \"3c183a68-147a-4cc9-8e01-6e6df3a46a7b\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:46:40.792905 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.790988 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3952e485-121f-46cf-820b-24e6a13ec3bc-system-cni-dir\") pod \"multus-gw5cp\" (UID: \"3952e485-121f-46cf-820b-24e6a13ec3bc\") " pod="openshift-multus/multus-gw5cp" Apr 22 18:46:40.792905 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.791001 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3c183a68-147a-4cc9-8e01-6e6df3a46a7b-systemd-units\") pod \"ovnkube-node-s8qfk\" (UID: \"3c183a68-147a-4cc9-8e01-6e6df3a46a7b\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:46:40.792905 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.791016 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ffe6e393-6faa-47fe-a818-c1ffc6dda35c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-bt425\" (UID: \"ffe6e393-6faa-47fe-a818-c1ffc6dda35c\") " pod="openshift-multus/multus-additional-cni-plugins-bt425" Apr 22 18:46:40.792905 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.791029 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3952e485-121f-46cf-820b-24e6a13ec3bc-host-var-lib-cni-multus\") pod \"multus-gw5cp\" (UID: \"3952e485-121f-46cf-820b-24e6a13ec3bc\") " pod="openshift-multus/multus-gw5cp" Apr 22 18:46:40.792905 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.791026 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3952e485-121f-46cf-820b-24e6a13ec3bc-host-var-lib-kubelet\") pod \"multus-gw5cp\" (UID: \"3952e485-121f-46cf-820b-24e6a13ec3bc\") " pod="openshift-multus/multus-gw5cp" Apr 22 18:46:40.792905 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.791054 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3c183a68-147a-4cc9-8e01-6e6df3a46a7b-host-kubelet\") pod \"ovnkube-node-s8qfk\" (UID: \"3c183a68-147a-4cc9-8e01-6e6df3a46a7b\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:46:40.792905 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.791068 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ffe6e393-6faa-47fe-a818-c1ffc6dda35c-system-cni-dir\") pod \"multus-additional-cni-plugins-bt425\" (UID: \"ffe6e393-6faa-47fe-a818-c1ffc6dda35c\") " pod="openshift-multus/multus-additional-cni-plugins-bt425" Apr 22 18:46:40.792905 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.791070 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3c183a68-147a-4cc9-8e01-6e6df3a46a7b-systemd-units\") pod \"ovnkube-node-s8qfk\" (UID: \"3c183a68-147a-4cc9-8e01-6e6df3a46a7b\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:46:40.792905 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.791056 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3952e485-121f-46cf-820b-24e6a13ec3bc-host-var-lib-kubelet\") pod \"multus-gw5cp\" (UID: \"3952e485-121f-46cf-820b-24e6a13ec3bc\") " pod="openshift-multus/multus-gw5cp" Apr 22 18:46:40.792905 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.791095 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2780ff15-4023-4008-904e-d48820d8e1d9-host-slash\") pod \"iptables-alerter-94k7v\" (UID: \"2780ff15-4023-4008-904e-d48820d8e1d9\") " pod="openshift-network-operator/iptables-alerter-94k7v" Apr 22 18:46:40.792905 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.791121 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/284cb099-6d1a-47cf-883b-5ada5cd11fa0-agent-certs\") pod \"konnectivity-agent-x8klk\" (UID: \"284cb099-6d1a-47cf-883b-5ada5cd11fa0\") " pod="kube-system/konnectivity-agent-x8klk" Apr 22 18:46:40.792905 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.791126 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ffe6e393-6faa-47fe-a818-c1ffc6dda35c-system-cni-dir\") pod \"multus-additional-cni-plugins-bt425\" (UID: \"ffe6e393-6faa-47fe-a818-c1ffc6dda35c\") " pod="openshift-multus/multus-additional-cni-plugins-bt425" Apr 22 18:46:40.792905 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.791144 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/462b5fca-7ee5-444f-af12-38942635aa2f-serviceca\") pod \"node-ca-wj7nl\" (UID: \"462b5fca-7ee5-444f-af12-38942635aa2f\") " pod="openshift-image-registry/node-ca-wj7nl" Apr 22 18:46:40.793608 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.791170 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3c183a68-147a-4cc9-8e01-6e6df3a46a7b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-s8qfk\" (UID: \"3c183a68-147a-4cc9-8e01-6e6df3a46a7b\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:46:40.793608 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.791194 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3c183a68-147a-4cc9-8e01-6e6df3a46a7b-ovnkube-script-lib\") pod \"ovnkube-node-s8qfk\" (UID: \"3c183a68-147a-4cc9-8e01-6e6df3a46a7b\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:46:40.793608 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.791173 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2780ff15-4023-4008-904e-d48820d8e1d9-host-slash\") pod \"iptables-alerter-94k7v\" (UID: \"2780ff15-4023-4008-904e-d48820d8e1d9\") " pod="openshift-network-operator/iptables-alerter-94k7v" Apr 22 18:46:40.793608 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.791199 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3c183a68-147a-4cc9-8e01-6e6df3a46a7b-env-overrides\") pod \"ovnkube-node-s8qfk\" (UID: \"3c183a68-147a-4cc9-8e01-6e6df3a46a7b\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:46:40.793608 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.791216 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3c183a68-147a-4cc9-8e01-6e6df3a46a7b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-s8qfk\" (UID: \"3c183a68-147a-4cc9-8e01-6e6df3a46a7b\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:46:40.793608 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.791251 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3c183a68-147a-4cc9-8e01-6e6df3a46a7b-ovn-node-metrics-cert\") pod \"ovnkube-node-s8qfk\" (UID: \"3c183a68-147a-4cc9-8e01-6e6df3a46a7b\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:46:40.793608 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.791274 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lpbhw\" (UniqueName: \"kubernetes.io/projected/e92d5c0b-3e38-4729-bb74-4ca59dfb7f0f-kube-api-access-lpbhw\") pod \"node-resolver-6xjch\" (UID: \"e92d5c0b-3e38-4729-bb74-4ca59dfb7f0f\") " pod="openshift-dns/node-resolver-6xjch" Apr 22 18:46:40.793608 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.791297 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3c183a68-147a-4cc9-8e01-6e6df3a46a7b-run-ovn\") pod \"ovnkube-node-s8qfk\" (UID: \"3c183a68-147a-4cc9-8e01-6e6df3a46a7b\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:46:40.793608 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.791319 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3c183a68-147a-4cc9-8e01-6e6df3a46a7b-node-log\") pod \"ovnkube-node-s8qfk\" (UID: \"3c183a68-147a-4cc9-8e01-6e6df3a46a7b\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:46:40.793608 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.791363 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/bbe988ee-649d-4eed-9871-40def68d89af-device-dir\") pod \"aws-ebs-csi-driver-node-p6bnc\" (UID: \"bbe988ee-649d-4eed-9871-40def68d89af\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-p6bnc" Apr 22 18:46:40.793608 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.791388 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/bbe988ee-649d-4eed-9871-40def68d89af-etc-selinux\") pod \"aws-ebs-csi-driver-node-p6bnc\" (UID: \"bbe988ee-649d-4eed-9871-40def68d89af\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-p6bnc" Apr 22 18:46:40.793608 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.791397 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ffe6e393-6faa-47fe-a818-c1ffc6dda35c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-bt425\" (UID: \"ffe6e393-6faa-47fe-a818-c1ffc6dda35c\") " pod="openshift-multus/multus-additional-cni-plugins-bt425" Apr 22 18:46:40.793608 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.791413 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7k96\" (UniqueName: \"kubernetes.io/projected/b6911a95-1c4c-4513-b688-98a7456d5642-kube-api-access-s7k96\") pod \"tuned-xwhq6\" (UID: \"b6911a95-1c4c-4513-b688-98a7456d5642\") " pod="openshift-cluster-node-tuning-operator/tuned-xwhq6" Apr 22 18:46:40.793608 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.791449 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/284cb099-6d1a-47cf-883b-5ada5cd11fa0-konnectivity-ca\") pod \"konnectivity-agent-x8klk\" (UID: \"284cb099-6d1a-47cf-883b-5ada5cd11fa0\") " pod="kube-system/konnectivity-agent-x8klk" Apr 22 18:46:40.793608 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.791471 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3c183a68-147a-4cc9-8e01-6e6df3a46a7b-ovnkube-config\") pod \"ovnkube-node-s8qfk\" (UID: \"3c183a68-147a-4cc9-8e01-6e6df3a46a7b\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:46:40.793608 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.791473 2566 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 18:46:40.793608 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.791492 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bbe988ee-649d-4eed-9871-40def68d89af-socket-dir\") pod \"aws-ebs-csi-driver-node-p6bnc\" (UID: \"bbe988ee-649d-4eed-9871-40def68d89af\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-p6bnc" Apr 22 18:46:40.794294 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.791507 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3c183a68-147a-4cc9-8e01-6e6df3a46a7b-run-ovn\") pod \"ovnkube-node-s8qfk\" (UID: \"3c183a68-147a-4cc9-8e01-6e6df3a46a7b\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:46:40.794294 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.791517 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bbe988ee-649d-4eed-9871-40def68d89af-registration-dir\") pod \"aws-ebs-csi-driver-node-p6bnc\" (UID: \"bbe988ee-649d-4eed-9871-40def68d89af\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-p6bnc" Apr 22 18:46:40.794294 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.791536 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3c183a68-147a-4cc9-8e01-6e6df3a46a7b-host-cni-netd\") pod \"ovnkube-node-s8qfk\" (UID: \"3c183a68-147a-4cc9-8e01-6e6df3a46a7b\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:46:40.794294 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.791579 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ffe6e393-6faa-47fe-a818-c1ffc6dda35c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-bt425\" (UID: \"ffe6e393-6faa-47fe-a818-c1ffc6dda35c\") " pod="openshift-multus/multus-additional-cni-plugins-bt425" Apr 22 18:46:40.794294 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.791610 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-crwg8\" (UniqueName: \"kubernetes.io/projected/ffe6e393-6faa-47fe-a818-c1ffc6dda35c-kube-api-access-crwg8\") pod \"multus-additional-cni-plugins-bt425\" (UID: \"ffe6e393-6faa-47fe-a818-c1ffc6dda35c\") " pod="openshift-multus/multus-additional-cni-plugins-bt425" Apr 22 18:46:40.794294 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.791635 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3952e485-121f-46cf-820b-24e6a13ec3bc-cni-binary-copy\") pod \"multus-gw5cp\" (UID: \"3952e485-121f-46cf-820b-24e6a13ec3bc\") " pod="openshift-multus/multus-gw5cp" Apr 22 18:46:40.794294 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.791658 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3952e485-121f-46cf-820b-24e6a13ec3bc-multus-socket-dir-parent\") pod \"multus-gw5cp\" (UID: \"3952e485-121f-46cf-820b-24e6a13ec3bc\") " pod="openshift-multus/multus-gw5cp" Apr 22 18:46:40.794294 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.791685 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3952e485-121f-46cf-820b-24e6a13ec3bc-host-run-netns\") pod \"multus-gw5cp\" (UID: \"3952e485-121f-46cf-820b-24e6a13ec3bc\") " pod="openshift-multus/multus-gw5cp" Apr 22 18:46:40.794294 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.791709 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/462b5fca-7ee5-444f-af12-38942635aa2f-host\") pod \"node-ca-wj7nl\" (UID: \"462b5fca-7ee5-444f-af12-38942635aa2f\") " pod="openshift-image-registry/node-ca-wj7nl" Apr 22 18:46:40.794294 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.791735 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3c183a68-147a-4cc9-8e01-6e6df3a46a7b-run-systemd\") pod \"ovnkube-node-s8qfk\" (UID: \"3c183a68-147a-4cc9-8e01-6e6df3a46a7b\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:46:40.794294 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.791768 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3c183a68-147a-4cc9-8e01-6e6df3a46a7b-var-lib-openvswitch\") pod \"ovnkube-node-s8qfk\" (UID: \"3c183a68-147a-4cc9-8e01-6e6df3a46a7b\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:46:40.794294 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.791795 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gz95r\" (UniqueName: \"kubernetes.io/projected/2780ff15-4023-4008-904e-d48820d8e1d9-kube-api-access-gz95r\") pod \"iptables-alerter-94k7v\" (UID: \"2780ff15-4023-4008-904e-d48820d8e1d9\") " pod="openshift-network-operator/iptables-alerter-94k7v" Apr 22 18:46:40.794294 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.791842 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bbe988ee-649d-4eed-9871-40def68d89af-kubelet-dir\") pod \"aws-ebs-csi-driver-node-p6bnc\" (UID: \"bbe988ee-649d-4eed-9871-40def68d89af\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-p6bnc" Apr 22 18:46:40.794294 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.792010 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/284cb099-6d1a-47cf-883b-5ada5cd11fa0-konnectivity-ca\") pod \"konnectivity-agent-x8klk\" (UID: \"284cb099-6d1a-47cf-883b-5ada5cd11fa0\") " pod="kube-system/konnectivity-agent-x8klk" Apr 22 18:46:40.794294 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.792017 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3c183a68-147a-4cc9-8e01-6e6df3a46a7b-ovnkube-config\") pod \"ovnkube-node-s8qfk\" (UID: \"3c183a68-147a-4cc9-8e01-6e6df3a46a7b\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:46:40.794294 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.792092 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ffe6e393-6faa-47fe-a818-c1ffc6dda35c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-bt425\" (UID: \"ffe6e393-6faa-47fe-a818-c1ffc6dda35c\") " pod="openshift-multus/multus-additional-cni-plugins-bt425" Apr 22 18:46:40.794294 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.792136 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3c183a68-147a-4cc9-8e01-6e6df3a46a7b-run-systemd\") pod \"ovnkube-node-s8qfk\" (UID: \"3c183a68-147a-4cc9-8e01-6e6df3a46a7b\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:46:40.795088 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.792136 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3c183a68-147a-4cc9-8e01-6e6df3a46a7b-node-log\") pod \"ovnkube-node-s8qfk\" (UID: \"3c183a68-147a-4cc9-8e01-6e6df3a46a7b\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:46:40.795088 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.792174 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/462b5fca-7ee5-444f-af12-38942635aa2f-host\") pod \"node-ca-wj7nl\" (UID: \"462b5fca-7ee5-444f-af12-38942635aa2f\") " pod="openshift-image-registry/node-ca-wj7nl" Apr 22 18:46:40.795088 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.792181 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3952e485-121f-46cf-820b-24e6a13ec3bc-host-run-netns\") pod \"multus-gw5cp\" (UID: \"3952e485-121f-46cf-820b-24e6a13ec3bc\") " pod="openshift-multus/multus-gw5cp" Apr 22 18:46:40.795088 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.792193 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3c183a68-147a-4cc9-8e01-6e6df3a46a7b-env-overrides\") pod \"ovnkube-node-s8qfk\" (UID: \"3c183a68-147a-4cc9-8e01-6e6df3a46a7b\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:46:40.795088 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.792206 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3c183a68-147a-4cc9-8e01-6e6df3a46a7b-var-lib-openvswitch\") pod \"ovnkube-node-s8qfk\" (UID: \"3c183a68-147a-4cc9-8e01-6e6df3a46a7b\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:46:40.795088 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.792225 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3c183a68-147a-4cc9-8e01-6e6df3a46a7b-host-cni-netd\") pod \"ovnkube-node-s8qfk\" (UID: \"3c183a68-147a-4cc9-8e01-6e6df3a46a7b\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:46:40.795088 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.792256 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3952e485-121f-46cf-820b-24e6a13ec3bc-multus-socket-dir-parent\") pod \"multus-gw5cp\" (UID: \"3952e485-121f-46cf-820b-24e6a13ec3bc\") " pod="openshift-multus/multus-gw5cp" Apr 22 18:46:40.795088 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.792383 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/462b5fca-7ee5-444f-af12-38942635aa2f-serviceca\") pod \"node-ca-wj7nl\" (UID: \"462b5fca-7ee5-444f-af12-38942635aa2f\") " pod="openshift-image-registry/node-ca-wj7nl" Apr 22 18:46:40.795088 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.792614 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3952e485-121f-46cf-820b-24e6a13ec3bc-cni-binary-copy\") pod \"multus-gw5cp\" (UID: \"3952e485-121f-46cf-820b-24e6a13ec3bc\") " pod="openshift-multus/multus-gw5cp" Apr 22 18:46:40.795088 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.795040 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3c183a68-147a-4cc9-8e01-6e6df3a46a7b-ovn-node-metrics-cert\") pod \"ovnkube-node-s8qfk\" (UID: \"3c183a68-147a-4cc9-8e01-6e6df3a46a7b\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:46:40.795481 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.795192 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/284cb099-6d1a-47cf-883b-5ada5cd11fa0-agent-certs\") pod \"konnectivity-agent-x8klk\" (UID: \"284cb099-6d1a-47cf-883b-5ada5cd11fa0\") " pod="kube-system/konnectivity-agent-x8klk" Apr 22 18:46:40.799289 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.799262 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn9sz\" (UniqueName: \"kubernetes.io/projected/462b5fca-7ee5-444f-af12-38942635aa2f-kube-api-access-rn9sz\") pod \"node-ca-wj7nl\" (UID: \"462b5fca-7ee5-444f-af12-38942635aa2f\") " pod="openshift-image-registry/node-ca-wj7nl" Apr 22 18:46:40.800111 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.800066 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpbhw\" (UniqueName: \"kubernetes.io/projected/e92d5c0b-3e38-4729-bb74-4ca59dfb7f0f-kube-api-access-lpbhw\") pod \"node-resolver-6xjch\" (UID: \"e92d5c0b-3e38-4729-bb74-4ca59dfb7f0f\") " pod="openshift-dns/node-resolver-6xjch" Apr 22 18:46:40.800716 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.800693 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-crwg8\" (UniqueName: \"kubernetes.io/projected/ffe6e393-6faa-47fe-a818-c1ffc6dda35c-kube-api-access-crwg8\") pod \"multus-additional-cni-plugins-bt425\" (UID: \"ffe6e393-6faa-47fe-a818-c1ffc6dda35c\") " pod="openshift-multus/multus-additional-cni-plugins-bt425" Apr 22 18:46:40.801030 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.800924 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-77wvj\" (UniqueName: \"kubernetes.io/projected/66e2806a-82a6-444a-a19a-13641a426bed-kube-api-access-77wvj\") pod \"network-metrics-daemon-7zj27\" (UID: \"66e2806a-82a6-444a-a19a-13641a426bed\") " pod="openshift-multus/network-metrics-daemon-7zj27" Apr 22 18:46:40.801372 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.801351 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b76rw\" (UniqueName: \"kubernetes.io/projected/3952e485-121f-46cf-820b-24e6a13ec3bc-kube-api-access-b76rw\") pod \"multus-gw5cp\" (UID: \"3952e485-121f-46cf-820b-24e6a13ec3bc\") " pod="openshift-multus/multus-gw5cp" Apr 22 18:46:40.801513 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.801496 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gz95r\" (UniqueName: \"kubernetes.io/projected/2780ff15-4023-4008-904e-d48820d8e1d9-kube-api-access-gz95r\") pod \"iptables-alerter-94k7v\" (UID: \"2780ff15-4023-4008-904e-d48820d8e1d9\") " pod="openshift-network-operator/iptables-alerter-94k7v" Apr 22 18:46:40.802491 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.802468 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4m9x\" (UniqueName: \"kubernetes.io/projected/3c183a68-147a-4cc9-8e01-6e6df3a46a7b-kube-api-access-h4m9x\") pod \"ovnkube-node-s8qfk\" (UID: \"3c183a68-147a-4cc9-8e01-6e6df3a46a7b\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:46:40.892617 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.892579 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bbe988ee-649d-4eed-9871-40def68d89af-kubelet-dir\") pod \"aws-ebs-csi-driver-node-p6bnc\" (UID: \"bbe988ee-649d-4eed-9871-40def68d89af\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-p6bnc" Apr 22 18:46:40.892799 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.892627 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b6911a95-1c4c-4513-b688-98a7456d5642-etc-sysconfig\") pod \"tuned-xwhq6\" (UID: \"b6911a95-1c4c-4513-b688-98a7456d5642\") " pod="openshift-cluster-node-tuning-operator/tuned-xwhq6" Apr 22 18:46:40.892799 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.892654 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b6911a95-1c4c-4513-b688-98a7456d5642-etc-tuned\") pod \"tuned-xwhq6\" (UID: \"b6911a95-1c4c-4513-b688-98a7456d5642\") " pod="openshift-cluster-node-tuning-operator/tuned-xwhq6" Apr 22 18:46:40.892799 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.892681 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mm486\" (UniqueName: \"kubernetes.io/projected/bbe988ee-649d-4eed-9871-40def68d89af-kube-api-access-mm486\") pod \"aws-ebs-csi-driver-node-p6bnc\" (UID: \"bbe988ee-649d-4eed-9871-40def68d89af\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-p6bnc" Apr 22 18:46:40.892799 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.892704 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b6911a95-1c4c-4513-b688-98a7456d5642-etc-sysctl-d\") pod \"tuned-xwhq6\" (UID: \"b6911a95-1c4c-4513-b688-98a7456d5642\") " pod="openshift-cluster-node-tuning-operator/tuned-xwhq6" Apr 22 18:46:40.892799 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.892730 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b6911a95-1c4c-4513-b688-98a7456d5642-etc-systemd\") pod \"tuned-xwhq6\" (UID: \"b6911a95-1c4c-4513-b688-98a7456d5642\") " pod="openshift-cluster-node-tuning-operator/tuned-xwhq6" Apr 22 18:46:40.892799 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.892752 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b6911a95-1c4c-4513-b688-98a7456d5642-sys\") pod \"tuned-xwhq6\" (UID: \"b6911a95-1c4c-4513-b688-98a7456d5642\") " pod="openshift-cluster-node-tuning-operator/tuned-xwhq6" Apr 22 18:46:40.892799 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.892775 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b6911a95-1c4c-4513-b688-98a7456d5642-lib-modules\") pod \"tuned-xwhq6\" (UID: \"b6911a95-1c4c-4513-b688-98a7456d5642\") " pod="openshift-cluster-node-tuning-operator/tuned-xwhq6" Apr 22 18:46:40.892799 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.892796 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b6911a95-1c4c-4513-b688-98a7456d5642-tmp\") pod \"tuned-xwhq6\" (UID: \"b6911a95-1c4c-4513-b688-98a7456d5642\") " pod="openshift-cluster-node-tuning-operator/tuned-xwhq6" Apr 22 18:46:40.893180 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.892824 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b6911a95-1c4c-4513-b688-98a7456d5642-var-lib-kubelet\") pod \"tuned-xwhq6\" (UID: \"b6911a95-1c4c-4513-b688-98a7456d5642\") " pod="openshift-cluster-node-tuning-operator/tuned-xwhq6" Apr 22 18:46:40.893180 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.892849 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b6911a95-1c4c-4513-b688-98a7456d5642-etc-sysctl-conf\") pod \"tuned-xwhq6\" (UID: \"b6911a95-1c4c-4513-b688-98a7456d5642\") " pod="openshift-cluster-node-tuning-operator/tuned-xwhq6" Apr 22 18:46:40.893180 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.893020 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b6911a95-1c4c-4513-b688-98a7456d5642-var-lib-kubelet\") pod \"tuned-xwhq6\" (UID: \"b6911a95-1c4c-4513-b688-98a7456d5642\") " pod="openshift-cluster-node-tuning-operator/tuned-xwhq6" Apr 22 18:46:40.893180 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.893024 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b6911a95-1c4c-4513-b688-98a7456d5642-lib-modules\") pod \"tuned-xwhq6\" (UID: \"b6911a95-1c4c-4513-b688-98a7456d5642\") " pod="openshift-cluster-node-tuning-operator/tuned-xwhq6" Apr 22 18:46:40.893180 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.893031 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b6911a95-1c4c-4513-b688-98a7456d5642-etc-sysctl-conf\") pod \"tuned-xwhq6\" (UID: \"b6911a95-1c4c-4513-b688-98a7456d5642\") " pod="openshift-cluster-node-tuning-operator/tuned-xwhq6" Apr 22 18:46:40.893180 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.893080 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b6911a95-1c4c-4513-b688-98a7456d5642-etc-sysconfig\") pod \"tuned-xwhq6\" (UID: \"b6911a95-1c4c-4513-b688-98a7456d5642\") " pod="openshift-cluster-node-tuning-operator/tuned-xwhq6" Apr 22 18:46:40.893444 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.893181 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bbe988ee-649d-4eed-9871-40def68d89af-kubelet-dir\") pod \"aws-ebs-csi-driver-node-p6bnc\" (UID: \"bbe988ee-649d-4eed-9871-40def68d89af\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-p6bnc" Apr 22 18:46:40.893444 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.893236 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b6911a95-1c4c-4513-b688-98a7456d5642-sys\") pod \"tuned-xwhq6\" (UID: \"b6911a95-1c4c-4513-b688-98a7456d5642\") " pod="openshift-cluster-node-tuning-operator/tuned-xwhq6" Apr 22 18:46:40.893444 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.893279 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b6911a95-1c4c-4513-b688-98a7456d5642-etc-sysctl-d\") pod \"tuned-xwhq6\" (UID: \"b6911a95-1c4c-4513-b688-98a7456d5642\") " pod="openshift-cluster-node-tuning-operator/tuned-xwhq6" Apr 22 18:46:40.893444 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.893294 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b6911a95-1c4c-4513-b688-98a7456d5642-etc-systemd\") pod \"tuned-xwhq6\" (UID: \"b6911a95-1c4c-4513-b688-98a7456d5642\") " pod="openshift-cluster-node-tuning-operator/tuned-xwhq6" Apr 22 18:46:40.893444 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.893300 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b6911a95-1c4c-4513-b688-98a7456d5642-etc-modprobe-d\") pod \"tuned-xwhq6\" (UID: \"b6911a95-1c4c-4513-b688-98a7456d5642\") " pod="openshift-cluster-node-tuning-operator/tuned-xwhq6" Apr 22 18:46:40.893444 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.893334 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b6911a95-1c4c-4513-b688-98a7456d5642-run\") pod \"tuned-xwhq6\" (UID: \"b6911a95-1c4c-4513-b688-98a7456d5642\") " pod="openshift-cluster-node-tuning-operator/tuned-xwhq6" Apr 22 18:46:40.893444 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.893382 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/bbe988ee-649d-4eed-9871-40def68d89af-sys-fs\") pod \"aws-ebs-csi-driver-node-p6bnc\" (UID: \"bbe988ee-649d-4eed-9871-40def68d89af\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-p6bnc" Apr 22 18:46:40.893444 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.893402 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b6911a95-1c4c-4513-b688-98a7456d5642-run\") pod \"tuned-xwhq6\" (UID: \"b6911a95-1c4c-4513-b688-98a7456d5642\") " pod="openshift-cluster-node-tuning-operator/tuned-xwhq6" Apr 22 18:46:40.893444 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.893405 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b6911a95-1c4c-4513-b688-98a7456d5642-etc-kubernetes\") pod \"tuned-xwhq6\" (UID: \"b6911a95-1c4c-4513-b688-98a7456d5642\") " pod="openshift-cluster-node-tuning-operator/tuned-xwhq6" Apr 22 18:46:40.893444 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.893431 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b6911a95-1c4c-4513-b688-98a7456d5642-etc-modprobe-d\") pod \"tuned-xwhq6\" (UID: \"b6911a95-1c4c-4513-b688-98a7456d5642\") " pod="openshift-cluster-node-tuning-operator/tuned-xwhq6" Apr 22 18:46:40.893444 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.893439 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b6911a95-1c4c-4513-b688-98a7456d5642-host\") pod \"tuned-xwhq6\" (UID: \"b6911a95-1c4c-4513-b688-98a7456d5642\") " pod="openshift-cluster-node-tuning-operator/tuned-xwhq6" Apr 22 18:46:40.893444 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.893443 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b6911a95-1c4c-4513-b688-98a7456d5642-etc-kubernetes\") pod \"tuned-xwhq6\" (UID: \"b6911a95-1c4c-4513-b688-98a7456d5642\") " pod="openshift-cluster-node-tuning-operator/tuned-xwhq6" Apr 22 18:46:40.893848 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.893464 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b6911a95-1c4c-4513-b688-98a7456d5642-host\") pod \"tuned-xwhq6\" (UID: \"b6911a95-1c4c-4513-b688-98a7456d5642\") " pod="openshift-cluster-node-tuning-operator/tuned-xwhq6" Apr 22 18:46:40.893848 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.893481 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ts99x\" (UniqueName: \"kubernetes.io/projected/85d15dde-b7d7-4309-980e-49ce8977c219-kube-api-access-ts99x\") pod \"network-check-target-6cpjm\" (UID: \"85d15dde-b7d7-4309-980e-49ce8977c219\") " pod="openshift-network-diagnostics/network-check-target-6cpjm" Apr 22 18:46:40.893848 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.893484 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/bbe988ee-649d-4eed-9871-40def68d89af-sys-fs\") pod \"aws-ebs-csi-driver-node-p6bnc\" (UID: \"bbe988ee-649d-4eed-9871-40def68d89af\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-p6bnc" Apr 22 18:46:40.893848 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.893533 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/bbe988ee-649d-4eed-9871-40def68d89af-device-dir\") pod \"aws-ebs-csi-driver-node-p6bnc\" (UID: \"bbe988ee-649d-4eed-9871-40def68d89af\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-p6bnc" Apr 22 18:46:40.893848 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.893559 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/bbe988ee-649d-4eed-9871-40def68d89af-etc-selinux\") pod \"aws-ebs-csi-driver-node-p6bnc\" (UID: \"bbe988ee-649d-4eed-9871-40def68d89af\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-p6bnc" Apr 22 18:46:40.893848 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.893583 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s7k96\" (UniqueName: \"kubernetes.io/projected/b6911a95-1c4c-4513-b688-98a7456d5642-kube-api-access-s7k96\") pod \"tuned-xwhq6\" (UID: \"b6911a95-1c4c-4513-b688-98a7456d5642\") " pod="openshift-cluster-node-tuning-operator/tuned-xwhq6" Apr 22 18:46:40.893848 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.893608 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bbe988ee-649d-4eed-9871-40def68d89af-socket-dir\") pod \"aws-ebs-csi-driver-node-p6bnc\" (UID: \"bbe988ee-649d-4eed-9871-40def68d89af\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-p6bnc" Apr 22 18:46:40.893848 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.893632 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bbe988ee-649d-4eed-9871-40def68d89af-registration-dir\") pod \"aws-ebs-csi-driver-node-p6bnc\" (UID: \"bbe988ee-649d-4eed-9871-40def68d89af\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-p6bnc" Apr 22 18:46:40.893848 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.893679 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/bbe988ee-649d-4eed-9871-40def68d89af-etc-selinux\") pod \"aws-ebs-csi-driver-node-p6bnc\" (UID: \"bbe988ee-649d-4eed-9871-40def68d89af\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-p6bnc" Apr 22 18:46:40.893848 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.893699 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bbe988ee-649d-4eed-9871-40def68d89af-registration-dir\") pod \"aws-ebs-csi-driver-node-p6bnc\" (UID: \"bbe988ee-649d-4eed-9871-40def68d89af\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-p6bnc" Apr 22 18:46:40.893848 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.893727 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/bbe988ee-649d-4eed-9871-40def68d89af-device-dir\") pod \"aws-ebs-csi-driver-node-p6bnc\" (UID: \"bbe988ee-649d-4eed-9871-40def68d89af\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-p6bnc" Apr 22 18:46:40.893848 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.893800 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bbe988ee-649d-4eed-9871-40def68d89af-socket-dir\") pod \"aws-ebs-csi-driver-node-p6bnc\" (UID: \"bbe988ee-649d-4eed-9871-40def68d89af\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-p6bnc" Apr 22 18:46:40.895153 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.895129 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b6911a95-1c4c-4513-b688-98a7456d5642-tmp\") pod \"tuned-xwhq6\" (UID: \"b6911a95-1c4c-4513-b688-98a7456d5642\") " pod="openshift-cluster-node-tuning-operator/tuned-xwhq6" Apr 22 18:46:40.895424 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.895405 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b6911a95-1c4c-4513-b688-98a7456d5642-etc-tuned\") pod \"tuned-xwhq6\" (UID: \"b6911a95-1c4c-4513-b688-98a7456d5642\") " pod="openshift-cluster-node-tuning-operator/tuned-xwhq6" Apr 22 18:46:40.898487 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:40.898463 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:46:40.898487 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:40.898489 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:46:40.898646 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:40.898504 2566 projected.go:194] Error preparing data for projected volume kube-api-access-ts99x for pod openshift-network-diagnostics/network-check-target-6cpjm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:40.898646 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:40.898620 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/85d15dde-b7d7-4309-980e-49ce8977c219-kube-api-access-ts99x podName:85d15dde-b7d7-4309-980e-49ce8977c219 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:41.39859682 +0000 UTC m=+3.177430317 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-ts99x" (UniqueName: "kubernetes.io/projected/85d15dde-b7d7-4309-980e-49ce8977c219-kube-api-access-ts99x") pod "network-check-target-6cpjm" (UID: "85d15dde-b7d7-4309-980e-49ce8977c219") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:40.900596 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.900527 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm486\" (UniqueName: \"kubernetes.io/projected/bbe988ee-649d-4eed-9871-40def68d89af-kube-api-access-mm486\") pod \"aws-ebs-csi-driver-node-p6bnc\" (UID: \"bbe988ee-649d-4eed-9871-40def68d89af\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-p6bnc" Apr 22 18:46:40.900596 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.900557 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7k96\" (UniqueName: \"kubernetes.io/projected/b6911a95-1c4c-4513-b688-98a7456d5642-kube-api-access-s7k96\") pod \"tuned-xwhq6\" (UID: \"b6911a95-1c4c-4513-b688-98a7456d5642\") " pod="openshift-cluster-node-tuning-operator/tuned-xwhq6" Apr 22 18:46:40.977782 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.977745 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-bt425" Apr 22 18:46:40.985773 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.985749 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-x8klk" Apr 22 18:46:40.993351 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.993327 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-6xjch" Apr 22 18:46:40.998911 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:40.998889 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-gw5cp" Apr 22 18:46:41.004491 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:41.004472 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-wj7nl" Apr 22 18:46:41.012012 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:41.011994 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-94k7v" Apr 22 18:46:41.018583 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:41.018563 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:46:41.025093 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:41.025077 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-p6bnc" Apr 22 18:46:41.028665 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:41.028647 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-xwhq6" Apr 22 18:46:41.295595 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:41.295510 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/66e2806a-82a6-444a-a19a-13641a426bed-metrics-certs\") pod \"network-metrics-daemon-7zj27\" (UID: \"66e2806a-82a6-444a-a19a-13641a426bed\") " pod="openshift-multus/network-metrics-daemon-7zj27" Apr 22 18:46:41.295744 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:41.295679 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:41.295810 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:41.295752 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66e2806a-82a6-444a-a19a-13641a426bed-metrics-certs podName:66e2806a-82a6-444a-a19a-13641a426bed nodeName:}" failed. No retries permitted until 2026-04-22 18:46:42.295729473 +0000 UTC m=+4.074562937 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/66e2806a-82a6-444a-a19a-13641a426bed-metrics-certs") pod "network-metrics-daemon-7zj27" (UID: "66e2806a-82a6-444a-a19a-13641a426bed") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:41.396011 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:41.395971 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbbe988ee_649d_4eed_9871_40def68d89af.slice/crio-afc8d22e6c8dc88c8661edf2ff4af27a1184895578ec71ef961bfd24395f2bd0 WatchSource:0}: Error finding container afc8d22e6c8dc88c8661edf2ff4af27a1184895578ec71ef961bfd24395f2bd0: Status 404 returned error can't find the container with id afc8d22e6c8dc88c8661edf2ff4af27a1184895578ec71ef961bfd24395f2bd0 Apr 22 18:46:41.397482 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:41.397407 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffe6e393_6faa_47fe_a818_c1ffc6dda35c.slice/crio-51de28d8fb7408cdd5aa8e9417ad34982abcb6233ad45cdad3fc304734552b25 WatchSource:0}: Error finding container 51de28d8fb7408cdd5aa8e9417ad34982abcb6233ad45cdad3fc304734552b25: Status 404 returned error can't find the container with id 51de28d8fb7408cdd5aa8e9417ad34982abcb6233ad45cdad3fc304734552b25 Apr 22 18:46:41.400493 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:41.400468 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6911a95_1c4c_4513_b688_98a7456d5642.slice/crio-7e1b5b28ba78b124b1575191e565c18f86b779601414fec880d6bb8a9114ec17 WatchSource:0}: Error finding container 7e1b5b28ba78b124b1575191e565c18f86b779601414fec880d6bb8a9114ec17: Status 404 returned error can't find the container with id 7e1b5b28ba78b124b1575191e565c18f86b779601414fec880d6bb8a9114ec17 Apr 22 18:46:41.401190 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:41.401154 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2780ff15_4023_4008_904e_d48820d8e1d9.slice/crio-09ab300ba01e07d2ffce5dd701a826d9560b5aea7cbff155a07da5cec2972a6c WatchSource:0}: Error finding container 09ab300ba01e07d2ffce5dd701a826d9560b5aea7cbff155a07da5cec2972a6c: Status 404 returned error can't find the container with id 09ab300ba01e07d2ffce5dd701a826d9560b5aea7cbff155a07da5cec2972a6c Apr 22 18:46:41.402079 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:41.402046 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode92d5c0b_3e38_4729_bb74_4ca59dfb7f0f.slice/crio-6876ff2471a7ad79c0173bfa52c1ad1a07de799a0b244aaf236b66b89a592015 WatchSource:0}: Error finding container 6876ff2471a7ad79c0173bfa52c1ad1a07de799a0b244aaf236b66b89a592015: Status 404 returned error can't find the container with id 6876ff2471a7ad79c0173bfa52c1ad1a07de799a0b244aaf236b66b89a592015 Apr 22 18:46:41.403737 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:41.403713 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod284cb099_6d1a_47cf_883b_5ada5cd11fa0.slice/crio-96e362e0cbb075fc7ebbfeaeb51ff0b768526393a1e84bdf499a8eed3dac5130 WatchSource:0}: Error finding container 96e362e0cbb075fc7ebbfeaeb51ff0b768526393a1e84bdf499a8eed3dac5130: Status 404 returned error can't find the container with id 96e362e0cbb075fc7ebbfeaeb51ff0b768526393a1e84bdf499a8eed3dac5130 Apr 22 18:46:41.404596 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:41.404577 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3952e485_121f_46cf_820b_24e6a13ec3bc.slice/crio-5c420faed819da697576e57e8a970aa5e69e9db7a5613aaa4c2f845e81f4f64a WatchSource:0}: Error finding container 5c420faed819da697576e57e8a970aa5e69e9db7a5613aaa4c2f845e81f4f64a: Status 404 returned error can't find the container with id 5c420faed819da697576e57e8a970aa5e69e9db7a5613aaa4c2f845e81f4f64a Apr 22 18:46:41.407396 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:41.407368 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c183a68_147a_4cc9_8e01_6e6df3a46a7b.slice/crio-b68bab11716206011d61023f12a19a91a6d807cbd4c7d63ba5b32892ea7b5e46 WatchSource:0}: Error finding container b68bab11716206011d61023f12a19a91a6d807cbd4c7d63ba5b32892ea7b5e46: Status 404 returned error can't find the container with id b68bab11716206011d61023f12a19a91a6d807cbd4c7d63ba5b32892ea7b5e46 Apr 22 18:46:41.408603 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:46:41.408579 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod462b5fca_7ee5_444f_af12_38942635aa2f.slice/crio-7ae34038adafc34db2a09828bd25553c03615ed814373f31b45147ee2f8fda27 WatchSource:0}: Error finding container 7ae34038adafc34db2a09828bd25553c03615ed814373f31b45147ee2f8fda27: Status 404 returned error can't find the container with id 7ae34038adafc34db2a09828bd25553c03615ed814373f31b45147ee2f8fda27 Apr 22 18:46:41.496350 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:41.496323 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ts99x\" (UniqueName: \"kubernetes.io/projected/85d15dde-b7d7-4309-980e-49ce8977c219-kube-api-access-ts99x\") pod \"network-check-target-6cpjm\" (UID: \"85d15dde-b7d7-4309-980e-49ce8977c219\") " pod="openshift-network-diagnostics/network-check-target-6cpjm" Apr 22 18:46:41.496500 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:41.496481 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:46:41.496576 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:41.496505 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:46:41.496576 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:41.496515 2566 projected.go:194] Error preparing data for projected volume kube-api-access-ts99x for pod openshift-network-diagnostics/network-check-target-6cpjm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:41.496686 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:41.496615 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/85d15dde-b7d7-4309-980e-49ce8977c219-kube-api-access-ts99x podName:85d15dde-b7d7-4309-980e-49ce8977c219 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:42.496595405 +0000 UTC m=+4.275428890 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-ts99x" (UniqueName: "kubernetes.io/projected/85d15dde-b7d7-4309-980e-49ce8977c219-kube-api-access-ts99x") pod "network-check-target-6cpjm" (UID: "85d15dde-b7d7-4309-980e-49ce8977c219") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:41.705953 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:41.705903 2566 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 18:41:39 +0000 UTC" deadline="2028-01-08 09:58:53.21257394 +0000 UTC" Apr 22 18:46:41.705953 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:41.705929 2566 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15015h12m11.506647737s" Apr 22 18:46:41.832662 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:41.832568 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gw5cp" event={"ID":"3952e485-121f-46cf-820b-24e6a13ec3bc","Type":"ContainerStarted","Data":"5c420faed819da697576e57e8a970aa5e69e9db7a5613aaa4c2f845e81f4f64a"} Apr 22 18:46:41.834028 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:41.833989 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-x8klk" event={"ID":"284cb099-6d1a-47cf-883b-5ada5cd11fa0","Type":"ContainerStarted","Data":"96e362e0cbb075fc7ebbfeaeb51ff0b768526393a1e84bdf499a8eed3dac5130"} Apr 22 18:46:41.836274 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:41.836251 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-6xjch" event={"ID":"e92d5c0b-3e38-4729-bb74-4ca59dfb7f0f","Type":"ContainerStarted","Data":"6876ff2471a7ad79c0173bfa52c1ad1a07de799a0b244aaf236b66b89a592015"} Apr 22 18:46:41.837930 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:41.837907 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-94k7v" event={"ID":"2780ff15-4023-4008-904e-d48820d8e1d9","Type":"ContainerStarted","Data":"09ab300ba01e07d2ffce5dd701a826d9560b5aea7cbff155a07da5cec2972a6c"} Apr 22 18:46:41.841204 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:41.840743 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-p6bnc" event={"ID":"bbe988ee-649d-4eed-9871-40def68d89af","Type":"ContainerStarted","Data":"afc8d22e6c8dc88c8661edf2ff4af27a1184895578ec71ef961bfd24395f2bd0"} Apr 22 18:46:41.846485 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:41.846436 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" event={"ID":"3c183a68-147a-4cc9-8e01-6e6df3a46a7b","Type":"ContainerStarted","Data":"b68bab11716206011d61023f12a19a91a6d807cbd4c7d63ba5b32892ea7b5e46"} Apr 22 18:46:41.852849 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:41.852779 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-xwhq6" event={"ID":"b6911a95-1c4c-4513-b688-98a7456d5642","Type":"ContainerStarted","Data":"7e1b5b28ba78b124b1575191e565c18f86b779601414fec880d6bb8a9114ec17"} Apr 22 18:46:41.860154 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:41.860129 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bt425" event={"ID":"ffe6e393-6faa-47fe-a818-c1ffc6dda35c","Type":"ContainerStarted","Data":"51de28d8fb7408cdd5aa8e9417ad34982abcb6233ad45cdad3fc304734552b25"} Apr 22 18:46:41.864536 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:41.864514 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-234.ec2.internal" event={"ID":"cd060d06ccaccd228d0dac1aa712346b","Type":"ContainerStarted","Data":"a6ddcd9197b709460250c33649289eaaef27fc7f19fc708994a389c4d73cc11b"} Apr 22 18:46:41.867917 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:41.867877 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-wj7nl" event={"ID":"462b5fca-7ee5-444f-af12-38942635aa2f","Type":"ContainerStarted","Data":"7ae34038adafc34db2a09828bd25553c03615ed814373f31b45147ee2f8fda27"} Apr 22 18:46:42.303538 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:42.303501 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/66e2806a-82a6-444a-a19a-13641a426bed-metrics-certs\") pod \"network-metrics-daemon-7zj27\" (UID: \"66e2806a-82a6-444a-a19a-13641a426bed\") " pod="openshift-multus/network-metrics-daemon-7zj27" Apr 22 18:46:42.303684 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:42.303664 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:42.303994 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:42.303976 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66e2806a-82a6-444a-a19a-13641a426bed-metrics-certs podName:66e2806a-82a6-444a-a19a-13641a426bed nodeName:}" failed. No retries permitted until 2026-04-22 18:46:44.30370572 +0000 UTC m=+6.082539185 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/66e2806a-82a6-444a-a19a-13641a426bed-metrics-certs") pod "network-metrics-daemon-7zj27" (UID: "66e2806a-82a6-444a-a19a-13641a426bed") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:42.505977 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:42.505304 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ts99x\" (UniqueName: \"kubernetes.io/projected/85d15dde-b7d7-4309-980e-49ce8977c219-kube-api-access-ts99x\") pod \"network-check-target-6cpjm\" (UID: \"85d15dde-b7d7-4309-980e-49ce8977c219\") " pod="openshift-network-diagnostics/network-check-target-6cpjm" Apr 22 18:46:42.505977 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:42.505477 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:46:42.505977 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:42.505496 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:46:42.505977 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:42.505508 2566 projected.go:194] Error preparing data for projected volume kube-api-access-ts99x for pod openshift-network-diagnostics/network-check-target-6cpjm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:42.505977 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:42.505568 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/85d15dde-b7d7-4309-980e-49ce8977c219-kube-api-access-ts99x podName:85d15dde-b7d7-4309-980e-49ce8977c219 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:44.505549522 +0000 UTC m=+6.284382998 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-ts99x" (UniqueName: "kubernetes.io/projected/85d15dde-b7d7-4309-980e-49ce8977c219-kube-api-access-ts99x") pod "network-check-target-6cpjm" (UID: "85d15dde-b7d7-4309-980e-49ce8977c219") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:42.825855 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:42.825773 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6cpjm" Apr 22 18:46:42.826315 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:42.825905 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6cpjm" podUID="85d15dde-b7d7-4309-980e-49ce8977c219" Apr 22 18:46:42.826387 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:42.826355 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7zj27" Apr 22 18:46:42.826475 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:42.826454 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7zj27" podUID="66e2806a-82a6-444a-a19a-13641a426bed" Apr 22 18:46:42.879703 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:42.878554 2566 generic.go:358] "Generic (PLEG): container finished" podID="b362398358ad211b5d4965dd1cf35b04" containerID="ba0636352dacf612d2f30e2b968f44a36c6f3689449c74da83f100dece29987e" exitCode=0 Apr 22 18:46:42.879703 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:42.879402 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-234.ec2.internal" event={"ID":"b362398358ad211b5d4965dd1cf35b04","Type":"ContainerDied","Data":"ba0636352dacf612d2f30e2b968f44a36c6f3689449c74da83f100dece29987e"} Apr 22 18:46:42.892668 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:42.892623 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-234.ec2.internal" podStartSLOduration=3.892608508 podStartE2EDuration="3.892608508s" podCreationTimestamp="2026-04-22 18:46:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:46:41.876212246 +0000 UTC m=+3.655045735" watchObservedRunningTime="2026-04-22 18:46:42.892608508 +0000 UTC m=+4.671441995" Apr 22 18:46:43.888245 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:43.888177 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-234.ec2.internal" event={"ID":"b362398358ad211b5d4965dd1cf35b04","Type":"ContainerStarted","Data":"328c707e680f2086412debee65fc758aef40c4a93f2934d892c01e0a5de41411"} Apr 22 18:46:43.901725 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:43.901672 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-234.ec2.internal" podStartSLOduration=4.901654192 podStartE2EDuration="4.901654192s" podCreationTimestamp="2026-04-22 18:46:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:46:43.900890893 +0000 UTC m=+5.679724382" watchObservedRunningTime="2026-04-22 18:46:43.901654192 +0000 UTC m=+5.680487680" Apr 22 18:46:44.320918 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:44.320835 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/66e2806a-82a6-444a-a19a-13641a426bed-metrics-certs\") pod \"network-metrics-daemon-7zj27\" (UID: \"66e2806a-82a6-444a-a19a-13641a426bed\") " pod="openshift-multus/network-metrics-daemon-7zj27" Apr 22 18:46:44.321118 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:44.321061 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:44.321181 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:44.321130 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66e2806a-82a6-444a-a19a-13641a426bed-metrics-certs podName:66e2806a-82a6-444a-a19a-13641a426bed nodeName:}" failed. No retries permitted until 2026-04-22 18:46:48.321109413 +0000 UTC m=+10.099942877 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/66e2806a-82a6-444a-a19a-13641a426bed-metrics-certs") pod "network-metrics-daemon-7zj27" (UID: "66e2806a-82a6-444a-a19a-13641a426bed") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:44.521777 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:44.521738 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ts99x\" (UniqueName: \"kubernetes.io/projected/85d15dde-b7d7-4309-980e-49ce8977c219-kube-api-access-ts99x\") pod \"network-check-target-6cpjm\" (UID: \"85d15dde-b7d7-4309-980e-49ce8977c219\") " pod="openshift-network-diagnostics/network-check-target-6cpjm" Apr 22 18:46:44.521976 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:44.521910 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:46:44.521976 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:44.521928 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:46:44.521976 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:44.521953 2566 projected.go:194] Error preparing data for projected volume kube-api-access-ts99x for pod openshift-network-diagnostics/network-check-target-6cpjm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:44.522166 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:44.522010 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/85d15dde-b7d7-4309-980e-49ce8977c219-kube-api-access-ts99x podName:85d15dde-b7d7-4309-980e-49ce8977c219 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:48.521991635 +0000 UTC m=+10.300825108 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-ts99x" (UniqueName: "kubernetes.io/projected/85d15dde-b7d7-4309-980e-49ce8977c219-kube-api-access-ts99x") pod "network-check-target-6cpjm" (UID: "85d15dde-b7d7-4309-980e-49ce8977c219") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:44.828534 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:44.828505 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6cpjm" Apr 22 18:46:44.828720 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:44.828620 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6cpjm" podUID="85d15dde-b7d7-4309-980e-49ce8977c219" Apr 22 18:46:44.828865 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:44.828728 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7zj27" Apr 22 18:46:44.828865 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:44.828818 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7zj27" podUID="66e2806a-82a6-444a-a19a-13641a426bed" Apr 22 18:46:45.562606 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:45.562233 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-xtfvj"] Apr 22 18:46:45.565075 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:45.565049 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xtfvj" Apr 22 18:46:45.565213 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:45.565150 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xtfvj" podUID="55779771-0d58-4918-9769-95bf4525dc2d" Apr 22 18:46:45.632781 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:45.632741 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/55779771-0d58-4918-9769-95bf4525dc2d-kubelet-config\") pod \"global-pull-secret-syncer-xtfvj\" (UID: \"55779771-0d58-4918-9769-95bf4525dc2d\") " pod="kube-system/global-pull-secret-syncer-xtfvj" Apr 22 18:46:45.632984 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:45.632791 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/55779771-0d58-4918-9769-95bf4525dc2d-original-pull-secret\") pod \"global-pull-secret-syncer-xtfvj\" (UID: \"55779771-0d58-4918-9769-95bf4525dc2d\") " pod="kube-system/global-pull-secret-syncer-xtfvj" Apr 22 18:46:45.632984 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:45.632831 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/55779771-0d58-4918-9769-95bf4525dc2d-dbus\") pod \"global-pull-secret-syncer-xtfvj\" (UID: \"55779771-0d58-4918-9769-95bf4525dc2d\") " pod="kube-system/global-pull-secret-syncer-xtfvj" Apr 22 18:46:45.733698 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:45.733621 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/55779771-0d58-4918-9769-95bf4525dc2d-kubelet-config\") pod \"global-pull-secret-syncer-xtfvj\" (UID: \"55779771-0d58-4918-9769-95bf4525dc2d\") " pod="kube-system/global-pull-secret-syncer-xtfvj" Apr 22 18:46:45.733698 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:45.733675 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/55779771-0d58-4918-9769-95bf4525dc2d-original-pull-secret\") pod \"global-pull-secret-syncer-xtfvj\" (UID: \"55779771-0d58-4918-9769-95bf4525dc2d\") " pod="kube-system/global-pull-secret-syncer-xtfvj" Apr 22 18:46:45.734074 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:45.733711 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/55779771-0d58-4918-9769-95bf4525dc2d-dbus\") pod \"global-pull-secret-syncer-xtfvj\" (UID: \"55779771-0d58-4918-9769-95bf4525dc2d\") " pod="kube-system/global-pull-secret-syncer-xtfvj" Apr 22 18:46:45.734074 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:45.733736 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/55779771-0d58-4918-9769-95bf4525dc2d-kubelet-config\") pod \"global-pull-secret-syncer-xtfvj\" (UID: \"55779771-0d58-4918-9769-95bf4525dc2d\") " pod="kube-system/global-pull-secret-syncer-xtfvj" Apr 22 18:46:45.734074 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:45.733884 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/55779771-0d58-4918-9769-95bf4525dc2d-dbus\") pod \"global-pull-secret-syncer-xtfvj\" (UID: \"55779771-0d58-4918-9769-95bf4525dc2d\") " pod="kube-system/global-pull-secret-syncer-xtfvj" Apr 22 18:46:45.734074 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:45.733889 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:45.734074 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:45.733977 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55779771-0d58-4918-9769-95bf4525dc2d-original-pull-secret podName:55779771-0d58-4918-9769-95bf4525dc2d nodeName:}" failed. No retries permitted until 2026-04-22 18:46:46.233958252 +0000 UTC m=+8.012791720 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/55779771-0d58-4918-9769-95bf4525dc2d-original-pull-secret") pod "global-pull-secret-syncer-xtfvj" (UID: "55779771-0d58-4918-9769-95bf4525dc2d") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:46.239840 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:46.239239 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/55779771-0d58-4918-9769-95bf4525dc2d-original-pull-secret\") pod \"global-pull-secret-syncer-xtfvj\" (UID: \"55779771-0d58-4918-9769-95bf4525dc2d\") " pod="kube-system/global-pull-secret-syncer-xtfvj" Apr 22 18:46:46.239840 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:46.239407 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:46.239840 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:46.239474 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55779771-0d58-4918-9769-95bf4525dc2d-original-pull-secret podName:55779771-0d58-4918-9769-95bf4525dc2d nodeName:}" failed. No retries permitted until 2026-04-22 18:46:47.239455705 +0000 UTC m=+9.018289174 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/55779771-0d58-4918-9769-95bf4525dc2d-original-pull-secret") pod "global-pull-secret-syncer-xtfvj" (UID: "55779771-0d58-4918-9769-95bf4525dc2d") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:46.826161 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:46.825424 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7zj27" Apr 22 18:46:46.826161 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:46.825567 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7zj27" podUID="66e2806a-82a6-444a-a19a-13641a426bed" Apr 22 18:46:46.826161 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:46.826023 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6cpjm" Apr 22 18:46:46.826161 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:46.826115 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6cpjm" podUID="85d15dde-b7d7-4309-980e-49ce8977c219" Apr 22 18:46:47.247103 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:47.247060 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/55779771-0d58-4918-9769-95bf4525dc2d-original-pull-secret\") pod \"global-pull-secret-syncer-xtfvj\" (UID: \"55779771-0d58-4918-9769-95bf4525dc2d\") " pod="kube-system/global-pull-secret-syncer-xtfvj" Apr 22 18:46:47.247302 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:47.247276 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:47.247405 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:47.247393 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55779771-0d58-4918-9769-95bf4525dc2d-original-pull-secret podName:55779771-0d58-4918-9769-95bf4525dc2d nodeName:}" failed. No retries permitted until 2026-04-22 18:46:49.247370683 +0000 UTC m=+11.026204169 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/55779771-0d58-4918-9769-95bf4525dc2d-original-pull-secret") pod "global-pull-secret-syncer-xtfvj" (UID: "55779771-0d58-4918-9769-95bf4525dc2d") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:47.825315 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:47.825286 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xtfvj" Apr 22 18:46:47.825503 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:47.825422 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xtfvj" podUID="55779771-0d58-4918-9769-95bf4525dc2d" Apr 22 18:46:48.356246 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:48.356209 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/66e2806a-82a6-444a-a19a-13641a426bed-metrics-certs\") pod \"network-metrics-daemon-7zj27\" (UID: \"66e2806a-82a6-444a-a19a-13641a426bed\") " pod="openshift-multus/network-metrics-daemon-7zj27" Apr 22 18:46:48.356722 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:48.356419 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:48.356722 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:48.356488 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66e2806a-82a6-444a-a19a-13641a426bed-metrics-certs podName:66e2806a-82a6-444a-a19a-13641a426bed nodeName:}" failed. No retries permitted until 2026-04-22 18:46:56.356466324 +0000 UTC m=+18.135299788 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/66e2806a-82a6-444a-a19a-13641a426bed-metrics-certs") pod "network-metrics-daemon-7zj27" (UID: "66e2806a-82a6-444a-a19a-13641a426bed") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:48.558294 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:48.558238 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ts99x\" (UniqueName: \"kubernetes.io/projected/85d15dde-b7d7-4309-980e-49ce8977c219-kube-api-access-ts99x\") pod \"network-check-target-6cpjm\" (UID: \"85d15dde-b7d7-4309-980e-49ce8977c219\") " pod="openshift-network-diagnostics/network-check-target-6cpjm" Apr 22 18:46:48.558515 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:48.558412 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:46:48.558515 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:48.558434 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:46:48.558515 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:48.558449 2566 projected.go:194] Error preparing data for projected volume kube-api-access-ts99x for pod openshift-network-diagnostics/network-check-target-6cpjm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:48.558628 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:48.558515 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/85d15dde-b7d7-4309-980e-49ce8977c219-kube-api-access-ts99x podName:85d15dde-b7d7-4309-980e-49ce8977c219 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:56.5584888 +0000 UTC m=+18.337322269 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-ts99x" (UniqueName: "kubernetes.io/projected/85d15dde-b7d7-4309-980e-49ce8977c219-kube-api-access-ts99x") pod "network-check-target-6cpjm" (UID: "85d15dde-b7d7-4309-980e-49ce8977c219") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:48.830012 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:48.828086 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7zj27" Apr 22 18:46:48.830012 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:48.828260 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7zj27" podUID="66e2806a-82a6-444a-a19a-13641a426bed" Apr 22 18:46:48.830012 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:48.828323 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6cpjm" Apr 22 18:46:48.830012 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:48.828424 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6cpjm" podUID="85d15dde-b7d7-4309-980e-49ce8977c219" Apr 22 18:46:49.263542 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:49.263506 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/55779771-0d58-4918-9769-95bf4525dc2d-original-pull-secret\") pod \"global-pull-secret-syncer-xtfvj\" (UID: \"55779771-0d58-4918-9769-95bf4525dc2d\") " pod="kube-system/global-pull-secret-syncer-xtfvj" Apr 22 18:46:49.263726 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:49.263679 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:49.263789 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:49.263761 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55779771-0d58-4918-9769-95bf4525dc2d-original-pull-secret podName:55779771-0d58-4918-9769-95bf4525dc2d nodeName:}" failed. No retries permitted until 2026-04-22 18:46:53.263736829 +0000 UTC m=+15.042570301 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/55779771-0d58-4918-9769-95bf4525dc2d-original-pull-secret") pod "global-pull-secret-syncer-xtfvj" (UID: "55779771-0d58-4918-9769-95bf4525dc2d") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:49.825457 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:49.825422 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xtfvj" Apr 22 18:46:49.825893 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:49.825561 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xtfvj" podUID="55779771-0d58-4918-9769-95bf4525dc2d" Apr 22 18:46:50.825081 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:50.824991 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7zj27" Apr 22 18:46:50.825246 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:50.824991 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6cpjm" Apr 22 18:46:50.825246 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:50.825132 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7zj27" podUID="66e2806a-82a6-444a-a19a-13641a426bed" Apr 22 18:46:50.825246 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:50.825186 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6cpjm" podUID="85d15dde-b7d7-4309-980e-49ce8977c219" Apr 22 18:46:51.825323 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:51.825294 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xtfvj" Apr 22 18:46:51.825737 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:51.825411 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xtfvj" podUID="55779771-0d58-4918-9769-95bf4525dc2d" Apr 22 18:46:52.825411 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:52.825375 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7zj27" Apr 22 18:46:52.825881 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:52.825384 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6cpjm" Apr 22 18:46:52.825881 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:52.825566 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7zj27" podUID="66e2806a-82a6-444a-a19a-13641a426bed" Apr 22 18:46:52.825881 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:52.825644 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6cpjm" podUID="85d15dde-b7d7-4309-980e-49ce8977c219" Apr 22 18:46:53.293446 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:53.293373 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/55779771-0d58-4918-9769-95bf4525dc2d-original-pull-secret\") pod \"global-pull-secret-syncer-xtfvj\" (UID: \"55779771-0d58-4918-9769-95bf4525dc2d\") " pod="kube-system/global-pull-secret-syncer-xtfvj" Apr 22 18:46:53.293624 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:53.293504 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:53.293624 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:53.293574 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55779771-0d58-4918-9769-95bf4525dc2d-original-pull-secret podName:55779771-0d58-4918-9769-95bf4525dc2d nodeName:}" failed. No retries permitted until 2026-04-22 18:47:01.293555918 +0000 UTC m=+23.072389409 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/55779771-0d58-4918-9769-95bf4525dc2d-original-pull-secret") pod "global-pull-secret-syncer-xtfvj" (UID: "55779771-0d58-4918-9769-95bf4525dc2d") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:53.825466 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:53.825437 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xtfvj" Apr 22 18:46:53.825812 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:53.825525 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xtfvj" podUID="55779771-0d58-4918-9769-95bf4525dc2d" Apr 22 18:46:54.825347 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:54.825310 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7zj27" Apr 22 18:46:54.825523 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:54.825348 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6cpjm" Apr 22 18:46:54.825523 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:54.825451 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7zj27" podUID="66e2806a-82a6-444a-a19a-13641a426bed" Apr 22 18:46:54.825891 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:54.825570 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6cpjm" podUID="85d15dde-b7d7-4309-980e-49ce8977c219" Apr 22 18:46:55.825386 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:55.825353 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xtfvj" Apr 22 18:46:55.825612 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:55.825458 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xtfvj" podUID="55779771-0d58-4918-9769-95bf4525dc2d" Apr 22 18:46:56.416465 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:56.416424 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/66e2806a-82a6-444a-a19a-13641a426bed-metrics-certs\") pod \"network-metrics-daemon-7zj27\" (UID: \"66e2806a-82a6-444a-a19a-13641a426bed\") " pod="openshift-multus/network-metrics-daemon-7zj27" Apr 22 18:46:56.416633 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:56.416567 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:56.416712 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:56.416642 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66e2806a-82a6-444a-a19a-13641a426bed-metrics-certs podName:66e2806a-82a6-444a-a19a-13641a426bed nodeName:}" failed. No retries permitted until 2026-04-22 18:47:12.416620179 +0000 UTC m=+34.195453642 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/66e2806a-82a6-444a-a19a-13641a426bed-metrics-certs") pod "network-metrics-daemon-7zj27" (UID: "66e2806a-82a6-444a-a19a-13641a426bed") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:56.618343 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:56.618302 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ts99x\" (UniqueName: \"kubernetes.io/projected/85d15dde-b7d7-4309-980e-49ce8977c219-kube-api-access-ts99x\") pod \"network-check-target-6cpjm\" (UID: \"85d15dde-b7d7-4309-980e-49ce8977c219\") " pod="openshift-network-diagnostics/network-check-target-6cpjm" Apr 22 18:46:56.618515 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:56.618467 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:46:56.618515 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:56.618492 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:46:56.618515 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:56.618505 2566 projected.go:194] Error preparing data for projected volume kube-api-access-ts99x for pod openshift-network-diagnostics/network-check-target-6cpjm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:56.618628 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:56.618558 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/85d15dde-b7d7-4309-980e-49ce8977c219-kube-api-access-ts99x podName:85d15dde-b7d7-4309-980e-49ce8977c219 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:12.618544099 +0000 UTC m=+34.397377562 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-ts99x" (UniqueName: "kubernetes.io/projected/85d15dde-b7d7-4309-980e-49ce8977c219-kube-api-access-ts99x") pod "network-check-target-6cpjm" (UID: "85d15dde-b7d7-4309-980e-49ce8977c219") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:56.827435 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:56.827362 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7zj27" Apr 22 18:46:56.827846 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:56.827364 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6cpjm" Apr 22 18:46:56.827846 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:56.827461 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7zj27" podUID="66e2806a-82a6-444a-a19a-13641a426bed" Apr 22 18:46:56.827846 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:56.827550 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6cpjm" podUID="85d15dde-b7d7-4309-980e-49ce8977c219" Apr 22 18:46:57.825790 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:57.825754 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xtfvj" Apr 22 18:46:57.825982 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:57.825895 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xtfvj" podUID="55779771-0d58-4918-9769-95bf4525dc2d" Apr 22 18:46:58.829202 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:58.828324 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6cpjm" Apr 22 18:46:58.829202 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:58.828623 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6cpjm" podUID="85d15dde-b7d7-4309-980e-49ce8977c219" Apr 22 18:46:58.829202 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:58.829021 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7zj27" Apr 22 18:46:58.829202 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:58.829145 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7zj27" podUID="66e2806a-82a6-444a-a19a-13641a426bed" Apr 22 18:46:58.915432 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:58.915085 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gw5cp" event={"ID":"3952e485-121f-46cf-820b-24e6a13ec3bc","Type":"ContainerStarted","Data":"de9c18420496d3a4d4460c4c34278574e6a5e0cc79735db918cdf7e804caf267"} Apr 22 18:46:58.916746 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:58.916681 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-x8klk" event={"ID":"284cb099-6d1a-47cf-883b-5ada5cd11fa0","Type":"ContainerStarted","Data":"69aaf0ee0f0f87a0c0febdeb104321aafa9667f243b737803de41eec27f48f7b"} Apr 22 18:46:58.918849 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:58.918827 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-xwhq6" event={"ID":"b6911a95-1c4c-4513-b688-98a7456d5642","Type":"ContainerStarted","Data":"ff125b2e55fd2f7d32252c732679ee2ea26aa8750cd57bd3b3a851eea7292666"} Apr 22 18:46:58.921165 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:58.921103 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bt425" event={"ID":"ffe6e393-6faa-47fe-a818-c1ffc6dda35c","Type":"ContainerStarted","Data":"5fa1b2d28945b8934e61fc5e3fe2fa2aa1d1537c14e047ad6dea661685fe3a2b"} Apr 22 18:46:58.945980 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:58.945918 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-x8klk" podStartSLOduration=8.436642182 podStartE2EDuration="20.945906958s" podCreationTimestamp="2026-04-22 18:46:38 +0000 UTC" firstStartedPulling="2026-04-22 18:46:41.406179785 +0000 UTC m=+3.185013256" lastFinishedPulling="2026-04-22 18:46:53.915444568 +0000 UTC m=+15.694278032" observedRunningTime="2026-04-22 18:46:58.932020679 +0000 UTC m=+20.710854166" watchObservedRunningTime="2026-04-22 18:46:58.945906958 +0000 UTC m=+20.724740443" Apr 22 18:46:58.963640 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:58.963246 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-xwhq6" podStartSLOduration=2.721238412 podStartE2EDuration="19.963235055s" podCreationTimestamp="2026-04-22 18:46:39 +0000 UTC" firstStartedPulling="2026-04-22 18:46:41.402011369 +0000 UTC m=+3.180844836" lastFinishedPulling="2026-04-22 18:46:58.644008011 +0000 UTC m=+20.422841479" observedRunningTime="2026-04-22 18:46:58.945799012 +0000 UTC m=+20.724632498" watchObservedRunningTime="2026-04-22 18:46:58.963235055 +0000 UTC m=+20.742068540" Apr 22 18:46:59.570856 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:59.570692 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-x8klk" Apr 22 18:46:59.825715 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:59.825692 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xtfvj" Apr 22 18:46:59.825839 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:46:59.825797 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xtfvj" podUID="55779771-0d58-4918-9769-95bf4525dc2d" Apr 22 18:46:59.852127 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:59.852082 2566 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 18:46:59.924097 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:59.924067 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-wj7nl" event={"ID":"462b5fca-7ee5-444f-af12-38942635aa2f","Type":"ContainerStarted","Data":"53e658a04516c900815954930e48ca5bbc3a90ccc7ad222ef0584fc854bd0a85"} Apr 22 18:46:59.925251 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:59.925229 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-6xjch" event={"ID":"e92d5c0b-3e38-4729-bb74-4ca59dfb7f0f","Type":"ContainerStarted","Data":"f09ecbec1f2e2d25c97dc80f3f6eb201ddcad5ae30c466a00d4414a1985c23b8"} Apr 22 18:46:59.926426 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:59.926384 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-94k7v" event={"ID":"2780ff15-4023-4008-904e-d48820d8e1d9","Type":"ContainerStarted","Data":"0f541c8f5f305abda9ee06ce19e11bfea7616dc9efa4946f4116c6dc507c8d75"} Apr 22 18:46:59.927804 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:59.927778 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-p6bnc" event={"ID":"bbe988ee-649d-4eed-9871-40def68d89af","Type":"ContainerStarted","Data":"075d85f7f451ff26d688e16e993777e083dc5e905751373c87bc80f91d74d280"} Apr 22 18:46:59.927804 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:59.927803 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-p6bnc" event={"ID":"bbe988ee-649d-4eed-9871-40def68d89af","Type":"ContainerStarted","Data":"99c0ef8e7fb879e6d445ecbc5e5862eb586ebe8e8dfff55a2f3ad6aa5392033a"} Apr 22 18:46:59.930186 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:59.930166 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" event={"ID":"3c183a68-147a-4cc9-8e01-6e6df3a46a7b","Type":"ContainerStarted","Data":"fd8ab6ebabc0cf83232a490ac46815cc374fcdd928e7b0bcd928ee4b3076efd3"} Apr 22 18:46:59.930272 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:59.930194 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" event={"ID":"3c183a68-147a-4cc9-8e01-6e6df3a46a7b","Type":"ContainerStarted","Data":"bfeffc9d5c75cdd5bb9b5f3edcd41d5460ad80c56bab7c62368fdeef08e856e1"} Apr 22 18:46:59.930272 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:59.930208 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" event={"ID":"3c183a68-147a-4cc9-8e01-6e6df3a46a7b","Type":"ContainerStarted","Data":"0daa4d4dbece810bcccd39ba75281e76943ee9bca096dcb9480daf6d630896e3"} Apr 22 18:46:59.930272 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:59.930220 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" event={"ID":"3c183a68-147a-4cc9-8e01-6e6df3a46a7b","Type":"ContainerStarted","Data":"7429425098d686bf87d82dc10631a7346eec969614f41672862286ea244aa0c7"} Apr 22 18:46:59.930272 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:59.930231 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" event={"ID":"3c183a68-147a-4cc9-8e01-6e6df3a46a7b","Type":"ContainerStarted","Data":"ddf95bbabb5552ed23f0e28c109775fedfab127e8b23aad84bcd484a9f40b6b9"} Apr 22 18:46:59.930272 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:59.930242 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" event={"ID":"3c183a68-147a-4cc9-8e01-6e6df3a46a7b","Type":"ContainerStarted","Data":"dfc306d8a420a0e69c4b641560e9ba54f454e8e088b619565296a9b710c63609"} Apr 22 18:46:59.931381 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:59.931360 2566 generic.go:358] "Generic (PLEG): container finished" podID="ffe6e393-6faa-47fe-a818-c1ffc6dda35c" containerID="5fa1b2d28945b8934e61fc5e3fe2fa2aa1d1537c14e047ad6dea661685fe3a2b" exitCode=0 Apr 22 18:46:59.931473 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:59.931451 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bt425" event={"ID":"ffe6e393-6faa-47fe-a818-c1ffc6dda35c","Type":"ContainerDied","Data":"5fa1b2d28945b8934e61fc5e3fe2fa2aa1d1537c14e047ad6dea661685fe3a2b"} Apr 22 18:46:59.947744 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:59.947680 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-wj7nl" podStartSLOduration=4.081312265 podStartE2EDuration="20.947669037s" podCreationTimestamp="2026-04-22 18:46:39 +0000 UTC" firstStartedPulling="2026-04-22 18:46:41.410367953 +0000 UTC m=+3.189201430" lastFinishedPulling="2026-04-22 18:46:58.276724725 +0000 UTC m=+20.055558202" observedRunningTime="2026-04-22 18:46:59.935654698 +0000 UTC m=+21.714488183" watchObservedRunningTime="2026-04-22 18:46:59.947669037 +0000 UTC m=+21.726502523" Apr 22 18:46:59.947821 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:59.947773 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-94k7v" podStartSLOduration=3.7066169860000002 podStartE2EDuration="20.947769341s" podCreationTimestamp="2026-04-22 18:46:39 +0000 UTC" firstStartedPulling="2026-04-22 18:46:41.402860182 +0000 UTC m=+3.181693663" lastFinishedPulling="2026-04-22 18:46:58.64401254 +0000 UTC m=+20.422846018" observedRunningTime="2026-04-22 18:46:59.947511751 +0000 UTC m=+21.726345237" watchObservedRunningTime="2026-04-22 18:46:59.947769341 +0000 UTC m=+21.726602850" Apr 22 18:46:59.998831 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:59.998795 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-gw5cp" podStartSLOduration=4.73435237 podStartE2EDuration="21.998784808s" podCreationTimestamp="2026-04-22 18:46:38 +0000 UTC" firstStartedPulling="2026-04-22 18:46:41.408474532 +0000 UTC m=+3.187308012" lastFinishedPulling="2026-04-22 18:46:58.672906971 +0000 UTC m=+20.451740450" observedRunningTime="2026-04-22 18:46:59.998715738 +0000 UTC m=+21.777549224" watchObservedRunningTime="2026-04-22 18:46:59.998784808 +0000 UTC m=+21.777618295" Apr 22 18:46:59.999034 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:46:59.999014 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-6xjch" podStartSLOduration=4.758812114 podStartE2EDuration="21.999008907s" podCreationTimestamp="2026-04-22 18:46:38 +0000 UTC" firstStartedPulling="2026-04-22 18:46:41.403887851 +0000 UTC m=+3.182721322" lastFinishedPulling="2026-04-22 18:46:58.644084639 +0000 UTC m=+20.422918115" observedRunningTime="2026-04-22 18:46:59.979577627 +0000 UTC m=+21.758411113" watchObservedRunningTime="2026-04-22 18:46:59.999008907 +0000 UTC m=+21.777842395" Apr 22 18:47:00.212468 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:00.212395 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-x8klk" Apr 22 18:47:00.213071 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:00.213051 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-x8klk" Apr 22 18:47:00.748088 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:00.747772 2566 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T18:46:59.852098933Z","UUID":"2d21fcc5-b835-4f7f-8ccc-d516c7eec04c","Handler":null,"Name":"","Endpoint":""} Apr 22 18:47:00.750846 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:00.750822 2566 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 18:47:00.750983 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:00.750855 2566 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 18:47:00.827668 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:00.827643 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7zj27" Apr 22 18:47:00.827826 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:00.827643 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6cpjm" Apr 22 18:47:00.827826 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:00.827748 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7zj27" podUID="66e2806a-82a6-444a-a19a-13641a426bed" Apr 22 18:47:00.827962 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:00.827856 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6cpjm" podUID="85d15dde-b7d7-4309-980e-49ce8977c219" Apr 22 18:47:00.938999 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:00.938961 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-p6bnc" event={"ID":"bbe988ee-649d-4eed-9871-40def68d89af","Type":"ContainerStarted","Data":"dfd1068f18999e662b70fb4f6bcc9fa1bf484a9e95fda91a2ed04009c1eeb108"} Apr 22 18:47:00.939903 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:00.939882 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-x8klk" Apr 22 18:47:00.957087 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:00.957015 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-p6bnc" podStartSLOduration=2.773254382 podStartE2EDuration="21.957002964s" podCreationTimestamp="2026-04-22 18:46:39 +0000 UTC" firstStartedPulling="2026-04-22 18:46:41.39835354 +0000 UTC m=+3.177187005" lastFinishedPulling="2026-04-22 18:47:00.582102123 +0000 UTC m=+22.360935587" observedRunningTime="2026-04-22 18:47:00.956794731 +0000 UTC m=+22.735628218" watchObservedRunningTime="2026-04-22 18:47:00.957002964 +0000 UTC m=+22.735836450" Apr 22 18:47:01.355667 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:01.355585 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/55779771-0d58-4918-9769-95bf4525dc2d-original-pull-secret\") pod \"global-pull-secret-syncer-xtfvj\" (UID: \"55779771-0d58-4918-9769-95bf4525dc2d\") " pod="kube-system/global-pull-secret-syncer-xtfvj" Apr 22 18:47:01.355809 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:01.355747 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:47:01.355878 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:01.355812 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55779771-0d58-4918-9769-95bf4525dc2d-original-pull-secret podName:55779771-0d58-4918-9769-95bf4525dc2d nodeName:}" failed. No retries permitted until 2026-04-22 18:47:17.355795371 +0000 UTC m=+39.134628837 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/55779771-0d58-4918-9769-95bf4525dc2d-original-pull-secret") pod "global-pull-secret-syncer-xtfvj" (UID: "55779771-0d58-4918-9769-95bf4525dc2d") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:47:01.824926 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:01.824893 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xtfvj" Apr 22 18:47:01.825101 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:01.825012 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xtfvj" podUID="55779771-0d58-4918-9769-95bf4525dc2d" Apr 22 18:47:01.944420 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:01.944384 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" event={"ID":"3c183a68-147a-4cc9-8e01-6e6df3a46a7b","Type":"ContainerStarted","Data":"2d19b6685f996ee954e414b5728395fad779dfb27595e99b35985abfba787ff4"} Apr 22 18:47:02.825009 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:02.824973 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7zj27" Apr 22 18:47:02.825009 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:02.825002 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6cpjm" Apr 22 18:47:02.825261 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:02.825119 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7zj27" podUID="66e2806a-82a6-444a-a19a-13641a426bed" Apr 22 18:47:02.825261 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:02.825238 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6cpjm" podUID="85d15dde-b7d7-4309-980e-49ce8977c219" Apr 22 18:47:03.824847 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:03.824759 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xtfvj" Apr 22 18:47:03.825480 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:03.824891 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xtfvj" podUID="55779771-0d58-4918-9769-95bf4525dc2d" Apr 22 18:47:04.828048 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:04.827867 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6cpjm" Apr 22 18:47:04.828482 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:04.827872 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7zj27" Apr 22 18:47:04.828482 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:04.828119 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6cpjm" podUID="85d15dde-b7d7-4309-980e-49ce8977c219" Apr 22 18:47:04.828482 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:04.828193 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7zj27" podUID="66e2806a-82a6-444a-a19a-13641a426bed" Apr 22 18:47:04.952955 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:04.952911 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" event={"ID":"3c183a68-147a-4cc9-8e01-6e6df3a46a7b","Type":"ContainerStarted","Data":"ee91a12ce54cdb0b249f180428458dd3a846732a85f26463cc4f76bf93d86b3a"} Apr 22 18:47:04.953315 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:04.953273 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:47:04.954693 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:04.954654 2566 generic.go:358] "Generic (PLEG): container finished" podID="ffe6e393-6faa-47fe-a818-c1ffc6dda35c" containerID="f328d363063e091deb99e20113299223b485203d9e143e01df341bcedc98cfb0" exitCode=0 Apr 22 18:47:04.954792 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:04.954702 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bt425" event={"ID":"ffe6e393-6faa-47fe-a818-c1ffc6dda35c","Type":"ContainerDied","Data":"f328d363063e091deb99e20113299223b485203d9e143e01df341bcedc98cfb0"} Apr 22 18:47:04.968198 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:04.968179 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:47:04.977997 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:04.977963 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" podStartSLOduration=8.354665899 podStartE2EDuration="25.977932546s" podCreationTimestamp="2026-04-22 18:46:39 +0000 UTC" firstStartedPulling="2026-04-22 18:46:41.410376546 +0000 UTC m=+3.189210010" lastFinishedPulling="2026-04-22 18:46:59.033643193 +0000 UTC m=+20.812476657" observedRunningTime="2026-04-22 18:47:04.976451876 +0000 UTC m=+26.755285363" watchObservedRunningTime="2026-04-22 18:47:04.977932546 +0000 UTC m=+26.756766040" Apr 22 18:47:05.825581 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:05.825550 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xtfvj" Apr 22 18:47:05.825711 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:05.825643 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xtfvj" podUID="55779771-0d58-4918-9769-95bf4525dc2d" Apr 22 18:47:05.958492 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:05.958463 2566 generic.go:358] "Generic (PLEG): container finished" podID="ffe6e393-6faa-47fe-a818-c1ffc6dda35c" containerID="0b390e95e1692c150ad2b096e5ef124e67b02b8ad6a87b5579d3c9d22652b7db" exitCode=0 Apr 22 18:47:05.958956 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:05.958551 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bt425" event={"ID":"ffe6e393-6faa-47fe-a818-c1ffc6dda35c","Type":"ContainerDied","Data":"0b390e95e1692c150ad2b096e5ef124e67b02b8ad6a87b5579d3c9d22652b7db"} Apr 22 18:47:05.958956 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:05.958726 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:47:05.958956 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:05.958782 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:47:05.977919 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:05.976508 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-xtfvj"] Apr 22 18:47:05.977919 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:05.976645 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xtfvj" Apr 22 18:47:05.977919 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:05.976781 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xtfvj" podUID="55779771-0d58-4918-9769-95bf4525dc2d" Apr 22 18:47:05.978476 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:05.978446 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:47:05.979285 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:05.979264 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7zj27"] Apr 22 18:47:05.979372 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:05.979365 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7zj27" Apr 22 18:47:05.979494 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:05.979469 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7zj27" podUID="66e2806a-82a6-444a-a19a-13641a426bed" Apr 22 18:47:05.979933 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:05.979915 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-6cpjm"] Apr 22 18:47:05.980038 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:05.980019 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6cpjm" Apr 22 18:47:05.980101 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:05.980081 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6cpjm" podUID="85d15dde-b7d7-4309-980e-49ce8977c219" Apr 22 18:47:06.961873 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:06.961840 2566 generic.go:358] "Generic (PLEG): container finished" podID="ffe6e393-6faa-47fe-a818-c1ffc6dda35c" containerID="694595506b2f58052bd9f15ee101c8dc4a27b3deef9ee6925e3c7794fd034d9f" exitCode=0 Apr 22 18:47:06.962290 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:06.961878 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bt425" event={"ID":"ffe6e393-6faa-47fe-a818-c1ffc6dda35c","Type":"ContainerDied","Data":"694595506b2f58052bd9f15ee101c8dc4a27b3deef9ee6925e3c7794fd034d9f"} Apr 22 18:47:07.825132 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:07.825098 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xtfvj" Apr 22 18:47:07.825276 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:07.825232 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6cpjm" Apr 22 18:47:07.825360 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:07.825234 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xtfvj" podUID="55779771-0d58-4918-9769-95bf4525dc2d" Apr 22 18:47:07.825431 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:07.825359 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7zj27" Apr 22 18:47:07.825487 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:07.825343 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6cpjm" podUID="85d15dde-b7d7-4309-980e-49ce8977c219" Apr 22 18:47:07.825487 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:07.825459 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7zj27" podUID="66e2806a-82a6-444a-a19a-13641a426bed" Apr 22 18:47:09.825633 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:09.825455 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xtfvj" Apr 22 18:47:09.826184 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:09.825459 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6cpjm" Apr 22 18:47:09.826184 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:09.825715 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xtfvj" podUID="55779771-0d58-4918-9769-95bf4525dc2d" Apr 22 18:47:09.826184 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:09.825800 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6cpjm" podUID="85d15dde-b7d7-4309-980e-49ce8977c219" Apr 22 18:47:09.826184 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:09.825519 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7zj27" Apr 22 18:47:09.826184 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:09.825897 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7zj27" podUID="66e2806a-82a6-444a-a19a-13641a426bed" Apr 22 18:47:10.553190 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:10.553160 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-234.ec2.internal" event="NodeReady" Apr 22 18:47:10.553330 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:10.553308 2566 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 18:47:10.587142 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:10.587117 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-77bfc79668-44qjh"] Apr 22 18:47:10.605328 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:10.605294 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-77bfc79668-44qjh"] Apr 22 18:47:10.605328 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:10.605324 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-6czt4"] Apr 22 18:47:10.605532 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:10.605467 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-77bfc79668-44qjh" Apr 22 18:47:10.607756 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:10.607734 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 18:47:10.607868 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:10.607779 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 18:47:10.607868 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:10.607732 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-c9m64\"" Apr 22 18:47:10.608024 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:10.608007 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 18:47:10.617472 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:10.617368 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-zkddm"] Apr 22 18:47:10.618404 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:10.618382 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6czt4" Apr 22 18:47:10.618523 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:10.618495 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 18:47:10.620583 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:10.620562 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 18:47:10.620694 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:10.620604 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-zg5rc\"" Apr 22 18:47:10.620801 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:10.620784 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 18:47:10.644739 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:10.644717 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6czt4"] Apr 22 18:47:10.644853 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:10.644745 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zkddm"] Apr 22 18:47:10.644853 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:10.644842 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zkddm" Apr 22 18:47:10.646858 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:10.646838 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-rxjsb\"" Apr 22 18:47:10.647410 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:10.647327 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 18:47:10.647537 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:10.647512 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 18:47:10.647623 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:10.647589 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 18:47:10.727075 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:10.727048 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-registry-certificates\") pod \"image-registry-77bfc79668-44qjh\" (UID: \"5ceaf6dd-7ffb-4393-9076-3f02c3b84983\") " pod="openshift-image-registry/image-registry-77bfc79668-44qjh" Apr 22 18:47:10.727210 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:10.727088 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8d6d09bf-c484-4e3a-8f6f-43ddb670c369-tmp-dir\") pod \"dns-default-6czt4\" (UID: \"8d6d09bf-c484-4e3a-8f6f-43ddb670c369\") " pod="openshift-dns/dns-default-6czt4" Apr 22 18:47:10.727210 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:10.727146 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-ca-trust-extracted\") pod \"image-registry-77bfc79668-44qjh\" (UID: \"5ceaf6dd-7ffb-4393-9076-3f02c3b84983\") " pod="openshift-image-registry/image-registry-77bfc79668-44qjh" Apr 22 18:47:10.727315 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:10.727215 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-image-registry-private-configuration\") pod \"image-registry-77bfc79668-44qjh\" (UID: \"5ceaf6dd-7ffb-4393-9076-3f02c3b84983\") " pod="openshift-image-registry/image-registry-77bfc79668-44qjh" Apr 22 18:47:10.727315 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:10.727244 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4glwh\" (UniqueName: \"kubernetes.io/projected/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-kube-api-access-4glwh\") pod \"image-registry-77bfc79668-44qjh\" (UID: \"5ceaf6dd-7ffb-4393-9076-3f02c3b84983\") " pod="openshift-image-registry/image-registry-77bfc79668-44qjh" Apr 22 18:47:10.727315 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:10.727270 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8d6d09bf-c484-4e3a-8f6f-43ddb670c369-config-volume\") pod \"dns-default-6czt4\" (UID: \"8d6d09bf-c484-4e3a-8f6f-43ddb670c369\") " pod="openshift-dns/dns-default-6czt4" Apr 22 18:47:10.727315 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:10.727296 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8d6d09bf-c484-4e3a-8f6f-43ddb670c369-metrics-tls\") pod \"dns-default-6czt4\" (UID: \"8d6d09bf-c484-4e3a-8f6f-43ddb670c369\") " pod="openshift-dns/dns-default-6czt4" Apr 22 18:47:10.727495 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:10.727327 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-registry-tls\") pod \"image-registry-77bfc79668-44qjh\" (UID: \"5ceaf6dd-7ffb-4393-9076-3f02c3b84983\") " pod="openshift-image-registry/image-registry-77bfc79668-44qjh" Apr 22 18:47:10.727495 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:10.727353 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-installation-pull-secrets\") pod \"image-registry-77bfc79668-44qjh\" (UID: \"5ceaf6dd-7ffb-4393-9076-3f02c3b84983\") " pod="openshift-image-registry/image-registry-77bfc79668-44qjh" Apr 22 18:47:10.727495 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:10.727385 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jg2g\" (UniqueName: \"kubernetes.io/projected/8d6d09bf-c484-4e3a-8f6f-43ddb670c369-kube-api-access-9jg2g\") pod \"dns-default-6czt4\" (UID: \"8d6d09bf-c484-4e3a-8f6f-43ddb670c369\") " pod="openshift-dns/dns-default-6czt4" Apr 22 18:47:10.727495 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:10.727418 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-bound-sa-token\") pod \"image-registry-77bfc79668-44qjh\" (UID: \"5ceaf6dd-7ffb-4393-9076-3f02c3b84983\") " pod="openshift-image-registry/image-registry-77bfc79668-44qjh" Apr 22 18:47:10.727495 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:10.727441 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9a00536f-bef4-4ca5-bc76-1956a1edb1d1-cert\") pod \"ingress-canary-zkddm\" (UID: \"9a00536f-bef4-4ca5-bc76-1956a1edb1d1\") " pod="openshift-ingress-canary/ingress-canary-zkddm" Apr 22 18:47:10.727495 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:10.727464 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldptj\" (UniqueName: \"kubernetes.io/projected/9a00536f-bef4-4ca5-bc76-1956a1edb1d1-kube-api-access-ldptj\") pod \"ingress-canary-zkddm\" (UID: \"9a00536f-bef4-4ca5-bc76-1956a1edb1d1\") " pod="openshift-ingress-canary/ingress-canary-zkddm" Apr 22 18:47:10.727706 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:10.727513 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-trusted-ca\") pod \"image-registry-77bfc79668-44qjh\" (UID: \"5ceaf6dd-7ffb-4393-9076-3f02c3b84983\") " pod="openshift-image-registry/image-registry-77bfc79668-44qjh" Apr 22 18:47:10.827810 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:10.827779 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-registry-certificates\") pod \"image-registry-77bfc79668-44qjh\" (UID: \"5ceaf6dd-7ffb-4393-9076-3f02c3b84983\") " pod="openshift-image-registry/image-registry-77bfc79668-44qjh" Apr 22 18:47:10.827810 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:10.827814 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8d6d09bf-c484-4e3a-8f6f-43ddb670c369-tmp-dir\") pod \"dns-default-6czt4\" (UID: \"8d6d09bf-c484-4e3a-8f6f-43ddb670c369\") " pod="openshift-dns/dns-default-6czt4" Apr 22 18:47:10.828370 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:10.828094 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-ca-trust-extracted\") pod \"image-registry-77bfc79668-44qjh\" (UID: \"5ceaf6dd-7ffb-4393-9076-3f02c3b84983\") " pod="openshift-image-registry/image-registry-77bfc79668-44qjh" Apr 22 18:47:10.828370 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:10.828165 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-image-registry-private-configuration\") pod \"image-registry-77bfc79668-44qjh\" (UID: \"5ceaf6dd-7ffb-4393-9076-3f02c3b84983\") " pod="openshift-image-registry/image-registry-77bfc79668-44qjh" Apr 22 18:47:10.828370 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:10.828175 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8d6d09bf-c484-4e3a-8f6f-43ddb670c369-tmp-dir\") pod \"dns-default-6czt4\" (UID: \"8d6d09bf-c484-4e3a-8f6f-43ddb670c369\") " pod="openshift-dns/dns-default-6czt4" Apr 22 18:47:10.828370 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:10.828192 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4glwh\" (UniqueName: \"kubernetes.io/projected/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-kube-api-access-4glwh\") pod \"image-registry-77bfc79668-44qjh\" (UID: \"5ceaf6dd-7ffb-4393-9076-3f02c3b84983\") " pod="openshift-image-registry/image-registry-77bfc79668-44qjh" Apr 22 18:47:10.828370 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:10.828215 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8d6d09bf-c484-4e3a-8f6f-43ddb670c369-config-volume\") pod \"dns-default-6czt4\" (UID: \"8d6d09bf-c484-4e3a-8f6f-43ddb670c369\") " pod="openshift-dns/dns-default-6czt4" Apr 22 18:47:10.828370 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:10.828234 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8d6d09bf-c484-4e3a-8f6f-43ddb670c369-metrics-tls\") pod \"dns-default-6czt4\" (UID: \"8d6d09bf-c484-4e3a-8f6f-43ddb670c369\") " pod="openshift-dns/dns-default-6czt4" Apr 22 18:47:10.828370 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:10.828259 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-registry-tls\") pod \"image-registry-77bfc79668-44qjh\" (UID: \"5ceaf6dd-7ffb-4393-9076-3f02c3b84983\") " pod="openshift-image-registry/image-registry-77bfc79668-44qjh" Apr 22 18:47:10.828370 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:10.828283 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-installation-pull-secrets\") pod \"image-registry-77bfc79668-44qjh\" (UID: \"5ceaf6dd-7ffb-4393-9076-3f02c3b84983\") " pod="openshift-image-registry/image-registry-77bfc79668-44qjh" Apr 22 18:47:10.828370 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:10.828314 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9jg2g\" (UniqueName: \"kubernetes.io/projected/8d6d09bf-c484-4e3a-8f6f-43ddb670c369-kube-api-access-9jg2g\") pod \"dns-default-6czt4\" (UID: \"8d6d09bf-c484-4e3a-8f6f-43ddb670c369\") " pod="openshift-dns/dns-default-6czt4" Apr 22 18:47:10.828370 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:10.828359 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-bound-sa-token\") pod \"image-registry-77bfc79668-44qjh\" (UID: \"5ceaf6dd-7ffb-4393-9076-3f02c3b84983\") " pod="openshift-image-registry/image-registry-77bfc79668-44qjh" Apr 22 18:47:10.828834 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:10.828383 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9a00536f-bef4-4ca5-bc76-1956a1edb1d1-cert\") pod \"ingress-canary-zkddm\" (UID: \"9a00536f-bef4-4ca5-bc76-1956a1edb1d1\") " pod="openshift-ingress-canary/ingress-canary-zkddm" Apr 22 18:47:10.828834 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:10.828409 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ldptj\" (UniqueName: \"kubernetes.io/projected/9a00536f-bef4-4ca5-bc76-1956a1edb1d1-kube-api-access-ldptj\") pod \"ingress-canary-zkddm\" (UID: \"9a00536f-bef4-4ca5-bc76-1956a1edb1d1\") " pod="openshift-ingress-canary/ingress-canary-zkddm" Apr 22 18:47:10.828834 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:10.828420 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:47:10.828834 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:10.828439 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-trusted-ca\") pod \"image-registry-77bfc79668-44qjh\" (UID: \"5ceaf6dd-7ffb-4393-9076-3f02c3b84983\") " pod="openshift-image-registry/image-registry-77bfc79668-44qjh" Apr 22 18:47:10.828834 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:10.828442 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-77bfc79668-44qjh: secret "image-registry-tls" not found Apr 22 18:47:10.828834 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:10.828501 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:47:10.828834 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:10.828526 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-ca-trust-extracted\") pod \"image-registry-77bfc79668-44qjh\" (UID: \"5ceaf6dd-7ffb-4393-9076-3f02c3b84983\") " pod="openshift-image-registry/image-registry-77bfc79668-44qjh" Apr 22 18:47:10.828834 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:10.828542 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-registry-tls podName:5ceaf6dd-7ffb-4393-9076-3f02c3b84983 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:11.328521556 +0000 UTC m=+33.107355027 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-registry-tls") pod "image-registry-77bfc79668-44qjh" (UID: "5ceaf6dd-7ffb-4393-9076-3f02c3b84983") : secret "image-registry-tls" not found Apr 22 18:47:10.828834 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:10.828561 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d6d09bf-c484-4e3a-8f6f-43ddb670c369-metrics-tls podName:8d6d09bf-c484-4e3a-8f6f-43ddb670c369 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:11.328550628 +0000 UTC m=+33.107384094 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8d6d09bf-c484-4e3a-8f6f-43ddb670c369-metrics-tls") pod "dns-default-6czt4" (UID: "8d6d09bf-c484-4e3a-8f6f-43ddb670c369") : secret "dns-default-metrics-tls" not found Apr 22 18:47:10.828834 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:10.828638 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:47:10.828834 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:10.828673 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a00536f-bef4-4ca5-bc76-1956a1edb1d1-cert podName:9a00536f-bef4-4ca5-bc76-1956a1edb1d1 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:11.328662494 +0000 UTC m=+33.107495962 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9a00536f-bef4-4ca5-bc76-1956a1edb1d1-cert") pod "ingress-canary-zkddm" (UID: "9a00536f-bef4-4ca5-bc76-1956a1edb1d1") : secret "canary-serving-cert" not found Apr 22 18:47:10.828834 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:10.828728 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8d6d09bf-c484-4e3a-8f6f-43ddb670c369-config-volume\") pod \"dns-default-6czt4\" (UID: \"8d6d09bf-c484-4e3a-8f6f-43ddb670c369\") " pod="openshift-dns/dns-default-6czt4" Apr 22 18:47:10.832881 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:10.832860 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-image-registry-private-configuration\") pod \"image-registry-77bfc79668-44qjh\" (UID: \"5ceaf6dd-7ffb-4393-9076-3f02c3b84983\") " pod="openshift-image-registry/image-registry-77bfc79668-44qjh" Apr 22 18:47:10.833011 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:10.832868 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-installation-pull-secrets\") pod \"image-registry-77bfc79668-44qjh\" (UID: \"5ceaf6dd-7ffb-4393-9076-3f02c3b84983\") " pod="openshift-image-registry/image-registry-77bfc79668-44qjh" Apr 22 18:47:10.837396 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:10.837259 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jg2g\" (UniqueName: \"kubernetes.io/projected/8d6d09bf-c484-4e3a-8f6f-43ddb670c369-kube-api-access-9jg2g\") pod \"dns-default-6czt4\" (UID: \"8d6d09bf-c484-4e3a-8f6f-43ddb670c369\") " pod="openshift-dns/dns-default-6czt4" Apr 22 18:47:10.837523 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:10.837500 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-bound-sa-token\") pod \"image-registry-77bfc79668-44qjh\" (UID: \"5ceaf6dd-7ffb-4393-9076-3f02c3b84983\") " pod="openshift-image-registry/image-registry-77bfc79668-44qjh" Apr 22 18:47:10.837576 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:10.837557 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4glwh\" (UniqueName: \"kubernetes.io/projected/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-kube-api-access-4glwh\") pod \"image-registry-77bfc79668-44qjh\" (UID: \"5ceaf6dd-7ffb-4393-9076-3f02c3b84983\") " pod="openshift-image-registry/image-registry-77bfc79668-44qjh" Apr 22 18:47:10.838049 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:10.838016 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldptj\" (UniqueName: \"kubernetes.io/projected/9a00536f-bef4-4ca5-bc76-1956a1edb1d1-kube-api-access-ldptj\") pod \"ingress-canary-zkddm\" (UID: \"9a00536f-bef4-4ca5-bc76-1956a1edb1d1\") " pod="openshift-ingress-canary/ingress-canary-zkddm" Apr 22 18:47:10.840733 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:10.840712 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-registry-certificates\") pod \"image-registry-77bfc79668-44qjh\" (UID: \"5ceaf6dd-7ffb-4393-9076-3f02c3b84983\") " pod="openshift-image-registry/image-registry-77bfc79668-44qjh" Apr 22 18:47:10.841020 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:10.840999 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-trusted-ca\") pod \"image-registry-77bfc79668-44qjh\" (UID: \"5ceaf6dd-7ffb-4393-9076-3f02c3b84983\") " pod="openshift-image-registry/image-registry-77bfc79668-44qjh" Apr 22 18:47:11.332427 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:11.332396 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-registry-tls\") pod \"image-registry-77bfc79668-44qjh\" (UID: \"5ceaf6dd-7ffb-4393-9076-3f02c3b84983\") " pod="openshift-image-registry/image-registry-77bfc79668-44qjh" Apr 22 18:47:11.332539 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:11.332451 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9a00536f-bef4-4ca5-bc76-1956a1edb1d1-cert\") pod \"ingress-canary-zkddm\" (UID: \"9a00536f-bef4-4ca5-bc76-1956a1edb1d1\") " pod="openshift-ingress-canary/ingress-canary-zkddm" Apr 22 18:47:11.332586 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:11.332542 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8d6d09bf-c484-4e3a-8f6f-43ddb670c369-metrics-tls\") pod \"dns-default-6czt4\" (UID: \"8d6d09bf-c484-4e3a-8f6f-43ddb670c369\") " pod="openshift-dns/dns-default-6czt4" Apr 22 18:47:11.332586 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:11.332551 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:47:11.332586 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:11.332569 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-77bfc79668-44qjh: secret "image-registry-tls" not found Apr 22 18:47:11.332717 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:11.332632 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:47:11.332717 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:11.332667 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-registry-tls podName:5ceaf6dd-7ffb-4393-9076-3f02c3b84983 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:12.33264433 +0000 UTC m=+34.111477814 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-registry-tls") pod "image-registry-77bfc79668-44qjh" (UID: "5ceaf6dd-7ffb-4393-9076-3f02c3b84983") : secret "image-registry-tls" not found Apr 22 18:47:11.332717 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:11.332684 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d6d09bf-c484-4e3a-8f6f-43ddb670c369-metrics-tls podName:8d6d09bf-c484-4e3a-8f6f-43ddb670c369 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:12.332676865 +0000 UTC m=+34.111510332 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8d6d09bf-c484-4e3a-8f6f-43ddb670c369-metrics-tls") pod "dns-default-6czt4" (UID: "8d6d09bf-c484-4e3a-8f6f-43ddb670c369") : secret "dns-default-metrics-tls" not found Apr 22 18:47:11.332842 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:11.332663 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:47:11.332842 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:11.332789 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a00536f-bef4-4ca5-bc76-1956a1edb1d1-cert podName:9a00536f-bef4-4ca5-bc76-1956a1edb1d1 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:12.332767187 +0000 UTC m=+34.111600663 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9a00536f-bef4-4ca5-bc76-1956a1edb1d1-cert") pod "ingress-canary-zkddm" (UID: "9a00536f-bef4-4ca5-bc76-1956a1edb1d1") : secret "canary-serving-cert" not found Apr 22 18:47:11.824839 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:11.824802 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xtfvj" Apr 22 18:47:11.824839 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:11.824823 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6cpjm" Apr 22 18:47:11.825120 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:11.824849 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7zj27" Apr 22 18:47:11.827135 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:11.827113 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 18:47:11.828227 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:11.828062 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 18:47:11.828227 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:11.828101 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-m94d4\"" Apr 22 18:47:11.828227 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:11.828147 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 18:47:11.828784 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:11.828455 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-5ggfs\"" Apr 22 18:47:11.828784 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:11.828528 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 18:47:12.340816 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:12.340642 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8d6d09bf-c484-4e3a-8f6f-43ddb670c369-metrics-tls\") pod \"dns-default-6czt4\" (UID: \"8d6d09bf-c484-4e3a-8f6f-43ddb670c369\") " pod="openshift-dns/dns-default-6czt4" Apr 22 18:47:12.341062 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:12.340832 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-registry-tls\") pod \"image-registry-77bfc79668-44qjh\" (UID: \"5ceaf6dd-7ffb-4393-9076-3f02c3b84983\") " pod="openshift-image-registry/image-registry-77bfc79668-44qjh" Apr 22 18:47:12.341062 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:12.340791 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:47:12.341062 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:12.340861 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9a00536f-bef4-4ca5-bc76-1956a1edb1d1-cert\") pod \"ingress-canary-zkddm\" (UID: \"9a00536f-bef4-4ca5-bc76-1956a1edb1d1\") " pod="openshift-ingress-canary/ingress-canary-zkddm" Apr 22 18:47:12.341363 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:12.341335 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:47:12.341476 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:12.341335 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d6d09bf-c484-4e3a-8f6f-43ddb670c369-metrics-tls podName:8d6d09bf-c484-4e3a-8f6f-43ddb670c369 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:14.341307845 +0000 UTC m=+36.120141323 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8d6d09bf-c484-4e3a-8f6f-43ddb670c369-metrics-tls") pod "dns-default-6czt4" (UID: "8d6d09bf-c484-4e3a-8f6f-43ddb670c369") : secret "dns-default-metrics-tls" not found Apr 22 18:47:12.341476 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:12.341437 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:47:12.341476 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:12.341440 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a00536f-bef4-4ca5-bc76-1956a1edb1d1-cert podName:9a00536f-bef4-4ca5-bc76-1956a1edb1d1 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:14.341427103 +0000 UTC m=+36.120260573 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9a00536f-bef4-4ca5-bc76-1956a1edb1d1-cert") pod "ingress-canary-zkddm" (UID: "9a00536f-bef4-4ca5-bc76-1956a1edb1d1") : secret "canary-serving-cert" not found Apr 22 18:47:12.341476 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:12.341456 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-77bfc79668-44qjh: secret "image-registry-tls" not found Apr 22 18:47:12.341688 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:12.341500 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-registry-tls podName:5ceaf6dd-7ffb-4393-9076-3f02c3b84983 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:14.341481085 +0000 UTC m=+36.120314560 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-registry-tls") pod "image-registry-77bfc79668-44qjh" (UID: "5ceaf6dd-7ffb-4393-9076-3f02c3b84983") : secret "image-registry-tls" not found Apr 22 18:47:12.441697 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:12.441665 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/66e2806a-82a6-444a-a19a-13641a426bed-metrics-certs\") pod \"network-metrics-daemon-7zj27\" (UID: \"66e2806a-82a6-444a-a19a-13641a426bed\") " pod="openshift-multus/network-metrics-daemon-7zj27" Apr 22 18:47:12.441876 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:12.441820 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 18:47:12.441961 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:12.441907 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66e2806a-82a6-444a-a19a-13641a426bed-metrics-certs podName:66e2806a-82a6-444a-a19a-13641a426bed nodeName:}" failed. No retries permitted until 2026-04-22 18:47:44.441885918 +0000 UTC m=+66.220719388 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/66e2806a-82a6-444a-a19a-13641a426bed-metrics-certs") pod "network-metrics-daemon-7zj27" (UID: "66e2806a-82a6-444a-a19a-13641a426bed") : secret "metrics-daemon-secret" not found Apr 22 18:47:12.643058 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:12.643025 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ts99x\" (UniqueName: \"kubernetes.io/projected/85d15dde-b7d7-4309-980e-49ce8977c219-kube-api-access-ts99x\") pod \"network-check-target-6cpjm\" (UID: \"85d15dde-b7d7-4309-980e-49ce8977c219\") " pod="openshift-network-diagnostics/network-check-target-6cpjm" Apr 22 18:47:12.645912 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:12.645889 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ts99x\" (UniqueName: \"kubernetes.io/projected/85d15dde-b7d7-4309-980e-49ce8977c219-kube-api-access-ts99x\") pod \"network-check-target-6cpjm\" (UID: \"85d15dde-b7d7-4309-980e-49ce8977c219\") " pod="openshift-network-diagnostics/network-check-target-6cpjm" Apr 22 18:47:12.742659 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:12.742627 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6cpjm" Apr 22 18:47:12.898464 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:12.898438 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-6cpjm"] Apr 22 18:47:12.901969 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:47:12.901925 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85d15dde_b7d7_4309_980e_49ce8977c219.slice/crio-3ee40063715bd148eaf564147de462c61efac1b0db50349e19231b407d01f0fc WatchSource:0}: Error finding container 3ee40063715bd148eaf564147de462c61efac1b0db50349e19231b407d01f0fc: Status 404 returned error can't find the container with id 3ee40063715bd148eaf564147de462c61efac1b0db50349e19231b407d01f0fc Apr 22 18:47:12.975583 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:12.975551 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bt425" event={"ID":"ffe6e393-6faa-47fe-a818-c1ffc6dda35c","Type":"ContainerStarted","Data":"33bb0179b75bb7fbd646c35956d31630ec9e8a69ba5f07c6df517602725a8020"} Apr 22 18:47:12.976624 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:12.976596 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-6cpjm" event={"ID":"85d15dde-b7d7-4309-980e-49ce8977c219","Type":"ContainerStarted","Data":"3ee40063715bd148eaf564147de462c61efac1b0db50349e19231b407d01f0fc"} Apr 22 18:47:13.981826 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:13.981744 2566 generic.go:358] "Generic (PLEG): container finished" podID="ffe6e393-6faa-47fe-a818-c1ffc6dda35c" containerID="33bb0179b75bb7fbd646c35956d31630ec9e8a69ba5f07c6df517602725a8020" exitCode=0 Apr 22 18:47:13.981826 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:13.981795 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bt425" event={"ID":"ffe6e393-6faa-47fe-a818-c1ffc6dda35c","Type":"ContainerDied","Data":"33bb0179b75bb7fbd646c35956d31630ec9e8a69ba5f07c6df517602725a8020"} Apr 22 18:47:14.356045 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:14.355965 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8d6d09bf-c484-4e3a-8f6f-43ddb670c369-metrics-tls\") pod \"dns-default-6czt4\" (UID: \"8d6d09bf-c484-4e3a-8f6f-43ddb670c369\") " pod="openshift-dns/dns-default-6czt4" Apr 22 18:47:14.356045 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:14.356012 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-registry-tls\") pod \"image-registry-77bfc79668-44qjh\" (UID: \"5ceaf6dd-7ffb-4393-9076-3f02c3b84983\") " pod="openshift-image-registry/image-registry-77bfc79668-44qjh" Apr 22 18:47:14.356261 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:14.356136 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:47:14.356261 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:14.356162 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9a00536f-bef4-4ca5-bc76-1956a1edb1d1-cert\") pod \"ingress-canary-zkddm\" (UID: \"9a00536f-bef4-4ca5-bc76-1956a1edb1d1\") " pod="openshift-ingress-canary/ingress-canary-zkddm" Apr 22 18:47:14.356261 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:14.356168 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:47:14.356261 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:14.356224 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:47:14.356261 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:14.356234 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-77bfc79668-44qjh: secret "image-registry-tls" not found Apr 22 18:47:14.356441 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:14.356215 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d6d09bf-c484-4e3a-8f6f-43ddb670c369-metrics-tls podName:8d6d09bf-c484-4e3a-8f6f-43ddb670c369 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:18.356192649 +0000 UTC m=+40.135026133 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8d6d09bf-c484-4e3a-8f6f-43ddb670c369-metrics-tls") pod "dns-default-6czt4" (UID: "8d6d09bf-c484-4e3a-8f6f-43ddb670c369") : secret "dns-default-metrics-tls" not found Apr 22 18:47:14.356441 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:14.356309 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-registry-tls podName:5ceaf6dd-7ffb-4393-9076-3f02c3b84983 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:18.356293058 +0000 UTC m=+40.135126527 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-registry-tls") pod "image-registry-77bfc79668-44qjh" (UID: "5ceaf6dd-7ffb-4393-9076-3f02c3b84983") : secret "image-registry-tls" not found Apr 22 18:47:14.356441 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:14.356322 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a00536f-bef4-4ca5-bc76-1956a1edb1d1-cert podName:9a00536f-bef4-4ca5-bc76-1956a1edb1d1 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:18.356316114 +0000 UTC m=+40.135149577 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9a00536f-bef4-4ca5-bc76-1956a1edb1d1-cert") pod "ingress-canary-zkddm" (UID: "9a00536f-bef4-4ca5-bc76-1956a1edb1d1") : secret "canary-serving-cert" not found Apr 22 18:47:14.986391 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:14.986361 2566 generic.go:358] "Generic (PLEG): container finished" podID="ffe6e393-6faa-47fe-a818-c1ffc6dda35c" containerID="49eb1888b066c2f3b5d7433988511dd857d1efda293e0883910e8d7e7c21b010" exitCode=0 Apr 22 18:47:14.986738 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:14.986418 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bt425" event={"ID":"ffe6e393-6faa-47fe-a818-c1ffc6dda35c","Type":"ContainerDied","Data":"49eb1888b066c2f3b5d7433988511dd857d1efda293e0883910e8d7e7c21b010"} Apr 22 18:47:16.992674 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:16.992515 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bt425" event={"ID":"ffe6e393-6faa-47fe-a818-c1ffc6dda35c","Type":"ContainerStarted","Data":"94ab1ee9a80a0a65e64d19cd19537402b6661085ce6500d153b206fefd07edbc"} Apr 22 18:47:16.993953 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:16.993918 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-6cpjm" event={"ID":"85d15dde-b7d7-4309-980e-49ce8977c219","Type":"ContainerStarted","Data":"c96d9d98bcf45c48ef5f72d67ec6e16cd24ec3a91a70de7cbf0352b02ce8bd58"} Apr 22 18:47:16.994108 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:16.994046 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-6cpjm" Apr 22 18:47:17.012655 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:17.012612 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-bt425" podStartSLOduration=7.635806923 podStartE2EDuration="39.012600607s" podCreationTimestamp="2026-04-22 18:46:38 +0000 UTC" firstStartedPulling="2026-04-22 18:46:41.399533142 +0000 UTC m=+3.178366605" lastFinishedPulling="2026-04-22 18:47:12.77632681 +0000 UTC m=+34.555160289" observedRunningTime="2026-04-22 18:47:17.011279028 +0000 UTC m=+38.790112514" watchObservedRunningTime="2026-04-22 18:47:17.012600607 +0000 UTC m=+38.791434089" Apr 22 18:47:17.025715 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:17.025675 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-6cpjm" podStartSLOduration=34.794133119 podStartE2EDuration="38.025664715s" podCreationTimestamp="2026-04-22 18:46:39 +0000 UTC" firstStartedPulling="2026-04-22 18:47:12.903878687 +0000 UTC m=+34.682712166" lastFinishedPulling="2026-04-22 18:47:16.135410298 +0000 UTC m=+37.914243762" observedRunningTime="2026-04-22 18:47:17.025170793 +0000 UTC m=+38.804004281" watchObservedRunningTime="2026-04-22 18:47:17.025664715 +0000 UTC m=+38.804498201" Apr 22 18:47:17.381731 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:17.381699 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/55779771-0d58-4918-9769-95bf4525dc2d-original-pull-secret\") pod \"global-pull-secret-syncer-xtfvj\" (UID: \"55779771-0d58-4918-9769-95bf4525dc2d\") " pod="kube-system/global-pull-secret-syncer-xtfvj" Apr 22 18:47:17.385394 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:17.385363 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/55779771-0d58-4918-9769-95bf4525dc2d-original-pull-secret\") pod \"global-pull-secret-syncer-xtfvj\" (UID: \"55779771-0d58-4918-9769-95bf4525dc2d\") " pod="kube-system/global-pull-secret-syncer-xtfvj" Apr 22 18:47:17.536363 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:17.536323 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xtfvj" Apr 22 18:47:17.645506 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:17.645441 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-xtfvj"] Apr 22 18:47:17.648646 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:47:17.648614 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55779771_0d58_4918_9769_95bf4525dc2d.slice/crio-bb7cf38fa46ef0e575229654366600347c92afc0463346d0d4a2db522f83da81 WatchSource:0}: Error finding container bb7cf38fa46ef0e575229654366600347c92afc0463346d0d4a2db522f83da81: Status 404 returned error can't find the container with id bb7cf38fa46ef0e575229654366600347c92afc0463346d0d4a2db522f83da81 Apr 22 18:47:17.997334 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:17.997246 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-xtfvj" event={"ID":"55779771-0d58-4918-9769-95bf4525dc2d","Type":"ContainerStarted","Data":"bb7cf38fa46ef0e575229654366600347c92afc0463346d0d4a2db522f83da81"} Apr 22 18:47:18.387996 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:18.387950 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8d6d09bf-c484-4e3a-8f6f-43ddb670c369-metrics-tls\") pod \"dns-default-6czt4\" (UID: \"8d6d09bf-c484-4e3a-8f6f-43ddb670c369\") " pod="openshift-dns/dns-default-6czt4" Apr 22 18:47:18.388168 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:18.388103 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:47:18.388168 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:18.388118 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-registry-tls\") pod \"image-registry-77bfc79668-44qjh\" (UID: \"5ceaf6dd-7ffb-4393-9076-3f02c3b84983\") " pod="openshift-image-registry/image-registry-77bfc79668-44qjh" Apr 22 18:47:18.388280 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:18.388180 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d6d09bf-c484-4e3a-8f6f-43ddb670c369-metrics-tls podName:8d6d09bf-c484-4e3a-8f6f-43ddb670c369 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:26.388154447 +0000 UTC m=+48.166987914 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8d6d09bf-c484-4e3a-8f6f-43ddb670c369-metrics-tls") pod "dns-default-6czt4" (UID: "8d6d09bf-c484-4e3a-8f6f-43ddb670c369") : secret "dns-default-metrics-tls" not found Apr 22 18:47:18.388280 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:18.388206 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9a00536f-bef4-4ca5-bc76-1956a1edb1d1-cert\") pod \"ingress-canary-zkddm\" (UID: \"9a00536f-bef4-4ca5-bc76-1956a1edb1d1\") " pod="openshift-ingress-canary/ingress-canary-zkddm" Apr 22 18:47:18.388280 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:18.388230 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:47:18.388280 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:18.388244 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-77bfc79668-44qjh: secret "image-registry-tls" not found Apr 22 18:47:18.388480 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:18.388293 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-registry-tls podName:5ceaf6dd-7ffb-4393-9076-3f02c3b84983 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:26.388274978 +0000 UTC m=+48.167108461 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-registry-tls") pod "image-registry-77bfc79668-44qjh" (UID: "5ceaf6dd-7ffb-4393-9076-3f02c3b84983") : secret "image-registry-tls" not found Apr 22 18:47:18.388480 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:18.388327 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:47:18.388480 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:18.388368 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a00536f-bef4-4ca5-bc76-1956a1edb1d1-cert podName:9a00536f-bef4-4ca5-bc76-1956a1edb1d1 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:26.388356501 +0000 UTC m=+48.167189971 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9a00536f-bef4-4ca5-bc76-1956a1edb1d1-cert") pod "ingress-canary-zkddm" (UID: "9a00536f-bef4-4ca5-bc76-1956a1edb1d1") : secret "canary-serving-cert" not found Apr 22 18:47:24.011672 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:24.011636 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-xtfvj" event={"ID":"55779771-0d58-4918-9769-95bf4525dc2d","Type":"ContainerStarted","Data":"81402958ba6c0fc6eb79571089977d307c8b0700b1f6c66c96661342edd9876c"} Apr 22 18:47:24.024514 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:24.024470 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-xtfvj" podStartSLOduration=33.813423832 podStartE2EDuration="39.024454291s" podCreationTimestamp="2026-04-22 18:46:45 +0000 UTC" firstStartedPulling="2026-04-22 18:47:17.650375951 +0000 UTC m=+39.429209416" lastFinishedPulling="2026-04-22 18:47:22.861406412 +0000 UTC m=+44.640239875" observedRunningTime="2026-04-22 18:47:24.024068245 +0000 UTC m=+45.802901731" watchObservedRunningTime="2026-04-22 18:47:24.024454291 +0000 UTC m=+45.803287778" Apr 22 18:47:26.447704 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:26.447667 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8d6d09bf-c484-4e3a-8f6f-43ddb670c369-metrics-tls\") pod \"dns-default-6czt4\" (UID: \"8d6d09bf-c484-4e3a-8f6f-43ddb670c369\") " pod="openshift-dns/dns-default-6czt4" Apr 22 18:47:26.447704 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:26.447710 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-registry-tls\") pod \"image-registry-77bfc79668-44qjh\" (UID: \"5ceaf6dd-7ffb-4393-9076-3f02c3b84983\") " pod="openshift-image-registry/image-registry-77bfc79668-44qjh" Apr 22 18:47:26.448186 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:26.447739 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9a00536f-bef4-4ca5-bc76-1956a1edb1d1-cert\") pod \"ingress-canary-zkddm\" (UID: \"9a00536f-bef4-4ca5-bc76-1956a1edb1d1\") " pod="openshift-ingress-canary/ingress-canary-zkddm" Apr 22 18:47:26.448186 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:26.447838 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:47:26.448186 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:26.447848 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:47:26.448186 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:26.447862 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-77bfc79668-44qjh: secret "image-registry-tls" not found Apr 22 18:47:26.448186 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:26.447905 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-registry-tls podName:5ceaf6dd-7ffb-4393-9076-3f02c3b84983 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:42.447886342 +0000 UTC m=+64.226719807 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-registry-tls") pod "image-registry-77bfc79668-44qjh" (UID: "5ceaf6dd-7ffb-4393-9076-3f02c3b84983") : secret "image-registry-tls" not found Apr 22 18:47:26.448186 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:26.447920 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a00536f-bef4-4ca5-bc76-1956a1edb1d1-cert podName:9a00536f-bef4-4ca5-bc76-1956a1edb1d1 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:42.447912266 +0000 UTC m=+64.226745730 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9a00536f-bef4-4ca5-bc76-1956a1edb1d1-cert") pod "ingress-canary-zkddm" (UID: "9a00536f-bef4-4ca5-bc76-1956a1edb1d1") : secret "canary-serving-cert" not found Apr 22 18:47:26.448186 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:26.447838 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:47:26.448186 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:26.447998 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d6d09bf-c484-4e3a-8f6f-43ddb670c369-metrics-tls podName:8d6d09bf-c484-4e3a-8f6f-43ddb670c369 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:42.44798025 +0000 UTC m=+64.226813713 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8d6d09bf-c484-4e3a-8f6f-43ddb670c369-metrics-tls") pod "dns-default-6czt4" (UID: "8d6d09bf-c484-4e3a-8f6f-43ddb670c369") : secret "dns-default-metrics-tls" not found Apr 22 18:47:37.974972 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:37.974932 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-s8qfk" Apr 22 18:47:42.463774 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:42.463724 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8d6d09bf-c484-4e3a-8f6f-43ddb670c369-metrics-tls\") pod \"dns-default-6czt4\" (UID: \"8d6d09bf-c484-4e3a-8f6f-43ddb670c369\") " pod="openshift-dns/dns-default-6czt4" Apr 22 18:47:42.463774 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:42.463776 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-registry-tls\") pod \"image-registry-77bfc79668-44qjh\" (UID: \"5ceaf6dd-7ffb-4393-9076-3f02c3b84983\") " pod="openshift-image-registry/image-registry-77bfc79668-44qjh" Apr 22 18:47:42.464262 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:42.463864 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:47:42.464262 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:42.463876 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-77bfc79668-44qjh: secret "image-registry-tls" not found Apr 22 18:47:42.464262 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:42.463873 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:47:42.464262 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:42.463928 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-registry-tls podName:5ceaf6dd-7ffb-4393-9076-3f02c3b84983 nodeName:}" failed. No retries permitted until 2026-04-22 18:48:14.463915131 +0000 UTC m=+96.242748595 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-registry-tls") pod "image-registry-77bfc79668-44qjh" (UID: "5ceaf6dd-7ffb-4393-9076-3f02c3b84983") : secret "image-registry-tls" not found Apr 22 18:47:42.464262 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:42.463966 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d6d09bf-c484-4e3a-8f6f-43ddb670c369-metrics-tls podName:8d6d09bf-c484-4e3a-8f6f-43ddb670c369 nodeName:}" failed. No retries permitted until 2026-04-22 18:48:14.463954806 +0000 UTC m=+96.242788315 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8d6d09bf-c484-4e3a-8f6f-43ddb670c369-metrics-tls") pod "dns-default-6czt4" (UID: "8d6d09bf-c484-4e3a-8f6f-43ddb670c369") : secret "dns-default-metrics-tls" not found Apr 22 18:47:42.464262 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:42.463984 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9a00536f-bef4-4ca5-bc76-1956a1edb1d1-cert\") pod \"ingress-canary-zkddm\" (UID: \"9a00536f-bef4-4ca5-bc76-1956a1edb1d1\") " pod="openshift-ingress-canary/ingress-canary-zkddm" Apr 22 18:47:42.464262 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:42.464068 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:47:42.464262 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:42.464100 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a00536f-bef4-4ca5-bc76-1956a1edb1d1-cert podName:9a00536f-bef4-4ca5-bc76-1956a1edb1d1 nodeName:}" failed. No retries permitted until 2026-04-22 18:48:14.464088903 +0000 UTC m=+96.242922368 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9a00536f-bef4-4ca5-bc76-1956a1edb1d1-cert") pod "ingress-canary-zkddm" (UID: "9a00536f-bef4-4ca5-bc76-1956a1edb1d1") : secret "canary-serving-cert" not found Apr 22 18:47:44.477378 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:44.477342 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/66e2806a-82a6-444a-a19a-13641a426bed-metrics-certs\") pod \"network-metrics-daemon-7zj27\" (UID: \"66e2806a-82a6-444a-a19a-13641a426bed\") " pod="openshift-multus/network-metrics-daemon-7zj27" Apr 22 18:47:44.477764 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:44.477490 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 18:47:44.477764 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:47:44.477564 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66e2806a-82a6-444a-a19a-13641a426bed-metrics-certs podName:66e2806a-82a6-444a-a19a-13641a426bed nodeName:}" failed. No retries permitted until 2026-04-22 18:48:48.477547382 +0000 UTC m=+130.256380846 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/66e2806a-82a6-444a-a19a-13641a426bed-metrics-certs") pod "network-metrics-daemon-7zj27" (UID: "66e2806a-82a6-444a-a19a-13641a426bed") : secret "metrics-daemon-secret" not found Apr 22 18:47:47.999775 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:47:47.999747 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-6cpjm" Apr 22 18:48:14.478340 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:14.478212 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9a00536f-bef4-4ca5-bc76-1956a1edb1d1-cert\") pod \"ingress-canary-zkddm\" (UID: \"9a00536f-bef4-4ca5-bc76-1956a1edb1d1\") " pod="openshift-ingress-canary/ingress-canary-zkddm" Apr 22 18:48:14.478340 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:14.478293 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8d6d09bf-c484-4e3a-8f6f-43ddb670c369-metrics-tls\") pod \"dns-default-6czt4\" (UID: \"8d6d09bf-c484-4e3a-8f6f-43ddb670c369\") " pod="openshift-dns/dns-default-6czt4" Apr 22 18:48:14.478340 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:14.478322 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-registry-tls\") pod \"image-registry-77bfc79668-44qjh\" (UID: \"5ceaf6dd-7ffb-4393-9076-3f02c3b84983\") " pod="openshift-image-registry/image-registry-77bfc79668-44qjh" Apr 22 18:48:14.478898 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:48:14.478358 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:48:14.478898 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:48:14.478422 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:48:14.478898 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:48:14.478431 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:48:14.478898 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:48:14.478443 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a00536f-bef4-4ca5-bc76-1956a1edb1d1-cert podName:9a00536f-bef4-4ca5-bc76-1956a1edb1d1 nodeName:}" failed. No retries permitted until 2026-04-22 18:49:18.47842493 +0000 UTC m=+160.257258395 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9a00536f-bef4-4ca5-bc76-1956a1edb1d1-cert") pod "ingress-canary-zkddm" (UID: "9a00536f-bef4-4ca5-bc76-1956a1edb1d1") : secret "canary-serving-cert" not found Apr 22 18:48:14.478898 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:48:14.478446 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-77bfc79668-44qjh: secret "image-registry-tls" not found Apr 22 18:48:14.478898 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:48:14.478478 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d6d09bf-c484-4e3a-8f6f-43ddb670c369-metrics-tls podName:8d6d09bf-c484-4e3a-8f6f-43ddb670c369 nodeName:}" failed. No retries permitted until 2026-04-22 18:49:18.478465687 +0000 UTC m=+160.257299156 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8d6d09bf-c484-4e3a-8f6f-43ddb670c369-metrics-tls") pod "dns-default-6czt4" (UID: "8d6d09bf-c484-4e3a-8f6f-43ddb670c369") : secret "dns-default-metrics-tls" not found Apr 22 18:48:14.478898 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:48:14.478522 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-registry-tls podName:5ceaf6dd-7ffb-4393-9076-3f02c3b84983 nodeName:}" failed. No retries permitted until 2026-04-22 18:49:18.478509442 +0000 UTC m=+160.257342906 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-registry-tls") pod "image-registry-77bfc79668-44qjh" (UID: "5ceaf6dd-7ffb-4393-9076-3f02c3b84983") : secret "image-registry-tls" not found Apr 22 18:48:48.517993 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:48.517954 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/66e2806a-82a6-444a-a19a-13641a426bed-metrics-certs\") pod \"network-metrics-daemon-7zj27\" (UID: \"66e2806a-82a6-444a-a19a-13641a426bed\") " pod="openshift-multus/network-metrics-daemon-7zj27" Apr 22 18:48:48.518471 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:48:48.518100 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 18:48:48.518471 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:48:48.518171 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66e2806a-82a6-444a-a19a-13641a426bed-metrics-certs podName:66e2806a-82a6-444a-a19a-13641a426bed nodeName:}" failed. No retries permitted until 2026-04-22 18:50:50.518153513 +0000 UTC m=+252.296986992 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/66e2806a-82a6-444a-a19a-13641a426bed-metrics-certs") pod "network-metrics-daemon-7zj27" (UID: "66e2806a-82a6-444a-a19a-13641a426bed") : secret "metrics-daemon-secret" not found Apr 22 18:48:52.252697 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.252660 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8jk7t"] Apr 22 18:48:52.255426 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.255410 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8jk7t" Apr 22 18:48:52.258405 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.258381 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 22 18:48:52.258501 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.258381 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-hllh5\"" Apr 22 18:48:52.259048 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.259022 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:48:52.259144 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.259056 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 22 18:48:52.263155 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.263133 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8jk7t"] Apr 22 18:48:52.346178 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.346151 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8l99\" (UniqueName: \"kubernetes.io/projected/bc71e7c0-1858-4f34-ba30-e0864e6770bf-kube-api-access-w8l99\") pod \"cluster-samples-operator-6dc5bdb6b4-8jk7t\" (UID: \"bc71e7c0-1858-4f34-ba30-e0864e6770bf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8jk7t" Apr 22 18:48:52.346328 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.346187 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/bc71e7c0-1858-4f34-ba30-e0864e6770bf-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-8jk7t\" (UID: \"bc71e7c0-1858-4f34-ba30-e0864e6770bf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8jk7t" Apr 22 18:48:52.366649 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.366620 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lxclb"] Apr 22 18:48:52.369158 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.369140 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lxclb" Apr 22 18:48:52.369667 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.369646 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-4jzwx"] Apr 22 18:48:52.372152 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.372134 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-4jzwx" Apr 22 18:48:52.374120 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.374099 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 22 18:48:52.374120 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.374116 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:48:52.374609 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.374593 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 22 18:48:52.375101 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.375087 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 22 18:48:52.378212 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.378195 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 22 18:48:52.378305 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.378262 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-mvpwt\"" Apr 22 18:48:52.381140 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.381120 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 22 18:48:52.381361 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.381346 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:48:52.389103 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.389084 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-5qx2r\"" Apr 22 18:48:52.389697 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.389685 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 22 18:48:52.405557 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.405540 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lxclb"] Apr 22 18:48:52.406676 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.406658 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-4jzwx"] Apr 22 18:48:52.418018 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.417998 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 22 18:48:52.446904 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.446884 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w8l99\" (UniqueName: \"kubernetes.io/projected/bc71e7c0-1858-4f34-ba30-e0864e6770bf-kube-api-access-w8l99\") pod \"cluster-samples-operator-6dc5bdb6b4-8jk7t\" (UID: \"bc71e7c0-1858-4f34-ba30-e0864e6770bf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8jk7t" Apr 22 18:48:52.447021 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.446913 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/bc71e7c0-1858-4f34-ba30-e0864e6770bf-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-8jk7t\" (UID: \"bc71e7c0-1858-4f34-ba30-e0864e6770bf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8jk7t" Apr 22 18:48:52.447064 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:48:52.447058 2566 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 18:48:52.447119 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:48:52.447110 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bc71e7c0-1858-4f34-ba30-e0864e6770bf-samples-operator-tls podName:bc71e7c0-1858-4f34-ba30-e0864e6770bf nodeName:}" failed. No retries permitted until 2026-04-22 18:48:52.947097904 +0000 UTC m=+134.725931368 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/bc71e7c0-1858-4f34-ba30-e0864e6770bf-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-8jk7t" (UID: "bc71e7c0-1858-4f34-ba30-e0864e6770bf") : secret "samples-operator-tls" not found Apr 22 18:48:52.458130 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.458102 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8l99\" (UniqueName: \"kubernetes.io/projected/bc71e7c0-1858-4f34-ba30-e0864e6770bf-kube-api-access-w8l99\") pod \"cluster-samples-operator-6dc5bdb6b4-8jk7t\" (UID: \"bc71e7c0-1858-4f34-ba30-e0864e6770bf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8jk7t" Apr 22 18:48:52.464064 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.464042 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-v9ml5"] Apr 22 18:48:52.467016 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.466995 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-v9ml5" Apr 22 18:48:52.467375 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.467354 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-zvgcc"] Apr 22 18:48:52.469092 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.469075 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-98dm9\"" Apr 22 18:48:52.469842 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.469827 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-djlpr"] Apr 22 18:48:52.470023 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.470004 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-zvgcc" Apr 22 18:48:52.472171 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.472151 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 22 18:48:52.472171 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.472169 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-qk8mh\"" Apr 22 18:48:52.472332 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.472182 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 18:48:52.472389 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.472342 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-djlpr" Apr 22 18:48:52.472541 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.472516 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 18:48:52.472832 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.472816 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 22 18:48:52.475447 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.475423 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 22 18:48:52.475447 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.475436 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 18:48:52.475627 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.475461 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-z9mjt\"" Apr 22 18:48:52.475997 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.475977 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 18:48:52.476125 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.476102 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 22 18:48:52.476897 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.476878 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-v9ml5"] Apr 22 18:48:52.482003 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.481983 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-zvgcc"] Apr 22 18:48:52.482872 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.482851 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-djlpr"] Apr 22 18:48:52.484803 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.484781 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 22 18:48:52.548373 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.548300 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2758dd4a-0d4e-4566-b935-f784873d69e9-serving-cert\") pod \"service-ca-operator-d6fc45fc5-lxclb\" (UID: \"2758dd4a-0d4e-4566-b935-f784873d69e9\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lxclb" Apr 22 18:48:52.548373 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.548349 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wwhp\" (UniqueName: \"kubernetes.io/projected/4871a747-c041-430f-9c1a-3ab2ce6d0156-kube-api-access-5wwhp\") pod \"console-operator-9d4b6777b-4jzwx\" (UID: \"4871a747-c041-430f-9c1a-3ab2ce6d0156\") " pod="openshift-console-operator/console-operator-9d4b6777b-4jzwx" Apr 22 18:48:52.548532 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.548403 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-869c7\" (UniqueName: \"kubernetes.io/projected/2758dd4a-0d4e-4566-b935-f784873d69e9-kube-api-access-869c7\") pod \"service-ca-operator-d6fc45fc5-lxclb\" (UID: \"2758dd4a-0d4e-4566-b935-f784873d69e9\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lxclb" Apr 22 18:48:52.548532 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.548474 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4871a747-c041-430f-9c1a-3ab2ce6d0156-config\") pod \"console-operator-9d4b6777b-4jzwx\" (UID: \"4871a747-c041-430f-9c1a-3ab2ce6d0156\") " pod="openshift-console-operator/console-operator-9d4b6777b-4jzwx" Apr 22 18:48:52.548532 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.548494 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4871a747-c041-430f-9c1a-3ab2ce6d0156-serving-cert\") pod \"console-operator-9d4b6777b-4jzwx\" (UID: \"4871a747-c041-430f-9c1a-3ab2ce6d0156\") " pod="openshift-console-operator/console-operator-9d4b6777b-4jzwx" Apr 22 18:48:52.548532 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.548512 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2758dd4a-0d4e-4566-b935-f784873d69e9-config\") pod \"service-ca-operator-d6fc45fc5-lxclb\" (UID: \"2758dd4a-0d4e-4566-b935-f784873d69e9\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lxclb" Apr 22 18:48:52.548532 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.548527 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4871a747-c041-430f-9c1a-3ab2ce6d0156-trusted-ca\") pod \"console-operator-9d4b6777b-4jzwx\" (UID: \"4871a747-c041-430f-9c1a-3ab2ce6d0156\") " pod="openshift-console-operator/console-operator-9d4b6777b-4jzwx" Apr 22 18:48:52.649142 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.649101 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nbjm\" (UniqueName: \"kubernetes.io/projected/574f89c9-2eec-4ad6-a1ef-c9b8109907a5-kube-api-access-4nbjm\") pod \"cluster-monitoring-operator-75587bd455-djlpr\" (UID: \"574f89c9-2eec-4ad6-a1ef-c9b8109907a5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-djlpr" Apr 22 18:48:52.649231 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.649166 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4871a747-c041-430f-9c1a-3ab2ce6d0156-config\") pod \"console-operator-9d4b6777b-4jzwx\" (UID: \"4871a747-c041-430f-9c1a-3ab2ce6d0156\") " pod="openshift-console-operator/console-operator-9d4b6777b-4jzwx" Apr 22 18:48:52.649231 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.649207 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/81a18479-876e-4704-9f10-9b3d459cce34-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-zvgcc\" (UID: \"81a18479-876e-4704-9f10-9b3d459cce34\") " pod="openshift-insights/insights-operator-585dfdc468-zvgcc" Apr 22 18:48:52.649292 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.649227 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swmc7\" (UniqueName: \"kubernetes.io/projected/4a6e19f9-6606-4820-8fc6-604159ab398c-kube-api-access-swmc7\") pod \"network-check-source-8894fc9bd-v9ml5\" (UID: \"4a6e19f9-6606-4820-8fc6-604159ab398c\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-v9ml5" Apr 22 18:48:52.649292 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.649250 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/81a18479-876e-4704-9f10-9b3d459cce34-tmp\") pod \"insights-operator-585dfdc468-zvgcc\" (UID: \"81a18479-876e-4704-9f10-9b3d459cce34\") " pod="openshift-insights/insights-operator-585dfdc468-zvgcc" Apr 22 18:48:52.649292 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.649270 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lp96\" (UniqueName: \"kubernetes.io/projected/81a18479-876e-4704-9f10-9b3d459cce34-kube-api-access-8lp96\") pod \"insights-operator-585dfdc468-zvgcc\" (UID: \"81a18479-876e-4704-9f10-9b3d459cce34\") " pod="openshift-insights/insights-operator-585dfdc468-zvgcc" Apr 22 18:48:52.649425 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.649319 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/574f89c9-2eec-4ad6-a1ef-c9b8109907a5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-djlpr\" (UID: \"574f89c9-2eec-4ad6-a1ef-c9b8109907a5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-djlpr" Apr 22 18:48:52.649425 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.649339 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/81a18479-876e-4704-9f10-9b3d459cce34-snapshots\") pod \"insights-operator-585dfdc468-zvgcc\" (UID: \"81a18479-876e-4704-9f10-9b3d459cce34\") " pod="openshift-insights/insights-operator-585dfdc468-zvgcc" Apr 22 18:48:52.649425 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.649358 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-869c7\" (UniqueName: \"kubernetes.io/projected/2758dd4a-0d4e-4566-b935-f784873d69e9-kube-api-access-869c7\") pod \"service-ca-operator-d6fc45fc5-lxclb\" (UID: \"2758dd4a-0d4e-4566-b935-f784873d69e9\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lxclb" Apr 22 18:48:52.649425 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.649390 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4871a747-c041-430f-9c1a-3ab2ce6d0156-serving-cert\") pod \"console-operator-9d4b6777b-4jzwx\" (UID: \"4871a747-c041-430f-9c1a-3ab2ce6d0156\") " pod="openshift-console-operator/console-operator-9d4b6777b-4jzwx" Apr 22 18:48:52.649575 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.649422 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2758dd4a-0d4e-4566-b935-f784873d69e9-config\") pod \"service-ca-operator-d6fc45fc5-lxclb\" (UID: \"2758dd4a-0d4e-4566-b935-f784873d69e9\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lxclb" Apr 22 18:48:52.649575 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.649447 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4871a747-c041-430f-9c1a-3ab2ce6d0156-trusted-ca\") pod \"console-operator-9d4b6777b-4jzwx\" (UID: \"4871a747-c041-430f-9c1a-3ab2ce6d0156\") " pod="openshift-console-operator/console-operator-9d4b6777b-4jzwx" Apr 22 18:48:52.649575 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.649470 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/81a18479-876e-4704-9f10-9b3d459cce34-service-ca-bundle\") pod \"insights-operator-585dfdc468-zvgcc\" (UID: \"81a18479-876e-4704-9f10-9b3d459cce34\") " pod="openshift-insights/insights-operator-585dfdc468-zvgcc" Apr 22 18:48:52.649575 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.649500 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2758dd4a-0d4e-4566-b935-f784873d69e9-serving-cert\") pod \"service-ca-operator-d6fc45fc5-lxclb\" (UID: \"2758dd4a-0d4e-4566-b935-f784873d69e9\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lxclb" Apr 22 18:48:52.649575 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.649525 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/574f89c9-2eec-4ad6-a1ef-c9b8109907a5-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-djlpr\" (UID: \"574f89c9-2eec-4ad6-a1ef-c9b8109907a5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-djlpr" Apr 22 18:48:52.649820 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.649602 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/81a18479-876e-4704-9f10-9b3d459cce34-serving-cert\") pod \"insights-operator-585dfdc468-zvgcc\" (UID: \"81a18479-876e-4704-9f10-9b3d459cce34\") " pod="openshift-insights/insights-operator-585dfdc468-zvgcc" Apr 22 18:48:52.649820 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.649648 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5wwhp\" (UniqueName: \"kubernetes.io/projected/4871a747-c041-430f-9c1a-3ab2ce6d0156-kube-api-access-5wwhp\") pod \"console-operator-9d4b6777b-4jzwx\" (UID: \"4871a747-c041-430f-9c1a-3ab2ce6d0156\") " pod="openshift-console-operator/console-operator-9d4b6777b-4jzwx" Apr 22 18:48:52.649968 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.649924 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4871a747-c041-430f-9c1a-3ab2ce6d0156-config\") pod \"console-operator-9d4b6777b-4jzwx\" (UID: \"4871a747-c041-430f-9c1a-3ab2ce6d0156\") " pod="openshift-console-operator/console-operator-9d4b6777b-4jzwx" Apr 22 18:48:52.650090 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.650070 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2758dd4a-0d4e-4566-b935-f784873d69e9-config\") pod \"service-ca-operator-d6fc45fc5-lxclb\" (UID: \"2758dd4a-0d4e-4566-b935-f784873d69e9\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lxclb" Apr 22 18:48:52.650349 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.650328 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4871a747-c041-430f-9c1a-3ab2ce6d0156-trusted-ca\") pod \"console-operator-9d4b6777b-4jzwx\" (UID: \"4871a747-c041-430f-9c1a-3ab2ce6d0156\") " pod="openshift-console-operator/console-operator-9d4b6777b-4jzwx" Apr 22 18:48:52.651730 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.651712 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2758dd4a-0d4e-4566-b935-f784873d69e9-serving-cert\") pod \"service-ca-operator-d6fc45fc5-lxclb\" (UID: \"2758dd4a-0d4e-4566-b935-f784873d69e9\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lxclb" Apr 22 18:48:52.651811 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.651758 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4871a747-c041-430f-9c1a-3ab2ce6d0156-serving-cert\") pod \"console-operator-9d4b6777b-4jzwx\" (UID: \"4871a747-c041-430f-9c1a-3ab2ce6d0156\") " pod="openshift-console-operator/console-operator-9d4b6777b-4jzwx" Apr 22 18:48:52.656787 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.656765 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-869c7\" (UniqueName: \"kubernetes.io/projected/2758dd4a-0d4e-4566-b935-f784873d69e9-kube-api-access-869c7\") pod \"service-ca-operator-d6fc45fc5-lxclb\" (UID: \"2758dd4a-0d4e-4566-b935-f784873d69e9\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lxclb" Apr 22 18:48:52.657131 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.657111 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wwhp\" (UniqueName: \"kubernetes.io/projected/4871a747-c041-430f-9c1a-3ab2ce6d0156-kube-api-access-5wwhp\") pod \"console-operator-9d4b6777b-4jzwx\" (UID: \"4871a747-c041-430f-9c1a-3ab2ce6d0156\") " pod="openshift-console-operator/console-operator-9d4b6777b-4jzwx" Apr 22 18:48:52.679000 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.678982 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lxclb" Apr 22 18:48:52.683603 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.683582 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-4jzwx" Apr 22 18:48:52.751024 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.750995 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4nbjm\" (UniqueName: \"kubernetes.io/projected/574f89c9-2eec-4ad6-a1ef-c9b8109907a5-kube-api-access-4nbjm\") pod \"cluster-monitoring-operator-75587bd455-djlpr\" (UID: \"574f89c9-2eec-4ad6-a1ef-c9b8109907a5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-djlpr" Apr 22 18:48:52.751151 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.751050 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/81a18479-876e-4704-9f10-9b3d459cce34-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-zvgcc\" (UID: \"81a18479-876e-4704-9f10-9b3d459cce34\") " pod="openshift-insights/insights-operator-585dfdc468-zvgcc" Apr 22 18:48:52.751151 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.751069 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-swmc7\" (UniqueName: \"kubernetes.io/projected/4a6e19f9-6606-4820-8fc6-604159ab398c-kube-api-access-swmc7\") pod \"network-check-source-8894fc9bd-v9ml5\" (UID: \"4a6e19f9-6606-4820-8fc6-604159ab398c\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-v9ml5" Apr 22 18:48:52.751151 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.751085 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/81a18479-876e-4704-9f10-9b3d459cce34-tmp\") pod \"insights-operator-585dfdc468-zvgcc\" (UID: \"81a18479-876e-4704-9f10-9b3d459cce34\") " pod="openshift-insights/insights-operator-585dfdc468-zvgcc" Apr 22 18:48:52.751151 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.751100 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8lp96\" (UniqueName: \"kubernetes.io/projected/81a18479-876e-4704-9f10-9b3d459cce34-kube-api-access-8lp96\") pod \"insights-operator-585dfdc468-zvgcc\" (UID: \"81a18479-876e-4704-9f10-9b3d459cce34\") " pod="openshift-insights/insights-operator-585dfdc468-zvgcc" Apr 22 18:48:52.751151 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.751121 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/574f89c9-2eec-4ad6-a1ef-c9b8109907a5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-djlpr\" (UID: \"574f89c9-2eec-4ad6-a1ef-c9b8109907a5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-djlpr" Apr 22 18:48:52.751151 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.751139 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/81a18479-876e-4704-9f10-9b3d459cce34-snapshots\") pod \"insights-operator-585dfdc468-zvgcc\" (UID: \"81a18479-876e-4704-9f10-9b3d459cce34\") " pod="openshift-insights/insights-operator-585dfdc468-zvgcc" Apr 22 18:48:52.751332 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.751168 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/81a18479-876e-4704-9f10-9b3d459cce34-service-ca-bundle\") pod \"insights-operator-585dfdc468-zvgcc\" (UID: \"81a18479-876e-4704-9f10-9b3d459cce34\") " pod="openshift-insights/insights-operator-585dfdc468-zvgcc" Apr 22 18:48:52.751332 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.751191 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/574f89c9-2eec-4ad6-a1ef-c9b8109907a5-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-djlpr\" (UID: \"574f89c9-2eec-4ad6-a1ef-c9b8109907a5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-djlpr" Apr 22 18:48:52.751332 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.751221 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/81a18479-876e-4704-9f10-9b3d459cce34-serving-cert\") pod \"insights-operator-585dfdc468-zvgcc\" (UID: \"81a18479-876e-4704-9f10-9b3d459cce34\") " pod="openshift-insights/insights-operator-585dfdc468-zvgcc" Apr 22 18:48:52.751806 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:48:52.751715 2566 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:48:52.751806 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:48:52.751792 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/574f89c9-2eec-4ad6-a1ef-c9b8109907a5-cluster-monitoring-operator-tls podName:574f89c9-2eec-4ad6-a1ef-c9b8109907a5 nodeName:}" failed. No retries permitted until 2026-04-22 18:48:53.251771034 +0000 UTC m=+135.030604508 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/574f89c9-2eec-4ad6-a1ef-c9b8109907a5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-djlpr" (UID: "574f89c9-2eec-4ad6-a1ef-c9b8109907a5") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:48:52.752337 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.752277 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/81a18479-876e-4704-9f10-9b3d459cce34-tmp\") pod \"insights-operator-585dfdc468-zvgcc\" (UID: \"81a18479-876e-4704-9f10-9b3d459cce34\") " pod="openshift-insights/insights-operator-585dfdc468-zvgcc" Apr 22 18:48:52.752337 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.752283 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/81a18479-876e-4704-9f10-9b3d459cce34-snapshots\") pod \"insights-operator-585dfdc468-zvgcc\" (UID: \"81a18479-876e-4704-9f10-9b3d459cce34\") " pod="openshift-insights/insights-operator-585dfdc468-zvgcc" Apr 22 18:48:52.752599 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.752344 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/574f89c9-2eec-4ad6-a1ef-c9b8109907a5-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-djlpr\" (UID: \"574f89c9-2eec-4ad6-a1ef-c9b8109907a5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-djlpr" Apr 22 18:48:52.752983 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.752911 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/81a18479-876e-4704-9f10-9b3d459cce34-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-zvgcc\" (UID: \"81a18479-876e-4704-9f10-9b3d459cce34\") " pod="openshift-insights/insights-operator-585dfdc468-zvgcc" Apr 22 18:48:52.753107 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.753082 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/81a18479-876e-4704-9f10-9b3d459cce34-service-ca-bundle\") pod \"insights-operator-585dfdc468-zvgcc\" (UID: \"81a18479-876e-4704-9f10-9b3d459cce34\") " pod="openshift-insights/insights-operator-585dfdc468-zvgcc" Apr 22 18:48:52.754068 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.754027 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/81a18479-876e-4704-9f10-9b3d459cce34-serving-cert\") pod \"insights-operator-585dfdc468-zvgcc\" (UID: \"81a18479-876e-4704-9f10-9b3d459cce34\") " pod="openshift-insights/insights-operator-585dfdc468-zvgcc" Apr 22 18:48:52.762159 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.762133 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nbjm\" (UniqueName: \"kubernetes.io/projected/574f89c9-2eec-4ad6-a1ef-c9b8109907a5-kube-api-access-4nbjm\") pod \"cluster-monitoring-operator-75587bd455-djlpr\" (UID: \"574f89c9-2eec-4ad6-a1ef-c9b8109907a5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-djlpr" Apr 22 18:48:52.762765 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.762730 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-swmc7\" (UniqueName: \"kubernetes.io/projected/4a6e19f9-6606-4820-8fc6-604159ab398c-kube-api-access-swmc7\") pod \"network-check-source-8894fc9bd-v9ml5\" (UID: \"4a6e19f9-6606-4820-8fc6-604159ab398c\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-v9ml5" Apr 22 18:48:52.763084 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.763067 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lp96\" (UniqueName: \"kubernetes.io/projected/81a18479-876e-4704-9f10-9b3d459cce34-kube-api-access-8lp96\") pod \"insights-operator-585dfdc468-zvgcc\" (UID: \"81a18479-876e-4704-9f10-9b3d459cce34\") " pod="openshift-insights/insights-operator-585dfdc468-zvgcc" Apr 22 18:48:52.777202 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.777173 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-v9ml5" Apr 22 18:48:52.789885 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.789862 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-zvgcc" Apr 22 18:48:52.802376 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.802315 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lxclb"] Apr 22 18:48:52.807385 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:48:52.807359 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2758dd4a_0d4e_4566_b935_f784873d69e9.slice/crio-50040319241afd3ed017b6bf668a830693bfcf17485bdfb85fd98dec0869a7fe WatchSource:0}: Error finding container 50040319241afd3ed017b6bf668a830693bfcf17485bdfb85fd98dec0869a7fe: Status 404 returned error can't find the container with id 50040319241afd3ed017b6bf668a830693bfcf17485bdfb85fd98dec0869a7fe Apr 22 18:48:52.818328 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.818196 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-4jzwx"] Apr 22 18:48:52.821104 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:48:52.821076 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4871a747_c041_430f_9c1a_3ab2ce6d0156.slice/crio-62d3d719b57459ea331d31ed46fcac6eefef695077b1546ae4a96de986941f75 WatchSource:0}: Error finding container 62d3d719b57459ea331d31ed46fcac6eefef695077b1546ae4a96de986941f75: Status 404 returned error can't find the container with id 62d3d719b57459ea331d31ed46fcac6eefef695077b1546ae4a96de986941f75 Apr 22 18:48:52.898797 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.898761 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-v9ml5"] Apr 22 18:48:52.901472 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:48:52.901442 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a6e19f9_6606_4820_8fc6_604159ab398c.slice/crio-4f706b9facb28210f7847ca99fa43f07e0d3fcc516114760eee57a57bf1aff11 WatchSource:0}: Error finding container 4f706b9facb28210f7847ca99fa43f07e0d3fcc516114760eee57a57bf1aff11: Status 404 returned error can't find the container with id 4f706b9facb28210f7847ca99fa43f07e0d3fcc516114760eee57a57bf1aff11 Apr 22 18:48:52.913759 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.913736 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-zvgcc"] Apr 22 18:48:52.916077 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:48:52.916054 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81a18479_876e_4704_9f10_9b3d459cce34.slice/crio-0da85b4a4d76953f616fa8b737fc0a1ca38081bc3fff60510896e139a647f887 WatchSource:0}: Error finding container 0da85b4a4d76953f616fa8b737fc0a1ca38081bc3fff60510896e139a647f887: Status 404 returned error can't find the container with id 0da85b4a4d76953f616fa8b737fc0a1ca38081bc3fff60510896e139a647f887 Apr 22 18:48:52.953393 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:52.953373 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/bc71e7c0-1858-4f34-ba30-e0864e6770bf-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-8jk7t\" (UID: \"bc71e7c0-1858-4f34-ba30-e0864e6770bf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8jk7t" Apr 22 18:48:52.953492 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:48:52.953485 2566 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 18:48:52.953547 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:48:52.953528 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bc71e7c0-1858-4f34-ba30-e0864e6770bf-samples-operator-tls podName:bc71e7c0-1858-4f34-ba30-e0864e6770bf nodeName:}" failed. No retries permitted until 2026-04-22 18:48:53.95351463 +0000 UTC m=+135.732348094 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/bc71e7c0-1858-4f34-ba30-e0864e6770bf-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-8jk7t" (UID: "bc71e7c0-1858-4f34-ba30-e0864e6770bf") : secret "samples-operator-tls" not found Apr 22 18:48:53.173445 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:53.173404 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-zvgcc" event={"ID":"81a18479-876e-4704-9f10-9b3d459cce34","Type":"ContainerStarted","Data":"0da85b4a4d76953f616fa8b737fc0a1ca38081bc3fff60510896e139a647f887"} Apr 22 18:48:53.174277 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:53.174252 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-4jzwx" event={"ID":"4871a747-c041-430f-9c1a-3ab2ce6d0156","Type":"ContainerStarted","Data":"62d3d719b57459ea331d31ed46fcac6eefef695077b1546ae4a96de986941f75"} Apr 22 18:48:53.175155 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:53.175134 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lxclb" event={"ID":"2758dd4a-0d4e-4566-b935-f784873d69e9","Type":"ContainerStarted","Data":"50040319241afd3ed017b6bf668a830693bfcf17485bdfb85fd98dec0869a7fe"} Apr 22 18:48:53.176336 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:53.176310 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-v9ml5" event={"ID":"4a6e19f9-6606-4820-8fc6-604159ab398c","Type":"ContainerStarted","Data":"1457cf4f6f44eb9753c2baac57a6ec74e7dbdeb4561c67b04270dbf33ce3c654"} Apr 22 18:48:53.176432 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:53.176338 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-v9ml5" event={"ID":"4a6e19f9-6606-4820-8fc6-604159ab398c","Type":"ContainerStarted","Data":"4f706b9facb28210f7847ca99fa43f07e0d3fcc516114760eee57a57bf1aff11"} Apr 22 18:48:53.193393 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:53.193348 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-v9ml5" podStartSLOduration=1.193333683 podStartE2EDuration="1.193333683s" podCreationTimestamp="2026-04-22 18:48:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:48:53.19301749 +0000 UTC m=+134.971850976" watchObservedRunningTime="2026-04-22 18:48:53.193333683 +0000 UTC m=+134.972167170" Apr 22 18:48:53.256982 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:53.256952 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/574f89c9-2eec-4ad6-a1ef-c9b8109907a5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-djlpr\" (UID: \"574f89c9-2eec-4ad6-a1ef-c9b8109907a5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-djlpr" Apr 22 18:48:53.257351 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:48:53.257096 2566 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:48:53.257351 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:48:53.257153 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/574f89c9-2eec-4ad6-a1ef-c9b8109907a5-cluster-monitoring-operator-tls podName:574f89c9-2eec-4ad6-a1ef-c9b8109907a5 nodeName:}" failed. No retries permitted until 2026-04-22 18:48:54.257135951 +0000 UTC m=+136.035969425 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/574f89c9-2eec-4ad6-a1ef-c9b8109907a5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-djlpr" (UID: "574f89c9-2eec-4ad6-a1ef-c9b8109907a5") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:48:53.964330 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:53.964290 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/bc71e7c0-1858-4f34-ba30-e0864e6770bf-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-8jk7t\" (UID: \"bc71e7c0-1858-4f34-ba30-e0864e6770bf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8jk7t" Apr 22 18:48:53.964631 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:48:53.964538 2566 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 18:48:53.964631 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:48:53.964607 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bc71e7c0-1858-4f34-ba30-e0864e6770bf-samples-operator-tls podName:bc71e7c0-1858-4f34-ba30-e0864e6770bf nodeName:}" failed. No retries permitted until 2026-04-22 18:48:55.964587546 +0000 UTC m=+137.743421015 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/bc71e7c0-1858-4f34-ba30-e0864e6770bf-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-8jk7t" (UID: "bc71e7c0-1858-4f34-ba30-e0864e6770bf") : secret "samples-operator-tls" not found Apr 22 18:48:54.268044 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:54.267958 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/574f89c9-2eec-4ad6-a1ef-c9b8109907a5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-djlpr\" (UID: \"574f89c9-2eec-4ad6-a1ef-c9b8109907a5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-djlpr" Apr 22 18:48:54.268473 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:48:54.268211 2566 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:48:54.268473 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:48:54.268333 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/574f89c9-2eec-4ad6-a1ef-c9b8109907a5-cluster-monitoring-operator-tls podName:574f89c9-2eec-4ad6-a1ef-c9b8109907a5 nodeName:}" failed. No retries permitted until 2026-04-22 18:48:56.268310934 +0000 UTC m=+138.047144410 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/574f89c9-2eec-4ad6-a1ef-c9b8109907a5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-djlpr" (UID: "574f89c9-2eec-4ad6-a1ef-c9b8109907a5") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:48:55.981664 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:55.981635 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/bc71e7c0-1858-4f34-ba30-e0864e6770bf-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-8jk7t\" (UID: \"bc71e7c0-1858-4f34-ba30-e0864e6770bf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8jk7t" Apr 22 18:48:55.982009 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:48:55.981795 2566 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 18:48:55.982009 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:48:55.981868 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bc71e7c0-1858-4f34-ba30-e0864e6770bf-samples-operator-tls podName:bc71e7c0-1858-4f34-ba30-e0864e6770bf nodeName:}" failed. No retries permitted until 2026-04-22 18:48:59.981846597 +0000 UTC m=+141.760680068 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/bc71e7c0-1858-4f34-ba30-e0864e6770bf-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-8jk7t" (UID: "bc71e7c0-1858-4f34-ba30-e0864e6770bf") : secret "samples-operator-tls" not found Apr 22 18:48:56.131590 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:56.131564 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-fkllm"] Apr 22 18:48:56.134585 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:56.134566 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-fkllm" Apr 22 18:48:56.137059 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:56.137034 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 22 18:48:56.137152 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:56.137041 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 22 18:48:56.137152 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:56.137099 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-tm9g8\"" Apr 22 18:48:56.141069 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:56.141035 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-fkllm"] Apr 22 18:48:56.183738 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:56.183713 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jq44h\" (UniqueName: \"kubernetes.io/projected/59a77f39-9d49-4aba-99e0-0e3df93fec7e-kube-api-access-jq44h\") pod \"migrator-74bb7799d9-fkllm\" (UID: \"59a77f39-9d49-4aba-99e0-0e3df93fec7e\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-fkllm" Apr 22 18:48:56.185044 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:56.185018 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-zvgcc" event={"ID":"81a18479-876e-4704-9f10-9b3d459cce34","Type":"ContainerStarted","Data":"8ef155b65a11c6ca9d43197b668ceba330e678c45e78f703328ca3aaddabf75f"} Apr 22 18:48:56.186481 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:56.186465 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4jzwx_4871a747-c041-430f-9c1a-3ab2ce6d0156/console-operator/0.log" Apr 22 18:48:56.186568 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:56.186497 2566 generic.go:358] "Generic (PLEG): container finished" podID="4871a747-c041-430f-9c1a-3ab2ce6d0156" containerID="cdabc6eba5f47a0b5d76bae7796f5ff6ed7f7efe0ce1b2a37c689719edf308c0" exitCode=255 Apr 22 18:48:56.186568 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:56.186558 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-4jzwx" event={"ID":"4871a747-c041-430f-9c1a-3ab2ce6d0156","Type":"ContainerDied","Data":"cdabc6eba5f47a0b5d76bae7796f5ff6ed7f7efe0ce1b2a37c689719edf308c0"} Apr 22 18:48:56.186763 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:56.186733 2566 scope.go:117] "RemoveContainer" containerID="cdabc6eba5f47a0b5d76bae7796f5ff6ed7f7efe0ce1b2a37c689719edf308c0" Apr 22 18:48:56.187864 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:56.187840 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lxclb" event={"ID":"2758dd4a-0d4e-4566-b935-f784873d69e9","Type":"ContainerStarted","Data":"9330e8007aaac2a7b23aab6afb682cc5f1e3333d65a5fd8e731195b172a75e3c"} Apr 22 18:48:56.199485 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:56.199449 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-zvgcc" podStartSLOduration=1.294243353 podStartE2EDuration="4.199440063s" podCreationTimestamp="2026-04-22 18:48:52 +0000 UTC" firstStartedPulling="2026-04-22 18:48:52.919897826 +0000 UTC m=+134.698731291" lastFinishedPulling="2026-04-22 18:48:55.825094534 +0000 UTC m=+137.603928001" observedRunningTime="2026-04-22 18:48:56.199118208 +0000 UTC m=+137.977951693" watchObservedRunningTime="2026-04-22 18:48:56.199440063 +0000 UTC m=+137.978273549" Apr 22 18:48:56.212840 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:56.212789 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lxclb" podStartSLOduration=1.204949142 podStartE2EDuration="4.212774576s" podCreationTimestamp="2026-04-22 18:48:52 +0000 UTC" firstStartedPulling="2026-04-22 18:48:52.809427376 +0000 UTC m=+134.588260839" lastFinishedPulling="2026-04-22 18:48:55.817252795 +0000 UTC m=+137.596086273" observedRunningTime="2026-04-22 18:48:56.211679752 +0000 UTC m=+137.990513239" watchObservedRunningTime="2026-04-22 18:48:56.212774576 +0000 UTC m=+137.991608064" Apr 22 18:48:56.285392 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:56.284454 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jq44h\" (UniqueName: \"kubernetes.io/projected/59a77f39-9d49-4aba-99e0-0e3df93fec7e-kube-api-access-jq44h\") pod \"migrator-74bb7799d9-fkllm\" (UID: \"59a77f39-9d49-4aba-99e0-0e3df93fec7e\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-fkllm" Apr 22 18:48:56.285392 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:56.284852 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/574f89c9-2eec-4ad6-a1ef-c9b8109907a5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-djlpr\" (UID: \"574f89c9-2eec-4ad6-a1ef-c9b8109907a5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-djlpr" Apr 22 18:48:56.285392 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:48:56.285022 2566 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:48:56.285392 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:48:56.285085 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/574f89c9-2eec-4ad6-a1ef-c9b8109907a5-cluster-monitoring-operator-tls podName:574f89c9-2eec-4ad6-a1ef-c9b8109907a5 nodeName:}" failed. No retries permitted until 2026-04-22 18:49:00.285065462 +0000 UTC m=+142.063898927 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/574f89c9-2eec-4ad6-a1ef-c9b8109907a5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-djlpr" (UID: "574f89c9-2eec-4ad6-a1ef-c9b8109907a5") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:48:56.291584 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:56.291556 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jq44h\" (UniqueName: \"kubernetes.io/projected/59a77f39-9d49-4aba-99e0-0e3df93fec7e-kube-api-access-jq44h\") pod \"migrator-74bb7799d9-fkllm\" (UID: \"59a77f39-9d49-4aba-99e0-0e3df93fec7e\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-fkllm" Apr 22 18:48:56.444895 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:56.444811 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-fkllm" Apr 22 18:48:56.570103 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:56.570076 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-fkllm"] Apr 22 18:48:56.573284 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:48:56.573253 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59a77f39_9d49_4aba_99e0_0e3df93fec7e.slice/crio-1e131713e05cd2c5a3b3516301ff5d9bb19dd7ca2fdaeb80ca777e8546660c02 WatchSource:0}: Error finding container 1e131713e05cd2c5a3b3516301ff5d9bb19dd7ca2fdaeb80ca777e8546660c02: Status 404 returned error can't find the container with id 1e131713e05cd2c5a3b3516301ff5d9bb19dd7ca2fdaeb80ca777e8546660c02 Apr 22 18:48:57.193721 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:57.193685 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4jzwx_4871a747-c041-430f-9c1a-3ab2ce6d0156/console-operator/1.log" Apr 22 18:48:57.194951 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:57.194910 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4jzwx_4871a747-c041-430f-9c1a-3ab2ce6d0156/console-operator/0.log" Apr 22 18:48:57.195087 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:57.194980 2566 generic.go:358] "Generic (PLEG): container finished" podID="4871a747-c041-430f-9c1a-3ab2ce6d0156" containerID="5b3dae5035589fa7c37363e73f3eedef08bc2357e62cd96840361a4dcffc12d1" exitCode=255 Apr 22 18:48:57.195087 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:57.195077 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-4jzwx" event={"ID":"4871a747-c041-430f-9c1a-3ab2ce6d0156","Type":"ContainerDied","Data":"5b3dae5035589fa7c37363e73f3eedef08bc2357e62cd96840361a4dcffc12d1"} Apr 22 18:48:57.195210 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:57.195113 2566 scope.go:117] "RemoveContainer" containerID="cdabc6eba5f47a0b5d76bae7796f5ff6ed7f7efe0ce1b2a37c689719edf308c0" Apr 22 18:48:57.195417 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:57.195392 2566 scope.go:117] "RemoveContainer" containerID="5b3dae5035589fa7c37363e73f3eedef08bc2357e62cd96840361a4dcffc12d1" Apr 22 18:48:57.195635 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:48:57.195612 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-4jzwx_openshift-console-operator(4871a747-c041-430f-9c1a-3ab2ce6d0156)\"" pod="openshift-console-operator/console-operator-9d4b6777b-4jzwx" podUID="4871a747-c041-430f-9c1a-3ab2ce6d0156" Apr 22 18:48:57.196908 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:57.196558 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-fkllm" event={"ID":"59a77f39-9d49-4aba-99e0-0e3df93fec7e","Type":"ContainerStarted","Data":"1e131713e05cd2c5a3b3516301ff5d9bb19dd7ca2fdaeb80ca777e8546660c02"} Apr 22 18:48:58.201028 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:58.200994 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-fkllm" event={"ID":"59a77f39-9d49-4aba-99e0-0e3df93fec7e","Type":"ContainerStarted","Data":"d210980a27d40b4cf9f4a89cde26c51bea9db9950cdba6727de192df5cb6ef5a"} Apr 22 18:48:58.201028 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:58.201033 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-fkllm" event={"ID":"59a77f39-9d49-4aba-99e0-0e3df93fec7e","Type":"ContainerStarted","Data":"4f081360c6fdf5603d86dbab317e8ce2d3d35c102b66c7a9ebc04cead57e5e26"} Apr 22 18:48:58.202361 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:58.202338 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4jzwx_4871a747-c041-430f-9c1a-3ab2ce6d0156/console-operator/1.log" Apr 22 18:48:58.202618 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:58.202604 2566 scope.go:117] "RemoveContainer" containerID="5b3dae5035589fa7c37363e73f3eedef08bc2357e62cd96840361a4dcffc12d1" Apr 22 18:48:58.202756 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:48:58.202739 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-4jzwx_openshift-console-operator(4871a747-c041-430f-9c1a-3ab2ce6d0156)\"" pod="openshift-console-operator/console-operator-9d4b6777b-4jzwx" podUID="4871a747-c041-430f-9c1a-3ab2ce6d0156" Apr 22 18:48:58.215148 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:58.215100 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-fkllm" podStartSLOduration=1.263263397 podStartE2EDuration="2.215084531s" podCreationTimestamp="2026-04-22 18:48:56 +0000 UTC" firstStartedPulling="2026-04-22 18:48:56.575468289 +0000 UTC m=+138.354301754" lastFinishedPulling="2026-04-22 18:48:57.52728941 +0000 UTC m=+139.306122888" observedRunningTime="2026-04-22 18:48:58.213476544 +0000 UTC m=+139.992310037" watchObservedRunningTime="2026-04-22 18:48:58.215084531 +0000 UTC m=+139.993918020" Apr 22 18:48:59.112328 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:59.112298 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-6xjch_e92d5c0b-3e38-4729-bb74-4ca59dfb7f0f/dns-node-resolver/0.log" Apr 22 18:48:59.204967 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:59.204539 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-dcg7g"] Apr 22 18:48:59.207882 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:59.207863 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-dcg7g" Apr 22 18:48:59.210137 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:59.210119 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 22 18:48:59.211019 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:59.211001 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-hzx2r\"" Apr 22 18:48:59.211122 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:59.211017 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 22 18:48:59.211122 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:59.211066 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 22 18:48:59.211122 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:59.211073 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 22 18:48:59.215914 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:59.215893 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-dcg7g"] Apr 22 18:48:59.306428 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:59.306397 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4xlp\" (UniqueName: \"kubernetes.io/projected/491a221f-17ba-45f2-801d-dbc3275189e4-kube-api-access-h4xlp\") pod \"service-ca-865cb79987-dcg7g\" (UID: \"491a221f-17ba-45f2-801d-dbc3275189e4\") " pod="openshift-service-ca/service-ca-865cb79987-dcg7g" Apr 22 18:48:59.306549 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:59.306453 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/491a221f-17ba-45f2-801d-dbc3275189e4-signing-cabundle\") pod \"service-ca-865cb79987-dcg7g\" (UID: \"491a221f-17ba-45f2-801d-dbc3275189e4\") " pod="openshift-service-ca/service-ca-865cb79987-dcg7g" Apr 22 18:48:59.306549 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:59.306485 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/491a221f-17ba-45f2-801d-dbc3275189e4-signing-key\") pod \"service-ca-865cb79987-dcg7g\" (UID: \"491a221f-17ba-45f2-801d-dbc3275189e4\") " pod="openshift-service-ca/service-ca-865cb79987-dcg7g" Apr 22 18:48:59.407757 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:59.407678 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h4xlp\" (UniqueName: \"kubernetes.io/projected/491a221f-17ba-45f2-801d-dbc3275189e4-kube-api-access-h4xlp\") pod \"service-ca-865cb79987-dcg7g\" (UID: \"491a221f-17ba-45f2-801d-dbc3275189e4\") " pod="openshift-service-ca/service-ca-865cb79987-dcg7g" Apr 22 18:48:59.407757 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:59.407735 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/491a221f-17ba-45f2-801d-dbc3275189e4-signing-cabundle\") pod \"service-ca-865cb79987-dcg7g\" (UID: \"491a221f-17ba-45f2-801d-dbc3275189e4\") " pod="openshift-service-ca/service-ca-865cb79987-dcg7g" Apr 22 18:48:59.407757 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:59.407760 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/491a221f-17ba-45f2-801d-dbc3275189e4-signing-key\") pod \"service-ca-865cb79987-dcg7g\" (UID: \"491a221f-17ba-45f2-801d-dbc3275189e4\") " pod="openshift-service-ca/service-ca-865cb79987-dcg7g" Apr 22 18:48:59.408405 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:59.408380 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/491a221f-17ba-45f2-801d-dbc3275189e4-signing-cabundle\") pod \"service-ca-865cb79987-dcg7g\" (UID: \"491a221f-17ba-45f2-801d-dbc3275189e4\") " pod="openshift-service-ca/service-ca-865cb79987-dcg7g" Apr 22 18:48:59.409996 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:59.409977 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/491a221f-17ba-45f2-801d-dbc3275189e4-signing-key\") pod \"service-ca-865cb79987-dcg7g\" (UID: \"491a221f-17ba-45f2-801d-dbc3275189e4\") " pod="openshift-service-ca/service-ca-865cb79987-dcg7g" Apr 22 18:48:59.415500 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:59.415481 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4xlp\" (UniqueName: \"kubernetes.io/projected/491a221f-17ba-45f2-801d-dbc3275189e4-kube-api-access-h4xlp\") pod \"service-ca-865cb79987-dcg7g\" (UID: \"491a221f-17ba-45f2-801d-dbc3275189e4\") " pod="openshift-service-ca/service-ca-865cb79987-dcg7g" Apr 22 18:48:59.516684 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:59.516652 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-dcg7g" Apr 22 18:48:59.646190 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:48:59.646156 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-dcg7g"] Apr 22 18:48:59.649328 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:48:59.649303 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod491a221f_17ba_45f2_801d_dbc3275189e4.slice/crio-7f693cb3eddf95e1d532d5209d721ced139d88944f9c6f4fb9948e3116f5743b WatchSource:0}: Error finding container 7f693cb3eddf95e1d532d5209d721ced139d88944f9c6f4fb9948e3116f5743b: Status 404 returned error can't find the container with id 7f693cb3eddf95e1d532d5209d721ced139d88944f9c6f4fb9948e3116f5743b Apr 22 18:49:00.012974 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:00.012881 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/bc71e7c0-1858-4f34-ba30-e0864e6770bf-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-8jk7t\" (UID: \"bc71e7c0-1858-4f34-ba30-e0864e6770bf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8jk7t" Apr 22 18:49:00.013114 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:49:00.013042 2566 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 18:49:00.013157 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:49:00.013114 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bc71e7c0-1858-4f34-ba30-e0864e6770bf-samples-operator-tls podName:bc71e7c0-1858-4f34-ba30-e0864e6770bf nodeName:}" failed. No retries permitted until 2026-04-22 18:49:08.013095553 +0000 UTC m=+149.791929026 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/bc71e7c0-1858-4f34-ba30-e0864e6770bf-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-8jk7t" (UID: "bc71e7c0-1858-4f34-ba30-e0864e6770bf") : secret "samples-operator-tls" not found Apr 22 18:49:00.210982 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:00.210931 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-dcg7g" event={"ID":"491a221f-17ba-45f2-801d-dbc3275189e4","Type":"ContainerStarted","Data":"e5a13aee4fd62560d6afcc08a2b190582fc2272ed7747aa8a76afbb5cf8161f6"} Apr 22 18:49:00.210982 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:00.210987 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-dcg7g" event={"ID":"491a221f-17ba-45f2-801d-dbc3275189e4","Type":"ContainerStarted","Data":"7f693cb3eddf95e1d532d5209d721ced139d88944f9c6f4fb9948e3116f5743b"} Apr 22 18:49:00.229328 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:00.229126 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-dcg7g" podStartSLOduration=1.229109899 podStartE2EDuration="1.229109899s" podCreationTimestamp="2026-04-22 18:48:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:49:00.228789901 +0000 UTC m=+142.007623386" watchObservedRunningTime="2026-04-22 18:49:00.229109899 +0000 UTC m=+142.007943388" Apr 22 18:49:00.315401 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:00.315320 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/574f89c9-2eec-4ad6-a1ef-c9b8109907a5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-djlpr\" (UID: \"574f89c9-2eec-4ad6-a1ef-c9b8109907a5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-djlpr" Apr 22 18:49:00.315555 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:49:00.315456 2566 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:49:00.315555 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:49:00.315535 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/574f89c9-2eec-4ad6-a1ef-c9b8109907a5-cluster-monitoring-operator-tls podName:574f89c9-2eec-4ad6-a1ef-c9b8109907a5 nodeName:}" failed. No retries permitted until 2026-04-22 18:49:08.315512388 +0000 UTC m=+150.094345856 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/574f89c9-2eec-4ad6-a1ef-c9b8109907a5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-djlpr" (UID: "574f89c9-2eec-4ad6-a1ef-c9b8109907a5") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:49:00.508973 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:00.508924 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-wj7nl_462b5fca-7ee5-444f-af12-38942635aa2f/node-ca/0.log" Apr 22 18:49:01.509289 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:01.509259 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-fkllm_59a77f39-9d49-4aba-99e0-0e3df93fec7e/migrator/0.log" Apr 22 18:49:01.710432 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:01.710402 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-fkllm_59a77f39-9d49-4aba-99e0-0e3df93fec7e/graceful-termination/0.log" Apr 22 18:49:02.684506 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:02.684467 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-4jzwx" Apr 22 18:49:02.684506 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:02.684516 2566 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-4jzwx" Apr 22 18:49:02.684906 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:02.684860 2566 scope.go:117] "RemoveContainer" containerID="5b3dae5035589fa7c37363e73f3eedef08bc2357e62cd96840361a4dcffc12d1" Apr 22 18:49:02.685072 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:49:02.685052 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-4jzwx_openshift-console-operator(4871a747-c041-430f-9c1a-3ab2ce6d0156)\"" pod="openshift-console-operator/console-operator-9d4b6777b-4jzwx" podUID="4871a747-c041-430f-9c1a-3ab2ce6d0156" Apr 22 18:49:08.081604 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:08.081574 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/bc71e7c0-1858-4f34-ba30-e0864e6770bf-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-8jk7t\" (UID: \"bc71e7c0-1858-4f34-ba30-e0864e6770bf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8jk7t" Apr 22 18:49:08.083985 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:08.083959 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/bc71e7c0-1858-4f34-ba30-e0864e6770bf-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-8jk7t\" (UID: \"bc71e7c0-1858-4f34-ba30-e0864e6770bf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8jk7t" Apr 22 18:49:08.163689 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:08.163661 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8jk7t" Apr 22 18:49:08.284444 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:08.284411 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8jk7t"] Apr 22 18:49:08.384597 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:08.384573 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/574f89c9-2eec-4ad6-a1ef-c9b8109907a5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-djlpr\" (UID: \"574f89c9-2eec-4ad6-a1ef-c9b8109907a5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-djlpr" Apr 22 18:49:08.384744 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:49:08.384710 2566 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:49:08.384804 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:49:08.384766 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/574f89c9-2eec-4ad6-a1ef-c9b8109907a5-cluster-monitoring-operator-tls podName:574f89c9-2eec-4ad6-a1ef-c9b8109907a5 nodeName:}" failed. No retries permitted until 2026-04-22 18:49:24.384747689 +0000 UTC m=+166.163581152 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/574f89c9-2eec-4ad6-a1ef-c9b8109907a5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-djlpr" (UID: "574f89c9-2eec-4ad6-a1ef-c9b8109907a5") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:49:09.232827 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:09.232791 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8jk7t" event={"ID":"bc71e7c0-1858-4f34-ba30-e0864e6770bf","Type":"ContainerStarted","Data":"d43ffecc87bdf6ec16adeb72f4f2450614417ca8bbadebd1ec2131ef95ace8e2"} Apr 22 18:49:10.236292 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:10.236200 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8jk7t" event={"ID":"bc71e7c0-1858-4f34-ba30-e0864e6770bf","Type":"ContainerStarted","Data":"156f7e897675cd3ddb1bd44523553a9f9a4a9649e0f15718faa3c5123c6c437a"} Apr 22 18:49:10.236292 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:10.236246 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8jk7t" event={"ID":"bc71e7c0-1858-4f34-ba30-e0864e6770bf","Type":"ContainerStarted","Data":"cef02d3e70f1b4f627eea08ea495d8dee3b89d88a045a84a41127d298e00376d"} Apr 22 18:49:10.254611 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:10.254565 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8jk7t" podStartSLOduration=16.712225268 podStartE2EDuration="18.254551692s" podCreationTimestamp="2026-04-22 18:48:52 +0000 UTC" firstStartedPulling="2026-04-22 18:49:08.330437513 +0000 UTC m=+150.109270981" lastFinishedPulling="2026-04-22 18:49:09.872763941 +0000 UTC m=+151.651597405" observedRunningTime="2026-04-22 18:49:10.253035745 +0000 UTC m=+152.031869232" watchObservedRunningTime="2026-04-22 18:49:10.254551692 +0000 UTC m=+152.033385177" Apr 22 18:49:13.622233 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:49:13.622183 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-77bfc79668-44qjh" podUID="5ceaf6dd-7ffb-4393-9076-3f02c3b84983" Apr 22 18:49:13.630011 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:49:13.629982 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-6czt4" podUID="8d6d09bf-c484-4e3a-8f6f-43ddb670c369" Apr 22 18:49:13.654253 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:49:13.654229 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-zkddm" podUID="9a00536f-bef4-4ca5-bc76-1956a1edb1d1" Apr 22 18:49:14.246135 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:14.246106 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6czt4" Apr 22 18:49:14.825358 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:14.825327 2566 scope.go:117] "RemoveContainer" containerID="5b3dae5035589fa7c37363e73f3eedef08bc2357e62cd96840361a4dcffc12d1" Apr 22 18:49:14.849096 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:49:14.849072 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-7zj27" podUID="66e2806a-82a6-444a-a19a-13641a426bed" Apr 22 18:49:15.249742 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:15.249712 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4jzwx_4871a747-c041-430f-9c1a-3ab2ce6d0156/console-operator/1.log" Apr 22 18:49:15.249926 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:15.249803 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-4jzwx" event={"ID":"4871a747-c041-430f-9c1a-3ab2ce6d0156","Type":"ContainerStarted","Data":"a8b85843330200e24f0609e7348fa62c10cc26b5214a367a2f9363e2c5658016"} Apr 22 18:49:15.250094 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:15.250072 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-4jzwx" Apr 22 18:49:15.782791 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:15.782720 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-4jzwx" Apr 22 18:49:15.800363 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:15.800307 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-4jzwx" podStartSLOduration=20.80291068 podStartE2EDuration="23.800291682s" podCreationTimestamp="2026-04-22 18:48:52 +0000 UTC" firstStartedPulling="2026-04-22 18:48:52.823454007 +0000 UTC m=+134.602287483" lastFinishedPulling="2026-04-22 18:48:55.820835021 +0000 UTC m=+137.599668485" observedRunningTime="2026-04-22 18:49:15.263377096 +0000 UTC m=+157.042210581" watchObservedRunningTime="2026-04-22 18:49:15.800291682 +0000 UTC m=+157.579125167" Apr 22 18:49:17.486559 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:17.486524 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-rpswj"] Apr 22 18:49:17.488970 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:17.488947 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-rpswj" Apr 22 18:49:17.491411 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:17.491392 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 18:49:17.492312 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:17.492291 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 18:49:17.492312 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:17.492308 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-zj8q7\"" Apr 22 18:49:17.501082 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:17.501060 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-rpswj"] Apr 22 18:49:17.553394 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:17.553361 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/de824459-5c69-45bc-b7b1-9320eba8e257-data-volume\") pod \"insights-runtime-extractor-rpswj\" (UID: \"de824459-5c69-45bc-b7b1-9320eba8e257\") " pod="openshift-insights/insights-runtime-extractor-rpswj" Apr 22 18:49:17.553394 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:17.553396 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/de824459-5c69-45bc-b7b1-9320eba8e257-crio-socket\") pod \"insights-runtime-extractor-rpswj\" (UID: \"de824459-5c69-45bc-b7b1-9320eba8e257\") " pod="openshift-insights/insights-runtime-extractor-rpswj" Apr 22 18:49:17.553583 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:17.553416 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/de824459-5c69-45bc-b7b1-9320eba8e257-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rpswj\" (UID: \"de824459-5c69-45bc-b7b1-9320eba8e257\") " pod="openshift-insights/insights-runtime-extractor-rpswj" Apr 22 18:49:17.553583 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:17.553537 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/de824459-5c69-45bc-b7b1-9320eba8e257-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-rpswj\" (UID: \"de824459-5c69-45bc-b7b1-9320eba8e257\") " pod="openshift-insights/insights-runtime-extractor-rpswj" Apr 22 18:49:17.553583 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:17.553573 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25699\" (UniqueName: \"kubernetes.io/projected/de824459-5c69-45bc-b7b1-9320eba8e257-kube-api-access-25699\") pod \"insights-runtime-extractor-rpswj\" (UID: \"de824459-5c69-45bc-b7b1-9320eba8e257\") " pod="openshift-insights/insights-runtime-extractor-rpswj" Apr 22 18:49:17.654378 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:17.654342 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/de824459-5c69-45bc-b7b1-9320eba8e257-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-rpswj\" (UID: \"de824459-5c69-45bc-b7b1-9320eba8e257\") " pod="openshift-insights/insights-runtime-extractor-rpswj" Apr 22 18:49:17.654560 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:17.654382 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-25699\" (UniqueName: \"kubernetes.io/projected/de824459-5c69-45bc-b7b1-9320eba8e257-kube-api-access-25699\") pod \"insights-runtime-extractor-rpswj\" (UID: \"de824459-5c69-45bc-b7b1-9320eba8e257\") " pod="openshift-insights/insights-runtime-extractor-rpswj" Apr 22 18:49:17.654560 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:17.654447 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/de824459-5c69-45bc-b7b1-9320eba8e257-data-volume\") pod \"insights-runtime-extractor-rpswj\" (UID: \"de824459-5c69-45bc-b7b1-9320eba8e257\") " pod="openshift-insights/insights-runtime-extractor-rpswj" Apr 22 18:49:17.654560 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:17.654464 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/de824459-5c69-45bc-b7b1-9320eba8e257-crio-socket\") pod \"insights-runtime-extractor-rpswj\" (UID: \"de824459-5c69-45bc-b7b1-9320eba8e257\") " pod="openshift-insights/insights-runtime-extractor-rpswj" Apr 22 18:49:17.654560 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:17.654481 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/de824459-5c69-45bc-b7b1-9320eba8e257-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rpswj\" (UID: \"de824459-5c69-45bc-b7b1-9320eba8e257\") " pod="openshift-insights/insights-runtime-extractor-rpswj" Apr 22 18:49:17.654768 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:17.654565 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/de824459-5c69-45bc-b7b1-9320eba8e257-crio-socket\") pod \"insights-runtime-extractor-rpswj\" (UID: \"de824459-5c69-45bc-b7b1-9320eba8e257\") " pod="openshift-insights/insights-runtime-extractor-rpswj" Apr 22 18:49:17.654823 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:17.654783 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/de824459-5c69-45bc-b7b1-9320eba8e257-data-volume\") pod \"insights-runtime-extractor-rpswj\" (UID: \"de824459-5c69-45bc-b7b1-9320eba8e257\") " pod="openshift-insights/insights-runtime-extractor-rpswj" Apr 22 18:49:17.654874 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:17.654858 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/de824459-5c69-45bc-b7b1-9320eba8e257-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-rpswj\" (UID: \"de824459-5c69-45bc-b7b1-9320eba8e257\") " pod="openshift-insights/insights-runtime-extractor-rpswj" Apr 22 18:49:17.656742 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:17.656725 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/de824459-5c69-45bc-b7b1-9320eba8e257-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rpswj\" (UID: \"de824459-5c69-45bc-b7b1-9320eba8e257\") " pod="openshift-insights/insights-runtime-extractor-rpswj" Apr 22 18:49:17.663596 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:17.663567 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-25699\" (UniqueName: \"kubernetes.io/projected/de824459-5c69-45bc-b7b1-9320eba8e257-kube-api-access-25699\") pod \"insights-runtime-extractor-rpswj\" (UID: \"de824459-5c69-45bc-b7b1-9320eba8e257\") " pod="openshift-insights/insights-runtime-extractor-rpswj" Apr 22 18:49:17.798004 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:17.797910 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-rpswj" Apr 22 18:49:17.910634 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:17.910601 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-rpswj"] Apr 22 18:49:17.913720 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:49:17.913697 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde824459_5c69_45bc_b7b1_9320eba8e257.slice/crio-8ab702033ff968b6cc2d4793ad9e2aae1efd378af9d8ff366e89cee589fbe025 WatchSource:0}: Error finding container 8ab702033ff968b6cc2d4793ad9e2aae1efd378af9d8ff366e89cee589fbe025: Status 404 returned error can't find the container with id 8ab702033ff968b6cc2d4793ad9e2aae1efd378af9d8ff366e89cee589fbe025 Apr 22 18:49:18.258024 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:18.257987 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rpswj" event={"ID":"de824459-5c69-45bc-b7b1-9320eba8e257","Type":"ContainerStarted","Data":"1af5e8d34f4ef59efd28ee5aa90246467e602d83986d6224f13d96fd00643cd2"} Apr 22 18:49:18.258024 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:18.258025 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rpswj" event={"ID":"de824459-5c69-45bc-b7b1-9320eba8e257","Type":"ContainerStarted","Data":"8ab702033ff968b6cc2d4793ad9e2aae1efd378af9d8ff366e89cee589fbe025"} Apr 22 18:49:18.563244 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:18.563150 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8d6d09bf-c484-4e3a-8f6f-43ddb670c369-metrics-tls\") pod \"dns-default-6czt4\" (UID: \"8d6d09bf-c484-4e3a-8f6f-43ddb670c369\") " pod="openshift-dns/dns-default-6czt4" Apr 22 18:49:18.563244 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:18.563199 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-registry-tls\") pod \"image-registry-77bfc79668-44qjh\" (UID: \"5ceaf6dd-7ffb-4393-9076-3f02c3b84983\") " pod="openshift-image-registry/image-registry-77bfc79668-44qjh" Apr 22 18:49:18.563832 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:18.563254 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9a00536f-bef4-4ca5-bc76-1956a1edb1d1-cert\") pod \"ingress-canary-zkddm\" (UID: \"9a00536f-bef4-4ca5-bc76-1956a1edb1d1\") " pod="openshift-ingress-canary/ingress-canary-zkddm" Apr 22 18:49:18.565853 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:18.565785 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9a00536f-bef4-4ca5-bc76-1956a1edb1d1-cert\") pod \"ingress-canary-zkddm\" (UID: \"9a00536f-bef4-4ca5-bc76-1956a1edb1d1\") " pod="openshift-ingress-canary/ingress-canary-zkddm" Apr 22 18:49:18.566010 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:18.565984 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8d6d09bf-c484-4e3a-8f6f-43ddb670c369-metrics-tls\") pod \"dns-default-6czt4\" (UID: \"8d6d09bf-c484-4e3a-8f6f-43ddb670c369\") " pod="openshift-dns/dns-default-6czt4" Apr 22 18:49:18.566090 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:18.566073 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-registry-tls\") pod \"image-registry-77bfc79668-44qjh\" (UID: \"5ceaf6dd-7ffb-4393-9076-3f02c3b84983\") " pod="openshift-image-registry/image-registry-77bfc79668-44qjh" Apr 22 18:49:18.748826 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:18.748798 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-zg5rc\"" Apr 22 18:49:18.757513 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:18.757497 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6czt4" Apr 22 18:49:18.871088 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:18.870785 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6czt4"] Apr 22 18:49:18.873192 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:49:18.873168 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d6d09bf_c484_4e3a_8f6f_43ddb670c369.slice/crio-a25ccdc0eef31b182e0db0689d17df11008b4d08163db4026aafa57fa619db70 WatchSource:0}: Error finding container a25ccdc0eef31b182e0db0689d17df11008b4d08163db4026aafa57fa619db70: Status 404 returned error can't find the container with id a25ccdc0eef31b182e0db0689d17df11008b4d08163db4026aafa57fa619db70 Apr 22 18:49:19.262579 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:19.262534 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rpswj" event={"ID":"de824459-5c69-45bc-b7b1-9320eba8e257","Type":"ContainerStarted","Data":"3b4b01725a762031101f03c5302728265b388285949b218aae679db4619ff6c3"} Apr 22 18:49:19.264373 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:19.264178 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6czt4" event={"ID":"8d6d09bf-c484-4e3a-8f6f-43ddb670c369","Type":"ContainerStarted","Data":"a25ccdc0eef31b182e0db0689d17df11008b4d08163db4026aafa57fa619db70"} Apr 22 18:49:21.271199 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:21.271163 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rpswj" event={"ID":"de824459-5c69-45bc-b7b1-9320eba8e257","Type":"ContainerStarted","Data":"33f87fe5da11952fd5621146e2e60070c00691d4ce1346188b7519bac6d6e154"} Apr 22 18:49:21.272767 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:21.272742 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6czt4" event={"ID":"8d6d09bf-c484-4e3a-8f6f-43ddb670c369","Type":"ContainerStarted","Data":"d941135d51a09fc22cc51acd35e3e5ac53a08717c6aef3ca71e0b4e74003b264"} Apr 22 18:49:21.272767 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:21.272770 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6czt4" event={"ID":"8d6d09bf-c484-4e3a-8f6f-43ddb670c369","Type":"ContainerStarted","Data":"491c73f617b202374a872d0c974062ad714f39bc393521a626bc01d224788bb6"} Apr 22 18:49:21.272977 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:21.272847 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-6czt4" Apr 22 18:49:21.286516 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:21.286469 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-rpswj" podStartSLOduration=1.829479493 podStartE2EDuration="4.286448919s" podCreationTimestamp="2026-04-22 18:49:17 +0000 UTC" firstStartedPulling="2026-04-22 18:49:17.96448652 +0000 UTC m=+159.743319983" lastFinishedPulling="2026-04-22 18:49:20.421455936 +0000 UTC m=+162.200289409" observedRunningTime="2026-04-22 18:49:21.286071878 +0000 UTC m=+163.064905364" watchObservedRunningTime="2026-04-22 18:49:21.286448919 +0000 UTC m=+163.065282405" Apr 22 18:49:21.302305 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:21.302265 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-6czt4" podStartSLOduration=129.753815599 podStartE2EDuration="2m11.302253354s" podCreationTimestamp="2026-04-22 18:47:10 +0000 UTC" firstStartedPulling="2026-04-22 18:49:18.875084885 +0000 UTC m=+160.653918349" lastFinishedPulling="2026-04-22 18:49:20.423522632 +0000 UTC m=+162.202356104" observedRunningTime="2026-04-22 18:49:21.301638078 +0000 UTC m=+163.080471564" watchObservedRunningTime="2026-04-22 18:49:21.302253354 +0000 UTC m=+163.081086839" Apr 22 18:49:24.409856 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:24.409822 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/574f89c9-2eec-4ad6-a1ef-c9b8109907a5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-djlpr\" (UID: \"574f89c9-2eec-4ad6-a1ef-c9b8109907a5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-djlpr" Apr 22 18:49:24.412072 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:24.412054 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/574f89c9-2eec-4ad6-a1ef-c9b8109907a5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-djlpr\" (UID: \"574f89c9-2eec-4ad6-a1ef-c9b8109907a5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-djlpr" Apr 22 18:49:24.594294 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:24.594264 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-djlpr" Apr 22 18:49:24.704709 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:24.704633 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-djlpr"] Apr 22 18:49:24.707025 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:49:24.706997 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod574f89c9_2eec_4ad6_a1ef_c9b8109907a5.slice/crio-4d782d6682332cdbc4bdfbbfc8ea4b7135fecc957dd1c3c8abbd24a14320eb05 WatchSource:0}: Error finding container 4d782d6682332cdbc4bdfbbfc8ea4b7135fecc957dd1c3c8abbd24a14320eb05: Status 404 returned error can't find the container with id 4d782d6682332cdbc4bdfbbfc8ea4b7135fecc957dd1c3c8abbd24a14320eb05 Apr 22 18:49:24.825309 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:24.825278 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-77bfc79668-44qjh" Apr 22 18:49:24.827631 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:24.827610 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-c9m64\"" Apr 22 18:49:24.836398 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:24.836379 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-77bfc79668-44qjh" Apr 22 18:49:24.953413 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:24.953343 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-77bfc79668-44qjh"] Apr 22 18:49:24.956810 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:49:24.956741 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ceaf6dd_7ffb_4393_9076_3f02c3b84983.slice/crio-880fa51011a722552a36e200047e2a5c81b5848d5bc6b16c15ff7319e534290a WatchSource:0}: Error finding container 880fa51011a722552a36e200047e2a5c81b5848d5bc6b16c15ff7319e534290a: Status 404 returned error can't find the container with id 880fa51011a722552a36e200047e2a5c81b5848d5bc6b16c15ff7319e534290a Apr 22 18:49:25.286502 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:25.286406 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-77bfc79668-44qjh" event={"ID":"5ceaf6dd-7ffb-4393-9076-3f02c3b84983","Type":"ContainerStarted","Data":"f2b43609f8fc69868b3d2568688e0b1a76d8d2d8707b39d0b00ebb6f2ffd2362"} Apr 22 18:49:25.286502 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:25.286461 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-77bfc79668-44qjh" Apr 22 18:49:25.286502 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:25.286478 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-77bfc79668-44qjh" event={"ID":"5ceaf6dd-7ffb-4393-9076-3f02c3b84983","Type":"ContainerStarted","Data":"880fa51011a722552a36e200047e2a5c81b5848d5bc6b16c15ff7319e534290a"} Apr 22 18:49:25.287497 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:25.287469 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-djlpr" event={"ID":"574f89c9-2eec-4ad6-a1ef-c9b8109907a5","Type":"ContainerStarted","Data":"4d782d6682332cdbc4bdfbbfc8ea4b7135fecc957dd1c3c8abbd24a14320eb05"} Apr 22 18:49:25.304861 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:25.304811 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-77bfc79668-44qjh" podStartSLOduration=166.304798761 podStartE2EDuration="2m46.304798761s" podCreationTimestamp="2026-04-22 18:46:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:49:25.304139338 +0000 UTC m=+167.082972824" watchObservedRunningTime="2026-04-22 18:49:25.304798761 +0000 UTC m=+167.083632247" Apr 22 18:49:26.291776 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:26.291734 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-djlpr" event={"ID":"574f89c9-2eec-4ad6-a1ef-c9b8109907a5","Type":"ContainerStarted","Data":"60bba4d2cae25767d7bfc37f9be0440ea85e92d497fee5353e8855e8a74ea933"} Apr 22 18:49:26.312723 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:26.312659 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-djlpr" podStartSLOduration=32.844051555 podStartE2EDuration="34.312640836s" podCreationTimestamp="2026-04-22 18:48:52 +0000 UTC" firstStartedPulling="2026-04-22 18:49:24.708812565 +0000 UTC m=+166.487646041" lastFinishedPulling="2026-04-22 18:49:26.177401852 +0000 UTC m=+167.956235322" observedRunningTime="2026-04-22 18:49:26.310316286 +0000 UTC m=+168.089149772" watchObservedRunningTime="2026-04-22 18:49:26.312640836 +0000 UTC m=+168.091474323" Apr 22 18:49:26.683773 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:26.683740 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5d8dcd44f7-tw2f4"] Apr 22 18:49:26.686041 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:26.686019 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d8dcd44f7-tw2f4" Apr 22 18:49:26.688411 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:26.688388 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 22 18:49:26.688544 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:26.688419 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-q6hpf\"" Apr 22 18:49:26.688625 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:26.688607 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 22 18:49:26.688692 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:26.688648 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 22 18:49:26.689221 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:26.689201 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 22 18:49:26.689221 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:26.689218 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 22 18:49:26.689373 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:26.689262 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 18:49:26.689598 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:26.689581 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 18:49:26.696780 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:26.696756 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5d8dcd44f7-tw2f4"] Apr 22 18:49:26.825610 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:26.825574 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zkddm" Apr 22 18:49:26.827817 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:26.827796 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-rxjsb\"" Apr 22 18:49:26.830723 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:26.830701 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/995cb635-9661-411f-8a71-b0de66906803-console-serving-cert\") pod \"console-5d8dcd44f7-tw2f4\" (UID: \"995cb635-9661-411f-8a71-b0de66906803\") " pod="openshift-console/console-5d8dcd44f7-tw2f4" Apr 22 18:49:26.830840 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:26.830730 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/995cb635-9661-411f-8a71-b0de66906803-service-ca\") pod \"console-5d8dcd44f7-tw2f4\" (UID: \"995cb635-9661-411f-8a71-b0de66906803\") " pod="openshift-console/console-5d8dcd44f7-tw2f4" Apr 22 18:49:26.830840 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:26.830753 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/995cb635-9661-411f-8a71-b0de66906803-oauth-serving-cert\") pod \"console-5d8dcd44f7-tw2f4\" (UID: \"995cb635-9661-411f-8a71-b0de66906803\") " pod="openshift-console/console-5d8dcd44f7-tw2f4" Apr 22 18:49:26.830840 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:26.830804 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ln7bv\" (UniqueName: \"kubernetes.io/projected/995cb635-9661-411f-8a71-b0de66906803-kube-api-access-ln7bv\") pod \"console-5d8dcd44f7-tw2f4\" (UID: \"995cb635-9661-411f-8a71-b0de66906803\") " pod="openshift-console/console-5d8dcd44f7-tw2f4" Apr 22 18:49:26.830947 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:26.830856 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/995cb635-9661-411f-8a71-b0de66906803-console-oauth-config\") pod \"console-5d8dcd44f7-tw2f4\" (UID: \"995cb635-9661-411f-8a71-b0de66906803\") " pod="openshift-console/console-5d8dcd44f7-tw2f4" Apr 22 18:49:26.830947 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:26.830885 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/995cb635-9661-411f-8a71-b0de66906803-console-config\") pod \"console-5d8dcd44f7-tw2f4\" (UID: \"995cb635-9661-411f-8a71-b0de66906803\") " pod="openshift-console/console-5d8dcd44f7-tw2f4" Apr 22 18:49:26.836658 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:26.836640 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zkddm" Apr 22 18:49:26.931556 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:26.931521 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/995cb635-9661-411f-8a71-b0de66906803-console-serving-cert\") pod \"console-5d8dcd44f7-tw2f4\" (UID: \"995cb635-9661-411f-8a71-b0de66906803\") " pod="openshift-console/console-5d8dcd44f7-tw2f4" Apr 22 18:49:26.931715 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:26.931580 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/995cb635-9661-411f-8a71-b0de66906803-service-ca\") pod \"console-5d8dcd44f7-tw2f4\" (UID: \"995cb635-9661-411f-8a71-b0de66906803\") " pod="openshift-console/console-5d8dcd44f7-tw2f4" Apr 22 18:49:26.931715 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:26.931642 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/995cb635-9661-411f-8a71-b0de66906803-oauth-serving-cert\") pod \"console-5d8dcd44f7-tw2f4\" (UID: \"995cb635-9661-411f-8a71-b0de66906803\") " pod="openshift-console/console-5d8dcd44f7-tw2f4" Apr 22 18:49:26.931715 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:26.931668 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ln7bv\" (UniqueName: \"kubernetes.io/projected/995cb635-9661-411f-8a71-b0de66906803-kube-api-access-ln7bv\") pod \"console-5d8dcd44f7-tw2f4\" (UID: \"995cb635-9661-411f-8a71-b0de66906803\") " pod="openshift-console/console-5d8dcd44f7-tw2f4" Apr 22 18:49:26.931890 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:26.931730 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/995cb635-9661-411f-8a71-b0de66906803-console-oauth-config\") pod \"console-5d8dcd44f7-tw2f4\" (UID: \"995cb635-9661-411f-8a71-b0de66906803\") " pod="openshift-console/console-5d8dcd44f7-tw2f4" Apr 22 18:49:26.931890 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:26.931756 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/995cb635-9661-411f-8a71-b0de66906803-console-config\") pod \"console-5d8dcd44f7-tw2f4\" (UID: \"995cb635-9661-411f-8a71-b0de66906803\") " pod="openshift-console/console-5d8dcd44f7-tw2f4" Apr 22 18:49:26.933036 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:26.932607 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/995cb635-9661-411f-8a71-b0de66906803-console-config\") pod \"console-5d8dcd44f7-tw2f4\" (UID: \"995cb635-9661-411f-8a71-b0de66906803\") " pod="openshift-console/console-5d8dcd44f7-tw2f4" Apr 22 18:49:26.933036 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:26.932688 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/995cb635-9661-411f-8a71-b0de66906803-service-ca\") pod \"console-5d8dcd44f7-tw2f4\" (UID: \"995cb635-9661-411f-8a71-b0de66906803\") " pod="openshift-console/console-5d8dcd44f7-tw2f4" Apr 22 18:49:26.933036 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:26.933016 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/995cb635-9661-411f-8a71-b0de66906803-oauth-serving-cert\") pod \"console-5d8dcd44f7-tw2f4\" (UID: \"995cb635-9661-411f-8a71-b0de66906803\") " pod="openshift-console/console-5d8dcd44f7-tw2f4" Apr 22 18:49:26.934712 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:26.934644 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/995cb635-9661-411f-8a71-b0de66906803-console-serving-cert\") pod \"console-5d8dcd44f7-tw2f4\" (UID: \"995cb635-9661-411f-8a71-b0de66906803\") " pod="openshift-console/console-5d8dcd44f7-tw2f4" Apr 22 18:49:26.935125 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:26.935105 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/995cb635-9661-411f-8a71-b0de66906803-console-oauth-config\") pod \"console-5d8dcd44f7-tw2f4\" (UID: \"995cb635-9661-411f-8a71-b0de66906803\") " pod="openshift-console/console-5d8dcd44f7-tw2f4" Apr 22 18:49:26.940171 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:26.940149 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ln7bv\" (UniqueName: \"kubernetes.io/projected/995cb635-9661-411f-8a71-b0de66906803-kube-api-access-ln7bv\") pod \"console-5d8dcd44f7-tw2f4\" (UID: \"995cb635-9661-411f-8a71-b0de66906803\") " pod="openshift-console/console-5d8dcd44f7-tw2f4" Apr 22 18:49:26.955325 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:26.955305 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zkddm"] Apr 22 18:49:26.958486 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:49:26.958457 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a00536f_bef4_4ca5_bc76_1956a1edb1d1.slice/crio-ac2b4393ed8a0af67b66d9b23e76efd93fb7a02da1128bf5fa46a81884938646 WatchSource:0}: Error finding container ac2b4393ed8a0af67b66d9b23e76efd93fb7a02da1128bf5fa46a81884938646: Status 404 returned error can't find the container with id ac2b4393ed8a0af67b66d9b23e76efd93fb7a02da1128bf5fa46a81884938646 Apr 22 18:49:26.995602 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:26.995573 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d8dcd44f7-tw2f4" Apr 22 18:49:27.111427 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:27.111394 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5d8dcd44f7-tw2f4"] Apr 22 18:49:27.114215 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:49:27.114190 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod995cb635_9661_411f_8a71_b0de66906803.slice/crio-ab90fdf730ff9d56a9e05feed2ca5466f82ef77d4c7e4d47a244f1c3a3042e7c WatchSource:0}: Error finding container ab90fdf730ff9d56a9e05feed2ca5466f82ef77d4c7e4d47a244f1c3a3042e7c: Status 404 returned error can't find the container with id ab90fdf730ff9d56a9e05feed2ca5466f82ef77d4c7e4d47a244f1c3a3042e7c Apr 22 18:49:27.296023 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:27.295898 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d8dcd44f7-tw2f4" event={"ID":"995cb635-9661-411f-8a71-b0de66906803","Type":"ContainerStarted","Data":"ab90fdf730ff9d56a9e05feed2ca5466f82ef77d4c7e4d47a244f1c3a3042e7c"} Apr 22 18:49:27.297064 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:27.297025 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zkddm" event={"ID":"9a00536f-bef4-4ca5-bc76-1956a1edb1d1","Type":"ContainerStarted","Data":"ac2b4393ed8a0af67b66d9b23e76efd93fb7a02da1128bf5fa46a81884938646"} Apr 22 18:49:29.304116 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:29.304083 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zkddm" event={"ID":"9a00536f-bef4-4ca5-bc76-1956a1edb1d1","Type":"ContainerStarted","Data":"ba9b413496a3def03b458fb72240469b75b3cea26a50a9fc46ee731481e12d99"} Apr 22 18:49:29.317744 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:29.317686 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-zkddm" podStartSLOduration=137.794830051 podStartE2EDuration="2m19.317669782s" podCreationTimestamp="2026-04-22 18:47:10 +0000 UTC" firstStartedPulling="2026-04-22 18:49:26.960304779 +0000 UTC m=+168.739138243" lastFinishedPulling="2026-04-22 18:49:28.4831445 +0000 UTC m=+170.261977974" observedRunningTime="2026-04-22 18:49:29.317507644 +0000 UTC m=+171.096341129" watchObservedRunningTime="2026-04-22 18:49:29.317669782 +0000 UTC m=+171.096503269" Apr 22 18:49:29.825496 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:29.825460 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7zj27" Apr 22 18:49:30.307770 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:30.307730 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d8dcd44f7-tw2f4" event={"ID":"995cb635-9661-411f-8a71-b0de66906803","Type":"ContainerStarted","Data":"6fa680d98332e49264fe9d7ff137778cdde1aa2f1b6211a73d37875dc4e72c80"} Apr 22 18:49:30.323492 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:30.323450 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5d8dcd44f7-tw2f4" podStartSLOduration=1.487231221 podStartE2EDuration="4.323437325s" podCreationTimestamp="2026-04-22 18:49:26 +0000 UTC" firstStartedPulling="2026-04-22 18:49:27.1160491 +0000 UTC m=+168.894882564" lastFinishedPulling="2026-04-22 18:49:29.952255203 +0000 UTC m=+171.731088668" observedRunningTime="2026-04-22 18:49:30.32237637 +0000 UTC m=+172.101209856" watchObservedRunningTime="2026-04-22 18:49:30.323437325 +0000 UTC m=+172.102270811" Apr 22 18:49:30.717578 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:30.717545 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-htlw7"] Apr 22 18:49:30.720660 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:30.720643 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-htlw7" Apr 22 18:49:30.723033 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:30.723007 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 22 18:49:30.723156 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:30.723093 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-wlsnk\"" Apr 22 18:49:30.723156 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:30.723093 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 22 18:49:30.723156 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:30.723147 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 18:49:30.728583 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:30.728561 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-htlw7"] Apr 22 18:49:30.864647 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:30.864619 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2dbd4d57-a75b-405a-bc03-338b3a2a721c-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-htlw7\" (UID: \"2dbd4d57-a75b-405a-bc03-338b3a2a721c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-htlw7" Apr 22 18:49:30.864819 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:30.864657 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/2dbd4d57-a75b-405a-bc03-338b3a2a721c-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-htlw7\" (UID: \"2dbd4d57-a75b-405a-bc03-338b3a2a721c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-htlw7" Apr 22 18:49:30.864819 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:30.864703 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2dbd4d57-a75b-405a-bc03-338b3a2a721c-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-htlw7\" (UID: \"2dbd4d57-a75b-405a-bc03-338b3a2a721c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-htlw7" Apr 22 18:49:30.864819 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:30.864731 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgt9h\" (UniqueName: \"kubernetes.io/projected/2dbd4d57-a75b-405a-bc03-338b3a2a721c-kube-api-access-jgt9h\") pod \"prometheus-operator-5676c8c784-htlw7\" (UID: \"2dbd4d57-a75b-405a-bc03-338b3a2a721c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-htlw7" Apr 22 18:49:30.965326 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:30.965286 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/2dbd4d57-a75b-405a-bc03-338b3a2a721c-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-htlw7\" (UID: \"2dbd4d57-a75b-405a-bc03-338b3a2a721c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-htlw7" Apr 22 18:49:30.965524 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:30.965348 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2dbd4d57-a75b-405a-bc03-338b3a2a721c-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-htlw7\" (UID: \"2dbd4d57-a75b-405a-bc03-338b3a2a721c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-htlw7" Apr 22 18:49:30.965524 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:30.965381 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jgt9h\" (UniqueName: \"kubernetes.io/projected/2dbd4d57-a75b-405a-bc03-338b3a2a721c-kube-api-access-jgt9h\") pod \"prometheus-operator-5676c8c784-htlw7\" (UID: \"2dbd4d57-a75b-405a-bc03-338b3a2a721c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-htlw7" Apr 22 18:49:30.965524 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:49:30.965444 2566 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 22 18:49:30.965524 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:30.965486 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2dbd4d57-a75b-405a-bc03-338b3a2a721c-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-htlw7\" (UID: \"2dbd4d57-a75b-405a-bc03-338b3a2a721c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-htlw7" Apr 22 18:49:30.965524 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:49:30.965517 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2dbd4d57-a75b-405a-bc03-338b3a2a721c-prometheus-operator-tls podName:2dbd4d57-a75b-405a-bc03-338b3a2a721c nodeName:}" failed. No retries permitted until 2026-04-22 18:49:31.465498707 +0000 UTC m=+173.244332175 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/2dbd4d57-a75b-405a-bc03-338b3a2a721c-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-htlw7" (UID: "2dbd4d57-a75b-405a-bc03-338b3a2a721c") : secret "prometheus-operator-tls" not found Apr 22 18:49:30.966160 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:30.966139 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2dbd4d57-a75b-405a-bc03-338b3a2a721c-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-htlw7\" (UID: \"2dbd4d57-a75b-405a-bc03-338b3a2a721c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-htlw7" Apr 22 18:49:30.967817 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:30.967764 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2dbd4d57-a75b-405a-bc03-338b3a2a721c-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-htlw7\" (UID: \"2dbd4d57-a75b-405a-bc03-338b3a2a721c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-htlw7" Apr 22 18:49:30.974400 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:30.974377 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgt9h\" (UniqueName: \"kubernetes.io/projected/2dbd4d57-a75b-405a-bc03-338b3a2a721c-kube-api-access-jgt9h\") pod \"prometheus-operator-5676c8c784-htlw7\" (UID: \"2dbd4d57-a75b-405a-bc03-338b3a2a721c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-htlw7" Apr 22 18:49:31.277765 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:31.277675 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-6czt4" Apr 22 18:49:31.470657 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:31.470618 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/2dbd4d57-a75b-405a-bc03-338b3a2a721c-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-htlw7\" (UID: \"2dbd4d57-a75b-405a-bc03-338b3a2a721c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-htlw7" Apr 22 18:49:31.472987 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:31.472968 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/2dbd4d57-a75b-405a-bc03-338b3a2a721c-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-htlw7\" (UID: \"2dbd4d57-a75b-405a-bc03-338b3a2a721c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-htlw7" Apr 22 18:49:31.629592 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:31.629561 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-htlw7" Apr 22 18:49:31.744925 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:31.744895 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-htlw7"] Apr 22 18:49:31.747881 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:49:31.747855 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2dbd4d57_a75b_405a_bc03_338b3a2a721c.slice/crio-8a327616af343dc8b5c680a8caf8a171a5d716c4aabf6c5f0a86108186d885e9 WatchSource:0}: Error finding container 8a327616af343dc8b5c680a8caf8a171a5d716c4aabf6c5f0a86108186d885e9: Status 404 returned error can't find the container with id 8a327616af343dc8b5c680a8caf8a171a5d716c4aabf6c5f0a86108186d885e9 Apr 22 18:49:32.314122 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:32.314086 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-htlw7" event={"ID":"2dbd4d57-a75b-405a-bc03-338b3a2a721c","Type":"ContainerStarted","Data":"8a327616af343dc8b5c680a8caf8a171a5d716c4aabf6c5f0a86108186d885e9"} Apr 22 18:49:33.321986 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:33.321874 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-htlw7" event={"ID":"2dbd4d57-a75b-405a-bc03-338b3a2a721c","Type":"ContainerStarted","Data":"5d6553d521143ee13e127f379ee60fb8de6357eed53aa1ec89d83971d4a80386"} Apr 22 18:49:33.321986 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:33.321914 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-htlw7" event={"ID":"2dbd4d57-a75b-405a-bc03-338b3a2a721c","Type":"ContainerStarted","Data":"9414c57916a32972c41cd0c612c91d2f47963d0e4694fb4bfd637ee02394f007"} Apr 22 18:49:33.337001 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:33.336931 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-htlw7" podStartSLOduration=2.147268404 podStartE2EDuration="3.336917563s" podCreationTimestamp="2026-04-22 18:49:30 +0000 UTC" firstStartedPulling="2026-04-22 18:49:31.749693052 +0000 UTC m=+173.528526521" lastFinishedPulling="2026-04-22 18:49:32.939342213 +0000 UTC m=+174.718175680" observedRunningTime="2026-04-22 18:49:33.336018139 +0000 UTC m=+175.114851624" watchObservedRunningTime="2026-04-22 18:49:33.336917563 +0000 UTC m=+175.115751048" Apr 22 18:49:35.031082 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.031035 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-g2fbc"] Apr 22 18:49:35.034054 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.034026 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-g2fbc" Apr 22 18:49:35.036341 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.036323 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 22 18:49:35.036512 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.036327 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 22 18:49:35.036666 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.036344 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-8wxdv\"" Apr 22 18:49:35.049633 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.049612 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-g2fbc"] Apr 22 18:49:35.071152 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.071128 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-zlrvw"] Apr 22 18:49:35.073735 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.073717 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-zlrvw" Apr 22 18:49:35.076083 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.076063 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 18:49:35.076194 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.076132 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 18:49:35.076265 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.076083 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-sw8gt\"" Apr 22 18:49:35.076526 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.076508 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 18:49:35.099489 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.099464 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/5721f580-f8b3-4921-9cec-388fc89208c3-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-g2fbc\" (UID: \"5721f580-f8b3-4921-9cec-388fc89208c3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-g2fbc" Apr 22 18:49:35.099597 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.099506 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5721f580-f8b3-4921-9cec-388fc89208c3-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-g2fbc\" (UID: \"5721f580-f8b3-4921-9cec-388fc89208c3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-g2fbc" Apr 22 18:49:35.099597 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.099547 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fb8lw\" (UniqueName: \"kubernetes.io/projected/5721f580-f8b3-4921-9cec-388fc89208c3-kube-api-access-fb8lw\") pod \"openshift-state-metrics-9d44df66c-g2fbc\" (UID: \"5721f580-f8b3-4921-9cec-388fc89208c3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-g2fbc" Apr 22 18:49:35.099692 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.099627 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5721f580-f8b3-4921-9cec-388fc89208c3-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-g2fbc\" (UID: \"5721f580-f8b3-4921-9cec-388fc89208c3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-g2fbc" Apr 22 18:49:35.200327 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.200294 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e91496df-a9d7-48a5-a22a-be7cd298ca98-sys\") pod \"node-exporter-zlrvw\" (UID: \"e91496df-a9d7-48a5-a22a-be7cd298ca98\") " pod="openshift-monitoring/node-exporter-zlrvw" Apr 22 18:49:35.200494 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.200357 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5721f580-f8b3-4921-9cec-388fc89208c3-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-g2fbc\" (UID: \"5721f580-f8b3-4921-9cec-388fc89208c3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-g2fbc" Apr 22 18:49:35.200494 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.200429 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e91496df-a9d7-48a5-a22a-be7cd298ca98-root\") pod \"node-exporter-zlrvw\" (UID: \"e91496df-a9d7-48a5-a22a-be7cd298ca98\") " pod="openshift-monitoring/node-exporter-zlrvw" Apr 22 18:49:35.200612 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.200500 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e91496df-a9d7-48a5-a22a-be7cd298ca98-node-exporter-textfile\") pod \"node-exporter-zlrvw\" (UID: \"e91496df-a9d7-48a5-a22a-be7cd298ca98\") " pod="openshift-monitoring/node-exporter-zlrvw" Apr 22 18:49:35.200612 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.200531 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e91496df-a9d7-48a5-a22a-be7cd298ca98-node-exporter-tls\") pod \"node-exporter-zlrvw\" (UID: \"e91496df-a9d7-48a5-a22a-be7cd298ca98\") " pod="openshift-monitoring/node-exporter-zlrvw" Apr 22 18:49:35.200612 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.200560 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e91496df-a9d7-48a5-a22a-be7cd298ca98-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zlrvw\" (UID: \"e91496df-a9d7-48a5-a22a-be7cd298ca98\") " pod="openshift-monitoring/node-exporter-zlrvw" Apr 22 18:49:35.200612 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.200600 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e91496df-a9d7-48a5-a22a-be7cd298ca98-metrics-client-ca\") pod \"node-exporter-zlrvw\" (UID: \"e91496df-a9d7-48a5-a22a-be7cd298ca98\") " pod="openshift-monitoring/node-exporter-zlrvw" Apr 22 18:49:35.200809 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.200643 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75jd5\" (UniqueName: \"kubernetes.io/projected/e91496df-a9d7-48a5-a22a-be7cd298ca98-kube-api-access-75jd5\") pod \"node-exporter-zlrvw\" (UID: \"e91496df-a9d7-48a5-a22a-be7cd298ca98\") " pod="openshift-monitoring/node-exporter-zlrvw" Apr 22 18:49:35.200809 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.200695 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e91496df-a9d7-48a5-a22a-be7cd298ca98-node-exporter-wtmp\") pod \"node-exporter-zlrvw\" (UID: \"e91496df-a9d7-48a5-a22a-be7cd298ca98\") " pod="openshift-monitoring/node-exporter-zlrvw" Apr 22 18:49:35.200809 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.200738 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/5721f580-f8b3-4921-9cec-388fc89208c3-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-g2fbc\" (UID: \"5721f580-f8b3-4921-9cec-388fc89208c3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-g2fbc" Apr 22 18:49:35.200809 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.200767 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5721f580-f8b3-4921-9cec-388fc89208c3-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-g2fbc\" (UID: \"5721f580-f8b3-4921-9cec-388fc89208c3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-g2fbc" Apr 22 18:49:35.200809 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.200797 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e91496df-a9d7-48a5-a22a-be7cd298ca98-node-exporter-accelerators-collector-config\") pod \"node-exporter-zlrvw\" (UID: \"e91496df-a9d7-48a5-a22a-be7cd298ca98\") " pod="openshift-monitoring/node-exporter-zlrvw" Apr 22 18:49:35.201090 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:49:35.200844 2566 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 22 18:49:35.201090 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:49:35.200907 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5721f580-f8b3-4921-9cec-388fc89208c3-openshift-state-metrics-tls podName:5721f580-f8b3-4921-9cec-388fc89208c3 nodeName:}" failed. No retries permitted until 2026-04-22 18:49:35.700885651 +0000 UTC m=+177.479719117 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/5721f580-f8b3-4921-9cec-388fc89208c3-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-g2fbc" (UID: "5721f580-f8b3-4921-9cec-388fc89208c3") : secret "openshift-state-metrics-tls" not found Apr 22 18:49:35.201090 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.200846 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fb8lw\" (UniqueName: \"kubernetes.io/projected/5721f580-f8b3-4921-9cec-388fc89208c3-kube-api-access-fb8lw\") pod \"openshift-state-metrics-9d44df66c-g2fbc\" (UID: \"5721f580-f8b3-4921-9cec-388fc89208c3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-g2fbc" Apr 22 18:49:35.201735 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.201716 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5721f580-f8b3-4921-9cec-388fc89208c3-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-g2fbc\" (UID: \"5721f580-f8b3-4921-9cec-388fc89208c3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-g2fbc" Apr 22 18:49:35.203093 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.203075 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5721f580-f8b3-4921-9cec-388fc89208c3-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-g2fbc\" (UID: \"5721f580-f8b3-4921-9cec-388fc89208c3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-g2fbc" Apr 22 18:49:35.209979 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.209926 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb8lw\" (UniqueName: \"kubernetes.io/projected/5721f580-f8b3-4921-9cec-388fc89208c3-kube-api-access-fb8lw\") pod \"openshift-state-metrics-9d44df66c-g2fbc\" (UID: \"5721f580-f8b3-4921-9cec-388fc89208c3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-g2fbc" Apr 22 18:49:35.302491 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.302414 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e91496df-a9d7-48a5-a22a-be7cd298ca98-metrics-client-ca\") pod \"node-exporter-zlrvw\" (UID: \"e91496df-a9d7-48a5-a22a-be7cd298ca98\") " pod="openshift-monitoring/node-exporter-zlrvw" Apr 22 18:49:35.302491 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.302450 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-75jd5\" (UniqueName: \"kubernetes.io/projected/e91496df-a9d7-48a5-a22a-be7cd298ca98-kube-api-access-75jd5\") pod \"node-exporter-zlrvw\" (UID: \"e91496df-a9d7-48a5-a22a-be7cd298ca98\") " pod="openshift-monitoring/node-exporter-zlrvw" Apr 22 18:49:35.302491 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.302474 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e91496df-a9d7-48a5-a22a-be7cd298ca98-node-exporter-wtmp\") pod \"node-exporter-zlrvw\" (UID: \"e91496df-a9d7-48a5-a22a-be7cd298ca98\") " pod="openshift-monitoring/node-exporter-zlrvw" Apr 22 18:49:35.302778 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.302751 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e91496df-a9d7-48a5-a22a-be7cd298ca98-node-exporter-accelerators-collector-config\") pod \"node-exporter-zlrvw\" (UID: \"e91496df-a9d7-48a5-a22a-be7cd298ca98\") " pod="openshift-monitoring/node-exporter-zlrvw" Apr 22 18:49:35.302901 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.302838 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e91496df-a9d7-48a5-a22a-be7cd298ca98-sys\") pod \"node-exporter-zlrvw\" (UID: \"e91496df-a9d7-48a5-a22a-be7cd298ca98\") " pod="openshift-monitoring/node-exporter-zlrvw" Apr 22 18:49:35.302901 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.302854 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e91496df-a9d7-48a5-a22a-be7cd298ca98-node-exporter-wtmp\") pod \"node-exporter-zlrvw\" (UID: \"e91496df-a9d7-48a5-a22a-be7cd298ca98\") " pod="openshift-monitoring/node-exporter-zlrvw" Apr 22 18:49:35.302901 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.302886 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e91496df-a9d7-48a5-a22a-be7cd298ca98-root\") pod \"node-exporter-zlrvw\" (UID: \"e91496df-a9d7-48a5-a22a-be7cd298ca98\") " pod="openshift-monitoring/node-exporter-zlrvw" Apr 22 18:49:35.303051 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.302905 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e91496df-a9d7-48a5-a22a-be7cd298ca98-sys\") pod \"node-exporter-zlrvw\" (UID: \"e91496df-a9d7-48a5-a22a-be7cd298ca98\") " pod="openshift-monitoring/node-exporter-zlrvw" Apr 22 18:49:35.303051 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.302926 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e91496df-a9d7-48a5-a22a-be7cd298ca98-node-exporter-textfile\") pod \"node-exporter-zlrvw\" (UID: \"e91496df-a9d7-48a5-a22a-be7cd298ca98\") " pod="openshift-monitoring/node-exporter-zlrvw" Apr 22 18:49:35.303051 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.302957 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e91496df-a9d7-48a5-a22a-be7cd298ca98-root\") pod \"node-exporter-zlrvw\" (UID: \"e91496df-a9d7-48a5-a22a-be7cd298ca98\") " pod="openshift-monitoring/node-exporter-zlrvw" Apr 22 18:49:35.303051 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.302971 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e91496df-a9d7-48a5-a22a-be7cd298ca98-node-exporter-tls\") pod \"node-exporter-zlrvw\" (UID: \"e91496df-a9d7-48a5-a22a-be7cd298ca98\") " pod="openshift-monitoring/node-exporter-zlrvw" Apr 22 18:49:35.303051 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.303000 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e91496df-a9d7-48a5-a22a-be7cd298ca98-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zlrvw\" (UID: \"e91496df-a9d7-48a5-a22a-be7cd298ca98\") " pod="openshift-monitoring/node-exporter-zlrvw" Apr 22 18:49:35.303284 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:49:35.303143 2566 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 18:49:35.303284 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.303189 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e91496df-a9d7-48a5-a22a-be7cd298ca98-metrics-client-ca\") pod \"node-exporter-zlrvw\" (UID: \"e91496df-a9d7-48a5-a22a-be7cd298ca98\") " pod="openshift-monitoring/node-exporter-zlrvw" Apr 22 18:49:35.303284 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:49:35.303220 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e91496df-a9d7-48a5-a22a-be7cd298ca98-node-exporter-tls podName:e91496df-a9d7-48a5-a22a-be7cd298ca98 nodeName:}" failed. No retries permitted until 2026-04-22 18:49:35.803201514 +0000 UTC m=+177.582035002 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/e91496df-a9d7-48a5-a22a-be7cd298ca98-node-exporter-tls") pod "node-exporter-zlrvw" (UID: "e91496df-a9d7-48a5-a22a-be7cd298ca98") : secret "node-exporter-tls" not found Apr 22 18:49:35.303284 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.303270 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e91496df-a9d7-48a5-a22a-be7cd298ca98-node-exporter-textfile\") pod \"node-exporter-zlrvw\" (UID: \"e91496df-a9d7-48a5-a22a-be7cd298ca98\") " pod="openshift-monitoring/node-exporter-zlrvw" Apr 22 18:49:35.303559 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.303538 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e91496df-a9d7-48a5-a22a-be7cd298ca98-node-exporter-accelerators-collector-config\") pod \"node-exporter-zlrvw\" (UID: \"e91496df-a9d7-48a5-a22a-be7cd298ca98\") " pod="openshift-monitoring/node-exporter-zlrvw" Apr 22 18:49:35.305138 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.305119 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e91496df-a9d7-48a5-a22a-be7cd298ca98-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zlrvw\" (UID: \"e91496df-a9d7-48a5-a22a-be7cd298ca98\") " pod="openshift-monitoring/node-exporter-zlrvw" Apr 22 18:49:35.309126 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.309102 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-75jd5\" (UniqueName: \"kubernetes.io/projected/e91496df-a9d7-48a5-a22a-be7cd298ca98-kube-api-access-75jd5\") pod \"node-exporter-zlrvw\" (UID: \"e91496df-a9d7-48a5-a22a-be7cd298ca98\") " pod="openshift-monitoring/node-exporter-zlrvw" Apr 22 18:49:35.456421 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.456387 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6f4b7497f6-zfdtd"] Apr 22 18:49:35.459680 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.459656 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f4b7497f6-zfdtd" Apr 22 18:49:35.467441 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.467122 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 22 18:49:35.473397 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.473372 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6f4b7497f6-zfdtd"] Apr 22 18:49:35.605297 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.605209 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ca94f287-c422-413d-9356-adf2fb7ef856-service-ca\") pod \"console-6f4b7497f6-zfdtd\" (UID: \"ca94f287-c422-413d-9356-adf2fb7ef856\") " pod="openshift-console/console-6f4b7497f6-zfdtd" Apr 22 18:49:35.605297 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.605260 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ca94f287-c422-413d-9356-adf2fb7ef856-console-oauth-config\") pod \"console-6f4b7497f6-zfdtd\" (UID: \"ca94f287-c422-413d-9356-adf2fb7ef856\") " pod="openshift-console/console-6f4b7497f6-zfdtd" Apr 22 18:49:35.605508 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.605308 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ca94f287-c422-413d-9356-adf2fb7ef856-console-config\") pod \"console-6f4b7497f6-zfdtd\" (UID: \"ca94f287-c422-413d-9356-adf2fb7ef856\") " pod="openshift-console/console-6f4b7497f6-zfdtd" Apr 22 18:49:35.605508 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.605344 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kvbh\" (UniqueName: \"kubernetes.io/projected/ca94f287-c422-413d-9356-adf2fb7ef856-kube-api-access-8kvbh\") pod \"console-6f4b7497f6-zfdtd\" (UID: \"ca94f287-c422-413d-9356-adf2fb7ef856\") " pod="openshift-console/console-6f4b7497f6-zfdtd" Apr 22 18:49:35.605508 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.605420 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca94f287-c422-413d-9356-adf2fb7ef856-trusted-ca-bundle\") pod \"console-6f4b7497f6-zfdtd\" (UID: \"ca94f287-c422-413d-9356-adf2fb7ef856\") " pod="openshift-console/console-6f4b7497f6-zfdtd" Apr 22 18:49:35.605508 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.605456 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ca94f287-c422-413d-9356-adf2fb7ef856-oauth-serving-cert\") pod \"console-6f4b7497f6-zfdtd\" (UID: \"ca94f287-c422-413d-9356-adf2fb7ef856\") " pod="openshift-console/console-6f4b7497f6-zfdtd" Apr 22 18:49:35.605663 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.605514 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ca94f287-c422-413d-9356-adf2fb7ef856-console-serving-cert\") pod \"console-6f4b7497f6-zfdtd\" (UID: \"ca94f287-c422-413d-9356-adf2fb7ef856\") " pod="openshift-console/console-6f4b7497f6-zfdtd" Apr 22 18:49:35.706389 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.706354 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/5721f580-f8b3-4921-9cec-388fc89208c3-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-g2fbc\" (UID: \"5721f580-f8b3-4921-9cec-388fc89208c3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-g2fbc" Apr 22 18:49:35.706389 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.706391 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ca94f287-c422-413d-9356-adf2fb7ef856-service-ca\") pod \"console-6f4b7497f6-zfdtd\" (UID: \"ca94f287-c422-413d-9356-adf2fb7ef856\") " pod="openshift-console/console-6f4b7497f6-zfdtd" Apr 22 18:49:35.706585 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.706418 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ca94f287-c422-413d-9356-adf2fb7ef856-console-oauth-config\") pod \"console-6f4b7497f6-zfdtd\" (UID: \"ca94f287-c422-413d-9356-adf2fb7ef856\") " pod="openshift-console/console-6f4b7497f6-zfdtd" Apr 22 18:49:35.706585 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.706436 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ca94f287-c422-413d-9356-adf2fb7ef856-console-config\") pod \"console-6f4b7497f6-zfdtd\" (UID: \"ca94f287-c422-413d-9356-adf2fb7ef856\") " pod="openshift-console/console-6f4b7497f6-zfdtd" Apr 22 18:49:35.706585 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.706565 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8kvbh\" (UniqueName: \"kubernetes.io/projected/ca94f287-c422-413d-9356-adf2fb7ef856-kube-api-access-8kvbh\") pod \"console-6f4b7497f6-zfdtd\" (UID: \"ca94f287-c422-413d-9356-adf2fb7ef856\") " pod="openshift-console/console-6f4b7497f6-zfdtd" Apr 22 18:49:35.706703 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.706619 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca94f287-c422-413d-9356-adf2fb7ef856-trusted-ca-bundle\") pod \"console-6f4b7497f6-zfdtd\" (UID: \"ca94f287-c422-413d-9356-adf2fb7ef856\") " pod="openshift-console/console-6f4b7497f6-zfdtd" Apr 22 18:49:35.706703 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.706665 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ca94f287-c422-413d-9356-adf2fb7ef856-oauth-serving-cert\") pod \"console-6f4b7497f6-zfdtd\" (UID: \"ca94f287-c422-413d-9356-adf2fb7ef856\") " pod="openshift-console/console-6f4b7497f6-zfdtd" Apr 22 18:49:35.706796 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.706713 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ca94f287-c422-413d-9356-adf2fb7ef856-console-serving-cert\") pod \"console-6f4b7497f6-zfdtd\" (UID: \"ca94f287-c422-413d-9356-adf2fb7ef856\") " pod="openshift-console/console-6f4b7497f6-zfdtd" Apr 22 18:49:35.707180 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.707151 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ca94f287-c422-413d-9356-adf2fb7ef856-console-config\") pod \"console-6f4b7497f6-zfdtd\" (UID: \"ca94f287-c422-413d-9356-adf2fb7ef856\") " pod="openshift-console/console-6f4b7497f6-zfdtd" Apr 22 18:49:35.707180 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.707175 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ca94f287-c422-413d-9356-adf2fb7ef856-service-ca\") pod \"console-6f4b7497f6-zfdtd\" (UID: \"ca94f287-c422-413d-9356-adf2fb7ef856\") " pod="openshift-console/console-6f4b7497f6-zfdtd" Apr 22 18:49:35.707489 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.707468 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca94f287-c422-413d-9356-adf2fb7ef856-trusted-ca-bundle\") pod \"console-6f4b7497f6-zfdtd\" (UID: \"ca94f287-c422-413d-9356-adf2fb7ef856\") " pod="openshift-console/console-6f4b7497f6-zfdtd" Apr 22 18:49:35.707574 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.707550 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ca94f287-c422-413d-9356-adf2fb7ef856-oauth-serving-cert\") pod \"console-6f4b7497f6-zfdtd\" (UID: \"ca94f287-c422-413d-9356-adf2fb7ef856\") " pod="openshift-console/console-6f4b7497f6-zfdtd" Apr 22 18:49:35.709098 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.709079 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ca94f287-c422-413d-9356-adf2fb7ef856-console-serving-cert\") pod \"console-6f4b7497f6-zfdtd\" (UID: \"ca94f287-c422-413d-9356-adf2fb7ef856\") " pod="openshift-console/console-6f4b7497f6-zfdtd" Apr 22 18:49:35.709328 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.709308 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ca94f287-c422-413d-9356-adf2fb7ef856-console-oauth-config\") pod \"console-6f4b7497f6-zfdtd\" (UID: \"ca94f287-c422-413d-9356-adf2fb7ef856\") " pod="openshift-console/console-6f4b7497f6-zfdtd" Apr 22 18:49:35.709382 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.709367 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/5721f580-f8b3-4921-9cec-388fc89208c3-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-g2fbc\" (UID: \"5721f580-f8b3-4921-9cec-388fc89208c3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-g2fbc" Apr 22 18:49:35.714125 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.714102 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kvbh\" (UniqueName: \"kubernetes.io/projected/ca94f287-c422-413d-9356-adf2fb7ef856-kube-api-access-8kvbh\") pod \"console-6f4b7497f6-zfdtd\" (UID: \"ca94f287-c422-413d-9356-adf2fb7ef856\") " pod="openshift-console/console-6f4b7497f6-zfdtd" Apr 22 18:49:35.770101 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.770073 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f4b7497f6-zfdtd" Apr 22 18:49:35.807660 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.807628 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e91496df-a9d7-48a5-a22a-be7cd298ca98-node-exporter-tls\") pod \"node-exporter-zlrvw\" (UID: \"e91496df-a9d7-48a5-a22a-be7cd298ca98\") " pod="openshift-monitoring/node-exporter-zlrvw" Apr 22 18:49:35.809723 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.809702 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e91496df-a9d7-48a5-a22a-be7cd298ca98-node-exporter-tls\") pod \"node-exporter-zlrvw\" (UID: \"e91496df-a9d7-48a5-a22a-be7cd298ca98\") " pod="openshift-monitoring/node-exporter-zlrvw" Apr 22 18:49:35.890885 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.890854 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6f4b7497f6-zfdtd"] Apr 22 18:49:35.893656 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:49:35.893629 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca94f287_c422_413d_9356_adf2fb7ef856.slice/crio-92dcc458d3727fc5d73c407740b2a9544964baaeca02c3b6437ee69ccabf7493 WatchSource:0}: Error finding container 92dcc458d3727fc5d73c407740b2a9544964baaeca02c3b6437ee69ccabf7493: Status 404 returned error can't find the container with id 92dcc458d3727fc5d73c407740b2a9544964baaeca02c3b6437ee69ccabf7493 Apr 22 18:49:35.946562 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.946543 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-g2fbc" Apr 22 18:49:35.985105 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:35.984767 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-zlrvw" Apr 22 18:49:35.997526 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:49:35.994260 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode91496df_a9d7_48a5_a22a_be7cd298ca98.slice/crio-1237f60f8c2bbcf619b498d122d328afff111047cb7dbbb9b23205a368ebda02 WatchSource:0}: Error finding container 1237f60f8c2bbcf619b498d122d328afff111047cb7dbbb9b23205a368ebda02: Status 404 returned error can't find the container with id 1237f60f8c2bbcf619b498d122d328afff111047cb7dbbb9b23205a368ebda02 Apr 22 18:49:36.085810 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:36.085778 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-g2fbc"] Apr 22 18:49:36.090596 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:49:36.090572 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5721f580_f8b3_4921_9cec_388fc89208c3.slice/crio-9167d150efee30a69de28c565309237fcfa21cad7b9583c48bf0026774d08019 WatchSource:0}: Error finding container 9167d150efee30a69de28c565309237fcfa21cad7b9583c48bf0026774d08019: Status 404 returned error can't find the container with id 9167d150efee30a69de28c565309237fcfa21cad7b9583c48bf0026774d08019 Apr 22 18:49:36.134585 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:36.134560 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:49:36.137283 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:36.137188 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:36.139374 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:36.139354 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 22 18:49:36.140058 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:36.139677 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 22 18:49:36.140058 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:36.139802 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 22 18:49:36.140058 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:36.139932 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 22 18:49:36.140261 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:36.140119 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 22 18:49:36.141421 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:36.140867 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 22 18:49:36.141421 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:36.140927 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 22 18:49:36.141421 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:36.141022 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 22 18:49:36.141421 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:36.141205 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-9wsbq\"" Apr 22 18:49:36.141421 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:36.141213 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 22 18:49:36.159731 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:36.159105 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:49:36.210688 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:36.210133 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8bc4df07-ae0b-4182-b241-870e0ed1be3d-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:36.210688 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:36.210178 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8bc4df07-ae0b-4182-b241-870e0ed1be3d-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:36.210688 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:36.210210 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8bc4df07-ae0b-4182-b241-870e0ed1be3d-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:36.210688 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:36.210248 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8bc4df07-ae0b-4182-b241-870e0ed1be3d-config-volume\") pod \"alertmanager-main-0\" (UID: \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:36.210688 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:36.210279 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8bc4df07-ae0b-4182-b241-870e0ed1be3d-config-out\") pod \"alertmanager-main-0\" (UID: \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:36.210688 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:36.210302 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8bc4df07-ae0b-4182-b241-870e0ed1be3d-web-config\") pod \"alertmanager-main-0\" (UID: \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:36.210688 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:36.210332 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8bc4df07-ae0b-4182-b241-870e0ed1be3d-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:36.210688 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:36.210366 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8bc4df07-ae0b-4182-b241-870e0ed1be3d-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:36.210688 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:36.210405 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cgb4\" (UniqueName: \"kubernetes.io/projected/8bc4df07-ae0b-4182-b241-870e0ed1be3d-kube-api-access-4cgb4\") pod \"alertmanager-main-0\" (UID: \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:36.210688 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:36.210442 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8bc4df07-ae0b-4182-b241-870e0ed1be3d-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:36.210688 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:36.210470 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8bc4df07-ae0b-4182-b241-870e0ed1be3d-tls-assets\") pod \"alertmanager-main-0\" (UID: \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:36.210688 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:36.210491 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8bc4df07-ae0b-4182-b241-870e0ed1be3d-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:36.210688 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:36.210536 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8bc4df07-ae0b-4182-b241-870e0ed1be3d-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:36.311325 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:36.311286 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8bc4df07-ae0b-4182-b241-870e0ed1be3d-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:36.311474 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:36.311353 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8bc4df07-ae0b-4182-b241-870e0ed1be3d-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:36.311474 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:36.311379 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8bc4df07-ae0b-4182-b241-870e0ed1be3d-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:36.311474 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:36.311408 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8bc4df07-ae0b-4182-b241-870e0ed1be3d-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:36.311474 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:36.311445 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8bc4df07-ae0b-4182-b241-870e0ed1be3d-config-volume\") pod \"alertmanager-main-0\" (UID: \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:36.311686 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:36.311477 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8bc4df07-ae0b-4182-b241-870e0ed1be3d-config-out\") pod \"alertmanager-main-0\" (UID: \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:36.311686 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:36.311501 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8bc4df07-ae0b-4182-b241-870e0ed1be3d-web-config\") pod \"alertmanager-main-0\" (UID: \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:36.311686 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:36.311535 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8bc4df07-ae0b-4182-b241-870e0ed1be3d-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:36.311686 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:36.311572 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8bc4df07-ae0b-4182-b241-870e0ed1be3d-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:36.311686 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:36.311614 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4cgb4\" (UniqueName: \"kubernetes.io/projected/8bc4df07-ae0b-4182-b241-870e0ed1be3d-kube-api-access-4cgb4\") pod \"alertmanager-main-0\" (UID: \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:36.311686 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:36.311656 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8bc4df07-ae0b-4182-b241-870e0ed1be3d-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:36.311686 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:36.311686 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8bc4df07-ae0b-4182-b241-870e0ed1be3d-tls-assets\") pod \"alertmanager-main-0\" (UID: \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:36.312075 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:36.311709 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8bc4df07-ae0b-4182-b241-870e0ed1be3d-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:36.312075 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:36.311848 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8bc4df07-ae0b-4182-b241-870e0ed1be3d-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:36.312920 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:36.312595 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8bc4df07-ae0b-4182-b241-870e0ed1be3d-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:36.313059 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:36.312917 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8bc4df07-ae0b-4182-b241-870e0ed1be3d-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:36.314459 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:36.314431 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8bc4df07-ae0b-4182-b241-870e0ed1be3d-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:36.314722 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:36.314688 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8bc4df07-ae0b-4182-b241-870e0ed1be3d-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:36.316144 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:36.316103 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8bc4df07-ae0b-4182-b241-870e0ed1be3d-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:36.316232 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:36.316203 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8bc4df07-ae0b-4182-b241-870e0ed1be3d-tls-assets\") pod \"alertmanager-main-0\" (UID: \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:36.316394 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:36.316369 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8bc4df07-ae0b-4182-b241-870e0ed1be3d-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:36.316608 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:36.316589 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8bc4df07-ae0b-4182-b241-870e0ed1be3d-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:36.316672 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:36.316603 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8bc4df07-ae0b-4182-b241-870e0ed1be3d-web-config\") pod \"alertmanager-main-0\" (UID: \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:36.316956 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:36.316916 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8bc4df07-ae0b-4182-b241-870e0ed1be3d-config-volume\") pod \"alertmanager-main-0\" (UID: \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:36.317671 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:36.317650 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8bc4df07-ae0b-4182-b241-870e0ed1be3d-config-out\") pod \"alertmanager-main-0\" (UID: \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:36.319775 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:36.319754 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cgb4\" (UniqueName: \"kubernetes.io/projected/8bc4df07-ae0b-4182-b241-870e0ed1be3d-kube-api-access-4cgb4\") pod \"alertmanager-main-0\" (UID: \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:36.333600 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:36.333565 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-g2fbc" event={"ID":"5721f580-f8b3-4921-9cec-388fc89208c3","Type":"ContainerStarted","Data":"4d3cc5a1eaf2bb5a41de280064005004144df0593db4750a322b782132edd6e4"} Apr 22 18:49:36.333600 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:36.333600 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-g2fbc" event={"ID":"5721f580-f8b3-4921-9cec-388fc89208c3","Type":"ContainerStarted","Data":"b5f53d678d3732c5594b9890aaaac9b94973027feaa332a922bc9326e6b85476"} Apr 22 18:49:36.333740 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:36.333610 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-g2fbc" event={"ID":"5721f580-f8b3-4921-9cec-388fc89208c3","Type":"ContainerStarted","Data":"9167d150efee30a69de28c565309237fcfa21cad7b9583c48bf0026774d08019"} Apr 22 18:49:36.334550 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:36.334524 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zlrvw" event={"ID":"e91496df-a9d7-48a5-a22a-be7cd298ca98","Type":"ContainerStarted","Data":"1237f60f8c2bbcf619b498d122d328afff111047cb7dbbb9b23205a368ebda02"} Apr 22 18:49:36.335715 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:36.335695 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f4b7497f6-zfdtd" event={"ID":"ca94f287-c422-413d-9356-adf2fb7ef856","Type":"ContainerStarted","Data":"8bec8fddd4600ac148dbb3cfe5507a836dfe6b1f904a051b52405d2afcd8787d"} Apr 22 18:49:36.335787 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:36.335722 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f4b7497f6-zfdtd" event={"ID":"ca94f287-c422-413d-9356-adf2fb7ef856","Type":"ContainerStarted","Data":"92dcc458d3727fc5d73c407740b2a9544964baaeca02c3b6437ee69ccabf7493"} Apr 22 18:49:36.350238 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:36.350202 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6f4b7497f6-zfdtd" podStartSLOduration=1.350190438 podStartE2EDuration="1.350190438s" podCreationTimestamp="2026-04-22 18:49:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:49:36.349177123 +0000 UTC m=+178.128010610" watchObservedRunningTime="2026-04-22 18:49:36.350190438 +0000 UTC m=+178.129023924" Apr 22 18:49:36.463515 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:36.463431 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:36.607698 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:36.607642 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:49:36.612692 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:49:36.612656 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8bc4df07_ae0b_4182_b241_870e0ed1be3d.slice/crio-66f7d9bbb9ef33a505a383980d2366b41696088f20608617369d2dffd3ae6aac WatchSource:0}: Error finding container 66f7d9bbb9ef33a505a383980d2366b41696088f20608617369d2dffd3ae6aac: Status 404 returned error can't find the container with id 66f7d9bbb9ef33a505a383980d2366b41696088f20608617369d2dffd3ae6aac Apr 22 18:49:36.996592 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:36.996545 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5d8dcd44f7-tw2f4" Apr 22 18:49:36.996761 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:36.996602 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5d8dcd44f7-tw2f4" Apr 22 18:49:37.002695 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:37.002605 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5d8dcd44f7-tw2f4" Apr 22 18:49:37.341310 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:37.341230 2566 generic.go:358] "Generic (PLEG): container finished" podID="e91496df-a9d7-48a5-a22a-be7cd298ca98" containerID="1708ff54e2f653a5ce5d11c049170d730e94b3e56ac1376d10f4642d6800152e" exitCode=0 Apr 22 18:49:37.341720 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:37.341326 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zlrvw" event={"ID":"e91496df-a9d7-48a5-a22a-be7cd298ca98","Type":"ContainerDied","Data":"1708ff54e2f653a5ce5d11c049170d730e94b3e56ac1376d10f4642d6800152e"} Apr 22 18:49:37.342635 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:37.342593 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8bc4df07-ae0b-4182-b241-870e0ed1be3d","Type":"ContainerStarted","Data":"66f7d9bbb9ef33a505a383980d2366b41696088f20608617369d2dffd3ae6aac"} Apr 22 18:49:37.344537 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:37.344511 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-g2fbc" event={"ID":"5721f580-f8b3-4921-9cec-388fc89208c3","Type":"ContainerStarted","Data":"c8ccfb288617e30d7481fe1614e2daa758cd6d13181ced720a6bbaf33547c94b"} Apr 22 18:49:37.349693 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:37.349591 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5d8dcd44f7-tw2f4" Apr 22 18:49:37.400252 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:37.400200 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-g2fbc" podStartSLOduration=1.556138362 podStartE2EDuration="2.400182566s" podCreationTimestamp="2026-04-22 18:49:35 +0000 UTC" firstStartedPulling="2026-04-22 18:49:36.227542763 +0000 UTC m=+178.006376228" lastFinishedPulling="2026-04-22 18:49:37.071586957 +0000 UTC m=+178.850420432" observedRunningTime="2026-04-22 18:49:37.398769047 +0000 UTC m=+179.177602536" watchObservedRunningTime="2026-04-22 18:49:37.400182566 +0000 UTC m=+179.179016053" Apr 22 18:49:38.349369 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:38.349333 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zlrvw" event={"ID":"e91496df-a9d7-48a5-a22a-be7cd298ca98","Type":"ContainerStarted","Data":"f4cad18c45f4a0663b03efdfe760685532542f03436bf5c75c01586dc417f50a"} Apr 22 18:49:38.349369 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:38.349369 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zlrvw" event={"ID":"e91496df-a9d7-48a5-a22a-be7cd298ca98","Type":"ContainerStarted","Data":"d38f67ee95255d83ecc32f8e0339fd6e3a0ff2b261b3dbd4457a7bfc8259330d"} Apr 22 18:49:38.350661 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:38.350634 2566 generic.go:358] "Generic (PLEG): container finished" podID="8bc4df07-ae0b-4182-b241-870e0ed1be3d" containerID="184dc1a7f73ea9bb3e15c8c8582ee84074bb9a5263546bc73698ff4696d68544" exitCode=0 Apr 22 18:49:38.350763 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:38.350722 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8bc4df07-ae0b-4182-b241-870e0ed1be3d","Type":"ContainerDied","Data":"184dc1a7f73ea9bb3e15c8c8582ee84074bb9a5263546bc73698ff4696d68544"} Apr 22 18:49:38.370863 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:38.370819 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-zlrvw" podStartSLOduration=2.324035652 podStartE2EDuration="3.370809256s" podCreationTimestamp="2026-04-22 18:49:35 +0000 UTC" firstStartedPulling="2026-04-22 18:49:35.997114897 +0000 UTC m=+177.775948366" lastFinishedPulling="2026-04-22 18:49:37.043888503 +0000 UTC m=+178.822721970" observedRunningTime="2026-04-22 18:49:38.370460028 +0000 UTC m=+180.149293513" watchObservedRunningTime="2026-04-22 18:49:38.370809256 +0000 UTC m=+180.149642742" Apr 22 18:49:39.632515 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:39.632475 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-77bfc79668-44qjh"] Apr 22 18:49:39.828174 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:39.828151 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-vmtgp"] Apr 22 18:49:39.832258 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:39.832240 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-vmtgp" Apr 22 18:49:39.834316 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:39.834302 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 22 18:49:39.834398 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:39.834306 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-4x9zf\"" Apr 22 18:49:39.837560 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:39.837245 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-vmtgp"] Apr 22 18:49:39.945322 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:39.945284 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f01825ba-3046-4f60-9037-77b19a52845f-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-vmtgp\" (UID: \"f01825ba-3046-4f60-9037-77b19a52845f\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-vmtgp" Apr 22 18:49:40.046011 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:40.045989 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f01825ba-3046-4f60-9037-77b19a52845f-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-vmtgp\" (UID: \"f01825ba-3046-4f60-9037-77b19a52845f\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-vmtgp" Apr 22 18:49:40.048368 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:40.048342 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f01825ba-3046-4f60-9037-77b19a52845f-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-vmtgp\" (UID: \"f01825ba-3046-4f60-9037-77b19a52845f\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-vmtgp" Apr 22 18:49:40.162693 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:40.162604 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-vmtgp" Apr 22 18:49:40.291790 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:40.291740 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-vmtgp"] Apr 22 18:49:40.293986 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:49:40.293961 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf01825ba_3046_4f60_9037_77b19a52845f.slice/crio-6dff0343fab3abda020cab466d17c5bfcf2b7cdfaf269c10375d4bd78ad2892a WatchSource:0}: Error finding container 6dff0343fab3abda020cab466d17c5bfcf2b7cdfaf269c10375d4bd78ad2892a: Status 404 returned error can't find the container with id 6dff0343fab3abda020cab466d17c5bfcf2b7cdfaf269c10375d4bd78ad2892a Apr 22 18:49:40.359743 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:40.359715 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8bc4df07-ae0b-4182-b241-870e0ed1be3d","Type":"ContainerStarted","Data":"255ed65319e57d3fc70b1a6c1630fa93fb20bdb9588e36b25498f6be4d632bbf"} Apr 22 18:49:40.359838 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:40.359751 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8bc4df07-ae0b-4182-b241-870e0ed1be3d","Type":"ContainerStarted","Data":"faa95981a43f743f994353db59508e18a800abf9e8239b8e1c48fc4f987ec66e"} Apr 22 18:49:40.359838 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:40.359763 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8bc4df07-ae0b-4182-b241-870e0ed1be3d","Type":"ContainerStarted","Data":"6f503c660d95e4c7d5eebc032206ad2fd88eac2f0ab0f367d7a4e522a9191232"} Apr 22 18:49:40.359838 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:40.359773 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8bc4df07-ae0b-4182-b241-870e0ed1be3d","Type":"ContainerStarted","Data":"f98d2ac67a3e64994429aa72939f5f7be356799a710ecb175dec8c704d4c7762"} Apr 22 18:49:40.359838 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:40.359785 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8bc4df07-ae0b-4182-b241-870e0ed1be3d","Type":"ContainerStarted","Data":"f9e33368851aaff9f758e256b86700acae3c0beee99dce2eba325498f977c1e1"} Apr 22 18:49:40.360783 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:40.360760 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-vmtgp" event={"ID":"f01825ba-3046-4f60-9037-77b19a52845f","Type":"ContainerStarted","Data":"6dff0343fab3abda020cab466d17c5bfcf2b7cdfaf269c10375d4bd78ad2892a"} Apr 22 18:49:41.371082 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:41.371042 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8bc4df07-ae0b-4182-b241-870e0ed1be3d","Type":"ContainerStarted","Data":"cde9450bf0b5e750e4d44067d5f7333ccfd7a20c9132c9b4028ca972443fe4df"} Apr 22 18:49:41.395968 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:41.395906 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.285542955 podStartE2EDuration="5.39588761s" podCreationTimestamp="2026-04-22 18:49:36 +0000 UTC" firstStartedPulling="2026-04-22 18:49:36.615548817 +0000 UTC m=+178.394382284" lastFinishedPulling="2026-04-22 18:49:40.725893472 +0000 UTC m=+182.504726939" observedRunningTime="2026-04-22 18:49:41.393579191 +0000 UTC m=+183.172412692" watchObservedRunningTime="2026-04-22 18:49:41.39588761 +0000 UTC m=+183.174721097" Apr 22 18:49:42.375641 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:42.375609 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-vmtgp" event={"ID":"f01825ba-3046-4f60-9037-77b19a52845f","Type":"ContainerStarted","Data":"da414d709c3032233f393b4f938a9c44f269b74673eb341f1249791121c32361"} Apr 22 18:49:42.376062 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:42.375739 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-vmtgp" Apr 22 18:49:42.380576 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:42.380556 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-vmtgp" Apr 22 18:49:42.389802 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:42.389768 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-vmtgp" podStartSLOduration=1.861365481 podStartE2EDuration="3.389757018s" podCreationTimestamp="2026-04-22 18:49:39 +0000 UTC" firstStartedPulling="2026-04-22 18:49:40.296753337 +0000 UTC m=+182.075586801" lastFinishedPulling="2026-04-22 18:49:41.82514487 +0000 UTC m=+183.603978338" observedRunningTime="2026-04-22 18:49:42.388720345 +0000 UTC m=+184.167553822" watchObservedRunningTime="2026-04-22 18:49:42.389757018 +0000 UTC m=+184.168590503" Apr 22 18:49:42.473627 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:42.473600 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6f4b7497f6-zfdtd"] Apr 22 18:49:42.498613 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:42.498580 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-69cd7898c7-tcsv8"] Apr 22 18:49:42.501145 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:42.501122 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69cd7898c7-tcsv8" Apr 22 18:49:42.511497 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:42.511476 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-69cd7898c7-tcsv8"] Apr 22 18:49:42.670844 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:42.670755 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9a608b0d-a014-4e5f-9e96-4331e842034d-service-ca\") pod \"console-69cd7898c7-tcsv8\" (UID: \"9a608b0d-a014-4e5f-9e96-4331e842034d\") " pod="openshift-console/console-69cd7898c7-tcsv8" Apr 22 18:49:42.670844 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:42.670796 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxwh5\" (UniqueName: \"kubernetes.io/projected/9a608b0d-a014-4e5f-9e96-4331e842034d-kube-api-access-hxwh5\") pod \"console-69cd7898c7-tcsv8\" (UID: \"9a608b0d-a014-4e5f-9e96-4331e842034d\") " pod="openshift-console/console-69cd7898c7-tcsv8" Apr 22 18:49:42.670844 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:42.670819 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9a608b0d-a014-4e5f-9e96-4331e842034d-oauth-serving-cert\") pod \"console-69cd7898c7-tcsv8\" (UID: \"9a608b0d-a014-4e5f-9e96-4331e842034d\") " pod="openshift-console/console-69cd7898c7-tcsv8" Apr 22 18:49:42.671157 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:42.670964 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9a608b0d-a014-4e5f-9e96-4331e842034d-console-oauth-config\") pod \"console-69cd7898c7-tcsv8\" (UID: \"9a608b0d-a014-4e5f-9e96-4331e842034d\") " pod="openshift-console/console-69cd7898c7-tcsv8" Apr 22 18:49:42.671157 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:42.671052 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9a608b0d-a014-4e5f-9e96-4331e842034d-console-serving-cert\") pod \"console-69cd7898c7-tcsv8\" (UID: \"9a608b0d-a014-4e5f-9e96-4331e842034d\") " pod="openshift-console/console-69cd7898c7-tcsv8" Apr 22 18:49:42.671157 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:42.671093 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a608b0d-a014-4e5f-9e96-4331e842034d-trusted-ca-bundle\") pod \"console-69cd7898c7-tcsv8\" (UID: \"9a608b0d-a014-4e5f-9e96-4331e842034d\") " pod="openshift-console/console-69cd7898c7-tcsv8" Apr 22 18:49:42.671157 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:42.671124 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9a608b0d-a014-4e5f-9e96-4331e842034d-console-config\") pod \"console-69cd7898c7-tcsv8\" (UID: \"9a608b0d-a014-4e5f-9e96-4331e842034d\") " pod="openshift-console/console-69cd7898c7-tcsv8" Apr 22 18:49:42.772189 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:42.772159 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9a608b0d-a014-4e5f-9e96-4331e842034d-console-oauth-config\") pod \"console-69cd7898c7-tcsv8\" (UID: \"9a608b0d-a014-4e5f-9e96-4331e842034d\") " pod="openshift-console/console-69cd7898c7-tcsv8" Apr 22 18:49:42.772354 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:42.772207 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9a608b0d-a014-4e5f-9e96-4331e842034d-console-serving-cert\") pod \"console-69cd7898c7-tcsv8\" (UID: \"9a608b0d-a014-4e5f-9e96-4331e842034d\") " pod="openshift-console/console-69cd7898c7-tcsv8" Apr 22 18:49:42.772354 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:42.772230 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a608b0d-a014-4e5f-9e96-4331e842034d-trusted-ca-bundle\") pod \"console-69cd7898c7-tcsv8\" (UID: \"9a608b0d-a014-4e5f-9e96-4331e842034d\") " pod="openshift-console/console-69cd7898c7-tcsv8" Apr 22 18:49:42.772354 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:42.772246 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9a608b0d-a014-4e5f-9e96-4331e842034d-console-config\") pod \"console-69cd7898c7-tcsv8\" (UID: \"9a608b0d-a014-4e5f-9e96-4331e842034d\") " pod="openshift-console/console-69cd7898c7-tcsv8" Apr 22 18:49:42.772354 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:42.772285 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9a608b0d-a014-4e5f-9e96-4331e842034d-service-ca\") pod \"console-69cd7898c7-tcsv8\" (UID: \"9a608b0d-a014-4e5f-9e96-4331e842034d\") " pod="openshift-console/console-69cd7898c7-tcsv8" Apr 22 18:49:42.772354 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:42.772305 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hxwh5\" (UniqueName: \"kubernetes.io/projected/9a608b0d-a014-4e5f-9e96-4331e842034d-kube-api-access-hxwh5\") pod \"console-69cd7898c7-tcsv8\" (UID: \"9a608b0d-a014-4e5f-9e96-4331e842034d\") " pod="openshift-console/console-69cd7898c7-tcsv8" Apr 22 18:49:42.772354 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:42.772323 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9a608b0d-a014-4e5f-9e96-4331e842034d-oauth-serving-cert\") pod \"console-69cd7898c7-tcsv8\" (UID: \"9a608b0d-a014-4e5f-9e96-4331e842034d\") " pod="openshift-console/console-69cd7898c7-tcsv8" Apr 22 18:49:42.773060 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:42.773034 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9a608b0d-a014-4e5f-9e96-4331e842034d-service-ca\") pod \"console-69cd7898c7-tcsv8\" (UID: \"9a608b0d-a014-4e5f-9e96-4331e842034d\") " pod="openshift-console/console-69cd7898c7-tcsv8" Apr 22 18:49:42.773204 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:42.773180 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9a608b0d-a014-4e5f-9e96-4331e842034d-oauth-serving-cert\") pod \"console-69cd7898c7-tcsv8\" (UID: \"9a608b0d-a014-4e5f-9e96-4331e842034d\") " pod="openshift-console/console-69cd7898c7-tcsv8" Apr 22 18:49:42.773204 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:42.773194 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a608b0d-a014-4e5f-9e96-4331e842034d-trusted-ca-bundle\") pod \"console-69cd7898c7-tcsv8\" (UID: \"9a608b0d-a014-4e5f-9e96-4331e842034d\") " pod="openshift-console/console-69cd7898c7-tcsv8" Apr 22 18:49:42.773325 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:42.773297 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9a608b0d-a014-4e5f-9e96-4331e842034d-console-config\") pod \"console-69cd7898c7-tcsv8\" (UID: \"9a608b0d-a014-4e5f-9e96-4331e842034d\") " pod="openshift-console/console-69cd7898c7-tcsv8" Apr 22 18:49:42.775245 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:42.775221 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9a608b0d-a014-4e5f-9e96-4331e842034d-console-serving-cert\") pod \"console-69cd7898c7-tcsv8\" (UID: \"9a608b0d-a014-4e5f-9e96-4331e842034d\") " pod="openshift-console/console-69cd7898c7-tcsv8" Apr 22 18:49:42.775353 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:42.775334 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9a608b0d-a014-4e5f-9e96-4331e842034d-console-oauth-config\") pod \"console-69cd7898c7-tcsv8\" (UID: \"9a608b0d-a014-4e5f-9e96-4331e842034d\") " pod="openshift-console/console-69cd7898c7-tcsv8" Apr 22 18:49:42.779531 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:42.779509 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxwh5\" (UniqueName: \"kubernetes.io/projected/9a608b0d-a014-4e5f-9e96-4331e842034d-kube-api-access-hxwh5\") pod \"console-69cd7898c7-tcsv8\" (UID: \"9a608b0d-a014-4e5f-9e96-4331e842034d\") " pod="openshift-console/console-69cd7898c7-tcsv8" Apr 22 18:49:42.810412 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:42.810385 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69cd7898c7-tcsv8" Apr 22 18:49:42.924739 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:42.924717 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-69cd7898c7-tcsv8"] Apr 22 18:49:42.927013 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:49:42.926989 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a608b0d_a014_4e5f_9e96_4331e842034d.slice/crio-a24e17f6f9e7b092b4287aa39a402657aace7e604f218608daa371218876ea5a WatchSource:0}: Error finding container a24e17f6f9e7b092b4287aa39a402657aace7e604f218608daa371218876ea5a: Status 404 returned error can't find the container with id a24e17f6f9e7b092b4287aa39a402657aace7e604f218608daa371218876ea5a Apr 22 18:49:43.379950 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:43.379908 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69cd7898c7-tcsv8" event={"ID":"9a608b0d-a014-4e5f-9e96-4331e842034d","Type":"ContainerStarted","Data":"6f1cddff386ec0ac2bc325aa8f336bbc4cd8eb506334a1e7753e658ced61d409"} Apr 22 18:49:43.380335 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:43.379959 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69cd7898c7-tcsv8" event={"ID":"9a608b0d-a014-4e5f-9e96-4331e842034d","Type":"ContainerStarted","Data":"a24e17f6f9e7b092b4287aa39a402657aace7e604f218608daa371218876ea5a"} Apr 22 18:49:43.395925 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:43.395881 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-69cd7898c7-tcsv8" podStartSLOduration=1.3958688750000001 podStartE2EDuration="1.395868875s" podCreationTimestamp="2026-04-22 18:49:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:49:43.394761736 +0000 UTC m=+185.173595222" watchObservedRunningTime="2026-04-22 18:49:43.395868875 +0000 UTC m=+185.174702360" Apr 22 18:49:45.771142 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:45.771090 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6f4b7497f6-zfdtd" Apr 22 18:49:47.337650 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:47.337610 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-69cd7898c7-tcsv8"] Apr 22 18:49:47.365741 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:47.365712 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-84f97f657c-wsmcp"] Apr 22 18:49:47.368713 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:47.368696 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84f97f657c-wsmcp" Apr 22 18:49:47.378462 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:47.378433 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-84f97f657c-wsmcp"] Apr 22 18:49:47.516158 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:47.516115 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d2d9a9c5-0b0b-4cfa-a570-c855f684acfb-oauth-serving-cert\") pod \"console-84f97f657c-wsmcp\" (UID: \"d2d9a9c5-0b0b-4cfa-a570-c855f684acfb\") " pod="openshift-console/console-84f97f657c-wsmcp" Apr 22 18:49:47.516158 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:47.516156 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d2d9a9c5-0b0b-4cfa-a570-c855f684acfb-console-serving-cert\") pod \"console-84f97f657c-wsmcp\" (UID: \"d2d9a9c5-0b0b-4cfa-a570-c855f684acfb\") " pod="openshift-console/console-84f97f657c-wsmcp" Apr 22 18:49:47.516377 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:47.516239 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d2d9a9c5-0b0b-4cfa-a570-c855f684acfb-service-ca\") pod \"console-84f97f657c-wsmcp\" (UID: \"d2d9a9c5-0b0b-4cfa-a570-c855f684acfb\") " pod="openshift-console/console-84f97f657c-wsmcp" Apr 22 18:49:47.516377 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:47.516283 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d2d9a9c5-0b0b-4cfa-a570-c855f684acfb-trusted-ca-bundle\") pod \"console-84f97f657c-wsmcp\" (UID: \"d2d9a9c5-0b0b-4cfa-a570-c855f684acfb\") " pod="openshift-console/console-84f97f657c-wsmcp" Apr 22 18:49:47.516377 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:47.516335 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d2d9a9c5-0b0b-4cfa-a570-c855f684acfb-console-config\") pod \"console-84f97f657c-wsmcp\" (UID: \"d2d9a9c5-0b0b-4cfa-a570-c855f684acfb\") " pod="openshift-console/console-84f97f657c-wsmcp" Apr 22 18:49:47.516377 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:47.516356 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d2d9a9c5-0b0b-4cfa-a570-c855f684acfb-console-oauth-config\") pod \"console-84f97f657c-wsmcp\" (UID: \"d2d9a9c5-0b0b-4cfa-a570-c855f684acfb\") " pod="openshift-console/console-84f97f657c-wsmcp" Apr 22 18:49:47.516537 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:47.516385 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lh7qj\" (UniqueName: \"kubernetes.io/projected/d2d9a9c5-0b0b-4cfa-a570-c855f684acfb-kube-api-access-lh7qj\") pod \"console-84f97f657c-wsmcp\" (UID: \"d2d9a9c5-0b0b-4cfa-a570-c855f684acfb\") " pod="openshift-console/console-84f97f657c-wsmcp" Apr 22 18:49:47.617040 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:47.617006 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d2d9a9c5-0b0b-4cfa-a570-c855f684acfb-oauth-serving-cert\") pod \"console-84f97f657c-wsmcp\" (UID: \"d2d9a9c5-0b0b-4cfa-a570-c855f684acfb\") " pod="openshift-console/console-84f97f657c-wsmcp" Apr 22 18:49:47.617243 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:47.617050 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d2d9a9c5-0b0b-4cfa-a570-c855f684acfb-console-serving-cert\") pod \"console-84f97f657c-wsmcp\" (UID: \"d2d9a9c5-0b0b-4cfa-a570-c855f684acfb\") " pod="openshift-console/console-84f97f657c-wsmcp" Apr 22 18:49:47.617243 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:47.617086 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d2d9a9c5-0b0b-4cfa-a570-c855f684acfb-service-ca\") pod \"console-84f97f657c-wsmcp\" (UID: \"d2d9a9c5-0b0b-4cfa-a570-c855f684acfb\") " pod="openshift-console/console-84f97f657c-wsmcp" Apr 22 18:49:47.617243 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:47.617108 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d2d9a9c5-0b0b-4cfa-a570-c855f684acfb-trusted-ca-bundle\") pod \"console-84f97f657c-wsmcp\" (UID: \"d2d9a9c5-0b0b-4cfa-a570-c855f684acfb\") " pod="openshift-console/console-84f97f657c-wsmcp" Apr 22 18:49:47.617243 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:47.617156 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d2d9a9c5-0b0b-4cfa-a570-c855f684acfb-console-config\") pod \"console-84f97f657c-wsmcp\" (UID: \"d2d9a9c5-0b0b-4cfa-a570-c855f684acfb\") " pod="openshift-console/console-84f97f657c-wsmcp" Apr 22 18:49:47.617243 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:47.617187 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d2d9a9c5-0b0b-4cfa-a570-c855f684acfb-console-oauth-config\") pod \"console-84f97f657c-wsmcp\" (UID: \"d2d9a9c5-0b0b-4cfa-a570-c855f684acfb\") " pod="openshift-console/console-84f97f657c-wsmcp" Apr 22 18:49:47.617243 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:47.617222 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lh7qj\" (UniqueName: \"kubernetes.io/projected/d2d9a9c5-0b0b-4cfa-a570-c855f684acfb-kube-api-access-lh7qj\") pod \"console-84f97f657c-wsmcp\" (UID: \"d2d9a9c5-0b0b-4cfa-a570-c855f684acfb\") " pod="openshift-console/console-84f97f657c-wsmcp" Apr 22 18:49:47.618002 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:47.617974 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d2d9a9c5-0b0b-4cfa-a570-c855f684acfb-service-ca\") pod \"console-84f97f657c-wsmcp\" (UID: \"d2d9a9c5-0b0b-4cfa-a570-c855f684acfb\") " pod="openshift-console/console-84f97f657c-wsmcp" Apr 22 18:49:47.618002 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:47.617990 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d2d9a9c5-0b0b-4cfa-a570-c855f684acfb-console-config\") pod \"console-84f97f657c-wsmcp\" (UID: \"d2d9a9c5-0b0b-4cfa-a570-c855f684acfb\") " pod="openshift-console/console-84f97f657c-wsmcp" Apr 22 18:49:47.618178 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:47.618069 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d2d9a9c5-0b0b-4cfa-a570-c855f684acfb-trusted-ca-bundle\") pod \"console-84f97f657c-wsmcp\" (UID: \"d2d9a9c5-0b0b-4cfa-a570-c855f684acfb\") " pod="openshift-console/console-84f97f657c-wsmcp" Apr 22 18:49:47.618389 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:47.618365 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d2d9a9c5-0b0b-4cfa-a570-c855f684acfb-oauth-serving-cert\") pod \"console-84f97f657c-wsmcp\" (UID: \"d2d9a9c5-0b0b-4cfa-a570-c855f684acfb\") " pod="openshift-console/console-84f97f657c-wsmcp" Apr 22 18:49:47.619708 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:47.619676 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d2d9a9c5-0b0b-4cfa-a570-c855f684acfb-console-oauth-config\") pod \"console-84f97f657c-wsmcp\" (UID: \"d2d9a9c5-0b0b-4cfa-a570-c855f684acfb\") " pod="openshift-console/console-84f97f657c-wsmcp" Apr 22 18:49:47.619811 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:47.619795 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d2d9a9c5-0b0b-4cfa-a570-c855f684acfb-console-serving-cert\") pod \"console-84f97f657c-wsmcp\" (UID: \"d2d9a9c5-0b0b-4cfa-a570-c855f684acfb\") " pod="openshift-console/console-84f97f657c-wsmcp" Apr 22 18:49:47.624195 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:47.624175 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lh7qj\" (UniqueName: \"kubernetes.io/projected/d2d9a9c5-0b0b-4cfa-a570-c855f684acfb-kube-api-access-lh7qj\") pod \"console-84f97f657c-wsmcp\" (UID: \"d2d9a9c5-0b0b-4cfa-a570-c855f684acfb\") " pod="openshift-console/console-84f97f657c-wsmcp" Apr 22 18:49:47.678440 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:47.678402 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84f97f657c-wsmcp" Apr 22 18:49:47.794088 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:47.794063 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-84f97f657c-wsmcp"] Apr 22 18:49:47.796843 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:49:47.796812 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2d9a9c5_0b0b_4cfa_a570_c855f684acfb.slice/crio-67341fe1c076a810782ee5300be862f9ad432742808642dda2ea4d6f95620a9b WatchSource:0}: Error finding container 67341fe1c076a810782ee5300be862f9ad432742808642dda2ea4d6f95620a9b: Status 404 returned error can't find the container with id 67341fe1c076a810782ee5300be862f9ad432742808642dda2ea4d6f95620a9b Apr 22 18:49:48.396600 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:48.396562 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84f97f657c-wsmcp" event={"ID":"d2d9a9c5-0b0b-4cfa-a570-c855f684acfb","Type":"ContainerStarted","Data":"521fef34ecd1518206a611f9e60272cba8c1cef6325854f278c5e9a0c4e3a40d"} Apr 22 18:49:48.396968 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:48.396608 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84f97f657c-wsmcp" event={"ID":"d2d9a9c5-0b0b-4cfa-a570-c855f684acfb","Type":"ContainerStarted","Data":"67341fe1c076a810782ee5300be862f9ad432742808642dda2ea4d6f95620a9b"} Apr 22 18:49:48.419268 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:48.419219 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-84f97f657c-wsmcp" podStartSLOduration=1.419205321 podStartE2EDuration="1.419205321s" podCreationTimestamp="2026-04-22 18:49:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:49:48.417968434 +0000 UTC m=+190.196801921" watchObservedRunningTime="2026-04-22 18:49:48.419205321 +0000 UTC m=+190.198038808" Apr 22 18:49:49.637986 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:49.637958 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-77bfc79668-44qjh" Apr 22 18:49:52.810958 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:52.810913 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-69cd7898c7-tcsv8" Apr 22 18:49:57.679116 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:57.679069 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-84f97f657c-wsmcp" Apr 22 18:49:57.679116 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:57.679122 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-84f97f657c-wsmcp" Apr 22 18:49:57.683557 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:57.683529 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-84f97f657c-wsmcp" Apr 22 18:49:58.430441 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:58.430418 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-84f97f657c-wsmcp" Apr 22 18:49:58.473340 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:49:58.473305 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5d8dcd44f7-tw2f4"] Apr 22 18:50:04.652797 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:04.652736 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-77bfc79668-44qjh" podUID="5ceaf6dd-7ffb-4393-9076-3f02c3b84983" containerName="registry" containerID="cri-o://f2b43609f8fc69868b3d2568688e0b1a76d8d2d8707b39d0b00ebb6f2ffd2362" gracePeriod=30 Apr 22 18:50:04.909311 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:04.909251 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-77bfc79668-44qjh" Apr 22 18:50:04.965638 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:04.965612 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-registry-certificates\") pod \"5ceaf6dd-7ffb-4393-9076-3f02c3b84983\" (UID: \"5ceaf6dd-7ffb-4393-9076-3f02c3b84983\") " Apr 22 18:50:04.965805 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:04.965666 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-bound-sa-token\") pod \"5ceaf6dd-7ffb-4393-9076-3f02c3b84983\" (UID: \"5ceaf6dd-7ffb-4393-9076-3f02c3b84983\") " Apr 22 18:50:04.965805 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:04.965703 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-registry-tls\") pod \"5ceaf6dd-7ffb-4393-9076-3f02c3b84983\" (UID: \"5ceaf6dd-7ffb-4393-9076-3f02c3b84983\") " Apr 22 18:50:04.965805 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:04.965761 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-installation-pull-secrets\") pod \"5ceaf6dd-7ffb-4393-9076-3f02c3b84983\" (UID: \"5ceaf6dd-7ffb-4393-9076-3f02c3b84983\") " Apr 22 18:50:04.965805 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:04.965787 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-trusted-ca\") pod \"5ceaf6dd-7ffb-4393-9076-3f02c3b84983\" (UID: \"5ceaf6dd-7ffb-4393-9076-3f02c3b84983\") " Apr 22 18:50:04.966037 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:04.965989 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-ca-trust-extracted\") pod \"5ceaf6dd-7ffb-4393-9076-3f02c3b84983\" (UID: \"5ceaf6dd-7ffb-4393-9076-3f02c3b84983\") " Apr 22 18:50:04.966092 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:04.966045 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4glwh\" (UniqueName: \"kubernetes.io/projected/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-kube-api-access-4glwh\") pod \"5ceaf6dd-7ffb-4393-9076-3f02c3b84983\" (UID: \"5ceaf6dd-7ffb-4393-9076-3f02c3b84983\") " Apr 22 18:50:04.966092 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:04.966079 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-image-registry-private-configuration\") pod \"5ceaf6dd-7ffb-4393-9076-3f02c3b84983\" (UID: \"5ceaf6dd-7ffb-4393-9076-3f02c3b84983\") " Apr 22 18:50:04.966254 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:04.966223 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "5ceaf6dd-7ffb-4393-9076-3f02c3b84983" (UID: "5ceaf6dd-7ffb-4393-9076-3f02c3b84983"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:50:04.966316 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:04.966300 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "5ceaf6dd-7ffb-4393-9076-3f02c3b84983" (UID: "5ceaf6dd-7ffb-4393-9076-3f02c3b84983"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:50:04.966466 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:04.966445 2566 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-trusted-ca\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:50:04.966532 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:04.966473 2566 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-registry-certificates\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:50:04.968415 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:04.968361 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "5ceaf6dd-7ffb-4393-9076-3f02c3b84983" (UID: "5ceaf6dd-7ffb-4393-9076-3f02c3b84983"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:50:04.968548 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:04.968499 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-kube-api-access-4glwh" (OuterVolumeSpecName: "kube-api-access-4glwh") pod "5ceaf6dd-7ffb-4393-9076-3f02c3b84983" (UID: "5ceaf6dd-7ffb-4393-9076-3f02c3b84983"). InnerVolumeSpecName "kube-api-access-4glwh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:50:04.968840 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:04.968810 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "5ceaf6dd-7ffb-4393-9076-3f02c3b84983" (UID: "5ceaf6dd-7ffb-4393-9076-3f02c3b84983"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:50:04.968970 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:04.968859 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "5ceaf6dd-7ffb-4393-9076-3f02c3b84983" (UID: "5ceaf6dd-7ffb-4393-9076-3f02c3b84983"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:50:04.969080 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:04.969062 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "5ceaf6dd-7ffb-4393-9076-3f02c3b84983" (UID: "5ceaf6dd-7ffb-4393-9076-3f02c3b84983"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:50:04.975208 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:04.975181 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "5ceaf6dd-7ffb-4393-9076-3f02c3b84983" (UID: "5ceaf6dd-7ffb-4393-9076-3f02c3b84983"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:50:05.067593 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:05.067557 2566 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-installation-pull-secrets\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:50:05.067593 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:05.067590 2566 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-ca-trust-extracted\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:50:05.067593 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:05.067599 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4glwh\" (UniqueName: \"kubernetes.io/projected/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-kube-api-access-4glwh\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:50:05.067796 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:05.067609 2566 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-image-registry-private-configuration\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:50:05.067796 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:05.067619 2566 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-bound-sa-token\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:50:05.067796 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:05.067629 2566 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5ceaf6dd-7ffb-4393-9076-3f02c3b84983-registry-tls\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:50:05.446951 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:05.446900 2566 generic.go:358] "Generic (PLEG): container finished" podID="5ceaf6dd-7ffb-4393-9076-3f02c3b84983" containerID="f2b43609f8fc69868b3d2568688e0b1a76d8d2d8707b39d0b00ebb6f2ffd2362" exitCode=0 Apr 22 18:50:05.447110 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:05.446987 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-77bfc79668-44qjh" event={"ID":"5ceaf6dd-7ffb-4393-9076-3f02c3b84983","Type":"ContainerDied","Data":"f2b43609f8fc69868b3d2568688e0b1a76d8d2d8707b39d0b00ebb6f2ffd2362"} Apr 22 18:50:05.447110 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:05.447010 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-77bfc79668-44qjh" Apr 22 18:50:05.447110 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:05.447026 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-77bfc79668-44qjh" event={"ID":"5ceaf6dd-7ffb-4393-9076-3f02c3b84983","Type":"ContainerDied","Data":"880fa51011a722552a36e200047e2a5c81b5848d5bc6b16c15ff7319e534290a"} Apr 22 18:50:05.447110 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:05.447042 2566 scope.go:117] "RemoveContainer" containerID="f2b43609f8fc69868b3d2568688e0b1a76d8d2d8707b39d0b00ebb6f2ffd2362" Apr 22 18:50:05.455515 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:05.455496 2566 scope.go:117] "RemoveContainer" containerID="f2b43609f8fc69868b3d2568688e0b1a76d8d2d8707b39d0b00ebb6f2ffd2362" Apr 22 18:50:05.455735 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:50:05.455717 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2b43609f8fc69868b3d2568688e0b1a76d8d2d8707b39d0b00ebb6f2ffd2362\": container with ID starting with f2b43609f8fc69868b3d2568688e0b1a76d8d2d8707b39d0b00ebb6f2ffd2362 not found: ID does not exist" containerID="f2b43609f8fc69868b3d2568688e0b1a76d8d2d8707b39d0b00ebb6f2ffd2362" Apr 22 18:50:05.455797 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:05.455740 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2b43609f8fc69868b3d2568688e0b1a76d8d2d8707b39d0b00ebb6f2ffd2362"} err="failed to get container status \"f2b43609f8fc69868b3d2568688e0b1a76d8d2d8707b39d0b00ebb6f2ffd2362\": rpc error: code = NotFound desc = could not find container \"f2b43609f8fc69868b3d2568688e0b1a76d8d2d8707b39d0b00ebb6f2ffd2362\": container with ID starting with f2b43609f8fc69868b3d2568688e0b1a76d8d2d8707b39d0b00ebb6f2ffd2362 not found: ID does not exist" Apr 22 18:50:05.467076 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:05.467052 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-77bfc79668-44qjh"] Apr 22 18:50:05.469835 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:05.469816 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-77bfc79668-44qjh"] Apr 22 18:50:06.451276 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:06.451203 2566 generic.go:358] "Generic (PLEG): container finished" podID="81a18479-876e-4704-9f10-9b3d459cce34" containerID="8ef155b65a11c6ca9d43197b668ceba330e678c45e78f703328ca3aaddabf75f" exitCode=0 Apr 22 18:50:06.451659 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:06.451282 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-zvgcc" event={"ID":"81a18479-876e-4704-9f10-9b3d459cce34","Type":"ContainerDied","Data":"8ef155b65a11c6ca9d43197b668ceba330e678c45e78f703328ca3aaddabf75f"} Apr 22 18:50:06.451659 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:06.451648 2566 scope.go:117] "RemoveContainer" containerID="8ef155b65a11c6ca9d43197b668ceba330e678c45e78f703328ca3aaddabf75f" Apr 22 18:50:06.830974 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:06.830880 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ceaf6dd-7ffb-4393-9076-3f02c3b84983" path="/var/lib/kubelet/pods/5ceaf6dd-7ffb-4393-9076-3f02c3b84983/volumes" Apr 22 18:50:07.456314 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:07.456280 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-zvgcc" event={"ID":"81a18479-876e-4704-9f10-9b3d459cce34","Type":"ContainerStarted","Data":"ec2c7f6a93bfd5dc1aa8123bcf739c69d826e9f5f94ab2f91525435819dc1d42"} Apr 22 18:50:07.492019 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:07.491985 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6f4b7497f6-zfdtd" podUID="ca94f287-c422-413d-9356-adf2fb7ef856" containerName="console" containerID="cri-o://8bec8fddd4600ac148dbb3cfe5507a836dfe6b1f904a051b52405d2afcd8787d" gracePeriod=15 Apr 22 18:50:07.741131 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:07.741111 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6f4b7497f6-zfdtd_ca94f287-c422-413d-9356-adf2fb7ef856/console/0.log" Apr 22 18:50:07.741241 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:07.741167 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f4b7497f6-zfdtd" Apr 22 18:50:07.795511 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:07.795476 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ca94f287-c422-413d-9356-adf2fb7ef856-oauth-serving-cert\") pod \"ca94f287-c422-413d-9356-adf2fb7ef856\" (UID: \"ca94f287-c422-413d-9356-adf2fb7ef856\") " Apr 22 18:50:07.795511 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:07.795515 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ca94f287-c422-413d-9356-adf2fb7ef856-service-ca\") pod \"ca94f287-c422-413d-9356-adf2fb7ef856\" (UID: \"ca94f287-c422-413d-9356-adf2fb7ef856\") " Apr 22 18:50:07.795727 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:07.795542 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ca94f287-c422-413d-9356-adf2fb7ef856-console-oauth-config\") pod \"ca94f287-c422-413d-9356-adf2fb7ef856\" (UID: \"ca94f287-c422-413d-9356-adf2fb7ef856\") " Apr 22 18:50:07.795727 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:07.795563 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca94f287-c422-413d-9356-adf2fb7ef856-trusted-ca-bundle\") pod \"ca94f287-c422-413d-9356-adf2fb7ef856\" (UID: \"ca94f287-c422-413d-9356-adf2fb7ef856\") " Apr 22 18:50:07.795727 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:07.795597 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ca94f287-c422-413d-9356-adf2fb7ef856-console-config\") pod \"ca94f287-c422-413d-9356-adf2fb7ef856\" (UID: \"ca94f287-c422-413d-9356-adf2fb7ef856\") " Apr 22 18:50:07.795727 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:07.795620 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ca94f287-c422-413d-9356-adf2fb7ef856-console-serving-cert\") pod \"ca94f287-c422-413d-9356-adf2fb7ef856\" (UID: \"ca94f287-c422-413d-9356-adf2fb7ef856\") " Apr 22 18:50:07.795727 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:07.795654 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kvbh\" (UniqueName: \"kubernetes.io/projected/ca94f287-c422-413d-9356-adf2fb7ef856-kube-api-access-8kvbh\") pod \"ca94f287-c422-413d-9356-adf2fb7ef856\" (UID: \"ca94f287-c422-413d-9356-adf2fb7ef856\") " Apr 22 18:50:07.796094 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:07.796030 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca94f287-c422-413d-9356-adf2fb7ef856-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "ca94f287-c422-413d-9356-adf2fb7ef856" (UID: "ca94f287-c422-413d-9356-adf2fb7ef856"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:50:07.796233 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:07.796144 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca94f287-c422-413d-9356-adf2fb7ef856-console-config" (OuterVolumeSpecName: "console-config") pod "ca94f287-c422-413d-9356-adf2fb7ef856" (UID: "ca94f287-c422-413d-9356-adf2fb7ef856"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:50:07.796289 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:07.796222 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca94f287-c422-413d-9356-adf2fb7ef856-service-ca" (OuterVolumeSpecName: "service-ca") pod "ca94f287-c422-413d-9356-adf2fb7ef856" (UID: "ca94f287-c422-413d-9356-adf2fb7ef856"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:50:07.796289 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:07.796256 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca94f287-c422-413d-9356-adf2fb7ef856-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "ca94f287-c422-413d-9356-adf2fb7ef856" (UID: "ca94f287-c422-413d-9356-adf2fb7ef856"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:50:07.798108 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:07.798083 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca94f287-c422-413d-9356-adf2fb7ef856-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "ca94f287-c422-413d-9356-adf2fb7ef856" (UID: "ca94f287-c422-413d-9356-adf2fb7ef856"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:50:07.798211 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:07.798132 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca94f287-c422-413d-9356-adf2fb7ef856-kube-api-access-8kvbh" (OuterVolumeSpecName: "kube-api-access-8kvbh") pod "ca94f287-c422-413d-9356-adf2fb7ef856" (UID: "ca94f287-c422-413d-9356-adf2fb7ef856"). InnerVolumeSpecName "kube-api-access-8kvbh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:50:07.798211 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:07.798146 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca94f287-c422-413d-9356-adf2fb7ef856-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "ca94f287-c422-413d-9356-adf2fb7ef856" (UID: "ca94f287-c422-413d-9356-adf2fb7ef856"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:50:07.896462 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:07.896434 2566 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ca94f287-c422-413d-9356-adf2fb7ef856-service-ca\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:50:07.896462 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:07.896456 2566 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ca94f287-c422-413d-9356-adf2fb7ef856-console-oauth-config\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:50:07.896462 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:07.896466 2566 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca94f287-c422-413d-9356-adf2fb7ef856-trusted-ca-bundle\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:50:07.896665 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:07.896475 2566 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ca94f287-c422-413d-9356-adf2fb7ef856-console-config\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:50:07.896665 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:07.896492 2566 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ca94f287-c422-413d-9356-adf2fb7ef856-console-serving-cert\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:50:07.896665 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:07.896501 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8kvbh\" (UniqueName: \"kubernetes.io/projected/ca94f287-c422-413d-9356-adf2fb7ef856-kube-api-access-8kvbh\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:50:07.896665 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:07.896511 2566 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ca94f287-c422-413d-9356-adf2fb7ef856-oauth-serving-cert\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:50:08.460207 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:08.460181 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6f4b7497f6-zfdtd_ca94f287-c422-413d-9356-adf2fb7ef856/console/0.log" Apr 22 18:50:08.460671 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:08.460219 2566 generic.go:358] "Generic (PLEG): container finished" podID="ca94f287-c422-413d-9356-adf2fb7ef856" containerID="8bec8fddd4600ac148dbb3cfe5507a836dfe6b1f904a051b52405d2afcd8787d" exitCode=2 Apr 22 18:50:08.460671 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:08.460251 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f4b7497f6-zfdtd" event={"ID":"ca94f287-c422-413d-9356-adf2fb7ef856","Type":"ContainerDied","Data":"8bec8fddd4600ac148dbb3cfe5507a836dfe6b1f904a051b52405d2afcd8787d"} Apr 22 18:50:08.460671 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:08.460291 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f4b7497f6-zfdtd" event={"ID":"ca94f287-c422-413d-9356-adf2fb7ef856","Type":"ContainerDied","Data":"92dcc458d3727fc5d73c407740b2a9544964baaeca02c3b6437ee69ccabf7493"} Apr 22 18:50:08.460671 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:08.460294 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f4b7497f6-zfdtd" Apr 22 18:50:08.460671 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:08.460306 2566 scope.go:117] "RemoveContainer" containerID="8bec8fddd4600ac148dbb3cfe5507a836dfe6b1f904a051b52405d2afcd8787d" Apr 22 18:50:08.469800 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:08.469695 2566 scope.go:117] "RemoveContainer" containerID="8bec8fddd4600ac148dbb3cfe5507a836dfe6b1f904a051b52405d2afcd8787d" Apr 22 18:50:08.472541 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:50:08.472514 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bec8fddd4600ac148dbb3cfe5507a836dfe6b1f904a051b52405d2afcd8787d\": container with ID starting with 8bec8fddd4600ac148dbb3cfe5507a836dfe6b1f904a051b52405d2afcd8787d not found: ID does not exist" containerID="8bec8fddd4600ac148dbb3cfe5507a836dfe6b1f904a051b52405d2afcd8787d" Apr 22 18:50:08.472652 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:08.472548 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bec8fddd4600ac148dbb3cfe5507a836dfe6b1f904a051b52405d2afcd8787d"} err="failed to get container status \"8bec8fddd4600ac148dbb3cfe5507a836dfe6b1f904a051b52405d2afcd8787d\": rpc error: code = NotFound desc = could not find container \"8bec8fddd4600ac148dbb3cfe5507a836dfe6b1f904a051b52405d2afcd8787d\": container with ID starting with 8bec8fddd4600ac148dbb3cfe5507a836dfe6b1f904a051b52405d2afcd8787d not found: ID does not exist" Apr 22 18:50:08.481511 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:08.481489 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6f4b7497f6-zfdtd"] Apr 22 18:50:08.484638 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:08.484619 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6f4b7497f6-zfdtd"] Apr 22 18:50:08.828893 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:08.828818 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca94f287-c422-413d-9356-adf2fb7ef856" path="/var/lib/kubelet/pods/ca94f287-c422-413d-9356-adf2fb7ef856/volumes" Apr 22 18:50:12.361083 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:12.361048 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-69cd7898c7-tcsv8" podUID="9a608b0d-a014-4e5f-9e96-4331e842034d" containerName="console" containerID="cri-o://6f1cddff386ec0ac2bc325aa8f336bbc4cd8eb506334a1e7753e658ced61d409" gracePeriod=15 Apr 22 18:50:12.614012 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:12.613957 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-69cd7898c7-tcsv8_9a608b0d-a014-4e5f-9e96-4331e842034d/console/0.log" Apr 22 18:50:12.614118 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:12.614017 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69cd7898c7-tcsv8" Apr 22 18:50:12.736834 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:12.736802 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9a608b0d-a014-4e5f-9e96-4331e842034d-console-oauth-config\") pod \"9a608b0d-a014-4e5f-9e96-4331e842034d\" (UID: \"9a608b0d-a014-4e5f-9e96-4331e842034d\") " Apr 22 18:50:12.736991 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:12.736849 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxwh5\" (UniqueName: \"kubernetes.io/projected/9a608b0d-a014-4e5f-9e96-4331e842034d-kube-api-access-hxwh5\") pod \"9a608b0d-a014-4e5f-9e96-4331e842034d\" (UID: \"9a608b0d-a014-4e5f-9e96-4331e842034d\") " Apr 22 18:50:12.736991 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:12.736872 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9a608b0d-a014-4e5f-9e96-4331e842034d-console-serving-cert\") pod \"9a608b0d-a014-4e5f-9e96-4331e842034d\" (UID: \"9a608b0d-a014-4e5f-9e96-4331e842034d\") " Apr 22 18:50:12.737085 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:12.736985 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9a608b0d-a014-4e5f-9e96-4331e842034d-service-ca\") pod \"9a608b0d-a014-4e5f-9e96-4331e842034d\" (UID: \"9a608b0d-a014-4e5f-9e96-4331e842034d\") " Apr 22 18:50:12.737085 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:12.737037 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9a608b0d-a014-4e5f-9e96-4331e842034d-oauth-serving-cert\") pod \"9a608b0d-a014-4e5f-9e96-4331e842034d\" (UID: \"9a608b0d-a014-4e5f-9e96-4331e842034d\") " Apr 22 18:50:12.737085 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:12.737062 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a608b0d-a014-4e5f-9e96-4331e842034d-trusted-ca-bundle\") pod \"9a608b0d-a014-4e5f-9e96-4331e842034d\" (UID: \"9a608b0d-a014-4e5f-9e96-4331e842034d\") " Apr 22 18:50:12.737232 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:12.737118 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9a608b0d-a014-4e5f-9e96-4331e842034d-console-config\") pod \"9a608b0d-a014-4e5f-9e96-4331e842034d\" (UID: \"9a608b0d-a014-4e5f-9e96-4331e842034d\") " Apr 22 18:50:12.737445 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:12.737407 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a608b0d-a014-4e5f-9e96-4331e842034d-service-ca" (OuterVolumeSpecName: "service-ca") pod "9a608b0d-a014-4e5f-9e96-4331e842034d" (UID: "9a608b0d-a014-4e5f-9e96-4331e842034d"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:50:12.737567 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:12.737472 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a608b0d-a014-4e5f-9e96-4331e842034d-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "9a608b0d-a014-4e5f-9e96-4331e842034d" (UID: "9a608b0d-a014-4e5f-9e96-4331e842034d"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:50:12.737567 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:12.737524 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a608b0d-a014-4e5f-9e96-4331e842034d-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "9a608b0d-a014-4e5f-9e96-4331e842034d" (UID: "9a608b0d-a014-4e5f-9e96-4331e842034d"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:50:12.737733 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:12.737713 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a608b0d-a014-4e5f-9e96-4331e842034d-console-config" (OuterVolumeSpecName: "console-config") pod "9a608b0d-a014-4e5f-9e96-4331e842034d" (UID: "9a608b0d-a014-4e5f-9e96-4331e842034d"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:50:12.738997 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:12.738975 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a608b0d-a014-4e5f-9e96-4331e842034d-kube-api-access-hxwh5" (OuterVolumeSpecName: "kube-api-access-hxwh5") pod "9a608b0d-a014-4e5f-9e96-4331e842034d" (UID: "9a608b0d-a014-4e5f-9e96-4331e842034d"). InnerVolumeSpecName "kube-api-access-hxwh5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:50:12.739082 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:12.738996 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a608b0d-a014-4e5f-9e96-4331e842034d-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "9a608b0d-a014-4e5f-9e96-4331e842034d" (UID: "9a608b0d-a014-4e5f-9e96-4331e842034d"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:50:12.739082 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:12.739014 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a608b0d-a014-4e5f-9e96-4331e842034d-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "9a608b0d-a014-4e5f-9e96-4331e842034d" (UID: "9a608b0d-a014-4e5f-9e96-4331e842034d"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:50:12.838421 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:12.838399 2566 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9a608b0d-a014-4e5f-9e96-4331e842034d-console-oauth-config\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:50:12.838421 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:12.838422 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hxwh5\" (UniqueName: \"kubernetes.io/projected/9a608b0d-a014-4e5f-9e96-4331e842034d-kube-api-access-hxwh5\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:50:12.838553 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:12.838432 2566 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9a608b0d-a014-4e5f-9e96-4331e842034d-console-serving-cert\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:50:12.838553 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:12.838441 2566 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9a608b0d-a014-4e5f-9e96-4331e842034d-service-ca\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:50:12.838553 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:12.838450 2566 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9a608b0d-a014-4e5f-9e96-4331e842034d-oauth-serving-cert\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:50:12.838553 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:12.838459 2566 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a608b0d-a014-4e5f-9e96-4331e842034d-trusted-ca-bundle\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:50:12.838553 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:12.838467 2566 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9a608b0d-a014-4e5f-9e96-4331e842034d-console-config\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:50:13.477206 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:13.477177 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-69cd7898c7-tcsv8_9a608b0d-a014-4e5f-9e96-4331e842034d/console/0.log" Apr 22 18:50:13.477588 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:13.477215 2566 generic.go:358] "Generic (PLEG): container finished" podID="9a608b0d-a014-4e5f-9e96-4331e842034d" containerID="6f1cddff386ec0ac2bc325aa8f336bbc4cd8eb506334a1e7753e658ced61d409" exitCode=2 Apr 22 18:50:13.477588 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:13.477287 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69cd7898c7-tcsv8" event={"ID":"9a608b0d-a014-4e5f-9e96-4331e842034d","Type":"ContainerDied","Data":"6f1cddff386ec0ac2bc325aa8f336bbc4cd8eb506334a1e7753e658ced61d409"} Apr 22 18:50:13.477588 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:13.477304 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69cd7898c7-tcsv8" Apr 22 18:50:13.477588 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:13.477314 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69cd7898c7-tcsv8" event={"ID":"9a608b0d-a014-4e5f-9e96-4331e842034d","Type":"ContainerDied","Data":"a24e17f6f9e7b092b4287aa39a402657aace7e604f218608daa371218876ea5a"} Apr 22 18:50:13.477588 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:13.477329 2566 scope.go:117] "RemoveContainer" containerID="6f1cddff386ec0ac2bc325aa8f336bbc4cd8eb506334a1e7753e658ced61d409" Apr 22 18:50:13.485032 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:13.485014 2566 scope.go:117] "RemoveContainer" containerID="6f1cddff386ec0ac2bc325aa8f336bbc4cd8eb506334a1e7753e658ced61d409" Apr 22 18:50:13.485284 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:50:13.485260 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f1cddff386ec0ac2bc325aa8f336bbc4cd8eb506334a1e7753e658ced61d409\": container with ID starting with 6f1cddff386ec0ac2bc325aa8f336bbc4cd8eb506334a1e7753e658ced61d409 not found: ID does not exist" containerID="6f1cddff386ec0ac2bc325aa8f336bbc4cd8eb506334a1e7753e658ced61d409" Apr 22 18:50:13.485390 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:13.485294 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f1cddff386ec0ac2bc325aa8f336bbc4cd8eb506334a1e7753e658ced61d409"} err="failed to get container status \"6f1cddff386ec0ac2bc325aa8f336bbc4cd8eb506334a1e7753e658ced61d409\": rpc error: code = NotFound desc = could not find container \"6f1cddff386ec0ac2bc325aa8f336bbc4cd8eb506334a1e7753e658ced61d409\": container with ID starting with 6f1cddff386ec0ac2bc325aa8f336bbc4cd8eb506334a1e7753e658ced61d409 not found: ID does not exist" Apr 22 18:50:13.493996 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:13.493975 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-69cd7898c7-tcsv8"] Apr 22 18:50:13.499158 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:13.499137 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-69cd7898c7-tcsv8"] Apr 22 18:50:14.828980 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:14.828929 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a608b0d-a014-4e5f-9e96-4331e842034d" path="/var/lib/kubelet/pods/9a608b0d-a014-4e5f-9e96-4331e842034d/volumes" Apr 22 18:50:21.504236 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:21.504142 2566 generic.go:358] "Generic (PLEG): container finished" podID="2758dd4a-0d4e-4566-b935-f784873d69e9" containerID="9330e8007aaac2a7b23aab6afb682cc5f1e3333d65a5fd8e731195b172a75e3c" exitCode=0 Apr 22 18:50:21.504236 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:21.504218 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lxclb" event={"ID":"2758dd4a-0d4e-4566-b935-f784873d69e9","Type":"ContainerDied","Data":"9330e8007aaac2a7b23aab6afb682cc5f1e3333d65a5fd8e731195b172a75e3c"} Apr 22 18:50:21.504721 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:21.504614 2566 scope.go:117] "RemoveContainer" containerID="9330e8007aaac2a7b23aab6afb682cc5f1e3333d65a5fd8e731195b172a75e3c" Apr 22 18:50:22.509676 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:22.509642 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lxclb" event={"ID":"2758dd4a-0d4e-4566-b935-f784873d69e9","Type":"ContainerStarted","Data":"704727e347b987072d6c7cbc9b14c4549a1982aaa3ccb8ace4146dd95598ee50"} Apr 22 18:50:23.496747 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:23.496690 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5d8dcd44f7-tw2f4" podUID="995cb635-9661-411f-8a71-b0de66906803" containerName="console" containerID="cri-o://6fa680d98332e49264fe9d7ff137778cdde1aa2f1b6211a73d37875dc4e72c80" gracePeriod=15 Apr 22 18:50:23.737240 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:23.737219 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5d8dcd44f7-tw2f4_995cb635-9661-411f-8a71-b0de66906803/console/0.log" Apr 22 18:50:23.737527 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:23.737273 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d8dcd44f7-tw2f4" Apr 22 18:50:23.830062 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:23.829993 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/995cb635-9661-411f-8a71-b0de66906803-service-ca\") pod \"995cb635-9661-411f-8a71-b0de66906803\" (UID: \"995cb635-9661-411f-8a71-b0de66906803\") " Apr 22 18:50:23.830062 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:23.830039 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/995cb635-9661-411f-8a71-b0de66906803-console-oauth-config\") pod \"995cb635-9661-411f-8a71-b0de66906803\" (UID: \"995cb635-9661-411f-8a71-b0de66906803\") " Apr 22 18:50:23.830242 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:23.830097 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/995cb635-9661-411f-8a71-b0de66906803-console-serving-cert\") pod \"995cb635-9661-411f-8a71-b0de66906803\" (UID: \"995cb635-9661-411f-8a71-b0de66906803\") " Apr 22 18:50:23.830242 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:23.830135 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/995cb635-9661-411f-8a71-b0de66906803-oauth-serving-cert\") pod \"995cb635-9661-411f-8a71-b0de66906803\" (UID: \"995cb635-9661-411f-8a71-b0de66906803\") " Apr 22 18:50:23.830242 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:23.830175 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/995cb635-9661-411f-8a71-b0de66906803-console-config\") pod \"995cb635-9661-411f-8a71-b0de66906803\" (UID: \"995cb635-9661-411f-8a71-b0de66906803\") " Apr 22 18:50:23.830242 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:23.830212 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ln7bv\" (UniqueName: \"kubernetes.io/projected/995cb635-9661-411f-8a71-b0de66906803-kube-api-access-ln7bv\") pod \"995cb635-9661-411f-8a71-b0de66906803\" (UID: \"995cb635-9661-411f-8a71-b0de66906803\") " Apr 22 18:50:23.830433 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:23.830414 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/995cb635-9661-411f-8a71-b0de66906803-service-ca" (OuterVolumeSpecName: "service-ca") pod "995cb635-9661-411f-8a71-b0de66906803" (UID: "995cb635-9661-411f-8a71-b0de66906803"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:50:23.830551 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:23.830529 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/995cb635-9661-411f-8a71-b0de66906803-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "995cb635-9661-411f-8a71-b0de66906803" (UID: "995cb635-9661-411f-8a71-b0de66906803"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:50:23.830602 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:23.830545 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/995cb635-9661-411f-8a71-b0de66906803-console-config" (OuterVolumeSpecName: "console-config") pod "995cb635-9661-411f-8a71-b0de66906803" (UID: "995cb635-9661-411f-8a71-b0de66906803"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:50:23.832361 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:23.832341 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/995cb635-9661-411f-8a71-b0de66906803-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "995cb635-9661-411f-8a71-b0de66906803" (UID: "995cb635-9661-411f-8a71-b0de66906803"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:50:23.832426 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:23.832363 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/995cb635-9661-411f-8a71-b0de66906803-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "995cb635-9661-411f-8a71-b0de66906803" (UID: "995cb635-9661-411f-8a71-b0de66906803"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:50:23.832426 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:23.832375 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/995cb635-9661-411f-8a71-b0de66906803-kube-api-access-ln7bv" (OuterVolumeSpecName: "kube-api-access-ln7bv") pod "995cb635-9661-411f-8a71-b0de66906803" (UID: "995cb635-9661-411f-8a71-b0de66906803"). InnerVolumeSpecName "kube-api-access-ln7bv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:50:23.930837 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:23.930806 2566 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/995cb635-9661-411f-8a71-b0de66906803-console-serving-cert\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:50:23.930837 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:23.930832 2566 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/995cb635-9661-411f-8a71-b0de66906803-oauth-serving-cert\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:50:23.930837 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:23.930842 2566 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/995cb635-9661-411f-8a71-b0de66906803-console-config\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:50:23.931136 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:23.930851 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ln7bv\" (UniqueName: \"kubernetes.io/projected/995cb635-9661-411f-8a71-b0de66906803-kube-api-access-ln7bv\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:50:23.931136 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:23.930866 2566 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/995cb635-9661-411f-8a71-b0de66906803-service-ca\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:50:23.931136 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:23.930880 2566 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/995cb635-9661-411f-8a71-b0de66906803-console-oauth-config\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:50:24.515891 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:24.515866 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5d8dcd44f7-tw2f4_995cb635-9661-411f-8a71-b0de66906803/console/0.log" Apr 22 18:50:24.516075 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:24.515902 2566 generic.go:358] "Generic (PLEG): container finished" podID="995cb635-9661-411f-8a71-b0de66906803" containerID="6fa680d98332e49264fe9d7ff137778cdde1aa2f1b6211a73d37875dc4e72c80" exitCode=2 Apr 22 18:50:24.516075 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:24.515970 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d8dcd44f7-tw2f4" event={"ID":"995cb635-9661-411f-8a71-b0de66906803","Type":"ContainerDied","Data":"6fa680d98332e49264fe9d7ff137778cdde1aa2f1b6211a73d37875dc4e72c80"} Apr 22 18:50:24.516075 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:24.515999 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d8dcd44f7-tw2f4" event={"ID":"995cb635-9661-411f-8a71-b0de66906803","Type":"ContainerDied","Data":"ab90fdf730ff9d56a9e05feed2ca5466f82ef77d4c7e4d47a244f1c3a3042e7c"} Apr 22 18:50:24.516075 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:24.516005 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d8dcd44f7-tw2f4" Apr 22 18:50:24.516075 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:24.516013 2566 scope.go:117] "RemoveContainer" containerID="6fa680d98332e49264fe9d7ff137778cdde1aa2f1b6211a73d37875dc4e72c80" Apr 22 18:50:24.526644 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:24.526621 2566 scope.go:117] "RemoveContainer" containerID="6fa680d98332e49264fe9d7ff137778cdde1aa2f1b6211a73d37875dc4e72c80" Apr 22 18:50:24.527020 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:50:24.527002 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fa680d98332e49264fe9d7ff137778cdde1aa2f1b6211a73d37875dc4e72c80\": container with ID starting with 6fa680d98332e49264fe9d7ff137778cdde1aa2f1b6211a73d37875dc4e72c80 not found: ID does not exist" containerID="6fa680d98332e49264fe9d7ff137778cdde1aa2f1b6211a73d37875dc4e72c80" Apr 22 18:50:24.527082 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:24.527038 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fa680d98332e49264fe9d7ff137778cdde1aa2f1b6211a73d37875dc4e72c80"} err="failed to get container status \"6fa680d98332e49264fe9d7ff137778cdde1aa2f1b6211a73d37875dc4e72c80\": rpc error: code = NotFound desc = could not find container \"6fa680d98332e49264fe9d7ff137778cdde1aa2f1b6211a73d37875dc4e72c80\": container with ID starting with 6fa680d98332e49264fe9d7ff137778cdde1aa2f1b6211a73d37875dc4e72c80 not found: ID does not exist" Apr 22 18:50:24.535642 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:24.535622 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5d8dcd44f7-tw2f4"] Apr 22 18:50:24.538778 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:24.538760 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5d8dcd44f7-tw2f4"] Apr 22 18:50:24.829226 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:24.829154 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="995cb635-9661-411f-8a71-b0de66906803" path="/var/lib/kubelet/pods/995cb635-9661-411f-8a71-b0de66906803/volumes" Apr 22 18:50:50.548244 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:50.548208 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/66e2806a-82a6-444a-a19a-13641a426bed-metrics-certs\") pod \"network-metrics-daemon-7zj27\" (UID: \"66e2806a-82a6-444a-a19a-13641a426bed\") " pod="openshift-multus/network-metrics-daemon-7zj27" Apr 22 18:50:50.550575 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:50.550544 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/66e2806a-82a6-444a-a19a-13641a426bed-metrics-certs\") pod \"network-metrics-daemon-7zj27\" (UID: \"66e2806a-82a6-444a-a19a-13641a426bed\") " pod="openshift-multus/network-metrics-daemon-7zj27" Apr 22 18:50:50.829143 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:50.829062 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-m94d4\"" Apr 22 18:50:50.836802 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:50.836776 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7zj27" Apr 22 18:50:50.950175 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:50.950110 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7zj27"] Apr 22 18:50:50.952922 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:50:50.952896 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66e2806a_82a6_444a_a19a_13641a426bed.slice/crio-1853012203f0be8a02ff9b4385f90046830facca598126fae8896000a0c8b5c6 WatchSource:0}: Error finding container 1853012203f0be8a02ff9b4385f90046830facca598126fae8896000a0c8b5c6: Status 404 returned error can't find the container with id 1853012203f0be8a02ff9b4385f90046830facca598126fae8896000a0c8b5c6 Apr 22 18:50:51.591376 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:51.591337 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7zj27" event={"ID":"66e2806a-82a6-444a-a19a-13641a426bed","Type":"ContainerStarted","Data":"1853012203f0be8a02ff9b4385f90046830facca598126fae8896000a0c8b5c6"} Apr 22 18:50:52.596030 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:52.595994 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7zj27" event={"ID":"66e2806a-82a6-444a-a19a-13641a426bed","Type":"ContainerStarted","Data":"b6d052a5afe402d3c98ef71eee9d7c559a109287c2083fa59345165efccdb8d7"} Apr 22 18:50:52.596030 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:52.596033 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7zj27" event={"ID":"66e2806a-82a6-444a-a19a-13641a426bed","Type":"ContainerStarted","Data":"a707abe8987d6fcc173f6873730b5bb838f150f0b37b7ff7a7bc1324af67fca6"} Apr 22 18:50:52.611413 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:52.611367 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-7zj27" podStartSLOduration=253.707446696 podStartE2EDuration="4m14.611352918s" podCreationTimestamp="2026-04-22 18:46:38 +0000 UTC" firstStartedPulling="2026-04-22 18:50:50.954715598 +0000 UTC m=+252.733549066" lastFinishedPulling="2026-04-22 18:50:51.858621807 +0000 UTC m=+253.637455288" observedRunningTime="2026-04-22 18:50:52.609690578 +0000 UTC m=+254.388524063" watchObservedRunningTime="2026-04-22 18:50:52.611352918 +0000 UTC m=+254.390186405" Apr 22 18:50:55.361563 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:55.361526 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:50:55.605304 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:55.605261 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8bc4df07-ae0b-4182-b241-870e0ed1be3d" containerName="alertmanager" containerID="cri-o://f9e33368851aaff9f758e256b86700acae3c0beee99dce2eba325498f977c1e1" gracePeriod=120 Apr 22 18:50:55.605498 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:55.605315 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8bc4df07-ae0b-4182-b241-870e0ed1be3d" containerName="kube-rbac-proxy-metric" containerID="cri-o://255ed65319e57d3fc70b1a6c1630fa93fb20bdb9588e36b25498f6be4d632bbf" gracePeriod=120 Apr 22 18:50:55.605498 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:55.605342 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8bc4df07-ae0b-4182-b241-870e0ed1be3d" containerName="kube-rbac-proxy-web" containerID="cri-o://6f503c660d95e4c7d5eebc032206ad2fd88eac2f0ab0f367d7a4e522a9191232" gracePeriod=120 Apr 22 18:50:55.605498 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:55.605377 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8bc4df07-ae0b-4182-b241-870e0ed1be3d" containerName="prom-label-proxy" containerID="cri-o://cde9450bf0b5e750e4d44067d5f7333ccfd7a20c9132c9b4028ca972443fe4df" gracePeriod=120 Apr 22 18:50:55.605498 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:55.605400 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8bc4df07-ae0b-4182-b241-870e0ed1be3d" containerName="config-reloader" containerID="cri-o://f98d2ac67a3e64994429aa72939f5f7be356799a710ecb175dec8c704d4c7762" gracePeriod=120 Apr 22 18:50:55.605498 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:55.605378 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8bc4df07-ae0b-4182-b241-870e0ed1be3d" containerName="kube-rbac-proxy" containerID="cri-o://faa95981a43f743f994353db59508e18a800abf9e8239b8e1c48fc4f987ec66e" gracePeriod=120 Apr 22 18:50:56.611328 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:56.611293 2566 generic.go:358] "Generic (PLEG): container finished" podID="8bc4df07-ae0b-4182-b241-870e0ed1be3d" containerID="cde9450bf0b5e750e4d44067d5f7333ccfd7a20c9132c9b4028ca972443fe4df" exitCode=0 Apr 22 18:50:56.611328 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:56.611320 2566 generic.go:358] "Generic (PLEG): container finished" podID="8bc4df07-ae0b-4182-b241-870e0ed1be3d" containerID="faa95981a43f743f994353db59508e18a800abf9e8239b8e1c48fc4f987ec66e" exitCode=0 Apr 22 18:50:56.611328 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:56.611329 2566 generic.go:358] "Generic (PLEG): container finished" podID="8bc4df07-ae0b-4182-b241-870e0ed1be3d" containerID="f98d2ac67a3e64994429aa72939f5f7be356799a710ecb175dec8c704d4c7762" exitCode=0 Apr 22 18:50:56.611328 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:56.611336 2566 generic.go:358] "Generic (PLEG): container finished" podID="8bc4df07-ae0b-4182-b241-870e0ed1be3d" containerID="f9e33368851aaff9f758e256b86700acae3c0beee99dce2eba325498f977c1e1" exitCode=0 Apr 22 18:50:56.611785 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:56.611370 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8bc4df07-ae0b-4182-b241-870e0ed1be3d","Type":"ContainerDied","Data":"cde9450bf0b5e750e4d44067d5f7333ccfd7a20c9132c9b4028ca972443fe4df"} Apr 22 18:50:56.611785 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:56.611403 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8bc4df07-ae0b-4182-b241-870e0ed1be3d","Type":"ContainerDied","Data":"faa95981a43f743f994353db59508e18a800abf9e8239b8e1c48fc4f987ec66e"} Apr 22 18:50:56.611785 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:56.611413 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8bc4df07-ae0b-4182-b241-870e0ed1be3d","Type":"ContainerDied","Data":"f98d2ac67a3e64994429aa72939f5f7be356799a710ecb175dec8c704d4c7762"} Apr 22 18:50:56.611785 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:56.611424 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8bc4df07-ae0b-4182-b241-870e0ed1be3d","Type":"ContainerDied","Data":"f9e33368851aaff9f758e256b86700acae3c0beee99dce2eba325498f977c1e1"} Apr 22 18:50:56.835950 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:56.835915 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:56.900821 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:56.900741 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8bc4df07-ae0b-4182-b241-870e0ed1be3d-web-config\") pod \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\" (UID: \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\") " Apr 22 18:50:56.900821 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:56.900775 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8bc4df07-ae0b-4182-b241-870e0ed1be3d-metrics-client-ca\") pod \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\" (UID: \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\") " Apr 22 18:50:56.900821 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:56.900812 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cgb4\" (UniqueName: \"kubernetes.io/projected/8bc4df07-ae0b-4182-b241-870e0ed1be3d-kube-api-access-4cgb4\") pod \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\" (UID: \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\") " Apr 22 18:50:56.901117 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:56.900864 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8bc4df07-ae0b-4182-b241-870e0ed1be3d-secret-alertmanager-kube-rbac-proxy-web\") pod \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\" (UID: \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\") " Apr 22 18:50:56.901117 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:56.900898 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8bc4df07-ae0b-4182-b241-870e0ed1be3d-cluster-tls-config\") pod \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\" (UID: \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\") " Apr 22 18:50:56.901117 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:56.900958 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8bc4df07-ae0b-4182-b241-870e0ed1be3d-secret-alertmanager-kube-rbac-proxy-metric\") pod \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\" (UID: \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\") " Apr 22 18:50:56.901117 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:56.900983 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8bc4df07-ae0b-4182-b241-870e0ed1be3d-alertmanager-main-db\") pod \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\" (UID: \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\") " Apr 22 18:50:56.901117 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:56.901000 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8bc4df07-ae0b-4182-b241-870e0ed1be3d-secret-alertmanager-main-tls\") pod \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\" (UID: \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\") " Apr 22 18:50:56.901117 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:56.901014 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8bc4df07-ae0b-4182-b241-870e0ed1be3d-config-out\") pod \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\" (UID: \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\") " Apr 22 18:50:56.901117 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:56.901039 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8bc4df07-ae0b-4182-b241-870e0ed1be3d-secret-alertmanager-kube-rbac-proxy\") pod \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\" (UID: \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\") " Apr 22 18:50:56.901117 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:56.901067 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8bc4df07-ae0b-4182-b241-870e0ed1be3d-config-volume\") pod \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\" (UID: \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\") " Apr 22 18:50:56.901117 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:56.901119 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8bc4df07-ae0b-4182-b241-870e0ed1be3d-alertmanager-trusted-ca-bundle\") pod \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\" (UID: \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\") " Apr 22 18:50:56.901529 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:56.901152 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8bc4df07-ae0b-4182-b241-870e0ed1be3d-tls-assets\") pod \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\" (UID: \"8bc4df07-ae0b-4182-b241-870e0ed1be3d\") " Apr 22 18:50:56.901529 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:56.901308 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bc4df07-ae0b-4182-b241-870e0ed1be3d-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "8bc4df07-ae0b-4182-b241-870e0ed1be3d" (UID: "8bc4df07-ae0b-4182-b241-870e0ed1be3d"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:50:56.901529 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:56.901419 2566 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8bc4df07-ae0b-4182-b241-870e0ed1be3d-alertmanager-main-db\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:50:56.902026 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:56.901716 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bc4df07-ae0b-4182-b241-870e0ed1be3d-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "8bc4df07-ae0b-4182-b241-870e0ed1be3d" (UID: "8bc4df07-ae0b-4182-b241-870e0ed1be3d"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:50:56.903844 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:56.903669 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bc4df07-ae0b-4182-b241-870e0ed1be3d-kube-api-access-4cgb4" (OuterVolumeSpecName: "kube-api-access-4cgb4") pod "8bc4df07-ae0b-4182-b241-870e0ed1be3d" (UID: "8bc4df07-ae0b-4182-b241-870e0ed1be3d"). InnerVolumeSpecName "kube-api-access-4cgb4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:50:56.903983 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:56.903884 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bc4df07-ae0b-4182-b241-870e0ed1be3d-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "8bc4df07-ae0b-4182-b241-870e0ed1be3d" (UID: "8bc4df07-ae0b-4182-b241-870e0ed1be3d"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:50:56.904221 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:56.904196 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bc4df07-ae0b-4182-b241-870e0ed1be3d-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "8bc4df07-ae0b-4182-b241-870e0ed1be3d" (UID: "8bc4df07-ae0b-4182-b241-870e0ed1be3d"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:50:56.904304 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:56.904219 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bc4df07-ae0b-4182-b241-870e0ed1be3d-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "8bc4df07-ae0b-4182-b241-870e0ed1be3d" (UID: "8bc4df07-ae0b-4182-b241-870e0ed1be3d"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:50:56.904377 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:56.904357 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bc4df07-ae0b-4182-b241-870e0ed1be3d-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "8bc4df07-ae0b-4182-b241-870e0ed1be3d" (UID: "8bc4df07-ae0b-4182-b241-870e0ed1be3d"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:50:56.905087 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:56.905052 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bc4df07-ae0b-4182-b241-870e0ed1be3d-config-out" (OuterVolumeSpecName: "config-out") pod "8bc4df07-ae0b-4182-b241-870e0ed1be3d" (UID: "8bc4df07-ae0b-4182-b241-870e0ed1be3d"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:50:56.905226 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:56.905203 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bc4df07-ae0b-4182-b241-870e0ed1be3d-config-volume" (OuterVolumeSpecName: "config-volume") pod "8bc4df07-ae0b-4182-b241-870e0ed1be3d" (UID: "8bc4df07-ae0b-4182-b241-870e0ed1be3d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:50:56.905812 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:56.905793 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bc4df07-ae0b-4182-b241-870e0ed1be3d-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "8bc4df07-ae0b-4182-b241-870e0ed1be3d" (UID: "8bc4df07-ae0b-4182-b241-870e0ed1be3d"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:50:56.906314 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:56.906290 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bc4df07-ae0b-4182-b241-870e0ed1be3d-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "8bc4df07-ae0b-4182-b241-870e0ed1be3d" (UID: "8bc4df07-ae0b-4182-b241-870e0ed1be3d"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:50:56.908975 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:56.908908 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bc4df07-ae0b-4182-b241-870e0ed1be3d-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "8bc4df07-ae0b-4182-b241-870e0ed1be3d" (UID: "8bc4df07-ae0b-4182-b241-870e0ed1be3d"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:50:56.915651 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:56.915630 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bc4df07-ae0b-4182-b241-870e0ed1be3d-web-config" (OuterVolumeSpecName: "web-config") pod "8bc4df07-ae0b-4182-b241-870e0ed1be3d" (UID: "8bc4df07-ae0b-4182-b241-870e0ed1be3d"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:50:57.002157 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.002123 2566 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8bc4df07-ae0b-4182-b241-870e0ed1be3d-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:50:57.002157 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.002159 2566 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8bc4df07-ae0b-4182-b241-870e0ed1be3d-config-volume\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:50:57.002320 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.002176 2566 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8bc4df07-ae0b-4182-b241-870e0ed1be3d-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:50:57.002320 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.002189 2566 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8bc4df07-ae0b-4182-b241-870e0ed1be3d-tls-assets\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:50:57.002320 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.002201 2566 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8bc4df07-ae0b-4182-b241-870e0ed1be3d-web-config\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:50:57.002320 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.002213 2566 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8bc4df07-ae0b-4182-b241-870e0ed1be3d-metrics-client-ca\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:50:57.002320 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.002225 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4cgb4\" (UniqueName: \"kubernetes.io/projected/8bc4df07-ae0b-4182-b241-870e0ed1be3d-kube-api-access-4cgb4\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:50:57.002320 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.002240 2566 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8bc4df07-ae0b-4182-b241-870e0ed1be3d-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:50:57.002320 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.002252 2566 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8bc4df07-ae0b-4182-b241-870e0ed1be3d-cluster-tls-config\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:50:57.002320 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.002266 2566 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8bc4df07-ae0b-4182-b241-870e0ed1be3d-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:50:57.002320 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.002281 2566 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8bc4df07-ae0b-4182-b241-870e0ed1be3d-secret-alertmanager-main-tls\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:50:57.002320 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.002295 2566 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8bc4df07-ae0b-4182-b241-870e0ed1be3d-config-out\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:50:57.617408 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.617369 2566 generic.go:358] "Generic (PLEG): container finished" podID="8bc4df07-ae0b-4182-b241-870e0ed1be3d" containerID="255ed65319e57d3fc70b1a6c1630fa93fb20bdb9588e36b25498f6be4d632bbf" exitCode=0 Apr 22 18:50:57.617408 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.617406 2566 generic.go:358] "Generic (PLEG): container finished" podID="8bc4df07-ae0b-4182-b241-870e0ed1be3d" containerID="6f503c660d95e4c7d5eebc032206ad2fd88eac2f0ab0f367d7a4e522a9191232" exitCode=0 Apr 22 18:50:57.617825 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.617412 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8bc4df07-ae0b-4182-b241-870e0ed1be3d","Type":"ContainerDied","Data":"255ed65319e57d3fc70b1a6c1630fa93fb20bdb9588e36b25498f6be4d632bbf"} Apr 22 18:50:57.617825 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.617458 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8bc4df07-ae0b-4182-b241-870e0ed1be3d","Type":"ContainerDied","Data":"6f503c660d95e4c7d5eebc032206ad2fd88eac2f0ab0f367d7a4e522a9191232"} Apr 22 18:50:57.617825 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.617474 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8bc4df07-ae0b-4182-b241-870e0ed1be3d","Type":"ContainerDied","Data":"66f7d9bbb9ef33a505a383980d2366b41696088f20608617369d2dffd3ae6aac"} Apr 22 18:50:57.617825 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.617492 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:57.617825 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.617493 2566 scope.go:117] "RemoveContainer" containerID="cde9450bf0b5e750e4d44067d5f7333ccfd7a20c9132c9b4028ca972443fe4df" Apr 22 18:50:57.625484 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.625469 2566 scope.go:117] "RemoveContainer" containerID="255ed65319e57d3fc70b1a6c1630fa93fb20bdb9588e36b25498f6be4d632bbf" Apr 22 18:50:57.632135 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.632115 2566 scope.go:117] "RemoveContainer" containerID="faa95981a43f743f994353db59508e18a800abf9e8239b8e1c48fc4f987ec66e" Apr 22 18:50:57.638712 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.638694 2566 scope.go:117] "RemoveContainer" containerID="6f503c660d95e4c7d5eebc032206ad2fd88eac2f0ab0f367d7a4e522a9191232" Apr 22 18:50:57.639493 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.639471 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:50:57.643464 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.643438 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:50:57.645400 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.645386 2566 scope.go:117] "RemoveContainer" containerID="f98d2ac67a3e64994429aa72939f5f7be356799a710ecb175dec8c704d4c7762" Apr 22 18:50:57.651601 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.651585 2566 scope.go:117] "RemoveContainer" containerID="f9e33368851aaff9f758e256b86700acae3c0beee99dce2eba325498f977c1e1" Apr 22 18:50:57.663607 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.663591 2566 scope.go:117] "RemoveContainer" containerID="184dc1a7f73ea9bb3e15c8c8582ee84074bb9a5263546bc73698ff4696d68544" Apr 22 18:50:57.665918 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.665898 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:50:57.666249 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.666234 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8bc4df07-ae0b-4182-b241-870e0ed1be3d" containerName="prom-label-proxy" Apr 22 18:50:57.666321 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.666252 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bc4df07-ae0b-4182-b241-870e0ed1be3d" containerName="prom-label-proxy" Apr 22 18:50:57.666321 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.666267 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8bc4df07-ae0b-4182-b241-870e0ed1be3d" containerName="init-config-reloader" Apr 22 18:50:57.666321 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.666277 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bc4df07-ae0b-4182-b241-870e0ed1be3d" containerName="init-config-reloader" Apr 22 18:50:57.666321 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.666292 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8bc4df07-ae0b-4182-b241-870e0ed1be3d" containerName="kube-rbac-proxy" Apr 22 18:50:57.666321 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.666300 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bc4df07-ae0b-4182-b241-870e0ed1be3d" containerName="kube-rbac-proxy" Apr 22 18:50:57.666321 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.666311 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ca94f287-c422-413d-9356-adf2fb7ef856" containerName="console" Apr 22 18:50:57.666321 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.666319 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca94f287-c422-413d-9356-adf2fb7ef856" containerName="console" Apr 22 18:50:57.666637 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.666332 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9a608b0d-a014-4e5f-9e96-4331e842034d" containerName="console" Apr 22 18:50:57.666637 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.666340 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a608b0d-a014-4e5f-9e96-4331e842034d" containerName="console" Apr 22 18:50:57.666637 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.666351 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5ceaf6dd-7ffb-4393-9076-3f02c3b84983" containerName="registry" Apr 22 18:50:57.666637 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.666359 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ceaf6dd-7ffb-4393-9076-3f02c3b84983" containerName="registry" Apr 22 18:50:57.666637 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.666368 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8bc4df07-ae0b-4182-b241-870e0ed1be3d" containerName="kube-rbac-proxy-metric" Apr 22 18:50:57.666637 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.666376 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bc4df07-ae0b-4182-b241-870e0ed1be3d" containerName="kube-rbac-proxy-metric" Apr 22 18:50:57.666637 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.666387 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="995cb635-9661-411f-8a71-b0de66906803" containerName="console" Apr 22 18:50:57.666637 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.666395 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="995cb635-9661-411f-8a71-b0de66906803" containerName="console" Apr 22 18:50:57.666637 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.666409 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8bc4df07-ae0b-4182-b241-870e0ed1be3d" containerName="alertmanager" Apr 22 18:50:57.666637 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.666417 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bc4df07-ae0b-4182-b241-870e0ed1be3d" containerName="alertmanager" Apr 22 18:50:57.666637 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.666430 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8bc4df07-ae0b-4182-b241-870e0ed1be3d" containerName="config-reloader" Apr 22 18:50:57.666637 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.666438 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bc4df07-ae0b-4182-b241-870e0ed1be3d" containerName="config-reloader" Apr 22 18:50:57.666637 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.666449 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8bc4df07-ae0b-4182-b241-870e0ed1be3d" containerName="kube-rbac-proxy-web" Apr 22 18:50:57.666637 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.666456 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bc4df07-ae0b-4182-b241-870e0ed1be3d" containerName="kube-rbac-proxy-web" Apr 22 18:50:57.666637 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.666531 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="8bc4df07-ae0b-4182-b241-870e0ed1be3d" containerName="kube-rbac-proxy-web" Apr 22 18:50:57.666637 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.666544 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="8bc4df07-ae0b-4182-b241-870e0ed1be3d" containerName="config-reloader" Apr 22 18:50:57.666637 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.666555 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="5ceaf6dd-7ffb-4393-9076-3f02c3b84983" containerName="registry" Apr 22 18:50:57.666637 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.666565 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="995cb635-9661-411f-8a71-b0de66906803" containerName="console" Apr 22 18:50:57.666637 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.666576 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="8bc4df07-ae0b-4182-b241-870e0ed1be3d" containerName="prom-label-proxy" Apr 22 18:50:57.666637 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.666584 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="9a608b0d-a014-4e5f-9e96-4331e842034d" containerName="console" Apr 22 18:50:57.666637 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.666595 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="8bc4df07-ae0b-4182-b241-870e0ed1be3d" containerName="kube-rbac-proxy" Apr 22 18:50:57.666637 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.666605 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="8bc4df07-ae0b-4182-b241-870e0ed1be3d" containerName="alertmanager" Apr 22 18:50:57.666637 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.666615 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="8bc4df07-ae0b-4182-b241-870e0ed1be3d" containerName="kube-rbac-proxy-metric" Apr 22 18:50:57.666637 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.666627 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="ca94f287-c422-413d-9356-adf2fb7ef856" containerName="console" Apr 22 18:50:57.671030 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.670807 2566 scope.go:117] "RemoveContainer" containerID="cde9450bf0b5e750e4d44067d5f7333ccfd7a20c9132c9b4028ca972443fe4df" Apr 22 18:50:57.671091 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:50:57.671069 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cde9450bf0b5e750e4d44067d5f7333ccfd7a20c9132c9b4028ca972443fe4df\": container with ID starting with cde9450bf0b5e750e4d44067d5f7333ccfd7a20c9132c9b4028ca972443fe4df not found: ID does not exist" containerID="cde9450bf0b5e750e4d44067d5f7333ccfd7a20c9132c9b4028ca972443fe4df" Apr 22 18:50:57.671129 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.671099 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cde9450bf0b5e750e4d44067d5f7333ccfd7a20c9132c9b4028ca972443fe4df"} err="failed to get container status \"cde9450bf0b5e750e4d44067d5f7333ccfd7a20c9132c9b4028ca972443fe4df\": rpc error: code = NotFound desc = could not find container \"cde9450bf0b5e750e4d44067d5f7333ccfd7a20c9132c9b4028ca972443fe4df\": container with ID starting with cde9450bf0b5e750e4d44067d5f7333ccfd7a20c9132c9b4028ca972443fe4df not found: ID does not exist" Apr 22 18:50:57.671129 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.671118 2566 scope.go:117] "RemoveContainer" containerID="255ed65319e57d3fc70b1a6c1630fa93fb20bdb9588e36b25498f6be4d632bbf" Apr 22 18:50:57.671228 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.671212 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:57.671330 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:50:57.671313 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"255ed65319e57d3fc70b1a6c1630fa93fb20bdb9588e36b25498f6be4d632bbf\": container with ID starting with 255ed65319e57d3fc70b1a6c1630fa93fb20bdb9588e36b25498f6be4d632bbf not found: ID does not exist" containerID="255ed65319e57d3fc70b1a6c1630fa93fb20bdb9588e36b25498f6be4d632bbf" Apr 22 18:50:57.671376 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.671336 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"255ed65319e57d3fc70b1a6c1630fa93fb20bdb9588e36b25498f6be4d632bbf"} err="failed to get container status \"255ed65319e57d3fc70b1a6c1630fa93fb20bdb9588e36b25498f6be4d632bbf\": rpc error: code = NotFound desc = could not find container \"255ed65319e57d3fc70b1a6c1630fa93fb20bdb9588e36b25498f6be4d632bbf\": container with ID starting with 255ed65319e57d3fc70b1a6c1630fa93fb20bdb9588e36b25498f6be4d632bbf not found: ID does not exist" Apr 22 18:50:57.671376 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.671350 2566 scope.go:117] "RemoveContainer" containerID="faa95981a43f743f994353db59508e18a800abf9e8239b8e1c48fc4f987ec66e" Apr 22 18:50:57.671736 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:50:57.671708 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"faa95981a43f743f994353db59508e18a800abf9e8239b8e1c48fc4f987ec66e\": container with ID starting with faa95981a43f743f994353db59508e18a800abf9e8239b8e1c48fc4f987ec66e not found: ID does not exist" containerID="faa95981a43f743f994353db59508e18a800abf9e8239b8e1c48fc4f987ec66e" Apr 22 18:50:57.671814 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.671745 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"faa95981a43f743f994353db59508e18a800abf9e8239b8e1c48fc4f987ec66e"} err="failed to get container status \"faa95981a43f743f994353db59508e18a800abf9e8239b8e1c48fc4f987ec66e\": rpc error: code = NotFound desc = could not find container \"faa95981a43f743f994353db59508e18a800abf9e8239b8e1c48fc4f987ec66e\": container with ID starting with faa95981a43f743f994353db59508e18a800abf9e8239b8e1c48fc4f987ec66e not found: ID does not exist" Apr 22 18:50:57.671814 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.671768 2566 scope.go:117] "RemoveContainer" containerID="6f503c660d95e4c7d5eebc032206ad2fd88eac2f0ab0f367d7a4e522a9191232" Apr 22 18:50:57.672018 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:50:57.671998 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f503c660d95e4c7d5eebc032206ad2fd88eac2f0ab0f367d7a4e522a9191232\": container with ID starting with 6f503c660d95e4c7d5eebc032206ad2fd88eac2f0ab0f367d7a4e522a9191232 not found: ID does not exist" containerID="6f503c660d95e4c7d5eebc032206ad2fd88eac2f0ab0f367d7a4e522a9191232" Apr 22 18:50:57.672113 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.672022 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f503c660d95e4c7d5eebc032206ad2fd88eac2f0ab0f367d7a4e522a9191232"} err="failed to get container status \"6f503c660d95e4c7d5eebc032206ad2fd88eac2f0ab0f367d7a4e522a9191232\": rpc error: code = NotFound desc = could not find container \"6f503c660d95e4c7d5eebc032206ad2fd88eac2f0ab0f367d7a4e522a9191232\": container with ID starting with 6f503c660d95e4c7d5eebc032206ad2fd88eac2f0ab0f367d7a4e522a9191232 not found: ID does not exist" Apr 22 18:50:57.672113 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.672040 2566 scope.go:117] "RemoveContainer" containerID="f98d2ac67a3e64994429aa72939f5f7be356799a710ecb175dec8c704d4c7762" Apr 22 18:50:57.672310 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:50:57.672290 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f98d2ac67a3e64994429aa72939f5f7be356799a710ecb175dec8c704d4c7762\": container with ID starting with f98d2ac67a3e64994429aa72939f5f7be356799a710ecb175dec8c704d4c7762 not found: ID does not exist" containerID="f98d2ac67a3e64994429aa72939f5f7be356799a710ecb175dec8c704d4c7762" Apr 22 18:50:57.672395 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.672313 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f98d2ac67a3e64994429aa72939f5f7be356799a710ecb175dec8c704d4c7762"} err="failed to get container status \"f98d2ac67a3e64994429aa72939f5f7be356799a710ecb175dec8c704d4c7762\": rpc error: code = NotFound desc = could not find container \"f98d2ac67a3e64994429aa72939f5f7be356799a710ecb175dec8c704d4c7762\": container with ID starting with f98d2ac67a3e64994429aa72939f5f7be356799a710ecb175dec8c704d4c7762 not found: ID does not exist" Apr 22 18:50:57.672395 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.672325 2566 scope.go:117] "RemoveContainer" containerID="f9e33368851aaff9f758e256b86700acae3c0beee99dce2eba325498f977c1e1" Apr 22 18:50:57.672592 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:50:57.672571 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9e33368851aaff9f758e256b86700acae3c0beee99dce2eba325498f977c1e1\": container with ID starting with f9e33368851aaff9f758e256b86700acae3c0beee99dce2eba325498f977c1e1 not found: ID does not exist" containerID="f9e33368851aaff9f758e256b86700acae3c0beee99dce2eba325498f977c1e1" Apr 22 18:50:57.672706 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.672595 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9e33368851aaff9f758e256b86700acae3c0beee99dce2eba325498f977c1e1"} err="failed to get container status \"f9e33368851aaff9f758e256b86700acae3c0beee99dce2eba325498f977c1e1\": rpc error: code = NotFound desc = could not find container \"f9e33368851aaff9f758e256b86700acae3c0beee99dce2eba325498f977c1e1\": container with ID starting with f9e33368851aaff9f758e256b86700acae3c0beee99dce2eba325498f977c1e1 not found: ID does not exist" Apr 22 18:50:57.672706 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.672612 2566 scope.go:117] "RemoveContainer" containerID="184dc1a7f73ea9bb3e15c8c8582ee84074bb9a5263546bc73698ff4696d68544" Apr 22 18:50:57.672908 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:50:57.672893 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"184dc1a7f73ea9bb3e15c8c8582ee84074bb9a5263546bc73698ff4696d68544\": container with ID starting with 184dc1a7f73ea9bb3e15c8c8582ee84074bb9a5263546bc73698ff4696d68544 not found: ID does not exist" containerID="184dc1a7f73ea9bb3e15c8c8582ee84074bb9a5263546bc73698ff4696d68544" Apr 22 18:50:57.672991 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.672910 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"184dc1a7f73ea9bb3e15c8c8582ee84074bb9a5263546bc73698ff4696d68544"} err="failed to get container status \"184dc1a7f73ea9bb3e15c8c8582ee84074bb9a5263546bc73698ff4696d68544\": rpc error: code = NotFound desc = could not find container \"184dc1a7f73ea9bb3e15c8c8582ee84074bb9a5263546bc73698ff4696d68544\": container with ID starting with 184dc1a7f73ea9bb3e15c8c8582ee84074bb9a5263546bc73698ff4696d68544 not found: ID does not exist" Apr 22 18:50:57.672991 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.672924 2566 scope.go:117] "RemoveContainer" containerID="cde9450bf0b5e750e4d44067d5f7333ccfd7a20c9132c9b4028ca972443fe4df" Apr 22 18:50:57.673258 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.673237 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cde9450bf0b5e750e4d44067d5f7333ccfd7a20c9132c9b4028ca972443fe4df"} err="failed to get container status \"cde9450bf0b5e750e4d44067d5f7333ccfd7a20c9132c9b4028ca972443fe4df\": rpc error: code = NotFound desc = could not find container \"cde9450bf0b5e750e4d44067d5f7333ccfd7a20c9132c9b4028ca972443fe4df\": container with ID starting with cde9450bf0b5e750e4d44067d5f7333ccfd7a20c9132c9b4028ca972443fe4df not found: ID does not exist" Apr 22 18:50:57.673317 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.673258 2566 scope.go:117] "RemoveContainer" containerID="255ed65319e57d3fc70b1a6c1630fa93fb20bdb9588e36b25498f6be4d632bbf" Apr 22 18:50:57.673506 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.673483 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"255ed65319e57d3fc70b1a6c1630fa93fb20bdb9588e36b25498f6be4d632bbf"} err="failed to get container status \"255ed65319e57d3fc70b1a6c1630fa93fb20bdb9588e36b25498f6be4d632bbf\": rpc error: code = NotFound desc = could not find container \"255ed65319e57d3fc70b1a6c1630fa93fb20bdb9588e36b25498f6be4d632bbf\": container with ID starting with 255ed65319e57d3fc70b1a6c1630fa93fb20bdb9588e36b25498f6be4d632bbf not found: ID does not exist" Apr 22 18:50:57.673556 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.673509 2566 scope.go:117] "RemoveContainer" containerID="faa95981a43f743f994353db59508e18a800abf9e8239b8e1c48fc4f987ec66e" Apr 22 18:50:57.673600 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.673558 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 22 18:50:57.673600 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.673588 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 22 18:50:57.673681 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.673623 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 22 18:50:57.673681 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.673624 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 22 18:50:57.673783 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.673734 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"faa95981a43f743f994353db59508e18a800abf9e8239b8e1c48fc4f987ec66e"} err="failed to get container status \"faa95981a43f743f994353db59508e18a800abf9e8239b8e1c48fc4f987ec66e\": rpc error: code = NotFound desc = could not find container \"faa95981a43f743f994353db59508e18a800abf9e8239b8e1c48fc4f987ec66e\": container with ID starting with faa95981a43f743f994353db59508e18a800abf9e8239b8e1c48fc4f987ec66e not found: ID does not exist" Apr 22 18:50:57.673783 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.673753 2566 scope.go:117] "RemoveContainer" containerID="6f503c660d95e4c7d5eebc032206ad2fd88eac2f0ab0f367d7a4e522a9191232" Apr 22 18:50:57.673996 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.673970 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f503c660d95e4c7d5eebc032206ad2fd88eac2f0ab0f367d7a4e522a9191232"} err="failed to get container status \"6f503c660d95e4c7d5eebc032206ad2fd88eac2f0ab0f367d7a4e522a9191232\": rpc error: code = NotFound desc = could not find container \"6f503c660d95e4c7d5eebc032206ad2fd88eac2f0ab0f367d7a4e522a9191232\": container with ID starting with 6f503c660d95e4c7d5eebc032206ad2fd88eac2f0ab0f367d7a4e522a9191232 not found: ID does not exist" Apr 22 18:50:57.673996 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.673990 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 22 18:50:57.673996 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.673997 2566 scope.go:117] "RemoveContainer" containerID="f98d2ac67a3e64994429aa72939f5f7be356799a710ecb175dec8c704d4c7762" Apr 22 18:50:57.674194 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.673992 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 22 18:50:57.674194 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.674016 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 22 18:50:57.674194 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.674032 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-9wsbq\"" Apr 22 18:50:57.674398 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.674374 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f98d2ac67a3e64994429aa72939f5f7be356799a710ecb175dec8c704d4c7762"} err="failed to get container status \"f98d2ac67a3e64994429aa72939f5f7be356799a710ecb175dec8c704d4c7762\": rpc error: code = NotFound desc = could not find container \"f98d2ac67a3e64994429aa72939f5f7be356799a710ecb175dec8c704d4c7762\": container with ID starting with f98d2ac67a3e64994429aa72939f5f7be356799a710ecb175dec8c704d4c7762 not found: ID does not exist" Apr 22 18:50:57.674438 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.674403 2566 scope.go:117] "RemoveContainer" containerID="f9e33368851aaff9f758e256b86700acae3c0beee99dce2eba325498f977c1e1" Apr 22 18:50:57.674550 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.674531 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 22 18:50:57.674684 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.674666 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9e33368851aaff9f758e256b86700acae3c0beee99dce2eba325498f977c1e1"} err="failed to get container status \"f9e33368851aaff9f758e256b86700acae3c0beee99dce2eba325498f977c1e1\": rpc error: code = NotFound desc = could not find container \"f9e33368851aaff9f758e256b86700acae3c0beee99dce2eba325498f977c1e1\": container with ID starting with f9e33368851aaff9f758e256b86700acae3c0beee99dce2eba325498f977c1e1 not found: ID does not exist" Apr 22 18:50:57.674684 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.674684 2566 scope.go:117] "RemoveContainer" containerID="184dc1a7f73ea9bb3e15c8c8582ee84074bb9a5263546bc73698ff4696d68544" Apr 22 18:50:57.675009 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.674987 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"184dc1a7f73ea9bb3e15c8c8582ee84074bb9a5263546bc73698ff4696d68544"} err="failed to get container status \"184dc1a7f73ea9bb3e15c8c8582ee84074bb9a5263546bc73698ff4696d68544\": rpc error: code = NotFound desc = could not find container \"184dc1a7f73ea9bb3e15c8c8582ee84074bb9a5263546bc73698ff4696d68544\": container with ID starting with 184dc1a7f73ea9bb3e15c8c8582ee84074bb9a5263546bc73698ff4696d68544 not found: ID does not exist" Apr 22 18:50:57.678807 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.678786 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 22 18:50:57.681508 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.681486 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:50:57.707675 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.707478 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftr4s\" (UniqueName: \"kubernetes.io/projected/5833426b-676a-4d5e-8e8b-ead0f3e719f9-kube-api-access-ftr4s\") pod \"alertmanager-main-0\" (UID: \"5833426b-676a-4d5e-8e8b-ead0f3e719f9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:57.707675 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.707594 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5833426b-676a-4d5e-8e8b-ead0f3e719f9-web-config\") pod \"alertmanager-main-0\" (UID: \"5833426b-676a-4d5e-8e8b-ead0f3e719f9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:57.707675 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.707627 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5833426b-676a-4d5e-8e8b-ead0f3e719f9-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"5833426b-676a-4d5e-8e8b-ead0f3e719f9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:57.707876 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.707685 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/5833426b-676a-4d5e-8e8b-ead0f3e719f9-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"5833426b-676a-4d5e-8e8b-ead0f3e719f9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:57.707876 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.707713 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/5833426b-676a-4d5e-8e8b-ead0f3e719f9-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"5833426b-676a-4d5e-8e8b-ead0f3e719f9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:57.707876 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.707766 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5833426b-676a-4d5e-8e8b-ead0f3e719f9-config-out\") pod \"alertmanager-main-0\" (UID: \"5833426b-676a-4d5e-8e8b-ead0f3e719f9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:57.707876 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.707796 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5833426b-676a-4d5e-8e8b-ead0f3e719f9-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"5833426b-676a-4d5e-8e8b-ead0f3e719f9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:57.707876 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.707846 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5833426b-676a-4d5e-8e8b-ead0f3e719f9-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"5833426b-676a-4d5e-8e8b-ead0f3e719f9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:57.708120 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.707887 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/5833426b-676a-4d5e-8e8b-ead0f3e719f9-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"5833426b-676a-4d5e-8e8b-ead0f3e719f9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:57.708120 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.707966 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5833426b-676a-4d5e-8e8b-ead0f3e719f9-tls-assets\") pod \"alertmanager-main-0\" (UID: \"5833426b-676a-4d5e-8e8b-ead0f3e719f9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:57.708120 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.708008 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5833426b-676a-4d5e-8e8b-ead0f3e719f9-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"5833426b-676a-4d5e-8e8b-ead0f3e719f9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:57.708120 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.708066 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/5833426b-676a-4d5e-8e8b-ead0f3e719f9-config-volume\") pod \"alertmanager-main-0\" (UID: \"5833426b-676a-4d5e-8e8b-ead0f3e719f9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:57.708340 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.708128 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/5833426b-676a-4d5e-8e8b-ead0f3e719f9-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"5833426b-676a-4d5e-8e8b-ead0f3e719f9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:57.808694 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.808663 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5833426b-676a-4d5e-8e8b-ead0f3e719f9-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"5833426b-676a-4d5e-8e8b-ead0f3e719f9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:57.808694 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.808699 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5833426b-676a-4d5e-8e8b-ead0f3e719f9-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"5833426b-676a-4d5e-8e8b-ead0f3e719f9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:57.808919 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.808722 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/5833426b-676a-4d5e-8e8b-ead0f3e719f9-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"5833426b-676a-4d5e-8e8b-ead0f3e719f9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:57.808919 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.808741 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5833426b-676a-4d5e-8e8b-ead0f3e719f9-tls-assets\") pod \"alertmanager-main-0\" (UID: \"5833426b-676a-4d5e-8e8b-ead0f3e719f9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:57.808919 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.808764 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5833426b-676a-4d5e-8e8b-ead0f3e719f9-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"5833426b-676a-4d5e-8e8b-ead0f3e719f9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:57.808919 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.808785 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/5833426b-676a-4d5e-8e8b-ead0f3e719f9-config-volume\") pod \"alertmanager-main-0\" (UID: \"5833426b-676a-4d5e-8e8b-ead0f3e719f9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:57.808919 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.808808 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/5833426b-676a-4d5e-8e8b-ead0f3e719f9-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"5833426b-676a-4d5e-8e8b-ead0f3e719f9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:57.808919 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.808827 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ftr4s\" (UniqueName: \"kubernetes.io/projected/5833426b-676a-4d5e-8e8b-ead0f3e719f9-kube-api-access-ftr4s\") pod \"alertmanager-main-0\" (UID: \"5833426b-676a-4d5e-8e8b-ead0f3e719f9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:57.808919 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.808878 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5833426b-676a-4d5e-8e8b-ead0f3e719f9-web-config\") pod \"alertmanager-main-0\" (UID: \"5833426b-676a-4d5e-8e8b-ead0f3e719f9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:57.809280 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.808921 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5833426b-676a-4d5e-8e8b-ead0f3e719f9-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"5833426b-676a-4d5e-8e8b-ead0f3e719f9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:57.809280 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.808979 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/5833426b-676a-4d5e-8e8b-ead0f3e719f9-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"5833426b-676a-4d5e-8e8b-ead0f3e719f9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:57.809280 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.809006 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/5833426b-676a-4d5e-8e8b-ead0f3e719f9-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"5833426b-676a-4d5e-8e8b-ead0f3e719f9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:57.809280 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.809044 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5833426b-676a-4d5e-8e8b-ead0f3e719f9-config-out\") pod \"alertmanager-main-0\" (UID: \"5833426b-676a-4d5e-8e8b-ead0f3e719f9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:57.810751 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.809533 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5833426b-676a-4d5e-8e8b-ead0f3e719f9-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"5833426b-676a-4d5e-8e8b-ead0f3e719f9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:57.810751 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.809604 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/5833426b-676a-4d5e-8e8b-ead0f3e719f9-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"5833426b-676a-4d5e-8e8b-ead0f3e719f9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:57.810751 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.810338 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5833426b-676a-4d5e-8e8b-ead0f3e719f9-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"5833426b-676a-4d5e-8e8b-ead0f3e719f9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:57.811610 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.811585 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5833426b-676a-4d5e-8e8b-ead0f3e719f9-tls-assets\") pod \"alertmanager-main-0\" (UID: \"5833426b-676a-4d5e-8e8b-ead0f3e719f9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:57.812063 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.812038 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5833426b-676a-4d5e-8e8b-ead0f3e719f9-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"5833426b-676a-4d5e-8e8b-ead0f3e719f9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:57.812210 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.812187 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5833426b-676a-4d5e-8e8b-ead0f3e719f9-config-out\") pod \"alertmanager-main-0\" (UID: \"5833426b-676a-4d5e-8e8b-ead0f3e719f9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:57.812291 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.812191 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5833426b-676a-4d5e-8e8b-ead0f3e719f9-web-config\") pod \"alertmanager-main-0\" (UID: \"5833426b-676a-4d5e-8e8b-ead0f3e719f9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:57.812413 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.812393 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/5833426b-676a-4d5e-8e8b-ead0f3e719f9-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"5833426b-676a-4d5e-8e8b-ead0f3e719f9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:57.812995 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.812969 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/5833426b-676a-4d5e-8e8b-ead0f3e719f9-config-volume\") pod \"alertmanager-main-0\" (UID: \"5833426b-676a-4d5e-8e8b-ead0f3e719f9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:57.813238 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.813211 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/5833426b-676a-4d5e-8e8b-ead0f3e719f9-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"5833426b-676a-4d5e-8e8b-ead0f3e719f9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:57.813512 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.813495 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5833426b-676a-4d5e-8e8b-ead0f3e719f9-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"5833426b-676a-4d5e-8e8b-ead0f3e719f9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:57.814237 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.814213 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/5833426b-676a-4d5e-8e8b-ead0f3e719f9-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"5833426b-676a-4d5e-8e8b-ead0f3e719f9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:57.816165 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.816148 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftr4s\" (UniqueName: \"kubernetes.io/projected/5833426b-676a-4d5e-8e8b-ead0f3e719f9-kube-api-access-ftr4s\") pod \"alertmanager-main-0\" (UID: \"5833426b-676a-4d5e-8e8b-ead0f3e719f9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:57.982389 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:57.982304 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:50:58.108574 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:58.108394 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:50:58.110681 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:50:58.110650 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5833426b_676a_4d5e_8e8b_ead0f3e719f9.slice/crio-9263d08e81d0dd88051322362508862be7e22c85c2bdc631960d0ff5958028ea WatchSource:0}: Error finding container 9263d08e81d0dd88051322362508862be7e22c85c2bdc631960d0ff5958028ea: Status 404 returned error can't find the container with id 9263d08e81d0dd88051322362508862be7e22c85c2bdc631960d0ff5958028ea Apr 22 18:50:58.622193 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:58.622159 2566 generic.go:358] "Generic (PLEG): container finished" podID="5833426b-676a-4d5e-8e8b-ead0f3e719f9" containerID="5a620936409c3755c750697415c939a9e059d1ed070bff0e296cd6d0aae6c8d4" exitCode=0 Apr 22 18:50:58.622556 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:58.622198 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5833426b-676a-4d5e-8e8b-ead0f3e719f9","Type":"ContainerDied","Data":"5a620936409c3755c750697415c939a9e059d1ed070bff0e296cd6d0aae6c8d4"} Apr 22 18:50:58.622556 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:58.622217 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5833426b-676a-4d5e-8e8b-ead0f3e719f9","Type":"ContainerStarted","Data":"9263d08e81d0dd88051322362508862be7e22c85c2bdc631960d0ff5958028ea"} Apr 22 18:50:58.829166 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:58.829143 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bc4df07-ae0b-4182-b241-870e0ed1be3d" path="/var/lib/kubelet/pods/8bc4df07-ae0b-4182-b241-870e0ed1be3d/volumes" Apr 22 18:50:58.943866 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:58.943842 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-76f65c8c6b-ckz2w"] Apr 22 18:50:58.947080 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:58.947057 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76f65c8c6b-ckz2w" Apr 22 18:50:58.958383 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:58.958343 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-76f65c8c6b-ckz2w"] Apr 22 18:50:59.018095 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:59.018066 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5ea88efb-2da5-4288-bb86-47f3b5848684-console-config\") pod \"console-76f65c8c6b-ckz2w\" (UID: \"5ea88efb-2da5-4288-bb86-47f3b5848684\") " pod="openshift-console/console-76f65c8c6b-ckz2w" Apr 22 18:50:59.018229 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:59.018107 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5ea88efb-2da5-4288-bb86-47f3b5848684-service-ca\") pod \"console-76f65c8c6b-ckz2w\" (UID: \"5ea88efb-2da5-4288-bb86-47f3b5848684\") " pod="openshift-console/console-76f65c8c6b-ckz2w" Apr 22 18:50:59.018229 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:59.018137 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jnmv\" (UniqueName: \"kubernetes.io/projected/5ea88efb-2da5-4288-bb86-47f3b5848684-kube-api-access-7jnmv\") pod \"console-76f65c8c6b-ckz2w\" (UID: \"5ea88efb-2da5-4288-bb86-47f3b5848684\") " pod="openshift-console/console-76f65c8c6b-ckz2w" Apr 22 18:50:59.018229 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:59.018186 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5ea88efb-2da5-4288-bb86-47f3b5848684-oauth-serving-cert\") pod \"console-76f65c8c6b-ckz2w\" (UID: \"5ea88efb-2da5-4288-bb86-47f3b5848684\") " pod="openshift-console/console-76f65c8c6b-ckz2w" Apr 22 18:50:59.018340 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:59.018233 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5ea88efb-2da5-4288-bb86-47f3b5848684-console-serving-cert\") pod \"console-76f65c8c6b-ckz2w\" (UID: \"5ea88efb-2da5-4288-bb86-47f3b5848684\") " pod="openshift-console/console-76f65c8c6b-ckz2w" Apr 22 18:50:59.018340 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:59.018269 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ea88efb-2da5-4288-bb86-47f3b5848684-trusted-ca-bundle\") pod \"console-76f65c8c6b-ckz2w\" (UID: \"5ea88efb-2da5-4288-bb86-47f3b5848684\") " pod="openshift-console/console-76f65c8c6b-ckz2w" Apr 22 18:50:59.018340 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:59.018292 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5ea88efb-2da5-4288-bb86-47f3b5848684-console-oauth-config\") pod \"console-76f65c8c6b-ckz2w\" (UID: \"5ea88efb-2da5-4288-bb86-47f3b5848684\") " pod="openshift-console/console-76f65c8c6b-ckz2w" Apr 22 18:50:59.119633 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:59.119601 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5ea88efb-2da5-4288-bb86-47f3b5848684-console-oauth-config\") pod \"console-76f65c8c6b-ckz2w\" (UID: \"5ea88efb-2da5-4288-bb86-47f3b5848684\") " pod="openshift-console/console-76f65c8c6b-ckz2w" Apr 22 18:50:59.119785 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:59.119647 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5ea88efb-2da5-4288-bb86-47f3b5848684-console-config\") pod \"console-76f65c8c6b-ckz2w\" (UID: \"5ea88efb-2da5-4288-bb86-47f3b5848684\") " pod="openshift-console/console-76f65c8c6b-ckz2w" Apr 22 18:50:59.119785 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:59.119767 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5ea88efb-2da5-4288-bb86-47f3b5848684-service-ca\") pod \"console-76f65c8c6b-ckz2w\" (UID: \"5ea88efb-2da5-4288-bb86-47f3b5848684\") " pod="openshift-console/console-76f65c8c6b-ckz2w" Apr 22 18:50:59.119905 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:59.119826 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7jnmv\" (UniqueName: \"kubernetes.io/projected/5ea88efb-2da5-4288-bb86-47f3b5848684-kube-api-access-7jnmv\") pod \"console-76f65c8c6b-ckz2w\" (UID: \"5ea88efb-2da5-4288-bb86-47f3b5848684\") " pod="openshift-console/console-76f65c8c6b-ckz2w" Apr 22 18:50:59.119905 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:59.119874 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5ea88efb-2da5-4288-bb86-47f3b5848684-oauth-serving-cert\") pod \"console-76f65c8c6b-ckz2w\" (UID: \"5ea88efb-2da5-4288-bb86-47f3b5848684\") " pod="openshift-console/console-76f65c8c6b-ckz2w" Apr 22 18:50:59.120030 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:59.119926 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5ea88efb-2da5-4288-bb86-47f3b5848684-console-serving-cert\") pod \"console-76f65c8c6b-ckz2w\" (UID: \"5ea88efb-2da5-4288-bb86-47f3b5848684\") " pod="openshift-console/console-76f65c8c6b-ckz2w" Apr 22 18:50:59.120030 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:59.120005 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ea88efb-2da5-4288-bb86-47f3b5848684-trusted-ca-bundle\") pod \"console-76f65c8c6b-ckz2w\" (UID: \"5ea88efb-2da5-4288-bb86-47f3b5848684\") " pod="openshift-console/console-76f65c8c6b-ckz2w" Apr 22 18:50:59.120404 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:59.120377 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5ea88efb-2da5-4288-bb86-47f3b5848684-console-config\") pod \"console-76f65c8c6b-ckz2w\" (UID: \"5ea88efb-2da5-4288-bb86-47f3b5848684\") " pod="openshift-console/console-76f65c8c6b-ckz2w" Apr 22 18:50:59.120529 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:59.120423 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5ea88efb-2da5-4288-bb86-47f3b5848684-service-ca\") pod \"console-76f65c8c6b-ckz2w\" (UID: \"5ea88efb-2da5-4288-bb86-47f3b5848684\") " pod="openshift-console/console-76f65c8c6b-ckz2w" Apr 22 18:50:59.120590 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:59.120551 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5ea88efb-2da5-4288-bb86-47f3b5848684-oauth-serving-cert\") pod \"console-76f65c8c6b-ckz2w\" (UID: \"5ea88efb-2da5-4288-bb86-47f3b5848684\") " pod="openshift-console/console-76f65c8c6b-ckz2w" Apr 22 18:50:59.120843 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:59.120811 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ea88efb-2da5-4288-bb86-47f3b5848684-trusted-ca-bundle\") pod \"console-76f65c8c6b-ckz2w\" (UID: \"5ea88efb-2da5-4288-bb86-47f3b5848684\") " pod="openshift-console/console-76f65c8c6b-ckz2w" Apr 22 18:50:59.122323 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:59.122304 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5ea88efb-2da5-4288-bb86-47f3b5848684-console-oauth-config\") pod \"console-76f65c8c6b-ckz2w\" (UID: \"5ea88efb-2da5-4288-bb86-47f3b5848684\") " pod="openshift-console/console-76f65c8c6b-ckz2w" Apr 22 18:50:59.122411 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:59.122393 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5ea88efb-2da5-4288-bb86-47f3b5848684-console-serving-cert\") pod \"console-76f65c8c6b-ckz2w\" (UID: \"5ea88efb-2da5-4288-bb86-47f3b5848684\") " pod="openshift-console/console-76f65c8c6b-ckz2w" Apr 22 18:50:59.126563 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:59.126501 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jnmv\" (UniqueName: \"kubernetes.io/projected/5ea88efb-2da5-4288-bb86-47f3b5848684-kube-api-access-7jnmv\") pod \"console-76f65c8c6b-ckz2w\" (UID: \"5ea88efb-2da5-4288-bb86-47f3b5848684\") " pod="openshift-console/console-76f65c8c6b-ckz2w" Apr 22 18:50:59.273711 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:59.273674 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76f65c8c6b-ckz2w" Apr 22 18:50:59.395513 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:59.395490 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-76f65c8c6b-ckz2w"] Apr 22 18:50:59.397298 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:50:59.397274 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ea88efb_2da5_4288_bb86_47f3b5848684.slice/crio-d5c1abcad849b05f1dd48c0142883330a0468ec88fe5f68553b424532524ff3c WatchSource:0}: Error finding container d5c1abcad849b05f1dd48c0142883330a0468ec88fe5f68553b424532524ff3c: Status 404 returned error can't find the container with id d5c1abcad849b05f1dd48c0142883330a0468ec88fe5f68553b424532524ff3c Apr 22 18:50:59.629003 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:59.628968 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5833426b-676a-4d5e-8e8b-ead0f3e719f9","Type":"ContainerStarted","Data":"c26862f1702aa7732f9611e51b2a5986ee0a35e2596269b5a3fb9a68fed7fa8c"} Apr 22 18:50:59.629437 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:59.629010 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5833426b-676a-4d5e-8e8b-ead0f3e719f9","Type":"ContainerStarted","Data":"f824ecfb3c624ab57fc1a3173e9b9a631ddbca413a7c60840f55d8e3d58ee2f2"} Apr 22 18:50:59.629437 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:59.629024 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5833426b-676a-4d5e-8e8b-ead0f3e719f9","Type":"ContainerStarted","Data":"be1820cb295e12d08f726e859c989082f8c9546cb2d5815ebc0fb3fede8f1bb7"} Apr 22 18:50:59.629437 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:59.629036 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5833426b-676a-4d5e-8e8b-ead0f3e719f9","Type":"ContainerStarted","Data":"8681b41e735dda389a0a309196935792956cbd8579937cb9e2e3f12d459d3dcb"} Apr 22 18:50:59.629437 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:59.629049 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5833426b-676a-4d5e-8e8b-ead0f3e719f9","Type":"ContainerStarted","Data":"34111540be0218e6dfe7df28fec2d77ccbb9376ca5a3ec7388842b512a799876"} Apr 22 18:50:59.629437 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:59.629061 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5833426b-676a-4d5e-8e8b-ead0f3e719f9","Type":"ContainerStarted","Data":"39dc01acd3020d376a24e8a8187dc866538e30675dae131a1388900b1d354edc"} Apr 22 18:50:59.630489 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:59.630459 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76f65c8c6b-ckz2w" event={"ID":"5ea88efb-2da5-4288-bb86-47f3b5848684","Type":"ContainerStarted","Data":"358d6ceb4062c4700a0b3f20282b9d9b909d1c69eaee104f2ffd9923c037a271"} Apr 22 18:50:59.630616 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:59.630493 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76f65c8c6b-ckz2w" event={"ID":"5ea88efb-2da5-4288-bb86-47f3b5848684","Type":"ContainerStarted","Data":"d5c1abcad849b05f1dd48c0142883330a0468ec88fe5f68553b424532524ff3c"} Apr 22 18:50:59.659236 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:59.659125 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.65910682 podStartE2EDuration="2.65910682s" podCreationTimestamp="2026-04-22 18:50:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:50:59.656248032 +0000 UTC m=+261.435081531" watchObservedRunningTime="2026-04-22 18:50:59.65910682 +0000 UTC m=+261.437940307" Apr 22 18:50:59.679781 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:50:59.679729 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-76f65c8c6b-ckz2w" podStartSLOduration=1.679711741 podStartE2EDuration="1.679711741s" podCreationTimestamp="2026-04-22 18:50:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:50:59.677814745 +0000 UTC m=+261.456648225" watchObservedRunningTime="2026-04-22 18:50:59.679711741 +0000 UTC m=+261.458545227" Apr 22 18:51:09.274811 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:09.274775 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-76f65c8c6b-ckz2w" Apr 22 18:51:09.275297 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:09.274853 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-76f65c8c6b-ckz2w" Apr 22 18:51:09.279426 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:09.279406 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-76f65c8c6b-ckz2w" Apr 22 18:51:09.666620 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:09.666594 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-76f65c8c6b-ckz2w" Apr 22 18:51:09.706556 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:09.706521 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-84f97f657c-wsmcp"] Apr 22 18:51:34.730111 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:34.730073 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-84f97f657c-wsmcp" podUID="d2d9a9c5-0b0b-4cfa-a570-c855f684acfb" containerName="console" containerID="cri-o://521fef34ecd1518206a611f9e60272cba8c1cef6325854f278c5e9a0c4e3a40d" gracePeriod=15 Apr 22 18:51:34.965162 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:34.965141 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-84f97f657c-wsmcp_d2d9a9c5-0b0b-4cfa-a570-c855f684acfb/console/0.log" Apr 22 18:51:34.965267 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:34.965198 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84f97f657c-wsmcp" Apr 22 18:51:35.016600 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:35.016535 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d2d9a9c5-0b0b-4cfa-a570-c855f684acfb-service-ca\") pod \"d2d9a9c5-0b0b-4cfa-a570-c855f684acfb\" (UID: \"d2d9a9c5-0b0b-4cfa-a570-c855f684acfb\") " Apr 22 18:51:35.016600 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:35.016575 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d2d9a9c5-0b0b-4cfa-a570-c855f684acfb-oauth-serving-cert\") pod \"d2d9a9c5-0b0b-4cfa-a570-c855f684acfb\" (UID: \"d2d9a9c5-0b0b-4cfa-a570-c855f684acfb\") " Apr 22 18:51:35.016793 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:35.016612 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d2d9a9c5-0b0b-4cfa-a570-c855f684acfb-console-config\") pod \"d2d9a9c5-0b0b-4cfa-a570-c855f684acfb\" (UID: \"d2d9a9c5-0b0b-4cfa-a570-c855f684acfb\") " Apr 22 18:51:35.016793 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:35.016638 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d2d9a9c5-0b0b-4cfa-a570-c855f684acfb-trusted-ca-bundle\") pod \"d2d9a9c5-0b0b-4cfa-a570-c855f684acfb\" (UID: \"d2d9a9c5-0b0b-4cfa-a570-c855f684acfb\") " Apr 22 18:51:35.016793 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:35.016689 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d2d9a9c5-0b0b-4cfa-a570-c855f684acfb-console-oauth-config\") pod \"d2d9a9c5-0b0b-4cfa-a570-c855f684acfb\" (UID: \"d2d9a9c5-0b0b-4cfa-a570-c855f684acfb\") " Apr 22 18:51:35.016793 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:35.016741 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d2d9a9c5-0b0b-4cfa-a570-c855f684acfb-console-serving-cert\") pod \"d2d9a9c5-0b0b-4cfa-a570-c855f684acfb\" (UID: \"d2d9a9c5-0b0b-4cfa-a570-c855f684acfb\") " Apr 22 18:51:35.016793 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:35.016766 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lh7qj\" (UniqueName: \"kubernetes.io/projected/d2d9a9c5-0b0b-4cfa-a570-c855f684acfb-kube-api-access-lh7qj\") pod \"d2d9a9c5-0b0b-4cfa-a570-c855f684acfb\" (UID: \"d2d9a9c5-0b0b-4cfa-a570-c855f684acfb\") " Apr 22 18:51:35.017082 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:35.017019 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2d9a9c5-0b0b-4cfa-a570-c855f684acfb-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "d2d9a9c5-0b0b-4cfa-a570-c855f684acfb" (UID: "d2d9a9c5-0b0b-4cfa-a570-c855f684acfb"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:51:35.017082 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:35.016933 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2d9a9c5-0b0b-4cfa-a570-c855f684acfb-service-ca" (OuterVolumeSpecName: "service-ca") pod "d2d9a9c5-0b0b-4cfa-a570-c855f684acfb" (UID: "d2d9a9c5-0b0b-4cfa-a570-c855f684acfb"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:51:35.017217 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:35.017186 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2d9a9c5-0b0b-4cfa-a570-c855f684acfb-console-config" (OuterVolumeSpecName: "console-config") pod "d2d9a9c5-0b0b-4cfa-a570-c855f684acfb" (UID: "d2d9a9c5-0b0b-4cfa-a570-c855f684acfb"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:51:35.017434 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:35.017405 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2d9a9c5-0b0b-4cfa-a570-c855f684acfb-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "d2d9a9c5-0b0b-4cfa-a570-c855f684acfb" (UID: "d2d9a9c5-0b0b-4cfa-a570-c855f684acfb"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:51:35.018873 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:35.018842 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2d9a9c5-0b0b-4cfa-a570-c855f684acfb-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "d2d9a9c5-0b0b-4cfa-a570-c855f684acfb" (UID: "d2d9a9c5-0b0b-4cfa-a570-c855f684acfb"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:51:35.019018 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:35.018910 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2d9a9c5-0b0b-4cfa-a570-c855f684acfb-kube-api-access-lh7qj" (OuterVolumeSpecName: "kube-api-access-lh7qj") pod "d2d9a9c5-0b0b-4cfa-a570-c855f684acfb" (UID: "d2d9a9c5-0b0b-4cfa-a570-c855f684acfb"). InnerVolumeSpecName "kube-api-access-lh7qj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:51:35.019088 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:35.019019 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2d9a9c5-0b0b-4cfa-a570-c855f684acfb-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "d2d9a9c5-0b0b-4cfa-a570-c855f684acfb" (UID: "d2d9a9c5-0b0b-4cfa-a570-c855f684acfb"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:51:35.117501 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:35.117481 2566 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d2d9a9c5-0b0b-4cfa-a570-c855f684acfb-service-ca\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:51:35.117501 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:35.117502 2566 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d2d9a9c5-0b0b-4cfa-a570-c855f684acfb-oauth-serving-cert\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:51:35.117643 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:35.117514 2566 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d2d9a9c5-0b0b-4cfa-a570-c855f684acfb-console-config\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:51:35.117643 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:35.117523 2566 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d2d9a9c5-0b0b-4cfa-a570-c855f684acfb-trusted-ca-bundle\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:51:35.117643 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:35.117531 2566 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d2d9a9c5-0b0b-4cfa-a570-c855f684acfb-console-oauth-config\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:51:35.117643 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:35.117539 2566 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d2d9a9c5-0b0b-4cfa-a570-c855f684acfb-console-serving-cert\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:51:35.117643 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:35.117548 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lh7qj\" (UniqueName: \"kubernetes.io/projected/d2d9a9c5-0b0b-4cfa-a570-c855f684acfb-kube-api-access-lh7qj\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:51:35.735738 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:35.735710 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-84f97f657c-wsmcp_d2d9a9c5-0b0b-4cfa-a570-c855f684acfb/console/0.log" Apr 22 18:51:35.736140 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:35.735748 2566 generic.go:358] "Generic (PLEG): container finished" podID="d2d9a9c5-0b0b-4cfa-a570-c855f684acfb" containerID="521fef34ecd1518206a611f9e60272cba8c1cef6325854f278c5e9a0c4e3a40d" exitCode=2 Apr 22 18:51:35.736140 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:35.735838 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84f97f657c-wsmcp" Apr 22 18:51:35.736140 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:35.735846 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84f97f657c-wsmcp" event={"ID":"d2d9a9c5-0b0b-4cfa-a570-c855f684acfb","Type":"ContainerDied","Data":"521fef34ecd1518206a611f9e60272cba8c1cef6325854f278c5e9a0c4e3a40d"} Apr 22 18:51:35.736140 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:35.735882 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84f97f657c-wsmcp" event={"ID":"d2d9a9c5-0b0b-4cfa-a570-c855f684acfb","Type":"ContainerDied","Data":"67341fe1c076a810782ee5300be862f9ad432742808642dda2ea4d6f95620a9b"} Apr 22 18:51:35.736140 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:35.735899 2566 scope.go:117] "RemoveContainer" containerID="521fef34ecd1518206a611f9e60272cba8c1cef6325854f278c5e9a0c4e3a40d" Apr 22 18:51:35.744314 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:35.744295 2566 scope.go:117] "RemoveContainer" containerID="521fef34ecd1518206a611f9e60272cba8c1cef6325854f278c5e9a0c4e3a40d" Apr 22 18:51:35.744581 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:51:35.744563 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"521fef34ecd1518206a611f9e60272cba8c1cef6325854f278c5e9a0c4e3a40d\": container with ID starting with 521fef34ecd1518206a611f9e60272cba8c1cef6325854f278c5e9a0c4e3a40d not found: ID does not exist" containerID="521fef34ecd1518206a611f9e60272cba8c1cef6325854f278c5e9a0c4e3a40d" Apr 22 18:51:35.744652 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:35.744588 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"521fef34ecd1518206a611f9e60272cba8c1cef6325854f278c5e9a0c4e3a40d"} err="failed to get container status \"521fef34ecd1518206a611f9e60272cba8c1cef6325854f278c5e9a0c4e3a40d\": rpc error: code = NotFound desc = could not find container \"521fef34ecd1518206a611f9e60272cba8c1cef6325854f278c5e9a0c4e3a40d\": container with ID starting with 521fef34ecd1518206a611f9e60272cba8c1cef6325854f278c5e9a0c4e3a40d not found: ID does not exist" Apr 22 18:51:35.755481 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:35.755434 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-84f97f657c-wsmcp"] Apr 22 18:51:35.758703 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:35.758684 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-84f97f657c-wsmcp"] Apr 22 18:51:36.829708 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:36.829671 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2d9a9c5-0b0b-4cfa-a570-c855f684acfb" path="/var/lib/kubelet/pods/d2d9a9c5-0b0b-4cfa-a570-c855f684acfb/volumes" Apr 22 18:51:38.690326 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:38.690301 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4jzwx_4871a747-c041-430f-9c1a-3ab2ce6d0156/console-operator/1.log" Apr 22 18:51:38.690781 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:38.690365 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4jzwx_4871a747-c041-430f-9c1a-3ab2ce6d0156/console-operator/1.log" Apr 22 18:51:38.699210 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:38.699191 2566 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 18:51:40.566557 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:40.566522 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cm8tvn"] Apr 22 18:51:40.568066 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:40.566861 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d2d9a9c5-0b0b-4cfa-a570-c855f684acfb" containerName="console" Apr 22 18:51:40.568066 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:40.566872 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2d9a9c5-0b0b-4cfa-a570-c855f684acfb" containerName="console" Apr 22 18:51:40.568066 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:40.566915 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="d2d9a9c5-0b0b-4cfa-a570-c855f684acfb" containerName="console" Apr 22 18:51:40.568782 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:40.568766 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cm8tvn" Apr 22 18:51:40.570961 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:40.570917 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 18:51:40.571058 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:40.570917 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 18:51:40.571850 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:40.571829 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-h655x\"" Apr 22 18:51:40.577129 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:40.577107 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cm8tvn"] Apr 22 18:51:40.658106 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:40.658078 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4526aa9c-a829-456e-b2fe-09086950b189-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cm8tvn\" (UID: \"4526aa9c-a829-456e-b2fe-09086950b189\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cm8tvn" Apr 22 18:51:40.658106 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:40.658107 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4526aa9c-a829-456e-b2fe-09086950b189-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cm8tvn\" (UID: \"4526aa9c-a829-456e-b2fe-09086950b189\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cm8tvn" Apr 22 18:51:40.658259 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:40.658159 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcvl4\" (UniqueName: \"kubernetes.io/projected/4526aa9c-a829-456e-b2fe-09086950b189-kube-api-access-lcvl4\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cm8tvn\" (UID: \"4526aa9c-a829-456e-b2fe-09086950b189\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cm8tvn" Apr 22 18:51:40.758749 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:40.758717 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lcvl4\" (UniqueName: \"kubernetes.io/projected/4526aa9c-a829-456e-b2fe-09086950b189-kube-api-access-lcvl4\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cm8tvn\" (UID: \"4526aa9c-a829-456e-b2fe-09086950b189\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cm8tvn" Apr 22 18:51:40.758892 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:40.758790 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4526aa9c-a829-456e-b2fe-09086950b189-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cm8tvn\" (UID: \"4526aa9c-a829-456e-b2fe-09086950b189\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cm8tvn" Apr 22 18:51:40.758892 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:40.758820 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4526aa9c-a829-456e-b2fe-09086950b189-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cm8tvn\" (UID: \"4526aa9c-a829-456e-b2fe-09086950b189\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cm8tvn" Apr 22 18:51:40.759189 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:40.759168 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4526aa9c-a829-456e-b2fe-09086950b189-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cm8tvn\" (UID: \"4526aa9c-a829-456e-b2fe-09086950b189\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cm8tvn" Apr 22 18:51:40.759235 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:40.759203 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4526aa9c-a829-456e-b2fe-09086950b189-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cm8tvn\" (UID: \"4526aa9c-a829-456e-b2fe-09086950b189\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cm8tvn" Apr 22 18:51:40.766765 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:40.766741 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcvl4\" (UniqueName: \"kubernetes.io/projected/4526aa9c-a829-456e-b2fe-09086950b189-kube-api-access-lcvl4\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cm8tvn\" (UID: \"4526aa9c-a829-456e-b2fe-09086950b189\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cm8tvn" Apr 22 18:51:40.879102 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:40.879076 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cm8tvn" Apr 22 18:51:40.994298 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:40.994276 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cm8tvn"] Apr 22 18:51:40.996384 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:51:40.996355 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4526aa9c_a829_456e_b2fe_09086950b189.slice/crio-79aba64da18568845a0d9dd79bcf916f212c0a8571557099eab4056c31456228 WatchSource:0}: Error finding container 79aba64da18568845a0d9dd79bcf916f212c0a8571557099eab4056c31456228: Status 404 returned error can't find the container with id 79aba64da18568845a0d9dd79bcf916f212c0a8571557099eab4056c31456228 Apr 22 18:51:40.998259 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:40.998244 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:51:41.755542 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:41.755511 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cm8tvn" event={"ID":"4526aa9c-a829-456e-b2fe-09086950b189","Type":"ContainerStarted","Data":"79aba64da18568845a0d9dd79bcf916f212c0a8571557099eab4056c31456228"} Apr 22 18:51:53.793054 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:53.792978 2566 generic.go:358] "Generic (PLEG): container finished" podID="4526aa9c-a829-456e-b2fe-09086950b189" containerID="66a11bbeb9b9626eb9e6be10278d70d61c8484c10e8b60f3ae328f28b89c2875" exitCode=0 Apr 22 18:51:53.793054 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:53.793017 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cm8tvn" event={"ID":"4526aa9c-a829-456e-b2fe-09086950b189","Type":"ContainerDied","Data":"66a11bbeb9b9626eb9e6be10278d70d61c8484c10e8b60f3ae328f28b89c2875"} Apr 22 18:51:56.804530 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:56.804494 2566 generic.go:358] "Generic (PLEG): container finished" podID="4526aa9c-a829-456e-b2fe-09086950b189" containerID="b5c7d5c874de1e6d4aba313eb50e661329a903dd93e6ea4a175188b2323d2196" exitCode=0 Apr 22 18:51:56.805049 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:51:56.804582 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cm8tvn" event={"ID":"4526aa9c-a829-456e-b2fe-09086950b189","Type":"ContainerDied","Data":"b5c7d5c874de1e6d4aba313eb50e661329a903dd93e6ea4a175188b2323d2196"} Apr 22 18:52:02.824029 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:02.823996 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cm8tvn" event={"ID":"4526aa9c-a829-456e-b2fe-09086950b189","Type":"ContainerStarted","Data":"0239a6290e607c91b17604d71999a1db5db52a47be7e15fd1b7fecde60399094"} Apr 22 18:52:02.842952 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:02.842749 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cm8tvn" podStartSLOduration=1.122219488 podStartE2EDuration="22.842731805s" podCreationTimestamp="2026-04-22 18:51:40 +0000 UTC" firstStartedPulling="2026-04-22 18:51:40.998377037 +0000 UTC m=+302.777210501" lastFinishedPulling="2026-04-22 18:52:02.718889351 +0000 UTC m=+324.497722818" observedRunningTime="2026-04-22 18:52:02.840855699 +0000 UTC m=+324.619689186" watchObservedRunningTime="2026-04-22 18:52:02.842731805 +0000 UTC m=+324.621565292" Apr 22 18:52:03.829309 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:03.829272 2566 generic.go:358] "Generic (PLEG): container finished" podID="4526aa9c-a829-456e-b2fe-09086950b189" containerID="0239a6290e607c91b17604d71999a1db5db52a47be7e15fd1b7fecde60399094" exitCode=0 Apr 22 18:52:03.829651 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:03.829353 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cm8tvn" event={"ID":"4526aa9c-a829-456e-b2fe-09086950b189","Type":"ContainerDied","Data":"0239a6290e607c91b17604d71999a1db5db52a47be7e15fd1b7fecde60399094"} Apr 22 18:52:04.950378 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:04.950354 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cm8tvn" Apr 22 18:52:05.056760 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:05.056724 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4526aa9c-a829-456e-b2fe-09086950b189-util\") pod \"4526aa9c-a829-456e-b2fe-09086950b189\" (UID: \"4526aa9c-a829-456e-b2fe-09086950b189\") " Apr 22 18:52:05.056965 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:05.056780 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4526aa9c-a829-456e-b2fe-09086950b189-bundle\") pod \"4526aa9c-a829-456e-b2fe-09086950b189\" (UID: \"4526aa9c-a829-456e-b2fe-09086950b189\") " Apr 22 18:52:05.056965 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:05.056831 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lcvl4\" (UniqueName: \"kubernetes.io/projected/4526aa9c-a829-456e-b2fe-09086950b189-kube-api-access-lcvl4\") pod \"4526aa9c-a829-456e-b2fe-09086950b189\" (UID: \"4526aa9c-a829-456e-b2fe-09086950b189\") " Apr 22 18:52:05.057393 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:05.057363 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4526aa9c-a829-456e-b2fe-09086950b189-bundle" (OuterVolumeSpecName: "bundle") pod "4526aa9c-a829-456e-b2fe-09086950b189" (UID: "4526aa9c-a829-456e-b2fe-09086950b189"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:52:05.059005 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:05.058975 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4526aa9c-a829-456e-b2fe-09086950b189-kube-api-access-lcvl4" (OuterVolumeSpecName: "kube-api-access-lcvl4") pod "4526aa9c-a829-456e-b2fe-09086950b189" (UID: "4526aa9c-a829-456e-b2fe-09086950b189"). InnerVolumeSpecName "kube-api-access-lcvl4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:52:05.060890 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:05.060866 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4526aa9c-a829-456e-b2fe-09086950b189-util" (OuterVolumeSpecName: "util") pod "4526aa9c-a829-456e-b2fe-09086950b189" (UID: "4526aa9c-a829-456e-b2fe-09086950b189"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:52:05.157706 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:05.157684 2566 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4526aa9c-a829-456e-b2fe-09086950b189-util\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:52:05.157706 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:05.157706 2566 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4526aa9c-a829-456e-b2fe-09086950b189-bundle\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:52:05.157834 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:05.157717 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lcvl4\" (UniqueName: \"kubernetes.io/projected/4526aa9c-a829-456e-b2fe-09086950b189-kube-api-access-lcvl4\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:52:05.837057 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:05.837024 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cm8tvn" event={"ID":"4526aa9c-a829-456e-b2fe-09086950b189","Type":"ContainerDied","Data":"79aba64da18568845a0d9dd79bcf916f212c0a8571557099eab4056c31456228"} Apr 22 18:52:05.837057 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:05.837046 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cm8tvn" Apr 22 18:52:05.837057 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:05.837057 2566 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79aba64da18568845a0d9dd79bcf916f212c0a8571557099eab4056c31456228" Apr 22 18:52:06.914969 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:06.914925 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ncrns"] Apr 22 18:52:06.915394 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:06.915244 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4526aa9c-a829-456e-b2fe-09086950b189" containerName="util" Apr 22 18:52:06.915394 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:06.915255 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="4526aa9c-a829-456e-b2fe-09086950b189" containerName="util" Apr 22 18:52:06.915394 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:06.915267 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4526aa9c-a829-456e-b2fe-09086950b189" containerName="extract" Apr 22 18:52:06.915394 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:06.915272 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="4526aa9c-a829-456e-b2fe-09086950b189" containerName="extract" Apr 22 18:52:06.915394 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:06.915287 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4526aa9c-a829-456e-b2fe-09086950b189" containerName="pull" Apr 22 18:52:06.915394 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:06.915292 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="4526aa9c-a829-456e-b2fe-09086950b189" containerName="pull" Apr 22 18:52:06.915394 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:06.915339 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="4526aa9c-a829-456e-b2fe-09086950b189" containerName="extract" Apr 22 18:52:06.960288 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:06.960261 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ncrns"] Apr 22 18:52:06.960423 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:06.960371 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ncrns" Apr 22 18:52:06.962678 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:06.962655 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 22 18:52:06.962801 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:06.962686 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 22 18:52:06.962801 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:06.962702 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-8z9qw\"" Apr 22 18:52:06.962916 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:06.962825 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 22 18:52:07.076118 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:07.076088 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj2dh\" (UniqueName: \"kubernetes.io/projected/ef41e668-ef06-4c12-a7c7-851cb039c728-kube-api-access-sj2dh\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-ncrns\" (UID: \"ef41e668-ef06-4c12-a7c7-851cb039c728\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ncrns" Apr 22 18:52:07.076271 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:07.076127 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/ef41e668-ef06-4c12-a7c7-851cb039c728-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-ncrns\" (UID: \"ef41e668-ef06-4c12-a7c7-851cb039c728\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ncrns" Apr 22 18:52:07.176698 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:07.176620 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sj2dh\" (UniqueName: \"kubernetes.io/projected/ef41e668-ef06-4c12-a7c7-851cb039c728-kube-api-access-sj2dh\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-ncrns\" (UID: \"ef41e668-ef06-4c12-a7c7-851cb039c728\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ncrns" Apr 22 18:52:07.176698 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:07.176658 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/ef41e668-ef06-4c12-a7c7-851cb039c728-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-ncrns\" (UID: \"ef41e668-ef06-4c12-a7c7-851cb039c728\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ncrns" Apr 22 18:52:07.178861 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:07.178840 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/ef41e668-ef06-4c12-a7c7-851cb039c728-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-ncrns\" (UID: \"ef41e668-ef06-4c12-a7c7-851cb039c728\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ncrns" Apr 22 18:52:07.185277 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:07.185248 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj2dh\" (UniqueName: \"kubernetes.io/projected/ef41e668-ef06-4c12-a7c7-851cb039c728-kube-api-access-sj2dh\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-ncrns\" (UID: \"ef41e668-ef06-4c12-a7c7-851cb039c728\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ncrns" Apr 22 18:52:07.270707 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:07.270683 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ncrns" Apr 22 18:52:07.387692 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:07.387667 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ncrns"] Apr 22 18:52:07.390438 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:52:07.390411 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef41e668_ef06_4c12_a7c7_851cb039c728.slice/crio-7bd0a41dab9d05cb104d9ac3d71d7cb90a3b0616d6d3a9b098d5b76106c70d6e WatchSource:0}: Error finding container 7bd0a41dab9d05cb104d9ac3d71d7cb90a3b0616d6d3a9b098d5b76106c70d6e: Status 404 returned error can't find the container with id 7bd0a41dab9d05cb104d9ac3d71d7cb90a3b0616d6d3a9b098d5b76106c70d6e Apr 22 18:52:07.844134 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:07.844099 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ncrns" event={"ID":"ef41e668-ef06-4c12-a7c7-851cb039c728","Type":"ContainerStarted","Data":"7bd0a41dab9d05cb104d9ac3d71d7cb90a3b0616d6d3a9b098d5b76106c70d6e"} Apr 22 18:52:10.855883 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:10.855843 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ncrns" event={"ID":"ef41e668-ef06-4c12-a7c7-851cb039c728","Type":"ContainerStarted","Data":"9fcef6e50976c35b324701d8890258ffa75c00b938aa729f99b518e18f49a0f1"} Apr 22 18:52:10.856215 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:10.855972 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ncrns" Apr 22 18:52:10.874699 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:10.874650 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ncrns" podStartSLOduration=1.506883111 podStartE2EDuration="4.874638821s" podCreationTimestamp="2026-04-22 18:52:06 +0000 UTC" firstStartedPulling="2026-04-22 18:52:07.392287065 +0000 UTC m=+329.171120532" lastFinishedPulling="2026-04-22 18:52:10.760042774 +0000 UTC m=+332.538876242" observedRunningTime="2026-04-22 18:52:10.872564787 +0000 UTC m=+332.651398273" watchObservedRunningTime="2026-04-22 18:52:10.874638821 +0000 UTC m=+332.653472349" Apr 22 18:52:11.240391 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:11.240323 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-qt7q5"] Apr 22 18:52:11.243625 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:11.243606 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-qt7q5" Apr 22 18:52:11.245861 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:11.245840 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 22 18:52:11.246028 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:11.245870 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 22 18:52:11.246028 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:11.245892 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-rb9zh\"" Apr 22 18:52:11.250906 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:11.250889 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-qt7q5"] Apr 22 18:52:11.311624 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:11.311586 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/512e449b-aed5-4c0e-b7b7-2a9b86aa7a2e-certificates\") pod \"keda-operator-ffbb595cb-qt7q5\" (UID: \"512e449b-aed5-4c0e-b7b7-2a9b86aa7a2e\") " pod="openshift-keda/keda-operator-ffbb595cb-qt7q5" Apr 22 18:52:11.311761 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:11.311634 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvcbw\" (UniqueName: \"kubernetes.io/projected/512e449b-aed5-4c0e-b7b7-2a9b86aa7a2e-kube-api-access-nvcbw\") pod \"keda-operator-ffbb595cb-qt7q5\" (UID: \"512e449b-aed5-4c0e-b7b7-2a9b86aa7a2e\") " pod="openshift-keda/keda-operator-ffbb595cb-qt7q5" Apr 22 18:52:11.311761 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:11.311664 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/512e449b-aed5-4c0e-b7b7-2a9b86aa7a2e-cabundle0\") pod \"keda-operator-ffbb595cb-qt7q5\" (UID: \"512e449b-aed5-4c0e-b7b7-2a9b86aa7a2e\") " pod="openshift-keda/keda-operator-ffbb595cb-qt7q5" Apr 22 18:52:11.412546 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:11.412515 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/512e449b-aed5-4c0e-b7b7-2a9b86aa7a2e-certificates\") pod \"keda-operator-ffbb595cb-qt7q5\" (UID: \"512e449b-aed5-4c0e-b7b7-2a9b86aa7a2e\") " pod="openshift-keda/keda-operator-ffbb595cb-qt7q5" Apr 22 18:52:11.412700 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:11.412561 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nvcbw\" (UniqueName: \"kubernetes.io/projected/512e449b-aed5-4c0e-b7b7-2a9b86aa7a2e-kube-api-access-nvcbw\") pod \"keda-operator-ffbb595cb-qt7q5\" (UID: \"512e449b-aed5-4c0e-b7b7-2a9b86aa7a2e\") " pod="openshift-keda/keda-operator-ffbb595cb-qt7q5" Apr 22 18:52:11.412700 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:11.412587 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/512e449b-aed5-4c0e-b7b7-2a9b86aa7a2e-cabundle0\") pod \"keda-operator-ffbb595cb-qt7q5\" (UID: \"512e449b-aed5-4c0e-b7b7-2a9b86aa7a2e\") " pod="openshift-keda/keda-operator-ffbb595cb-qt7q5" Apr 22 18:52:11.412700 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:52:11.412679 2566 secret.go:281] references non-existent secret key: ca.crt Apr 22 18:52:11.412819 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:52:11.412702 2566 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 18:52:11.412819 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:52:11.412714 2566 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-qt7q5: references non-existent secret key: ca.crt Apr 22 18:52:11.412819 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:52:11.412783 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/512e449b-aed5-4c0e-b7b7-2a9b86aa7a2e-certificates podName:512e449b-aed5-4c0e-b7b7-2a9b86aa7a2e nodeName:}" failed. No retries permitted until 2026-04-22 18:52:11.912762198 +0000 UTC m=+333.691595675 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/512e449b-aed5-4c0e-b7b7-2a9b86aa7a2e-certificates") pod "keda-operator-ffbb595cb-qt7q5" (UID: "512e449b-aed5-4c0e-b7b7-2a9b86aa7a2e") : references non-existent secret key: ca.crt Apr 22 18:52:11.413296 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:11.413280 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/512e449b-aed5-4c0e-b7b7-2a9b86aa7a2e-cabundle0\") pod \"keda-operator-ffbb595cb-qt7q5\" (UID: \"512e449b-aed5-4c0e-b7b7-2a9b86aa7a2e\") " pod="openshift-keda/keda-operator-ffbb595cb-qt7q5" Apr 22 18:52:11.420975 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:11.420927 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvcbw\" (UniqueName: \"kubernetes.io/projected/512e449b-aed5-4c0e-b7b7-2a9b86aa7a2e-kube-api-access-nvcbw\") pod \"keda-operator-ffbb595cb-qt7q5\" (UID: \"512e449b-aed5-4c0e-b7b7-2a9b86aa7a2e\") " pod="openshift-keda/keda-operator-ffbb595cb-qt7q5" Apr 22 18:52:11.719617 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:11.719569 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-rkpph"] Apr 22 18:52:11.722924 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:11.722906 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-rkpph" Apr 22 18:52:11.725117 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:11.725096 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 22 18:52:11.730174 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:11.730153 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-rkpph"] Apr 22 18:52:11.816543 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:11.816502 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0179f270-8f72-4962-a9be-b137148c6bcf-certificates\") pod \"keda-admission-cf49989db-rkpph\" (UID: \"0179f270-8f72-4962-a9be-b137148c6bcf\") " pod="openshift-keda/keda-admission-cf49989db-rkpph" Apr 22 18:52:11.816720 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:11.816553 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj76v\" (UniqueName: \"kubernetes.io/projected/0179f270-8f72-4962-a9be-b137148c6bcf-kube-api-access-mj76v\") pod \"keda-admission-cf49989db-rkpph\" (UID: \"0179f270-8f72-4962-a9be-b137148c6bcf\") " pod="openshift-keda/keda-admission-cf49989db-rkpph" Apr 22 18:52:11.917403 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:11.917367 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/512e449b-aed5-4c0e-b7b7-2a9b86aa7a2e-certificates\") pod \"keda-operator-ffbb595cb-qt7q5\" (UID: \"512e449b-aed5-4c0e-b7b7-2a9b86aa7a2e\") " pod="openshift-keda/keda-operator-ffbb595cb-qt7q5" Apr 22 18:52:11.917869 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:52:11.917525 2566 secret.go:281] references non-existent secret key: ca.crt Apr 22 18:52:11.917869 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:52:11.917548 2566 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 18:52:11.917869 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:52:11.917560 2566 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-qt7q5: references non-existent secret key: ca.crt Apr 22 18:52:11.917869 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:52:11.917618 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/512e449b-aed5-4c0e-b7b7-2a9b86aa7a2e-certificates podName:512e449b-aed5-4c0e-b7b7-2a9b86aa7a2e nodeName:}" failed. No retries permitted until 2026-04-22 18:52:12.917598187 +0000 UTC m=+334.696431653 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/512e449b-aed5-4c0e-b7b7-2a9b86aa7a2e-certificates") pod "keda-operator-ffbb595cb-qt7q5" (UID: "512e449b-aed5-4c0e-b7b7-2a9b86aa7a2e") : references non-existent secret key: ca.crt Apr 22 18:52:11.917869 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:11.917644 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0179f270-8f72-4962-a9be-b137148c6bcf-certificates\") pod \"keda-admission-cf49989db-rkpph\" (UID: \"0179f270-8f72-4962-a9be-b137148c6bcf\") " pod="openshift-keda/keda-admission-cf49989db-rkpph" Apr 22 18:52:11.917869 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:11.917679 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mj76v\" (UniqueName: \"kubernetes.io/projected/0179f270-8f72-4962-a9be-b137148c6bcf-kube-api-access-mj76v\") pod \"keda-admission-cf49989db-rkpph\" (UID: \"0179f270-8f72-4962-a9be-b137148c6bcf\") " pod="openshift-keda/keda-admission-cf49989db-rkpph" Apr 22 18:52:11.917869 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:52:11.917742 2566 projected.go:264] Couldn't get secret openshift-keda/keda-admission-webhooks-certs: secret "keda-admission-webhooks-certs" not found Apr 22 18:52:11.917869 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:52:11.917761 2566 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-admission-cf49989db-rkpph: secret "keda-admission-webhooks-certs" not found Apr 22 18:52:11.918195 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:52:11.917899 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0179f270-8f72-4962-a9be-b137148c6bcf-certificates podName:0179f270-8f72-4962-a9be-b137148c6bcf nodeName:}" failed. No retries permitted until 2026-04-22 18:52:12.417875087 +0000 UTC m=+334.196708556 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/0179f270-8f72-4962-a9be-b137148c6bcf-certificates") pod "keda-admission-cf49989db-rkpph" (UID: "0179f270-8f72-4962-a9be-b137148c6bcf") : secret "keda-admission-webhooks-certs" not found Apr 22 18:52:11.926790 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:11.926766 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj76v\" (UniqueName: \"kubernetes.io/projected/0179f270-8f72-4962-a9be-b137148c6bcf-kube-api-access-mj76v\") pod \"keda-admission-cf49989db-rkpph\" (UID: \"0179f270-8f72-4962-a9be-b137148c6bcf\") " pod="openshift-keda/keda-admission-cf49989db-rkpph" Apr 22 18:52:12.421553 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:12.421521 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0179f270-8f72-4962-a9be-b137148c6bcf-certificates\") pod \"keda-admission-cf49989db-rkpph\" (UID: \"0179f270-8f72-4962-a9be-b137148c6bcf\") " pod="openshift-keda/keda-admission-cf49989db-rkpph" Apr 22 18:52:12.424035 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:12.424011 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0179f270-8f72-4962-a9be-b137148c6bcf-certificates\") pod \"keda-admission-cf49989db-rkpph\" (UID: \"0179f270-8f72-4962-a9be-b137148c6bcf\") " pod="openshift-keda/keda-admission-cf49989db-rkpph" Apr 22 18:52:12.634793 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:12.634757 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-rkpph" Apr 22 18:52:12.759191 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:12.759166 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-rkpph"] Apr 22 18:52:12.761976 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:52:12.761930 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0179f270_8f72_4962_a9be_b137148c6bcf.slice/crio-419943dbd85c9df9971ad91b432c1064038c2438358964801ded9511a3bd99b8 WatchSource:0}: Error finding container 419943dbd85c9df9971ad91b432c1064038c2438358964801ded9511a3bd99b8: Status 404 returned error can't find the container with id 419943dbd85c9df9971ad91b432c1064038c2438358964801ded9511a3bd99b8 Apr 22 18:52:12.865006 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:12.864978 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-rkpph" event={"ID":"0179f270-8f72-4962-a9be-b137148c6bcf","Type":"ContainerStarted","Data":"419943dbd85c9df9971ad91b432c1064038c2438358964801ded9511a3bd99b8"} Apr 22 18:52:12.925510 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:12.925482 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/512e449b-aed5-4c0e-b7b7-2a9b86aa7a2e-certificates\") pod \"keda-operator-ffbb595cb-qt7q5\" (UID: \"512e449b-aed5-4c0e-b7b7-2a9b86aa7a2e\") " pod="openshift-keda/keda-operator-ffbb595cb-qt7q5" Apr 22 18:52:12.925834 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:52:12.925607 2566 secret.go:281] references non-existent secret key: ca.crt Apr 22 18:52:12.925834 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:52:12.925619 2566 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 18:52:12.925834 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:52:12.925628 2566 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-qt7q5: references non-existent secret key: ca.crt Apr 22 18:52:12.925834 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:52:12.925676 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/512e449b-aed5-4c0e-b7b7-2a9b86aa7a2e-certificates podName:512e449b-aed5-4c0e-b7b7-2a9b86aa7a2e nodeName:}" failed. No retries permitted until 2026-04-22 18:52:14.925658328 +0000 UTC m=+336.704491792 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/512e449b-aed5-4c0e-b7b7-2a9b86aa7a2e-certificates") pod "keda-operator-ffbb595cb-qt7q5" (UID: "512e449b-aed5-4c0e-b7b7-2a9b86aa7a2e") : references non-existent secret key: ca.crt Apr 22 18:52:14.877736 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:14.877696 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-rkpph" event={"ID":"0179f270-8f72-4962-a9be-b137148c6bcf","Type":"ContainerStarted","Data":"e114be4cfedd7bced8a054abcf3c2aa037ff69c2a8795dcf18243e4c4d7c74b7"} Apr 22 18:52:14.878164 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:14.877788 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-rkpph" Apr 22 18:52:14.891518 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:14.891467 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-rkpph" podStartSLOduration=2.090769731 podStartE2EDuration="3.891453925s" podCreationTimestamp="2026-04-22 18:52:11 +0000 UTC" firstStartedPulling="2026-04-22 18:52:12.763222135 +0000 UTC m=+334.542055602" lastFinishedPulling="2026-04-22 18:52:14.563906328 +0000 UTC m=+336.342739796" observedRunningTime="2026-04-22 18:52:14.891018514 +0000 UTC m=+336.669852001" watchObservedRunningTime="2026-04-22 18:52:14.891453925 +0000 UTC m=+336.670287411" Apr 22 18:52:14.943592 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:14.943563 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/512e449b-aed5-4c0e-b7b7-2a9b86aa7a2e-certificates\") pod \"keda-operator-ffbb595cb-qt7q5\" (UID: \"512e449b-aed5-4c0e-b7b7-2a9b86aa7a2e\") " pod="openshift-keda/keda-operator-ffbb595cb-qt7q5" Apr 22 18:52:14.943719 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:52:14.943707 2566 secret.go:281] references non-existent secret key: ca.crt Apr 22 18:52:14.943762 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:52:14.943722 2566 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 18:52:14.943762 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:52:14.943733 2566 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-qt7q5: references non-existent secret key: ca.crt Apr 22 18:52:14.943832 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:52:14.943794 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/512e449b-aed5-4c0e-b7b7-2a9b86aa7a2e-certificates podName:512e449b-aed5-4c0e-b7b7-2a9b86aa7a2e nodeName:}" failed. No retries permitted until 2026-04-22 18:52:18.943779212 +0000 UTC m=+340.722612685 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/512e449b-aed5-4c0e-b7b7-2a9b86aa7a2e-certificates") pod "keda-operator-ffbb595cb-qt7q5" (UID: "512e449b-aed5-4c0e-b7b7-2a9b86aa7a2e") : references non-existent secret key: ca.crt Apr 22 18:52:18.979440 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:18.979410 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/512e449b-aed5-4c0e-b7b7-2a9b86aa7a2e-certificates\") pod \"keda-operator-ffbb595cb-qt7q5\" (UID: \"512e449b-aed5-4c0e-b7b7-2a9b86aa7a2e\") " pod="openshift-keda/keda-operator-ffbb595cb-qt7q5" Apr 22 18:52:18.981807 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:18.981774 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/512e449b-aed5-4c0e-b7b7-2a9b86aa7a2e-certificates\") pod \"keda-operator-ffbb595cb-qt7q5\" (UID: \"512e449b-aed5-4c0e-b7b7-2a9b86aa7a2e\") " pod="openshift-keda/keda-operator-ffbb595cb-qt7q5" Apr 22 18:52:19.054788 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:19.054752 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-qt7q5" Apr 22 18:52:19.171641 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:19.171614 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-qt7q5"] Apr 22 18:52:19.173781 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:52:19.173752 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod512e449b_aed5_4c0e_b7b7_2a9b86aa7a2e.slice/crio-3b720fd446e5eeec32a56166438452a2b24dc4285556a3c2f56cd7301bc6169b WatchSource:0}: Error finding container 3b720fd446e5eeec32a56166438452a2b24dc4285556a3c2f56cd7301bc6169b: Status 404 returned error can't find the container with id 3b720fd446e5eeec32a56166438452a2b24dc4285556a3c2f56cd7301bc6169b Apr 22 18:52:19.894159 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:19.894115 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-qt7q5" event={"ID":"512e449b-aed5-4c0e-b7b7-2a9b86aa7a2e","Type":"ContainerStarted","Data":"3b720fd446e5eeec32a56166438452a2b24dc4285556a3c2f56cd7301bc6169b"} Apr 22 18:52:23.909509 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:23.909473 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-qt7q5" event={"ID":"512e449b-aed5-4c0e-b7b7-2a9b86aa7a2e","Type":"ContainerStarted","Data":"a2142f80ca29b076d78a99122c98b8a2628889fabaa56345b2db0809bfbbf269"} Apr 22 18:52:23.909873 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:23.909607 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-qt7q5" Apr 22 18:52:23.929715 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:23.929670 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-qt7q5" podStartSLOduration=9.065430338 podStartE2EDuration="12.929658876s" podCreationTimestamp="2026-04-22 18:52:11 +0000 UTC" firstStartedPulling="2026-04-22 18:52:19.175440091 +0000 UTC m=+340.954273556" lastFinishedPulling="2026-04-22 18:52:23.039668627 +0000 UTC m=+344.818502094" observedRunningTime="2026-04-22 18:52:23.928658509 +0000 UTC m=+345.707491996" watchObservedRunningTime="2026-04-22 18:52:23.929658876 +0000 UTC m=+345.708492361" Apr 22 18:52:31.862981 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:31.862891 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ncrns" Apr 22 18:52:35.882536 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:35.882504 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-rkpph" Apr 22 18:52:44.914651 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:52:44.914620 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-qt7q5" Apr 22 18:53:06.760390 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:06.760358 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dsxlhw"] Apr 22 18:53:06.763921 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:06.763906 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dsxlhw" Apr 22 18:53:06.766135 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:06.766115 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 18:53:06.766230 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:06.766146 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-h655x\"" Apr 22 18:53:06.766914 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:06.766892 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 18:53:06.770655 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:06.770628 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dsxlhw"] Apr 22 18:53:06.869756 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:06.869719 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6553bc66-13d6-4ed3-a71d-2d8004c73384-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dsxlhw\" (UID: \"6553bc66-13d6-4ed3-a71d-2d8004c73384\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dsxlhw" Apr 22 18:53:06.869923 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:06.869788 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6553bc66-13d6-4ed3-a71d-2d8004c73384-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dsxlhw\" (UID: \"6553bc66-13d6-4ed3-a71d-2d8004c73384\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dsxlhw" Apr 22 18:53:06.869923 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:06.869807 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scvcc\" (UniqueName: \"kubernetes.io/projected/6553bc66-13d6-4ed3-a71d-2d8004c73384-kube-api-access-scvcc\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dsxlhw\" (UID: \"6553bc66-13d6-4ed3-a71d-2d8004c73384\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dsxlhw" Apr 22 18:53:06.970281 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:06.970237 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6553bc66-13d6-4ed3-a71d-2d8004c73384-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dsxlhw\" (UID: \"6553bc66-13d6-4ed3-a71d-2d8004c73384\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dsxlhw" Apr 22 18:53:06.970281 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:06.970288 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-scvcc\" (UniqueName: \"kubernetes.io/projected/6553bc66-13d6-4ed3-a71d-2d8004c73384-kube-api-access-scvcc\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dsxlhw\" (UID: \"6553bc66-13d6-4ed3-a71d-2d8004c73384\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dsxlhw" Apr 22 18:53:06.970450 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:06.970332 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6553bc66-13d6-4ed3-a71d-2d8004c73384-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dsxlhw\" (UID: \"6553bc66-13d6-4ed3-a71d-2d8004c73384\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dsxlhw" Apr 22 18:53:06.970669 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:06.970654 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6553bc66-13d6-4ed3-a71d-2d8004c73384-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dsxlhw\" (UID: \"6553bc66-13d6-4ed3-a71d-2d8004c73384\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dsxlhw" Apr 22 18:53:06.970704 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:06.970659 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6553bc66-13d6-4ed3-a71d-2d8004c73384-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dsxlhw\" (UID: \"6553bc66-13d6-4ed3-a71d-2d8004c73384\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dsxlhw" Apr 22 18:53:06.977594 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:06.977562 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-scvcc\" (UniqueName: \"kubernetes.io/projected/6553bc66-13d6-4ed3-a71d-2d8004c73384-kube-api-access-scvcc\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dsxlhw\" (UID: \"6553bc66-13d6-4ed3-a71d-2d8004c73384\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dsxlhw" Apr 22 18:53:07.074340 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:07.074272 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dsxlhw" Apr 22 18:53:07.189881 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:07.189850 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dsxlhw"] Apr 22 18:53:07.193488 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:53:07.193458 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6553bc66_13d6_4ed3_a71d_2d8004c73384.slice/crio-8e8cff4ecfcda51738d4351056243d3bc9da8d07824cc8a81ab6ff88c9186ce7 WatchSource:0}: Error finding container 8e8cff4ecfcda51738d4351056243d3bc9da8d07824cc8a81ab6ff88c9186ce7: Status 404 returned error can't find the container with id 8e8cff4ecfcda51738d4351056243d3bc9da8d07824cc8a81ab6ff88c9186ce7 Apr 22 18:53:08.050722 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:08.050690 2566 generic.go:358] "Generic (PLEG): container finished" podID="6553bc66-13d6-4ed3-a71d-2d8004c73384" containerID="7c43c7ae1cd32197fcad7c25785434e8dca3b565864d92a4c945c4122f271759" exitCode=0 Apr 22 18:53:08.051107 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:08.050750 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dsxlhw" event={"ID":"6553bc66-13d6-4ed3-a71d-2d8004c73384","Type":"ContainerDied","Data":"7c43c7ae1cd32197fcad7c25785434e8dca3b565864d92a4c945c4122f271759"} Apr 22 18:53:08.051107 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:08.050776 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dsxlhw" event={"ID":"6553bc66-13d6-4ed3-a71d-2d8004c73384","Type":"ContainerStarted","Data":"8e8cff4ecfcda51738d4351056243d3bc9da8d07824cc8a81ab6ff88c9186ce7"} Apr 22 18:53:11.062660 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:11.062625 2566 generic.go:358] "Generic (PLEG): container finished" podID="6553bc66-13d6-4ed3-a71d-2d8004c73384" containerID="726e198c1963f1c053865b49348afa59f81ae366b272ea1fbda473f181d3f3df" exitCode=0 Apr 22 18:53:11.063125 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:11.062685 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dsxlhw" event={"ID":"6553bc66-13d6-4ed3-a71d-2d8004c73384","Type":"ContainerDied","Data":"726e198c1963f1c053865b49348afa59f81ae366b272ea1fbda473f181d3f3df"} Apr 22 18:53:12.069783 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:12.069751 2566 generic.go:358] "Generic (PLEG): container finished" podID="6553bc66-13d6-4ed3-a71d-2d8004c73384" containerID="4454705f4fbbd0b3eeafa287520b08dc45a6f172f697d3180006d56f58933e74" exitCode=0 Apr 22 18:53:12.070182 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:12.069844 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dsxlhw" event={"ID":"6553bc66-13d6-4ed3-a71d-2d8004c73384","Type":"ContainerDied","Data":"4454705f4fbbd0b3eeafa287520b08dc45a6f172f697d3180006d56f58933e74"} Apr 22 18:53:13.192725 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:13.192697 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dsxlhw" Apr 22 18:53:13.325661 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:13.325580 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6553bc66-13d6-4ed3-a71d-2d8004c73384-bundle\") pod \"6553bc66-13d6-4ed3-a71d-2d8004c73384\" (UID: \"6553bc66-13d6-4ed3-a71d-2d8004c73384\") " Apr 22 18:53:13.325800 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:13.325681 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6553bc66-13d6-4ed3-a71d-2d8004c73384-util\") pod \"6553bc66-13d6-4ed3-a71d-2d8004c73384\" (UID: \"6553bc66-13d6-4ed3-a71d-2d8004c73384\") " Apr 22 18:53:13.325800 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:13.325719 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-scvcc\" (UniqueName: \"kubernetes.io/projected/6553bc66-13d6-4ed3-a71d-2d8004c73384-kube-api-access-scvcc\") pod \"6553bc66-13d6-4ed3-a71d-2d8004c73384\" (UID: \"6553bc66-13d6-4ed3-a71d-2d8004c73384\") " Apr 22 18:53:13.326337 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:13.326310 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6553bc66-13d6-4ed3-a71d-2d8004c73384-bundle" (OuterVolumeSpecName: "bundle") pod "6553bc66-13d6-4ed3-a71d-2d8004c73384" (UID: "6553bc66-13d6-4ed3-a71d-2d8004c73384"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:53:13.327777 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:13.327749 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6553bc66-13d6-4ed3-a71d-2d8004c73384-kube-api-access-scvcc" (OuterVolumeSpecName: "kube-api-access-scvcc") pod "6553bc66-13d6-4ed3-a71d-2d8004c73384" (UID: "6553bc66-13d6-4ed3-a71d-2d8004c73384"). InnerVolumeSpecName "kube-api-access-scvcc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:53:13.330218 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:13.330195 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6553bc66-13d6-4ed3-a71d-2d8004c73384-util" (OuterVolumeSpecName: "util") pod "6553bc66-13d6-4ed3-a71d-2d8004c73384" (UID: "6553bc66-13d6-4ed3-a71d-2d8004c73384"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:53:13.426293 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:13.426265 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-scvcc\" (UniqueName: \"kubernetes.io/projected/6553bc66-13d6-4ed3-a71d-2d8004c73384-kube-api-access-scvcc\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:53:13.426293 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:13.426291 2566 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6553bc66-13d6-4ed3-a71d-2d8004c73384-bundle\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:53:13.426293 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:13.426301 2566 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6553bc66-13d6-4ed3-a71d-2d8004c73384-util\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:53:14.078748 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:14.078711 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dsxlhw" event={"ID":"6553bc66-13d6-4ed3-a71d-2d8004c73384","Type":"ContainerDied","Data":"8e8cff4ecfcda51738d4351056243d3bc9da8d07824cc8a81ab6ff88c9186ce7"} Apr 22 18:53:14.078748 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:14.078732 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dsxlhw" Apr 22 18:53:14.078748 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:14.078749 2566 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e8cff4ecfcda51738d4351056243d3bc9da8d07824cc8a81ab6ff88c9186ce7" Apr 22 18:53:27.094424 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:27.094391 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-tbs64"] Apr 22 18:53:27.094865 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:27.094758 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6553bc66-13d6-4ed3-a71d-2d8004c73384" containerName="util" Apr 22 18:53:27.094865 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:27.094770 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="6553bc66-13d6-4ed3-a71d-2d8004c73384" containerName="util" Apr 22 18:53:27.094865 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:27.094783 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6553bc66-13d6-4ed3-a71d-2d8004c73384" containerName="pull" Apr 22 18:53:27.094865 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:27.094788 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="6553bc66-13d6-4ed3-a71d-2d8004c73384" containerName="pull" Apr 22 18:53:27.094865 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:27.094795 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6553bc66-13d6-4ed3-a71d-2d8004c73384" containerName="extract" Apr 22 18:53:27.094865 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:27.094801 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="6553bc66-13d6-4ed3-a71d-2d8004c73384" containerName="extract" Apr 22 18:53:27.094865 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:27.094853 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="6553bc66-13d6-4ed3-a71d-2d8004c73384" containerName="extract" Apr 22 18:53:27.099115 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:27.099098 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-tbs64" Apr 22 18:53:27.101444 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:27.101425 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 22 18:53:27.102487 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:27.102472 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 22 18:53:27.102557 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:27.102472 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-64s79\"" Apr 22 18:53:27.107722 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:27.107699 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-tbs64"] Apr 22 18:53:27.242648 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:27.242607 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xhvw\" (UniqueName: \"kubernetes.io/projected/ed8def6b-56ab-4d36-a783-6e6e3252724a-kube-api-access-5xhvw\") pod \"cert-manager-webhook-587ccfb98-tbs64\" (UID: \"ed8def6b-56ab-4d36-a783-6e6e3252724a\") " pod="cert-manager/cert-manager-webhook-587ccfb98-tbs64" Apr 22 18:53:27.242829 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:27.242717 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ed8def6b-56ab-4d36-a783-6e6e3252724a-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-tbs64\" (UID: \"ed8def6b-56ab-4d36-a783-6e6e3252724a\") " pod="cert-manager/cert-manager-webhook-587ccfb98-tbs64" Apr 22 18:53:27.344046 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:27.344015 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5xhvw\" (UniqueName: \"kubernetes.io/projected/ed8def6b-56ab-4d36-a783-6e6e3252724a-kube-api-access-5xhvw\") pod \"cert-manager-webhook-587ccfb98-tbs64\" (UID: \"ed8def6b-56ab-4d36-a783-6e6e3252724a\") " pod="cert-manager/cert-manager-webhook-587ccfb98-tbs64" Apr 22 18:53:27.344197 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:27.344066 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ed8def6b-56ab-4d36-a783-6e6e3252724a-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-tbs64\" (UID: \"ed8def6b-56ab-4d36-a783-6e6e3252724a\") " pod="cert-manager/cert-manager-webhook-587ccfb98-tbs64" Apr 22 18:53:27.351858 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:27.351793 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ed8def6b-56ab-4d36-a783-6e6e3252724a-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-tbs64\" (UID: \"ed8def6b-56ab-4d36-a783-6e6e3252724a\") " pod="cert-manager/cert-manager-webhook-587ccfb98-tbs64" Apr 22 18:53:27.351989 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:27.351889 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xhvw\" (UniqueName: \"kubernetes.io/projected/ed8def6b-56ab-4d36-a783-6e6e3252724a-kube-api-access-5xhvw\") pod \"cert-manager-webhook-587ccfb98-tbs64\" (UID: \"ed8def6b-56ab-4d36-a783-6e6e3252724a\") " pod="cert-manager/cert-manager-webhook-587ccfb98-tbs64" Apr 22 18:53:27.423763 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:27.423731 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-tbs64" Apr 22 18:53:27.540229 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:27.540204 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-tbs64"] Apr 22 18:53:27.543015 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:53:27.542991 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded8def6b_56ab_4d36_a783_6e6e3252724a.slice/crio-8cd87c22d8d837ab90e0080013fcac1f3963972e6e242568dbce1ede797c7bd0 WatchSource:0}: Error finding container 8cd87c22d8d837ab90e0080013fcac1f3963972e6e242568dbce1ede797c7bd0: Status 404 returned error can't find the container with id 8cd87c22d8d837ab90e0080013fcac1f3963972e6e242568dbce1ede797c7bd0 Apr 22 18:53:28.125364 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:28.125331 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-tbs64" event={"ID":"ed8def6b-56ab-4d36-a783-6e6e3252724a","Type":"ContainerStarted","Data":"8cd87c22d8d837ab90e0080013fcac1f3963972e6e242568dbce1ede797c7bd0"} Apr 22 18:53:30.946534 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:30.946497 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fxdjwv"] Apr 22 18:53:30.949816 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:30.949796 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fxdjwv" Apr 22 18:53:30.952216 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:30.952193 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 18:53:30.952307 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:30.952218 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 18:53:30.953040 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:30.953026 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-h655x\"" Apr 22 18:53:30.957976 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:30.957950 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fxdjwv"] Apr 22 18:53:30.972636 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:30.972608 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sk62w\" (UniqueName: \"kubernetes.io/projected/a73d4935-e764-4bbf-9615-47e46e549614-kube-api-access-sk62w\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fxdjwv\" (UID: \"a73d4935-e764-4bbf-9615-47e46e549614\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fxdjwv" Apr 22 18:53:30.972791 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:30.972649 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a73d4935-e764-4bbf-9615-47e46e549614-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fxdjwv\" (UID: \"a73d4935-e764-4bbf-9615-47e46e549614\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fxdjwv" Apr 22 18:53:30.972791 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:30.972721 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a73d4935-e764-4bbf-9615-47e46e549614-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fxdjwv\" (UID: \"a73d4935-e764-4bbf-9615-47e46e549614\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fxdjwv" Apr 22 18:53:31.073526 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:31.073489 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sk62w\" (UniqueName: \"kubernetes.io/projected/a73d4935-e764-4bbf-9615-47e46e549614-kube-api-access-sk62w\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fxdjwv\" (UID: \"a73d4935-e764-4bbf-9615-47e46e549614\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fxdjwv" Apr 22 18:53:31.073691 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:31.073537 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a73d4935-e764-4bbf-9615-47e46e549614-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fxdjwv\" (UID: \"a73d4935-e764-4bbf-9615-47e46e549614\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fxdjwv" Apr 22 18:53:31.073691 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:31.073600 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a73d4935-e764-4bbf-9615-47e46e549614-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fxdjwv\" (UID: \"a73d4935-e764-4bbf-9615-47e46e549614\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fxdjwv" Apr 22 18:53:31.074039 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:31.074019 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a73d4935-e764-4bbf-9615-47e46e549614-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fxdjwv\" (UID: \"a73d4935-e764-4bbf-9615-47e46e549614\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fxdjwv" Apr 22 18:53:31.074085 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:31.074022 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a73d4935-e764-4bbf-9615-47e46e549614-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fxdjwv\" (UID: \"a73d4935-e764-4bbf-9615-47e46e549614\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fxdjwv" Apr 22 18:53:31.083142 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:31.083115 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sk62w\" (UniqueName: \"kubernetes.io/projected/a73d4935-e764-4bbf-9615-47e46e549614-kube-api-access-sk62w\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fxdjwv\" (UID: \"a73d4935-e764-4bbf-9615-47e46e549614\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fxdjwv" Apr 22 18:53:31.138082 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:31.138047 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-tbs64" event={"ID":"ed8def6b-56ab-4d36-a783-6e6e3252724a","Type":"ContainerStarted","Data":"7afc592c19209ca0b7676b8bfed5814ef5436283cef1c6977d076294acc477a0"} Apr 22 18:53:31.138231 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:31.138171 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-587ccfb98-tbs64" Apr 22 18:53:31.152792 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:31.152752 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-587ccfb98-tbs64" podStartSLOduration=1.3911258229999999 podStartE2EDuration="4.152741078s" podCreationTimestamp="2026-04-22 18:53:27 +0000 UTC" firstStartedPulling="2026-04-22 18:53:27.544961702 +0000 UTC m=+409.323795167" lastFinishedPulling="2026-04-22 18:53:30.306576945 +0000 UTC m=+412.085410422" observedRunningTime="2026-04-22 18:53:31.151776715 +0000 UTC m=+412.930610200" watchObservedRunningTime="2026-04-22 18:53:31.152741078 +0000 UTC m=+412.931574564" Apr 22 18:53:31.260276 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:31.260198 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fxdjwv" Apr 22 18:53:31.380391 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:31.380354 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fxdjwv"] Apr 22 18:53:31.383458 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:53:31.383429 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda73d4935_e764_4bbf_9615_47e46e549614.slice/crio-9608e8c478c0976b1bd4c343a6c31b86c033c7ff19f09d8a6bd1332b76c8b244 WatchSource:0}: Error finding container 9608e8c478c0976b1bd4c343a6c31b86c033c7ff19f09d8a6bd1332b76c8b244: Status 404 returned error can't find the container with id 9608e8c478c0976b1bd4c343a6c31b86c033c7ff19f09d8a6bd1332b76c8b244 Apr 22 18:53:32.143087 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:32.143052 2566 generic.go:358] "Generic (PLEG): container finished" podID="a73d4935-e764-4bbf-9615-47e46e549614" containerID="4cd9d54c75e539951a4efd041cd118909f7190df57720b084ad4b14ab0ac2752" exitCode=0 Apr 22 18:53:32.143513 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:32.143110 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fxdjwv" event={"ID":"a73d4935-e764-4bbf-9615-47e46e549614","Type":"ContainerDied","Data":"4cd9d54c75e539951a4efd041cd118909f7190df57720b084ad4b14ab0ac2752"} Apr 22 18:53:32.143513 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:32.143153 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fxdjwv" event={"ID":"a73d4935-e764-4bbf-9615-47e46e549614","Type":"ContainerStarted","Data":"9608e8c478c0976b1bd4c343a6c31b86c033c7ff19f09d8a6bd1332b76c8b244"} Apr 22 18:53:35.154154 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:35.154125 2566 generic.go:358] "Generic (PLEG): container finished" podID="a73d4935-e764-4bbf-9615-47e46e549614" containerID="34afd52bf38ff0102dfb1d693238553d22f85d8d9d50ea6153662387936655af" exitCode=0 Apr 22 18:53:35.154525 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:35.154172 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fxdjwv" event={"ID":"a73d4935-e764-4bbf-9615-47e46e549614","Type":"ContainerDied","Data":"34afd52bf38ff0102dfb1d693238553d22f85d8d9d50ea6153662387936655af"} Apr 22 18:53:36.159477 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:36.159441 2566 generic.go:358] "Generic (PLEG): container finished" podID="a73d4935-e764-4bbf-9615-47e46e549614" containerID="53b99d7bc1567fc46cf78c7407898dc1f833ea6f7a333c7aae73b193c6140007" exitCode=0 Apr 22 18:53:36.159910 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:36.159525 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fxdjwv" event={"ID":"a73d4935-e764-4bbf-9615-47e46e549614","Type":"ContainerDied","Data":"53b99d7bc1567fc46cf78c7407898dc1f833ea6f7a333c7aae73b193c6140007"} Apr 22 18:53:37.144998 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:37.144968 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-587ccfb98-tbs64" Apr 22 18:53:37.283802 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:37.283781 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fxdjwv" Apr 22 18:53:37.322788 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:37.322761 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a73d4935-e764-4bbf-9615-47e46e549614-util\") pod \"a73d4935-e764-4bbf-9615-47e46e549614\" (UID: \"a73d4935-e764-4bbf-9615-47e46e549614\") " Apr 22 18:53:37.322957 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:37.322852 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sk62w\" (UniqueName: \"kubernetes.io/projected/a73d4935-e764-4bbf-9615-47e46e549614-kube-api-access-sk62w\") pod \"a73d4935-e764-4bbf-9615-47e46e549614\" (UID: \"a73d4935-e764-4bbf-9615-47e46e549614\") " Apr 22 18:53:37.322957 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:37.322886 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a73d4935-e764-4bbf-9615-47e46e549614-bundle\") pod \"a73d4935-e764-4bbf-9615-47e46e549614\" (UID: \"a73d4935-e764-4bbf-9615-47e46e549614\") " Apr 22 18:53:37.323325 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:37.323302 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a73d4935-e764-4bbf-9615-47e46e549614-bundle" (OuterVolumeSpecName: "bundle") pod "a73d4935-e764-4bbf-9615-47e46e549614" (UID: "a73d4935-e764-4bbf-9615-47e46e549614"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:53:37.325009 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:37.324987 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a73d4935-e764-4bbf-9615-47e46e549614-kube-api-access-sk62w" (OuterVolumeSpecName: "kube-api-access-sk62w") pod "a73d4935-e764-4bbf-9615-47e46e549614" (UID: "a73d4935-e764-4bbf-9615-47e46e549614"). InnerVolumeSpecName "kube-api-access-sk62w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:53:37.328319 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:37.328295 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a73d4935-e764-4bbf-9615-47e46e549614-util" (OuterVolumeSpecName: "util") pod "a73d4935-e764-4bbf-9615-47e46e549614" (UID: "a73d4935-e764-4bbf-9615-47e46e549614"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:53:37.423706 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:37.423643 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sk62w\" (UniqueName: \"kubernetes.io/projected/a73d4935-e764-4bbf-9615-47e46e549614-kube-api-access-sk62w\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:53:37.423706 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:37.423668 2566 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a73d4935-e764-4bbf-9615-47e46e549614-bundle\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:53:37.423706 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:37.423677 2566 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a73d4935-e764-4bbf-9615-47e46e549614-util\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:53:38.167706 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:38.167673 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fxdjwv" Apr 22 18:53:38.167872 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:38.167672 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fxdjwv" event={"ID":"a73d4935-e764-4bbf-9615-47e46e549614","Type":"ContainerDied","Data":"9608e8c478c0976b1bd4c343a6c31b86c033c7ff19f09d8a6bd1332b76c8b244"} Apr 22 18:53:38.167872 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:38.167778 2566 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9608e8c478c0976b1bd4c343a6c31b86c033c7ff19f09d8a6bd1332b76c8b244" Apr 22 18:53:46.013389 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:46.013350 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-79c8d999ff-lqdcc"] Apr 22 18:53:46.013751 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:46.013676 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a73d4935-e764-4bbf-9615-47e46e549614" containerName="util" Apr 22 18:53:46.013751 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:46.013687 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="a73d4935-e764-4bbf-9615-47e46e549614" containerName="util" Apr 22 18:53:46.013751 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:46.013697 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a73d4935-e764-4bbf-9615-47e46e549614" containerName="pull" Apr 22 18:53:46.013751 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:46.013702 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="a73d4935-e764-4bbf-9615-47e46e549614" containerName="pull" Apr 22 18:53:46.013751 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:46.013717 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a73d4935-e764-4bbf-9615-47e46e549614" containerName="extract" Apr 22 18:53:46.013751 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:46.013723 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="a73d4935-e764-4bbf-9615-47e46e549614" containerName="extract" Apr 22 18:53:46.013959 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:46.013778 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="a73d4935-e764-4bbf-9615-47e46e549614" containerName="extract" Apr 22 18:53:46.022396 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:46.022371 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-lqdcc" Apr 22 18:53:46.022507 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:46.022442 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-lqdcc"] Apr 22 18:53:46.024614 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:46.024592 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-n6sqn\"" Apr 22 18:53:46.097530 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:46.097483 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd2dl\" (UniqueName: \"kubernetes.io/projected/3f441101-b233-420a-84e6-25da1e113868-kube-api-access-kd2dl\") pod \"cert-manager-79c8d999ff-lqdcc\" (UID: \"3f441101-b233-420a-84e6-25da1e113868\") " pod="cert-manager/cert-manager-79c8d999ff-lqdcc" Apr 22 18:53:46.097530 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:46.097529 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3f441101-b233-420a-84e6-25da1e113868-bound-sa-token\") pod \"cert-manager-79c8d999ff-lqdcc\" (UID: \"3f441101-b233-420a-84e6-25da1e113868\") " pod="cert-manager/cert-manager-79c8d999ff-lqdcc" Apr 22 18:53:46.198872 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:46.198842 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kd2dl\" (UniqueName: \"kubernetes.io/projected/3f441101-b233-420a-84e6-25da1e113868-kube-api-access-kd2dl\") pod \"cert-manager-79c8d999ff-lqdcc\" (UID: \"3f441101-b233-420a-84e6-25da1e113868\") " pod="cert-manager/cert-manager-79c8d999ff-lqdcc" Apr 22 18:53:46.199010 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:46.198879 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3f441101-b233-420a-84e6-25da1e113868-bound-sa-token\") pod \"cert-manager-79c8d999ff-lqdcc\" (UID: \"3f441101-b233-420a-84e6-25da1e113868\") " pod="cert-manager/cert-manager-79c8d999ff-lqdcc" Apr 22 18:53:46.206486 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:46.206456 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3f441101-b233-420a-84e6-25da1e113868-bound-sa-token\") pod \"cert-manager-79c8d999ff-lqdcc\" (UID: \"3f441101-b233-420a-84e6-25da1e113868\") " pod="cert-manager/cert-manager-79c8d999ff-lqdcc" Apr 22 18:53:46.206614 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:46.206595 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd2dl\" (UniqueName: \"kubernetes.io/projected/3f441101-b233-420a-84e6-25da1e113868-kube-api-access-kd2dl\") pod \"cert-manager-79c8d999ff-lqdcc\" (UID: \"3f441101-b233-420a-84e6-25da1e113868\") " pod="cert-manager/cert-manager-79c8d999ff-lqdcc" Apr 22 18:53:46.332330 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:46.332250 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-lqdcc" Apr 22 18:53:46.451247 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:46.451214 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-lqdcc"] Apr 22 18:53:46.453680 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:53:46.453653 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f441101_b233_420a_84e6_25da1e113868.slice/crio-1d604ad6d8e7cc002fd16b61e9f455656f1d7fdacaf3831f60d0112c7e8dd2df WatchSource:0}: Error finding container 1d604ad6d8e7cc002fd16b61e9f455656f1d7fdacaf3831f60d0112c7e8dd2df: Status 404 returned error can't find the container with id 1d604ad6d8e7cc002fd16b61e9f455656f1d7fdacaf3831f60d0112c7e8dd2df Apr 22 18:53:47.198060 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:47.198024 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-lqdcc" event={"ID":"3f441101-b233-420a-84e6-25da1e113868","Type":"ContainerStarted","Data":"8b423acc260ddf896d7d48accdb5fefac4bb53ea2ef356e2b691d00c7522adb7"} Apr 22 18:53:47.198060 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:47.198058 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-lqdcc" event={"ID":"3f441101-b233-420a-84e6-25da1e113868","Type":"ContainerStarted","Data":"1d604ad6d8e7cc002fd16b61e9f455656f1d7fdacaf3831f60d0112c7e8dd2df"} Apr 22 18:53:47.216390 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:47.216342 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-79c8d999ff-lqdcc" podStartSLOduration=2.216326268 podStartE2EDuration="2.216326268s" podCreationTimestamp="2026-04-22 18:53:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:53:47.214917077 +0000 UTC m=+428.993750575" watchObservedRunningTime="2026-04-22 18:53:47.216326268 +0000 UTC m=+428.995159757" Apr 22 18:53:56.368880 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:56.368845 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835w2kk8"] Apr 22 18:53:56.372655 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:56.372639 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835w2kk8" Apr 22 18:53:56.374838 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:56.374814 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 18:53:56.375674 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:56.375653 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 18:53:56.375674 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:56.375669 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-h655x\"" Apr 22 18:53:56.378493 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:56.378469 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835w2kk8"] Apr 22 18:53:56.482817 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:56.482784 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/043c5e12-11a9-44ce-96d8-670fc42c3ad6-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835w2kk8\" (UID: \"043c5e12-11a9-44ce-96d8-670fc42c3ad6\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835w2kk8" Apr 22 18:53:56.482817 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:56.482821 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/043c5e12-11a9-44ce-96d8-670fc42c3ad6-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835w2kk8\" (UID: \"043c5e12-11a9-44ce-96d8-670fc42c3ad6\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835w2kk8" Apr 22 18:53:56.483025 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:56.482843 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsqvh\" (UniqueName: \"kubernetes.io/projected/043c5e12-11a9-44ce-96d8-670fc42c3ad6-kube-api-access-zsqvh\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835w2kk8\" (UID: \"043c5e12-11a9-44ce-96d8-670fc42c3ad6\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835w2kk8" Apr 22 18:53:56.584181 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:56.584140 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/043c5e12-11a9-44ce-96d8-670fc42c3ad6-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835w2kk8\" (UID: \"043c5e12-11a9-44ce-96d8-670fc42c3ad6\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835w2kk8" Apr 22 18:53:56.584332 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:56.584187 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/043c5e12-11a9-44ce-96d8-670fc42c3ad6-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835w2kk8\" (UID: \"043c5e12-11a9-44ce-96d8-670fc42c3ad6\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835w2kk8" Apr 22 18:53:56.584332 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:56.584221 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zsqvh\" (UniqueName: \"kubernetes.io/projected/043c5e12-11a9-44ce-96d8-670fc42c3ad6-kube-api-access-zsqvh\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835w2kk8\" (UID: \"043c5e12-11a9-44ce-96d8-670fc42c3ad6\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835w2kk8" Apr 22 18:53:56.584621 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:56.584602 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/043c5e12-11a9-44ce-96d8-670fc42c3ad6-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835w2kk8\" (UID: \"043c5e12-11a9-44ce-96d8-670fc42c3ad6\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835w2kk8" Apr 22 18:53:56.584665 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:56.584613 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/043c5e12-11a9-44ce-96d8-670fc42c3ad6-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835w2kk8\" (UID: \"043c5e12-11a9-44ce-96d8-670fc42c3ad6\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835w2kk8" Apr 22 18:53:56.591759 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:56.591732 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsqvh\" (UniqueName: \"kubernetes.io/projected/043c5e12-11a9-44ce-96d8-670fc42c3ad6-kube-api-access-zsqvh\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835w2kk8\" (UID: \"043c5e12-11a9-44ce-96d8-670fc42c3ad6\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835w2kk8" Apr 22 18:53:56.682892 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:56.682815 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835w2kk8" Apr 22 18:53:56.802219 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:56.802091 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835w2kk8"] Apr 22 18:53:56.804960 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:53:56.804911 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod043c5e12_11a9_44ce_96d8_670fc42c3ad6.slice/crio-f18f8eac8430fafb54394c084e7de59a68cb6128b65a4248816d9d87504d7461 WatchSource:0}: Error finding container f18f8eac8430fafb54394c084e7de59a68cb6128b65a4248816d9d87504d7461: Status 404 returned error can't find the container with id f18f8eac8430fafb54394c084e7de59a68cb6128b65a4248816d9d87504d7461 Apr 22 18:53:57.234216 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:57.234181 2566 generic.go:358] "Generic (PLEG): container finished" podID="043c5e12-11a9-44ce-96d8-670fc42c3ad6" containerID="f6927303721b1bfc039fa24ebb034abbfa2ece24a284ef5456950ff3ce0b6be1" exitCode=0 Apr 22 18:53:57.234383 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:57.234261 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835w2kk8" event={"ID":"043c5e12-11a9-44ce-96d8-670fc42c3ad6","Type":"ContainerDied","Data":"f6927303721b1bfc039fa24ebb034abbfa2ece24a284ef5456950ff3ce0b6be1"} Apr 22 18:53:57.234383 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:57.234296 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835w2kk8" event={"ID":"043c5e12-11a9-44ce-96d8-670fc42c3ad6","Type":"ContainerStarted","Data":"f18f8eac8430fafb54394c084e7de59a68cb6128b65a4248816d9d87504d7461"} Apr 22 18:53:58.238797 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:58.238765 2566 generic.go:358] "Generic (PLEG): container finished" podID="043c5e12-11a9-44ce-96d8-670fc42c3ad6" containerID="12117f8bb79ac652de35cc604a110fe244094cd22bec488240371d91bf666585" exitCode=0 Apr 22 18:53:58.239154 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:58.238846 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835w2kk8" event={"ID":"043c5e12-11a9-44ce-96d8-670fc42c3ad6","Type":"ContainerDied","Data":"12117f8bb79ac652de35cc604a110fe244094cd22bec488240371d91bf666585"} Apr 22 18:53:59.244012 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:59.243979 2566 generic.go:358] "Generic (PLEG): container finished" podID="043c5e12-11a9-44ce-96d8-670fc42c3ad6" containerID="2fd68b15e85e41606acb55f79fe4d83b265f159051a7f078be79c9d10166070d" exitCode=0 Apr 22 18:53:59.244374 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:53:59.244060 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835w2kk8" event={"ID":"043c5e12-11a9-44ce-96d8-670fc42c3ad6","Type":"ContainerDied","Data":"2fd68b15e85e41606acb55f79fe4d83b265f159051a7f078be79c9d10166070d"} Apr 22 18:54:00.374715 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:00.374693 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835w2kk8" Apr 22 18:54:00.518051 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:00.517963 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/043c5e12-11a9-44ce-96d8-670fc42c3ad6-bundle\") pod \"043c5e12-11a9-44ce-96d8-670fc42c3ad6\" (UID: \"043c5e12-11a9-44ce-96d8-670fc42c3ad6\") " Apr 22 18:54:00.518051 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:00.518020 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsqvh\" (UniqueName: \"kubernetes.io/projected/043c5e12-11a9-44ce-96d8-670fc42c3ad6-kube-api-access-zsqvh\") pod \"043c5e12-11a9-44ce-96d8-670fc42c3ad6\" (UID: \"043c5e12-11a9-44ce-96d8-670fc42c3ad6\") " Apr 22 18:54:00.518279 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:00.518112 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/043c5e12-11a9-44ce-96d8-670fc42c3ad6-util\") pod \"043c5e12-11a9-44ce-96d8-670fc42c3ad6\" (UID: \"043c5e12-11a9-44ce-96d8-670fc42c3ad6\") " Apr 22 18:54:00.518794 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:00.518768 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/043c5e12-11a9-44ce-96d8-670fc42c3ad6-bundle" (OuterVolumeSpecName: "bundle") pod "043c5e12-11a9-44ce-96d8-670fc42c3ad6" (UID: "043c5e12-11a9-44ce-96d8-670fc42c3ad6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:54:00.520133 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:00.520106 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/043c5e12-11a9-44ce-96d8-670fc42c3ad6-kube-api-access-zsqvh" (OuterVolumeSpecName: "kube-api-access-zsqvh") pod "043c5e12-11a9-44ce-96d8-670fc42c3ad6" (UID: "043c5e12-11a9-44ce-96d8-670fc42c3ad6"). InnerVolumeSpecName "kube-api-access-zsqvh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:54:00.523898 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:00.523852 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/043c5e12-11a9-44ce-96d8-670fc42c3ad6-util" (OuterVolumeSpecName: "util") pod "043c5e12-11a9-44ce-96d8-670fc42c3ad6" (UID: "043c5e12-11a9-44ce-96d8-670fc42c3ad6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:54:00.619261 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:00.619228 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zsqvh\" (UniqueName: \"kubernetes.io/projected/043c5e12-11a9-44ce-96d8-670fc42c3ad6-kube-api-access-zsqvh\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:54:00.619261 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:00.619256 2566 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/043c5e12-11a9-44ce-96d8-670fc42c3ad6-util\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:54:00.619261 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:00.619267 2566 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/043c5e12-11a9-44ce-96d8-670fc42c3ad6-bundle\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:54:01.252607 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:01.252579 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835w2kk8" Apr 22 18:54:01.252743 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:01.252582 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835w2kk8" event={"ID":"043c5e12-11a9-44ce-96d8-670fc42c3ad6","Type":"ContainerDied","Data":"f18f8eac8430fafb54394c084e7de59a68cb6128b65a4248816d9d87504d7461"} Apr 22 18:54:01.252743 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:01.252688 2566 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f18f8eac8430fafb54394c084e7de59a68cb6128b65a4248816d9d87504d7461" Apr 22 18:54:10.932266 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:10.932227 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebrjrkc"] Apr 22 18:54:10.932858 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:10.932719 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="043c5e12-11a9-44ce-96d8-670fc42c3ad6" containerName="util" Apr 22 18:54:10.932858 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:10.932738 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="043c5e12-11a9-44ce-96d8-670fc42c3ad6" containerName="util" Apr 22 18:54:10.932858 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:10.932755 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="043c5e12-11a9-44ce-96d8-670fc42c3ad6" containerName="extract" Apr 22 18:54:10.932858 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:10.932763 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="043c5e12-11a9-44ce-96d8-670fc42c3ad6" containerName="extract" Apr 22 18:54:10.932858 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:10.932778 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="043c5e12-11a9-44ce-96d8-670fc42c3ad6" containerName="pull" Apr 22 18:54:10.932858 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:10.932787 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="043c5e12-11a9-44ce-96d8-670fc42c3ad6" containerName="pull" Apr 22 18:54:10.933216 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:10.932884 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="043c5e12-11a9-44ce-96d8-670fc42c3ad6" containerName="extract" Apr 22 18:54:10.937436 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:10.937409 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebrjrkc" Apr 22 18:54:10.939725 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:10.939698 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 18:54:10.939857 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:10.939730 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-h655x\"" Apr 22 18:54:10.940538 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:10.940518 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 18:54:10.945183 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:10.945160 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebrjrkc"] Apr 22 18:54:11.105767 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:11.105733 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/80154316-d969-4a81-9b4c-f0670c8aa0ed-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebrjrkc\" (UID: \"80154316-d969-4a81-9b4c-f0670c8aa0ed\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebrjrkc" Apr 22 18:54:11.105964 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:11.105782 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rl5f9\" (UniqueName: \"kubernetes.io/projected/80154316-d969-4a81-9b4c-f0670c8aa0ed-kube-api-access-rl5f9\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebrjrkc\" (UID: \"80154316-d969-4a81-9b4c-f0670c8aa0ed\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebrjrkc" Apr 22 18:54:11.105964 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:11.105811 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/80154316-d969-4a81-9b4c-f0670c8aa0ed-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebrjrkc\" (UID: \"80154316-d969-4a81-9b4c-f0670c8aa0ed\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebrjrkc" Apr 22 18:54:11.206436 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:11.206330 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/80154316-d969-4a81-9b4c-f0670c8aa0ed-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebrjrkc\" (UID: \"80154316-d969-4a81-9b4c-f0670c8aa0ed\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebrjrkc" Apr 22 18:54:11.206436 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:11.206396 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rl5f9\" (UniqueName: \"kubernetes.io/projected/80154316-d969-4a81-9b4c-f0670c8aa0ed-kube-api-access-rl5f9\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebrjrkc\" (UID: \"80154316-d969-4a81-9b4c-f0670c8aa0ed\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebrjrkc" Apr 22 18:54:11.206436 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:11.206424 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/80154316-d969-4a81-9b4c-f0670c8aa0ed-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebrjrkc\" (UID: \"80154316-d969-4a81-9b4c-f0670c8aa0ed\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebrjrkc" Apr 22 18:54:11.206709 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:11.206689 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/80154316-d969-4a81-9b4c-f0670c8aa0ed-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebrjrkc\" (UID: \"80154316-d969-4a81-9b4c-f0670c8aa0ed\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebrjrkc" Apr 22 18:54:11.206768 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:11.206754 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/80154316-d969-4a81-9b4c-f0670c8aa0ed-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebrjrkc\" (UID: \"80154316-d969-4a81-9b4c-f0670c8aa0ed\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebrjrkc" Apr 22 18:54:11.213525 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:11.213494 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rl5f9\" (UniqueName: \"kubernetes.io/projected/80154316-d969-4a81-9b4c-f0670c8aa0ed-kube-api-access-rl5f9\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebrjrkc\" (UID: \"80154316-d969-4a81-9b4c-f0670c8aa0ed\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebrjrkc" Apr 22 18:54:11.248536 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:11.248508 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebrjrkc" Apr 22 18:54:11.374534 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:11.374507 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebrjrkc"] Apr 22 18:54:11.376565 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:54:11.376535 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80154316_d969_4a81_9b4c_f0670c8aa0ed.slice/crio-13aec0cb2d6d53573d6e50b971fdb92e0e56cbf55f71ef4eeb7ede5c5626b13f WatchSource:0}: Error finding container 13aec0cb2d6d53573d6e50b971fdb92e0e56cbf55f71ef4eeb7ede5c5626b13f: Status 404 returned error can't find the container with id 13aec0cb2d6d53573d6e50b971fdb92e0e56cbf55f71ef4eeb7ede5c5626b13f Apr 22 18:54:12.292739 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:12.292705 2566 generic.go:358] "Generic (PLEG): container finished" podID="80154316-d969-4a81-9b4c-f0670c8aa0ed" containerID="d31c2f0cdd0a618cdfcb2460f9b7f8bc9e537267b99324664680dcaf9de7799b" exitCode=0 Apr 22 18:54:12.293120 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:12.292799 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebrjrkc" event={"ID":"80154316-d969-4a81-9b4c-f0670c8aa0ed","Type":"ContainerDied","Data":"d31c2f0cdd0a618cdfcb2460f9b7f8bc9e537267b99324664680dcaf9de7799b"} Apr 22 18:54:12.293120 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:12.292851 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebrjrkc" event={"ID":"80154316-d969-4a81-9b4c-f0670c8aa0ed","Type":"ContainerStarted","Data":"13aec0cb2d6d53573d6e50b971fdb92e0e56cbf55f71ef4eeb7ede5c5626b13f"} Apr 22 18:54:13.297881 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:13.297856 2566 generic.go:358] "Generic (PLEG): container finished" podID="80154316-d969-4a81-9b4c-f0670c8aa0ed" containerID="688b147441a75aff83668d63eb60f31670672ccf15f3833a6e3c3b713086669e" exitCode=0 Apr 22 18:54:13.298241 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:13.297960 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebrjrkc" event={"ID":"80154316-d969-4a81-9b4c-f0670c8aa0ed","Type":"ContainerDied","Data":"688b147441a75aff83668d63eb60f31670672ccf15f3833a6e3c3b713086669e"} Apr 22 18:54:14.303871 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:14.303838 2566 generic.go:358] "Generic (PLEG): container finished" podID="80154316-d969-4a81-9b4c-f0670c8aa0ed" containerID="3cedb507de2f9481ff3d0ab6a3d43563fe27361b0b57734aa7557410c216dae9" exitCode=0 Apr 22 18:54:14.304261 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:14.303885 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebrjrkc" event={"ID":"80154316-d969-4a81-9b4c-f0670c8aa0ed","Type":"ContainerDied","Data":"3cedb507de2f9481ff3d0ab6a3d43563fe27361b0b57734aa7557410c216dae9"} Apr 22 18:54:15.437260 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:15.437240 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebrjrkc" Apr 22 18:54:15.543540 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:15.543509 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/80154316-d969-4a81-9b4c-f0670c8aa0ed-util\") pod \"80154316-d969-4a81-9b4c-f0670c8aa0ed\" (UID: \"80154316-d969-4a81-9b4c-f0670c8aa0ed\") " Apr 22 18:54:15.543664 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:15.543578 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rl5f9\" (UniqueName: \"kubernetes.io/projected/80154316-d969-4a81-9b4c-f0670c8aa0ed-kube-api-access-rl5f9\") pod \"80154316-d969-4a81-9b4c-f0670c8aa0ed\" (UID: \"80154316-d969-4a81-9b4c-f0670c8aa0ed\") " Apr 22 18:54:15.543664 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:15.543615 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/80154316-d969-4a81-9b4c-f0670c8aa0ed-bundle\") pod \"80154316-d969-4a81-9b4c-f0670c8aa0ed\" (UID: \"80154316-d969-4a81-9b4c-f0670c8aa0ed\") " Apr 22 18:54:15.544544 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:15.544517 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80154316-d969-4a81-9b4c-f0670c8aa0ed-bundle" (OuterVolumeSpecName: "bundle") pod "80154316-d969-4a81-9b4c-f0670c8aa0ed" (UID: "80154316-d969-4a81-9b4c-f0670c8aa0ed"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:54:15.545596 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:15.545570 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80154316-d969-4a81-9b4c-f0670c8aa0ed-kube-api-access-rl5f9" (OuterVolumeSpecName: "kube-api-access-rl5f9") pod "80154316-d969-4a81-9b4c-f0670c8aa0ed" (UID: "80154316-d969-4a81-9b4c-f0670c8aa0ed"). InnerVolumeSpecName "kube-api-access-rl5f9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:54:15.549147 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:15.549124 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80154316-d969-4a81-9b4c-f0670c8aa0ed-util" (OuterVolumeSpecName: "util") pod "80154316-d969-4a81-9b4c-f0670c8aa0ed" (UID: "80154316-d969-4a81-9b4c-f0670c8aa0ed"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:54:15.644332 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:15.644282 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rl5f9\" (UniqueName: \"kubernetes.io/projected/80154316-d969-4a81-9b4c-f0670c8aa0ed-kube-api-access-rl5f9\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:54:15.644332 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:15.644328 2566 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/80154316-d969-4a81-9b4c-f0670c8aa0ed-bundle\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:54:15.644332 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:15.644339 2566 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/80154316-d969-4a81-9b4c-f0670c8aa0ed-util\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:54:16.312160 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:16.312120 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebrjrkc" event={"ID":"80154316-d969-4a81-9b4c-f0670c8aa0ed","Type":"ContainerDied","Data":"13aec0cb2d6d53573d6e50b971fdb92e0e56cbf55f71ef4eeb7ede5c5626b13f"} Apr 22 18:54:16.312160 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:16.312155 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebrjrkc" Apr 22 18:54:16.312403 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:16.312160 2566 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13aec0cb2d6d53573d6e50b971fdb92e0e56cbf55f71ef4eeb7ede5c5626b13f" Apr 22 18:54:16.938915 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:16.938880 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-6869d4659b-qm862"] Apr 22 18:54:16.939321 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:16.939214 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="80154316-d969-4a81-9b4c-f0670c8aa0ed" containerName="util" Apr 22 18:54:16.939321 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:16.939227 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="80154316-d969-4a81-9b4c-f0670c8aa0ed" containerName="util" Apr 22 18:54:16.939321 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:16.939242 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="80154316-d969-4a81-9b4c-f0670c8aa0ed" containerName="pull" Apr 22 18:54:16.939321 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:16.939247 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="80154316-d969-4a81-9b4c-f0670c8aa0ed" containerName="pull" Apr 22 18:54:16.939321 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:16.939260 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="80154316-d969-4a81-9b4c-f0670c8aa0ed" containerName="extract" Apr 22 18:54:16.939321 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:16.939265 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="80154316-d969-4a81-9b4c-f0670c8aa0ed" containerName="extract" Apr 22 18:54:16.939321 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:16.939320 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="80154316-d969-4a81-9b4c-f0670c8aa0ed" containerName="extract" Apr 22 18:54:16.943270 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:16.943253 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-6869d4659b-qm862" Apr 22 18:54:16.946779 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:16.946759 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 22 18:54:16.947398 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:16.947377 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-5ws6l\"" Apr 22 18:54:16.947504 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:16.947452 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 22 18:54:16.947718 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:16.947702 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 22 18:54:16.947718 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:16.947709 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 22 18:54:16.947852 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:16.947756 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:54:16.963774 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:16.963753 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-6869d4659b-qm862"] Apr 22 18:54:17.055577 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:17.055535 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/e13e10c4-2946-45e5-827f-7aea7b38e767-metrics-cert\") pod \"lws-controller-manager-6869d4659b-qm862\" (UID: \"e13e10c4-2946-45e5-827f-7aea7b38e767\") " pod="openshift-lws-operator/lws-controller-manager-6869d4659b-qm862" Apr 22 18:54:17.055756 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:17.055582 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqrw6\" (UniqueName: \"kubernetes.io/projected/e13e10c4-2946-45e5-827f-7aea7b38e767-kube-api-access-rqrw6\") pod \"lws-controller-manager-6869d4659b-qm862\" (UID: \"e13e10c4-2946-45e5-827f-7aea7b38e767\") " pod="openshift-lws-operator/lws-controller-manager-6869d4659b-qm862" Apr 22 18:54:17.055756 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:17.055647 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e13e10c4-2946-45e5-827f-7aea7b38e767-cert\") pod \"lws-controller-manager-6869d4659b-qm862\" (UID: \"e13e10c4-2946-45e5-827f-7aea7b38e767\") " pod="openshift-lws-operator/lws-controller-manager-6869d4659b-qm862" Apr 22 18:54:17.055847 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:17.055771 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/e13e10c4-2946-45e5-827f-7aea7b38e767-manager-config\") pod \"lws-controller-manager-6869d4659b-qm862\" (UID: \"e13e10c4-2946-45e5-827f-7aea7b38e767\") " pod="openshift-lws-operator/lws-controller-manager-6869d4659b-qm862" Apr 22 18:54:17.156606 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:17.156568 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/e13e10c4-2946-45e5-827f-7aea7b38e767-manager-config\") pod \"lws-controller-manager-6869d4659b-qm862\" (UID: \"e13e10c4-2946-45e5-827f-7aea7b38e767\") " pod="openshift-lws-operator/lws-controller-manager-6869d4659b-qm862" Apr 22 18:54:17.156802 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:17.156624 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/e13e10c4-2946-45e5-827f-7aea7b38e767-metrics-cert\") pod \"lws-controller-manager-6869d4659b-qm862\" (UID: \"e13e10c4-2946-45e5-827f-7aea7b38e767\") " pod="openshift-lws-operator/lws-controller-manager-6869d4659b-qm862" Apr 22 18:54:17.156802 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:17.156653 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rqrw6\" (UniqueName: \"kubernetes.io/projected/e13e10c4-2946-45e5-827f-7aea7b38e767-kube-api-access-rqrw6\") pod \"lws-controller-manager-6869d4659b-qm862\" (UID: \"e13e10c4-2946-45e5-827f-7aea7b38e767\") " pod="openshift-lws-operator/lws-controller-manager-6869d4659b-qm862" Apr 22 18:54:17.156802 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:17.156680 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e13e10c4-2946-45e5-827f-7aea7b38e767-cert\") pod \"lws-controller-manager-6869d4659b-qm862\" (UID: \"e13e10c4-2946-45e5-827f-7aea7b38e767\") " pod="openshift-lws-operator/lws-controller-manager-6869d4659b-qm862" Apr 22 18:54:17.157327 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:17.157295 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/e13e10c4-2946-45e5-827f-7aea7b38e767-manager-config\") pod \"lws-controller-manager-6869d4659b-qm862\" (UID: \"e13e10c4-2946-45e5-827f-7aea7b38e767\") " pod="openshift-lws-operator/lws-controller-manager-6869d4659b-qm862" Apr 22 18:54:17.159188 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:17.159164 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/e13e10c4-2946-45e5-827f-7aea7b38e767-metrics-cert\") pod \"lws-controller-manager-6869d4659b-qm862\" (UID: \"e13e10c4-2946-45e5-827f-7aea7b38e767\") " pod="openshift-lws-operator/lws-controller-manager-6869d4659b-qm862" Apr 22 18:54:17.159277 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:17.159171 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e13e10c4-2946-45e5-827f-7aea7b38e767-cert\") pod \"lws-controller-manager-6869d4659b-qm862\" (UID: \"e13e10c4-2946-45e5-827f-7aea7b38e767\") " pod="openshift-lws-operator/lws-controller-manager-6869d4659b-qm862" Apr 22 18:54:17.164610 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:17.164589 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqrw6\" (UniqueName: \"kubernetes.io/projected/e13e10c4-2946-45e5-827f-7aea7b38e767-kube-api-access-rqrw6\") pod \"lws-controller-manager-6869d4659b-qm862\" (UID: \"e13e10c4-2946-45e5-827f-7aea7b38e767\") " pod="openshift-lws-operator/lws-controller-manager-6869d4659b-qm862" Apr 22 18:54:17.252747 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:17.252669 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-6869d4659b-qm862" Apr 22 18:54:17.374490 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:17.374467 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-6869d4659b-qm862"] Apr 22 18:54:17.376503 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:54:17.376465 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode13e10c4_2946_45e5_827f_7aea7b38e767.slice/crio-ea3b9fd22e5c5d8bb87265ee082b8e472d8addc0ef6c26d53c93c1620e452106 WatchSource:0}: Error finding container ea3b9fd22e5c5d8bb87265ee082b8e472d8addc0ef6c26d53c93c1620e452106: Status 404 returned error can't find the container with id ea3b9fd22e5c5d8bb87265ee082b8e472d8addc0ef6c26d53c93c1620e452106 Apr 22 18:54:18.320139 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:18.320098 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-6869d4659b-qm862" event={"ID":"e13e10c4-2946-45e5-827f-7aea7b38e767","Type":"ContainerStarted","Data":"ea3b9fd22e5c5d8bb87265ee082b8e472d8addc0ef6c26d53c93c1620e452106"} Apr 22 18:54:20.331327 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:20.331293 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-6869d4659b-qm862" event={"ID":"e13e10c4-2946-45e5-827f-7aea7b38e767","Type":"ContainerStarted","Data":"132ff7467fbfbcc6de738ed7091d52e2a3e0eaea4961f8768fcecdc9db647f7d"} Apr 22 18:54:20.331714 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:20.331349 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-6869d4659b-qm862" Apr 22 18:54:20.349883 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:20.349831 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-6869d4659b-qm862" podStartSLOduration=1.927060494 podStartE2EDuration="4.349815837s" podCreationTimestamp="2026-04-22 18:54:16 +0000 UTC" firstStartedPulling="2026-04-22 18:54:17.378417222 +0000 UTC m=+459.157250686" lastFinishedPulling="2026-04-22 18:54:19.801172564 +0000 UTC m=+461.580006029" observedRunningTime="2026-04-22 18:54:20.347506192 +0000 UTC m=+462.126339677" watchObservedRunningTime="2026-04-22 18:54:20.349815837 +0000 UTC m=+462.128649383" Apr 22 18:54:31.337216 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:31.337184 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-6869d4659b-qm862" Apr 22 18:54:36.528878 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:36.528839 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5038h2sh"] Apr 22 18:54:36.532693 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:36.532676 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5038h2sh" Apr 22 18:54:36.534862 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:36.534829 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 18:54:36.534862 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:36.534840 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 18:54:36.535057 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:36.534906 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-h655x\"" Apr 22 18:54:36.538660 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:36.538640 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5038h2sh"] Apr 22 18:54:36.615802 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:36.615770 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/08a2df76-4e7f-4e7a-96d4-808cbcb62224-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5038h2sh\" (UID: \"08a2df76-4e7f-4e7a-96d4-808cbcb62224\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5038h2sh" Apr 22 18:54:36.615975 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:36.615823 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ftnl\" (UniqueName: \"kubernetes.io/projected/08a2df76-4e7f-4e7a-96d4-808cbcb62224-kube-api-access-7ftnl\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5038h2sh\" (UID: \"08a2df76-4e7f-4e7a-96d4-808cbcb62224\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5038h2sh" Apr 22 18:54:36.615975 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:36.615859 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/08a2df76-4e7f-4e7a-96d4-808cbcb62224-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5038h2sh\" (UID: \"08a2df76-4e7f-4e7a-96d4-808cbcb62224\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5038h2sh" Apr 22 18:54:36.619854 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:36.619828 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30d2hqz"] Apr 22 18:54:36.623543 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:36.623528 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30d2hqz" Apr 22 18:54:36.630744 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:36.630717 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30d2hqz"] Apr 22 18:54:36.716362 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:36.716335 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7ftnl\" (UniqueName: \"kubernetes.io/projected/08a2df76-4e7f-4e7a-96d4-808cbcb62224-kube-api-access-7ftnl\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5038h2sh\" (UID: \"08a2df76-4e7f-4e7a-96d4-808cbcb62224\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5038h2sh" Apr 22 18:54:36.716516 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:36.716369 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c92bb305-cbaa-493e-8780-378dd376adee-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30d2hqz\" (UID: \"c92bb305-cbaa-493e-8780-378dd376adee\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30d2hqz" Apr 22 18:54:36.716516 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:36.716393 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/08a2df76-4e7f-4e7a-96d4-808cbcb62224-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5038h2sh\" (UID: \"08a2df76-4e7f-4e7a-96d4-808cbcb62224\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5038h2sh" Apr 22 18:54:36.716516 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:36.716463 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vwwx\" (UniqueName: \"kubernetes.io/projected/c92bb305-cbaa-493e-8780-378dd376adee-kube-api-access-7vwwx\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30d2hqz\" (UID: \"c92bb305-cbaa-493e-8780-378dd376adee\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30d2hqz" Apr 22 18:54:36.716644 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:36.716543 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/08a2df76-4e7f-4e7a-96d4-808cbcb62224-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5038h2sh\" (UID: \"08a2df76-4e7f-4e7a-96d4-808cbcb62224\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5038h2sh" Apr 22 18:54:36.716644 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:36.716571 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c92bb305-cbaa-493e-8780-378dd376adee-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30d2hqz\" (UID: \"c92bb305-cbaa-493e-8780-378dd376adee\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30d2hqz" Apr 22 18:54:36.716723 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:36.716710 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/08a2df76-4e7f-4e7a-96d4-808cbcb62224-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5038h2sh\" (UID: \"08a2df76-4e7f-4e7a-96d4-808cbcb62224\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5038h2sh" Apr 22 18:54:36.716829 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:36.716814 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/08a2df76-4e7f-4e7a-96d4-808cbcb62224-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5038h2sh\" (UID: \"08a2df76-4e7f-4e7a-96d4-808cbcb62224\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5038h2sh" Apr 22 18:54:36.721542 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:36.721520 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b47fz2"] Apr 22 18:54:36.724113 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:36.724096 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ftnl\" (UniqueName: \"kubernetes.io/projected/08a2df76-4e7f-4e7a-96d4-808cbcb62224-kube-api-access-7ftnl\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5038h2sh\" (UID: \"08a2df76-4e7f-4e7a-96d4-808cbcb62224\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5038h2sh" Apr 22 18:54:36.725262 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:36.725245 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b47fz2" Apr 22 18:54:36.731440 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:36.731419 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b47fz2"] Apr 22 18:54:36.817430 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:36.817338 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c92bb305-cbaa-493e-8780-378dd376adee-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30d2hqz\" (UID: \"c92bb305-cbaa-493e-8780-378dd376adee\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30d2hqz" Apr 22 18:54:36.817430 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:36.817382 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f96aa390-b218-4950-a524-284d25e590bf-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b47fz2\" (UID: \"f96aa390-b218-4950-a524-284d25e590bf\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b47fz2" Apr 22 18:54:36.817634 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:36.817437 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f96aa390-b218-4950-a524-284d25e590bf-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b47fz2\" (UID: \"f96aa390-b218-4950-a524-284d25e590bf\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b47fz2" Apr 22 18:54:36.817634 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:36.817531 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfw84\" (UniqueName: \"kubernetes.io/projected/f96aa390-b218-4950-a524-284d25e590bf-kube-api-access-tfw84\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b47fz2\" (UID: \"f96aa390-b218-4950-a524-284d25e590bf\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b47fz2" Apr 22 18:54:36.817634 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:36.817568 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c92bb305-cbaa-493e-8780-378dd376adee-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30d2hqz\" (UID: \"c92bb305-cbaa-493e-8780-378dd376adee\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30d2hqz" Apr 22 18:54:36.817634 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:36.817605 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7vwwx\" (UniqueName: \"kubernetes.io/projected/c92bb305-cbaa-493e-8780-378dd376adee-kube-api-access-7vwwx\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30d2hqz\" (UID: \"c92bb305-cbaa-493e-8780-378dd376adee\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30d2hqz" Apr 22 18:54:36.817826 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:36.817692 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c92bb305-cbaa-493e-8780-378dd376adee-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30d2hqz\" (UID: \"c92bb305-cbaa-493e-8780-378dd376adee\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30d2hqz" Apr 22 18:54:36.817878 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:36.817832 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c92bb305-cbaa-493e-8780-378dd376adee-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30d2hqz\" (UID: \"c92bb305-cbaa-493e-8780-378dd376adee\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30d2hqz" Apr 22 18:54:36.825742 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:36.825713 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vwwx\" (UniqueName: \"kubernetes.io/projected/c92bb305-cbaa-493e-8780-378dd376adee-kube-api-access-7vwwx\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30d2hqz\" (UID: \"c92bb305-cbaa-493e-8780-378dd376adee\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30d2hqz" Apr 22 18:54:36.831049 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:36.831028 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88c6b5z"] Apr 22 18:54:36.836494 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:36.836466 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88c6b5z" Apr 22 18:54:36.842238 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:36.842217 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88c6b5z"] Apr 22 18:54:36.842362 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:36.842323 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5038h2sh" Apr 22 18:54:36.918331 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:36.918297 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f96aa390-b218-4950-a524-284d25e590bf-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b47fz2\" (UID: \"f96aa390-b218-4950-a524-284d25e590bf\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b47fz2" Apr 22 18:54:36.918465 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:36.918346 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f4d32903-6c42-4376-bb4c-356c591f4387-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88c6b5z\" (UID: \"f4d32903-6c42-4376-bb4c-356c591f4387\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88c6b5z" Apr 22 18:54:36.918465 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:36.918386 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f96aa390-b218-4950-a524-284d25e590bf-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b47fz2\" (UID: \"f96aa390-b218-4950-a524-284d25e590bf\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b47fz2" Apr 22 18:54:36.918465 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:36.918457 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tfw84\" (UniqueName: \"kubernetes.io/projected/f96aa390-b218-4950-a524-284d25e590bf-kube-api-access-tfw84\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b47fz2\" (UID: \"f96aa390-b218-4950-a524-284d25e590bf\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b47fz2" Apr 22 18:54:36.918571 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:36.918499 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f4d32903-6c42-4376-bb4c-356c591f4387-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88c6b5z\" (UID: \"f4d32903-6c42-4376-bb4c-356c591f4387\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88c6b5z" Apr 22 18:54:36.918571 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:36.918521 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzpkq\" (UniqueName: \"kubernetes.io/projected/f4d32903-6c42-4376-bb4c-356c591f4387-kube-api-access-bzpkq\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88c6b5z\" (UID: \"f4d32903-6c42-4376-bb4c-356c591f4387\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88c6b5z" Apr 22 18:54:36.918721 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:36.918691 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f96aa390-b218-4950-a524-284d25e590bf-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b47fz2\" (UID: \"f96aa390-b218-4950-a524-284d25e590bf\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b47fz2" Apr 22 18:54:36.918785 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:36.918695 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f96aa390-b218-4950-a524-284d25e590bf-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b47fz2\" (UID: \"f96aa390-b218-4950-a524-284d25e590bf\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b47fz2" Apr 22 18:54:36.927047 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:36.927022 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfw84\" (UniqueName: \"kubernetes.io/projected/f96aa390-b218-4950-a524-284d25e590bf-kube-api-access-tfw84\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b47fz2\" (UID: \"f96aa390-b218-4950-a524-284d25e590bf\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b47fz2" Apr 22 18:54:36.932776 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:36.932750 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30d2hqz" Apr 22 18:54:36.968022 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:36.967996 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5038h2sh"] Apr 22 18:54:36.970415 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:54:36.970389 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08a2df76_4e7f_4e7a_96d4_808cbcb62224.slice/crio-0af9b93fe17ab5e205994ad87f33db9b2cdd6837e2a2bae6ee0c716812420fe3 WatchSource:0}: Error finding container 0af9b93fe17ab5e205994ad87f33db9b2cdd6837e2a2bae6ee0c716812420fe3: Status 404 returned error can't find the container with id 0af9b93fe17ab5e205994ad87f33db9b2cdd6837e2a2bae6ee0c716812420fe3 Apr 22 18:54:37.019081 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:37.019039 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f4d32903-6c42-4376-bb4c-356c591f4387-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88c6b5z\" (UID: \"f4d32903-6c42-4376-bb4c-356c591f4387\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88c6b5z" Apr 22 18:54:37.019151 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:37.019133 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f4d32903-6c42-4376-bb4c-356c591f4387-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88c6b5z\" (UID: \"f4d32903-6c42-4376-bb4c-356c591f4387\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88c6b5z" Apr 22 18:54:37.019207 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:37.019158 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bzpkq\" (UniqueName: \"kubernetes.io/projected/f4d32903-6c42-4376-bb4c-356c591f4387-kube-api-access-bzpkq\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88c6b5z\" (UID: \"f4d32903-6c42-4376-bb4c-356c591f4387\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88c6b5z" Apr 22 18:54:37.019446 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:37.019427 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f4d32903-6c42-4376-bb4c-356c591f4387-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88c6b5z\" (UID: \"f4d32903-6c42-4376-bb4c-356c591f4387\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88c6b5z" Apr 22 18:54:37.019671 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:37.019642 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f4d32903-6c42-4376-bb4c-356c591f4387-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88c6b5z\" (UID: \"f4d32903-6c42-4376-bb4c-356c591f4387\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88c6b5z" Apr 22 18:54:37.029743 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:37.029699 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzpkq\" (UniqueName: \"kubernetes.io/projected/f4d32903-6c42-4376-bb4c-356c591f4387-kube-api-access-bzpkq\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88c6b5z\" (UID: \"f4d32903-6c42-4376-bb4c-356c591f4387\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88c6b5z" Apr 22 18:54:37.042086 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:37.042013 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b47fz2" Apr 22 18:54:37.069638 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:37.069573 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30d2hqz"] Apr 22 18:54:37.072138 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:54:37.072111 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc92bb305_cbaa_493e_8780_378dd376adee.slice/crio-e62be7d6f29af625e059acd7d265248025545ac9b4cba2c9a171b8c6f7e22f9c WatchSource:0}: Error finding container e62be7d6f29af625e059acd7d265248025545ac9b4cba2c9a171b8c6f7e22f9c: Status 404 returned error can't find the container with id e62be7d6f29af625e059acd7d265248025545ac9b4cba2c9a171b8c6f7e22f9c Apr 22 18:54:37.149174 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:37.149150 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88c6b5z" Apr 22 18:54:37.172466 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:37.172392 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b47fz2"] Apr 22 18:54:37.174173 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:54:37.174140 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf96aa390_b218_4950_a524_284d25e590bf.slice/crio-7448d93c521b87027e5389350c7ff5e3f18ef078b39d70b988db46d8a65e29d2 WatchSource:0}: Error finding container 7448d93c521b87027e5389350c7ff5e3f18ef078b39d70b988db46d8a65e29d2: Status 404 returned error can't find the container with id 7448d93c521b87027e5389350c7ff5e3f18ef078b39d70b988db46d8a65e29d2 Apr 22 18:54:37.279289 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:37.279265 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88c6b5z"] Apr 22 18:54:37.281399 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:54:37.281375 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4d32903_6c42_4376_bb4c_356c591f4387.slice/crio-c7828f9aab22de85c2dbeda4ca13d157600ef56a20bcd1a2860c35a0271183d0 WatchSource:0}: Error finding container c7828f9aab22de85c2dbeda4ca13d157600ef56a20bcd1a2860c35a0271183d0: Status 404 returned error can't find the container with id c7828f9aab22de85c2dbeda4ca13d157600ef56a20bcd1a2860c35a0271183d0 Apr 22 18:54:37.389873 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:37.389842 2566 generic.go:358] "Generic (PLEG): container finished" podID="f96aa390-b218-4950-a524-284d25e590bf" containerID="f7463e82054c85db72d1c6f53b3867ceb5de4e768cd377a7fb29b69d698abc77" exitCode=0 Apr 22 18:54:37.390051 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:37.389924 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b47fz2" event={"ID":"f96aa390-b218-4950-a524-284d25e590bf","Type":"ContainerDied","Data":"f7463e82054c85db72d1c6f53b3867ceb5de4e768cd377a7fb29b69d698abc77"} Apr 22 18:54:37.390051 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:37.389980 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b47fz2" event={"ID":"f96aa390-b218-4950-a524-284d25e590bf","Type":"ContainerStarted","Data":"7448d93c521b87027e5389350c7ff5e3f18ef078b39d70b988db46d8a65e29d2"} Apr 22 18:54:37.391463 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:37.391367 2566 generic.go:358] "Generic (PLEG): container finished" podID="08a2df76-4e7f-4e7a-96d4-808cbcb62224" containerID="b6486b1137e60b5a09df0b11eb149e44adcb77db773225fcc7afd26431a7a6ce" exitCode=0 Apr 22 18:54:37.391578 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:37.391537 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5038h2sh" event={"ID":"08a2df76-4e7f-4e7a-96d4-808cbcb62224","Type":"ContainerDied","Data":"b6486b1137e60b5a09df0b11eb149e44adcb77db773225fcc7afd26431a7a6ce"} Apr 22 18:54:37.391578 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:37.391575 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5038h2sh" event={"ID":"08a2df76-4e7f-4e7a-96d4-808cbcb62224","Type":"ContainerStarted","Data":"0af9b93fe17ab5e205994ad87f33db9b2cdd6837e2a2bae6ee0c716812420fe3"} Apr 22 18:54:37.393061 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:37.393001 2566 generic.go:358] "Generic (PLEG): container finished" podID="c92bb305-cbaa-493e-8780-378dd376adee" containerID="06bedaed9d65d96f7b4304e64263b682d749a0ab6664d1a84cd5c013b7930dc1" exitCode=0 Apr 22 18:54:37.393105 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:37.393074 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30d2hqz" event={"ID":"c92bb305-cbaa-493e-8780-378dd376adee","Type":"ContainerDied","Data":"06bedaed9d65d96f7b4304e64263b682d749a0ab6664d1a84cd5c013b7930dc1"} Apr 22 18:54:37.393105 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:37.393098 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30d2hqz" event={"ID":"c92bb305-cbaa-493e-8780-378dd376adee","Type":"ContainerStarted","Data":"e62be7d6f29af625e059acd7d265248025545ac9b4cba2c9a171b8c6f7e22f9c"} Apr 22 18:54:37.394897 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:37.394867 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88c6b5z" event={"ID":"f4d32903-6c42-4376-bb4c-356c591f4387","Type":"ContainerStarted","Data":"a273a4f36c6c81a0f2fcee2f2c060af6d7bffc317ce0690f82baec6aa0baa634"} Apr 22 18:54:37.394993 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:37.394897 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88c6b5z" event={"ID":"f4d32903-6c42-4376-bb4c-356c591f4387","Type":"ContainerStarted","Data":"c7828f9aab22de85c2dbeda4ca13d157600ef56a20bcd1a2860c35a0271183d0"} Apr 22 18:54:38.402956 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:38.402034 2566 generic.go:358] "Generic (PLEG): container finished" podID="f4d32903-6c42-4376-bb4c-356c591f4387" containerID="a273a4f36c6c81a0f2fcee2f2c060af6d7bffc317ce0690f82baec6aa0baa634" exitCode=0 Apr 22 18:54:38.402956 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:38.402099 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88c6b5z" event={"ID":"f4d32903-6c42-4376-bb4c-356c591f4387","Type":"ContainerDied","Data":"a273a4f36c6c81a0f2fcee2f2c060af6d7bffc317ce0690f82baec6aa0baa634"} Apr 22 18:54:39.407602 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:39.407561 2566 generic.go:358] "Generic (PLEG): container finished" podID="c92bb305-cbaa-493e-8780-378dd376adee" containerID="b6dc074b6f68bf46739e94b69a18d0892e54e68806c5a2e1ec497122f4be30d1" exitCode=0 Apr 22 18:54:39.408200 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:39.407651 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30d2hqz" event={"ID":"c92bb305-cbaa-493e-8780-378dd376adee","Type":"ContainerDied","Data":"b6dc074b6f68bf46739e94b69a18d0892e54e68806c5a2e1ec497122f4be30d1"} Apr 22 18:54:39.409292 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:39.409269 2566 generic.go:358] "Generic (PLEG): container finished" podID="f4d32903-6c42-4376-bb4c-356c591f4387" containerID="d3ec00f06f7cdb28aacf2cf0ab49a7fb13950cc4e936f9183c5483308de3611d" exitCode=0 Apr 22 18:54:39.409392 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:39.409372 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88c6b5z" event={"ID":"f4d32903-6c42-4376-bb4c-356c591f4387","Type":"ContainerDied","Data":"d3ec00f06f7cdb28aacf2cf0ab49a7fb13950cc4e936f9183c5483308de3611d"} Apr 22 18:54:39.411159 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:39.411137 2566 generic.go:358] "Generic (PLEG): container finished" podID="f96aa390-b218-4950-a524-284d25e590bf" containerID="b8cb05de293ff9f5b9aef9d70401cab79f19f652b870f3f9d57760efb31de9fc" exitCode=0 Apr 22 18:54:39.411312 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:39.411199 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b47fz2" event={"ID":"f96aa390-b218-4950-a524-284d25e590bf","Type":"ContainerDied","Data":"b8cb05de293ff9f5b9aef9d70401cab79f19f652b870f3f9d57760efb31de9fc"} Apr 22 18:54:39.413011 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:39.412990 2566 generic.go:358] "Generic (PLEG): container finished" podID="08a2df76-4e7f-4e7a-96d4-808cbcb62224" containerID="0f5d1823e1914d02ea64e8bf3e7025d31339626c50c54b16b3437ddb24bca123" exitCode=0 Apr 22 18:54:39.413107 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:39.413038 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5038h2sh" event={"ID":"08a2df76-4e7f-4e7a-96d4-808cbcb62224","Type":"ContainerDied","Data":"0f5d1823e1914d02ea64e8bf3e7025d31339626c50c54b16b3437ddb24bca123"} Apr 22 18:54:40.418716 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:40.418682 2566 generic.go:358] "Generic (PLEG): container finished" podID="c92bb305-cbaa-493e-8780-378dd376adee" containerID="a0490439e54e4488dc4618549f26ab54e2f7c4baf2f753a80ec1428a59b7d24f" exitCode=0 Apr 22 18:54:40.419124 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:40.418774 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30d2hqz" event={"ID":"c92bb305-cbaa-493e-8780-378dd376adee","Type":"ContainerDied","Data":"a0490439e54e4488dc4618549f26ab54e2f7c4baf2f753a80ec1428a59b7d24f"} Apr 22 18:54:40.420621 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:40.420602 2566 generic.go:358] "Generic (PLEG): container finished" podID="f4d32903-6c42-4376-bb4c-356c591f4387" containerID="ac9a008a293b72677288e59ed808074dd9f6a9c742638cb99e1b503ad45707e8" exitCode=0 Apr 22 18:54:40.420689 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:40.420633 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88c6b5z" event={"ID":"f4d32903-6c42-4376-bb4c-356c591f4387","Type":"ContainerDied","Data":"ac9a008a293b72677288e59ed808074dd9f6a9c742638cb99e1b503ad45707e8"} Apr 22 18:54:40.422527 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:40.422506 2566 generic.go:358] "Generic (PLEG): container finished" podID="f96aa390-b218-4950-a524-284d25e590bf" containerID="087926ce3c98c05a0089c09e3c9907ceb932091bbbc6685d65be1a87795162d5" exitCode=0 Apr 22 18:54:40.422638 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:40.422567 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b47fz2" event={"ID":"f96aa390-b218-4950-a524-284d25e590bf","Type":"ContainerDied","Data":"087926ce3c98c05a0089c09e3c9907ceb932091bbbc6685d65be1a87795162d5"} Apr 22 18:54:40.424226 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:40.424206 2566 generic.go:358] "Generic (PLEG): container finished" podID="08a2df76-4e7f-4e7a-96d4-808cbcb62224" containerID="feec1dd7f50deefb95844c5c9376306335f9d75d1ccf8e90370acad80efa2ef5" exitCode=0 Apr 22 18:54:40.424315 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:40.424238 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5038h2sh" event={"ID":"08a2df76-4e7f-4e7a-96d4-808cbcb62224","Type":"ContainerDied","Data":"feec1dd7f50deefb95844c5c9376306335f9d75d1ccf8e90370acad80efa2ef5"} Apr 22 18:54:41.567023 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:41.566998 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b47fz2" Apr 22 18:54:41.620294 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:41.620271 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30d2hqz" Apr 22 18:54:41.623885 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:41.623868 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88c6b5z" Apr 22 18:54:41.627111 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:41.627097 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5038h2sh" Apr 22 18:54:41.656422 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:41.656399 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c92bb305-cbaa-493e-8780-378dd376adee-util\") pod \"c92bb305-cbaa-493e-8780-378dd376adee\" (UID: \"c92bb305-cbaa-493e-8780-378dd376adee\") " Apr 22 18:54:41.656565 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:41.656441 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ftnl\" (UniqueName: \"kubernetes.io/projected/08a2df76-4e7f-4e7a-96d4-808cbcb62224-kube-api-access-7ftnl\") pod \"08a2df76-4e7f-4e7a-96d4-808cbcb62224\" (UID: \"08a2df76-4e7f-4e7a-96d4-808cbcb62224\") " Apr 22 18:54:41.656565 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:41.656462 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f96aa390-b218-4950-a524-284d25e590bf-util\") pod \"f96aa390-b218-4950-a524-284d25e590bf\" (UID: \"f96aa390-b218-4950-a524-284d25e590bf\") " Apr 22 18:54:41.656565 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:41.656489 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/08a2df76-4e7f-4e7a-96d4-808cbcb62224-bundle\") pod \"08a2df76-4e7f-4e7a-96d4-808cbcb62224\" (UID: \"08a2df76-4e7f-4e7a-96d4-808cbcb62224\") " Apr 22 18:54:41.656565 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:41.656514 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vwwx\" (UniqueName: \"kubernetes.io/projected/c92bb305-cbaa-493e-8780-378dd376adee-kube-api-access-7vwwx\") pod \"c92bb305-cbaa-493e-8780-378dd376adee\" (UID: \"c92bb305-cbaa-493e-8780-378dd376adee\") " Apr 22 18:54:41.656565 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:41.656562 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f96aa390-b218-4950-a524-284d25e590bf-bundle\") pod \"f96aa390-b218-4950-a524-284d25e590bf\" (UID: \"f96aa390-b218-4950-a524-284d25e590bf\") " Apr 22 18:54:41.656826 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:41.656710 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f4d32903-6c42-4376-bb4c-356c591f4387-bundle\") pod \"f4d32903-6c42-4376-bb4c-356c591f4387\" (UID: \"f4d32903-6c42-4376-bb4c-356c591f4387\") " Apr 22 18:54:41.656826 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:41.656787 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfw84\" (UniqueName: \"kubernetes.io/projected/f96aa390-b218-4950-a524-284d25e590bf-kube-api-access-tfw84\") pod \"f96aa390-b218-4950-a524-284d25e590bf\" (UID: \"f96aa390-b218-4950-a524-284d25e590bf\") " Apr 22 18:54:41.656826 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:41.656812 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f4d32903-6c42-4376-bb4c-356c591f4387-util\") pod \"f4d32903-6c42-4376-bb4c-356c591f4387\" (UID: \"f4d32903-6c42-4376-bb4c-356c591f4387\") " Apr 22 18:54:41.656996 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:41.656857 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/08a2df76-4e7f-4e7a-96d4-808cbcb62224-util\") pod \"08a2df76-4e7f-4e7a-96d4-808cbcb62224\" (UID: \"08a2df76-4e7f-4e7a-96d4-808cbcb62224\") " Apr 22 18:54:41.656996 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:41.656881 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c92bb305-cbaa-493e-8780-378dd376adee-bundle\") pod \"c92bb305-cbaa-493e-8780-378dd376adee\" (UID: \"c92bb305-cbaa-493e-8780-378dd376adee\") " Apr 22 18:54:41.656996 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:41.656904 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzpkq\" (UniqueName: \"kubernetes.io/projected/f4d32903-6c42-4376-bb4c-356c591f4387-kube-api-access-bzpkq\") pod \"f4d32903-6c42-4376-bb4c-356c591f4387\" (UID: \"f4d32903-6c42-4376-bb4c-356c591f4387\") " Apr 22 18:54:41.657286 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:41.657249 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08a2df76-4e7f-4e7a-96d4-808cbcb62224-bundle" (OuterVolumeSpecName: "bundle") pod "08a2df76-4e7f-4e7a-96d4-808cbcb62224" (UID: "08a2df76-4e7f-4e7a-96d4-808cbcb62224"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:54:41.657728 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:41.657699 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4d32903-6c42-4376-bb4c-356c591f4387-bundle" (OuterVolumeSpecName: "bundle") pod "f4d32903-6c42-4376-bb4c-356c591f4387" (UID: "f4d32903-6c42-4376-bb4c-356c591f4387"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:54:41.657836 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:41.657807 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f96aa390-b218-4950-a524-284d25e590bf-bundle" (OuterVolumeSpecName: "bundle") pod "f96aa390-b218-4950-a524-284d25e590bf" (UID: "f96aa390-b218-4950-a524-284d25e590bf"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:54:41.660418 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:41.660390 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f96aa390-b218-4950-a524-284d25e590bf-kube-api-access-tfw84" (OuterVolumeSpecName: "kube-api-access-tfw84") pod "f96aa390-b218-4950-a524-284d25e590bf" (UID: "f96aa390-b218-4950-a524-284d25e590bf"). InnerVolumeSpecName "kube-api-access-tfw84". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:54:41.660901 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:41.660876 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c92bb305-cbaa-493e-8780-378dd376adee-bundle" (OuterVolumeSpecName: "bundle") pod "c92bb305-cbaa-493e-8780-378dd376adee" (UID: "c92bb305-cbaa-493e-8780-378dd376adee"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:54:41.661069 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:41.661043 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4d32903-6c42-4376-bb4c-356c591f4387-kube-api-access-bzpkq" (OuterVolumeSpecName: "kube-api-access-bzpkq") pod "f4d32903-6c42-4376-bb4c-356c591f4387" (UID: "f4d32903-6c42-4376-bb4c-356c591f4387"). InnerVolumeSpecName "kube-api-access-bzpkq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:54:41.661325 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:41.661293 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08a2df76-4e7f-4e7a-96d4-808cbcb62224-kube-api-access-7ftnl" (OuterVolumeSpecName: "kube-api-access-7ftnl") pod "08a2df76-4e7f-4e7a-96d4-808cbcb62224" (UID: "08a2df76-4e7f-4e7a-96d4-808cbcb62224"). InnerVolumeSpecName "kube-api-access-7ftnl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:54:41.665190 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:41.665127 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f96aa390-b218-4950-a524-284d25e590bf-util" (OuterVolumeSpecName: "util") pod "f96aa390-b218-4950-a524-284d25e590bf" (UID: "f96aa390-b218-4950-a524-284d25e590bf"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:54:41.665299 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:41.665227 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c92bb305-cbaa-493e-8780-378dd376adee-kube-api-access-7vwwx" (OuterVolumeSpecName: "kube-api-access-7vwwx") pod "c92bb305-cbaa-493e-8780-378dd376adee" (UID: "c92bb305-cbaa-493e-8780-378dd376adee"). InnerVolumeSpecName "kube-api-access-7vwwx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:54:41.666147 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:41.666119 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c92bb305-cbaa-493e-8780-378dd376adee-util" (OuterVolumeSpecName: "util") pod "c92bb305-cbaa-493e-8780-378dd376adee" (UID: "c92bb305-cbaa-493e-8780-378dd376adee"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:54:41.666878 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:41.666853 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4d32903-6c42-4376-bb4c-356c591f4387-util" (OuterVolumeSpecName: "util") pod "f4d32903-6c42-4376-bb4c-356c591f4387" (UID: "f4d32903-6c42-4376-bb4c-356c591f4387"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:54:41.667513 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:41.667478 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08a2df76-4e7f-4e7a-96d4-808cbcb62224-util" (OuterVolumeSpecName: "util") pod "08a2df76-4e7f-4e7a-96d4-808cbcb62224" (UID: "08a2df76-4e7f-4e7a-96d4-808cbcb62224"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:54:41.757960 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:41.757871 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tfw84\" (UniqueName: \"kubernetes.io/projected/f96aa390-b218-4950-a524-284d25e590bf-kube-api-access-tfw84\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:54:41.757960 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:41.757896 2566 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f4d32903-6c42-4376-bb4c-356c591f4387-util\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:54:41.757960 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:41.757907 2566 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/08a2df76-4e7f-4e7a-96d4-808cbcb62224-util\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:54:41.757960 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:41.757915 2566 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c92bb305-cbaa-493e-8780-378dd376adee-bundle\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:54:41.757960 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:41.757923 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bzpkq\" (UniqueName: \"kubernetes.io/projected/f4d32903-6c42-4376-bb4c-356c591f4387-kube-api-access-bzpkq\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:54:41.757960 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:41.757932 2566 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c92bb305-cbaa-493e-8780-378dd376adee-util\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:54:41.757960 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:41.757960 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7ftnl\" (UniqueName: \"kubernetes.io/projected/08a2df76-4e7f-4e7a-96d4-808cbcb62224-kube-api-access-7ftnl\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:54:41.757960 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:41.757968 2566 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f96aa390-b218-4950-a524-284d25e590bf-util\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:54:41.758337 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:41.757976 2566 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/08a2df76-4e7f-4e7a-96d4-808cbcb62224-bundle\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:54:41.758337 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:41.757986 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7vwwx\" (UniqueName: \"kubernetes.io/projected/c92bb305-cbaa-493e-8780-378dd376adee-kube-api-access-7vwwx\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:54:41.758337 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:41.757994 2566 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f96aa390-b218-4950-a524-284d25e590bf-bundle\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:54:41.758337 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:41.758002 2566 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f4d32903-6c42-4376-bb4c-356c591f4387-bundle\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:54:42.433823 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:42.433793 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5038h2sh" event={"ID":"08a2df76-4e7f-4e7a-96d4-808cbcb62224","Type":"ContainerDied","Data":"0af9b93fe17ab5e205994ad87f33db9b2cdd6837e2a2bae6ee0c716812420fe3"} Apr 22 18:54:42.433823 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:42.433828 2566 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0af9b93fe17ab5e205994ad87f33db9b2cdd6837e2a2bae6ee0c716812420fe3" Apr 22 18:54:42.434088 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:42.433798 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5038h2sh" Apr 22 18:54:42.435556 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:42.435528 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30d2hqz" event={"ID":"c92bb305-cbaa-493e-8780-378dd376adee","Type":"ContainerDied","Data":"e62be7d6f29af625e059acd7d265248025545ac9b4cba2c9a171b8c6f7e22f9c"} Apr 22 18:54:42.435699 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:42.435558 2566 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e62be7d6f29af625e059acd7d265248025545ac9b4cba2c9a171b8c6f7e22f9c" Apr 22 18:54:42.435699 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:42.435535 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30d2hqz" Apr 22 18:54:42.437294 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:42.437274 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88c6b5z" Apr 22 18:54:42.437427 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:42.437277 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88c6b5z" event={"ID":"f4d32903-6c42-4376-bb4c-356c591f4387","Type":"ContainerDied","Data":"c7828f9aab22de85c2dbeda4ca13d157600ef56a20bcd1a2860c35a0271183d0"} Apr 22 18:54:42.437427 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:42.437419 2566 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c7828f9aab22de85c2dbeda4ca13d157600ef56a20bcd1a2860c35a0271183d0" Apr 22 18:54:42.438948 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:42.438914 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b47fz2" event={"ID":"f96aa390-b218-4950-a524-284d25e590bf","Type":"ContainerDied","Data":"7448d93c521b87027e5389350c7ff5e3f18ef078b39d70b988db46d8a65e29d2"} Apr 22 18:54:42.439045 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:42.438955 2566 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7448d93c521b87027e5389350c7ff5e3f18ef078b39d70b988db46d8a65e29d2" Apr 22 18:54:42.439045 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:42.438980 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b47fz2" Apr 22 18:54:49.760131 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:49.760097 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-66b5c98d57-hhw22"] Apr 22 18:54:49.760615 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:49.760599 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f96aa390-b218-4950-a524-284d25e590bf" containerName="pull" Apr 22 18:54:49.760663 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:49.760617 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="f96aa390-b218-4950-a524-284d25e590bf" containerName="pull" Apr 22 18:54:49.760663 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:49.760626 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f4d32903-6c42-4376-bb4c-356c591f4387" containerName="extract" Apr 22 18:54:49.760663 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:49.760631 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4d32903-6c42-4376-bb4c-356c591f4387" containerName="extract" Apr 22 18:54:49.760663 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:49.760640 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c92bb305-cbaa-493e-8780-378dd376adee" containerName="pull" Apr 22 18:54:49.760663 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:49.760646 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="c92bb305-cbaa-493e-8780-378dd376adee" containerName="pull" Apr 22 18:54:49.760663 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:49.760655 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="08a2df76-4e7f-4e7a-96d4-808cbcb62224" containerName="util" Apr 22 18:54:49.760663 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:49.760661 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="08a2df76-4e7f-4e7a-96d4-808cbcb62224" containerName="util" Apr 22 18:54:49.760896 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:49.760668 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="08a2df76-4e7f-4e7a-96d4-808cbcb62224" containerName="pull" Apr 22 18:54:49.760896 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:49.760673 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="08a2df76-4e7f-4e7a-96d4-808cbcb62224" containerName="pull" Apr 22 18:54:49.760896 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:49.760684 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c92bb305-cbaa-493e-8780-378dd376adee" containerName="util" Apr 22 18:54:49.760896 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:49.760688 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="c92bb305-cbaa-493e-8780-378dd376adee" containerName="util" Apr 22 18:54:49.760896 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:49.760696 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f96aa390-b218-4950-a524-284d25e590bf" containerName="util" Apr 22 18:54:49.760896 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:49.760701 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="f96aa390-b218-4950-a524-284d25e590bf" containerName="util" Apr 22 18:54:49.760896 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:49.760708 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f4d32903-6c42-4376-bb4c-356c591f4387" containerName="util" Apr 22 18:54:49.760896 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:49.760713 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4d32903-6c42-4376-bb4c-356c591f4387" containerName="util" Apr 22 18:54:49.760896 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:49.760718 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f96aa390-b218-4950-a524-284d25e590bf" containerName="extract" Apr 22 18:54:49.760896 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:49.760723 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="f96aa390-b218-4950-a524-284d25e590bf" containerName="extract" Apr 22 18:54:49.760896 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:49.760728 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="08a2df76-4e7f-4e7a-96d4-808cbcb62224" containerName="extract" Apr 22 18:54:49.760896 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:49.760734 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="08a2df76-4e7f-4e7a-96d4-808cbcb62224" containerName="extract" Apr 22 18:54:49.760896 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:49.760743 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c92bb305-cbaa-493e-8780-378dd376adee" containerName="extract" Apr 22 18:54:49.760896 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:49.760748 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="c92bb305-cbaa-493e-8780-378dd376adee" containerName="extract" Apr 22 18:54:49.760896 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:49.760755 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f4d32903-6c42-4376-bb4c-356c591f4387" containerName="pull" Apr 22 18:54:49.760896 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:49.760760 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4d32903-6c42-4376-bb4c-356c591f4387" containerName="pull" Apr 22 18:54:49.760896 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:49.760841 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="08a2df76-4e7f-4e7a-96d4-808cbcb62224" containerName="extract" Apr 22 18:54:49.760896 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:49.760851 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="f96aa390-b218-4950-a524-284d25e590bf" containerName="extract" Apr 22 18:54:49.760896 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:49.760857 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="f4d32903-6c42-4376-bb4c-356c591f4387" containerName="extract" Apr 22 18:54:49.760896 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:49.760864 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="c92bb305-cbaa-493e-8780-378dd376adee" containerName="extract" Apr 22 18:54:49.765829 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:49.765807 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66b5c98d57-hhw22" Apr 22 18:54:49.774664 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:49.774642 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-66b5c98d57-hhw22"] Apr 22 18:54:49.824080 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:49.824051 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f5cadbec-20f2-4935-a4e1-eebd6962ef43-oauth-serving-cert\") pod \"console-66b5c98d57-hhw22\" (UID: \"f5cadbec-20f2-4935-a4e1-eebd6962ef43\") " pod="openshift-console/console-66b5c98d57-hhw22" Apr 22 18:54:49.824213 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:49.824091 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f5cadbec-20f2-4935-a4e1-eebd6962ef43-console-serving-cert\") pod \"console-66b5c98d57-hhw22\" (UID: \"f5cadbec-20f2-4935-a4e1-eebd6962ef43\") " pod="openshift-console/console-66b5c98d57-hhw22" Apr 22 18:54:49.824213 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:49.824130 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f5cadbec-20f2-4935-a4e1-eebd6962ef43-console-config\") pod \"console-66b5c98d57-hhw22\" (UID: \"f5cadbec-20f2-4935-a4e1-eebd6962ef43\") " pod="openshift-console/console-66b5c98d57-hhw22" Apr 22 18:54:49.824213 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:49.824156 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f5cadbec-20f2-4935-a4e1-eebd6962ef43-service-ca\") pod \"console-66b5c98d57-hhw22\" (UID: \"f5cadbec-20f2-4935-a4e1-eebd6962ef43\") " pod="openshift-console/console-66b5c98d57-hhw22" Apr 22 18:54:49.824313 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:49.824251 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f5cadbec-20f2-4935-a4e1-eebd6962ef43-console-oauth-config\") pod \"console-66b5c98d57-hhw22\" (UID: \"f5cadbec-20f2-4935-a4e1-eebd6962ef43\") " pod="openshift-console/console-66b5c98d57-hhw22" Apr 22 18:54:49.824313 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:49.824290 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f5cadbec-20f2-4935-a4e1-eebd6962ef43-trusted-ca-bundle\") pod \"console-66b5c98d57-hhw22\" (UID: \"f5cadbec-20f2-4935-a4e1-eebd6962ef43\") " pod="openshift-console/console-66b5c98d57-hhw22" Apr 22 18:54:49.824313 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:49.824307 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2wx6\" (UniqueName: \"kubernetes.io/projected/f5cadbec-20f2-4935-a4e1-eebd6962ef43-kube-api-access-s2wx6\") pod \"console-66b5c98d57-hhw22\" (UID: \"f5cadbec-20f2-4935-a4e1-eebd6962ef43\") " pod="openshift-console/console-66b5c98d57-hhw22" Apr 22 18:54:49.924708 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:49.924677 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f5cadbec-20f2-4935-a4e1-eebd6962ef43-oauth-serving-cert\") pod \"console-66b5c98d57-hhw22\" (UID: \"f5cadbec-20f2-4935-a4e1-eebd6962ef43\") " pod="openshift-console/console-66b5c98d57-hhw22" Apr 22 18:54:49.924863 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:49.924719 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f5cadbec-20f2-4935-a4e1-eebd6962ef43-console-serving-cert\") pod \"console-66b5c98d57-hhw22\" (UID: \"f5cadbec-20f2-4935-a4e1-eebd6962ef43\") " pod="openshift-console/console-66b5c98d57-hhw22" Apr 22 18:54:49.924863 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:49.924777 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f5cadbec-20f2-4935-a4e1-eebd6962ef43-console-config\") pod \"console-66b5c98d57-hhw22\" (UID: \"f5cadbec-20f2-4935-a4e1-eebd6962ef43\") " pod="openshift-console/console-66b5c98d57-hhw22" Apr 22 18:54:49.924863 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:49.924803 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f5cadbec-20f2-4935-a4e1-eebd6962ef43-service-ca\") pod \"console-66b5c98d57-hhw22\" (UID: \"f5cadbec-20f2-4935-a4e1-eebd6962ef43\") " pod="openshift-console/console-66b5c98d57-hhw22" Apr 22 18:54:49.925065 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:49.924896 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f5cadbec-20f2-4935-a4e1-eebd6962ef43-console-oauth-config\") pod \"console-66b5c98d57-hhw22\" (UID: \"f5cadbec-20f2-4935-a4e1-eebd6962ef43\") " pod="openshift-console/console-66b5c98d57-hhw22" Apr 22 18:54:49.925065 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:49.924962 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f5cadbec-20f2-4935-a4e1-eebd6962ef43-trusted-ca-bundle\") pod \"console-66b5c98d57-hhw22\" (UID: \"f5cadbec-20f2-4935-a4e1-eebd6962ef43\") " pod="openshift-console/console-66b5c98d57-hhw22" Apr 22 18:54:49.925065 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:49.924993 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s2wx6\" (UniqueName: \"kubernetes.io/projected/f5cadbec-20f2-4935-a4e1-eebd6962ef43-kube-api-access-s2wx6\") pod \"console-66b5c98d57-hhw22\" (UID: \"f5cadbec-20f2-4935-a4e1-eebd6962ef43\") " pod="openshift-console/console-66b5c98d57-hhw22" Apr 22 18:54:49.925575 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:49.925545 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f5cadbec-20f2-4935-a4e1-eebd6962ef43-oauth-serving-cert\") pod \"console-66b5c98d57-hhw22\" (UID: \"f5cadbec-20f2-4935-a4e1-eebd6962ef43\") " pod="openshift-console/console-66b5c98d57-hhw22" Apr 22 18:54:49.925670 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:49.925544 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f5cadbec-20f2-4935-a4e1-eebd6962ef43-console-config\") pod \"console-66b5c98d57-hhw22\" (UID: \"f5cadbec-20f2-4935-a4e1-eebd6962ef43\") " pod="openshift-console/console-66b5c98d57-hhw22" Apr 22 18:54:49.925670 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:49.925603 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f5cadbec-20f2-4935-a4e1-eebd6962ef43-service-ca\") pod \"console-66b5c98d57-hhw22\" (UID: \"f5cadbec-20f2-4935-a4e1-eebd6962ef43\") " pod="openshift-console/console-66b5c98d57-hhw22" Apr 22 18:54:49.925877 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:49.925853 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f5cadbec-20f2-4935-a4e1-eebd6962ef43-trusted-ca-bundle\") pod \"console-66b5c98d57-hhw22\" (UID: \"f5cadbec-20f2-4935-a4e1-eebd6962ef43\") " pod="openshift-console/console-66b5c98d57-hhw22" Apr 22 18:54:49.927237 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:49.927216 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f5cadbec-20f2-4935-a4e1-eebd6962ef43-console-oauth-config\") pod \"console-66b5c98d57-hhw22\" (UID: \"f5cadbec-20f2-4935-a4e1-eebd6962ef43\") " pod="openshift-console/console-66b5c98d57-hhw22" Apr 22 18:54:49.927381 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:49.927362 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f5cadbec-20f2-4935-a4e1-eebd6962ef43-console-serving-cert\") pod \"console-66b5c98d57-hhw22\" (UID: \"f5cadbec-20f2-4935-a4e1-eebd6962ef43\") " pod="openshift-console/console-66b5c98d57-hhw22" Apr 22 18:54:49.933233 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:49.933214 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2wx6\" (UniqueName: \"kubernetes.io/projected/f5cadbec-20f2-4935-a4e1-eebd6962ef43-kube-api-access-s2wx6\") pod \"console-66b5c98d57-hhw22\" (UID: \"f5cadbec-20f2-4935-a4e1-eebd6962ef43\") " pod="openshift-console/console-66b5c98d57-hhw22" Apr 22 18:54:50.075849 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:50.075757 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66b5c98d57-hhw22" Apr 22 18:54:50.207561 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:50.207537 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-66b5c98d57-hhw22"] Apr 22 18:54:50.210240 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:54:50.210202 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5cadbec_20f2_4935_a4e1_eebd6962ef43.slice/crio-dbc4b04fe463086784b8048b07f680c9c4b9cc3eb3a001b82bda218d4557a346 WatchSource:0}: Error finding container dbc4b04fe463086784b8048b07f680c9c4b9cc3eb3a001b82bda218d4557a346: Status 404 returned error can't find the container with id dbc4b04fe463086784b8048b07f680c9c4b9cc3eb3a001b82bda218d4557a346 Apr 22 18:54:50.472733 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:50.472700 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66b5c98d57-hhw22" event={"ID":"f5cadbec-20f2-4935-a4e1-eebd6962ef43","Type":"ContainerStarted","Data":"84e44a052a13fa6ce877650f8fe3e5b97bde1d079932e6e585e3076b3ea540a2"} Apr 22 18:54:50.472733 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:50.472738 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66b5c98d57-hhw22" event={"ID":"f5cadbec-20f2-4935-a4e1-eebd6962ef43","Type":"ContainerStarted","Data":"dbc4b04fe463086784b8048b07f680c9c4b9cc3eb3a001b82bda218d4557a346"} Apr 22 18:54:50.495404 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:54:50.495352 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-66b5c98d57-hhw22" podStartSLOduration=1.495336354 podStartE2EDuration="1.495336354s" podCreationTimestamp="2026-04-22 18:54:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:54:50.49448862 +0000 UTC m=+492.273322106" watchObservedRunningTime="2026-04-22 18:54:50.495336354 +0000 UTC m=+492.274169854" Apr 22 18:55:00.076479 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:00.076441 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-66b5c98d57-hhw22" Apr 22 18:55:00.076859 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:00.076497 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-66b5c98d57-hhw22" Apr 22 18:55:00.081196 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:00.081169 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-66b5c98d57-hhw22" Apr 22 18:55:00.513977 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:00.513932 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-66b5c98d57-hhw22" Apr 22 18:55:00.565650 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:00.565619 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-76f65c8c6b-ckz2w"] Apr 22 18:55:09.008280 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:09.008245 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-lzlzq"] Apr 22 18:55:09.011456 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:09.011437 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-lzlzq" Apr 22 18:55:09.013789 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:09.013760 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 22 18:55:09.014709 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:09.014679 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-2fl6w\"" Apr 22 18:55:09.014820 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:09.014727 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 22 18:55:09.014820 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:09.014727 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 22 18:55:09.014820 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:09.014755 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 22 18:55:09.018083 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:09.018061 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-lzlzq"] Apr 22 18:55:09.094199 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:09.094167 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcdbq\" (UniqueName: \"kubernetes.io/projected/1c636de5-f994-42f6-9a67-90a640c701be-kube-api-access-zcdbq\") pod \"kuadrant-console-plugin-6c886788f8-lzlzq\" (UID: \"1c636de5-f994-42f6-9a67-90a640c701be\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-lzlzq" Apr 22 18:55:09.094378 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:09.094236 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/1c636de5-f994-42f6-9a67-90a640c701be-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-lzlzq\" (UID: \"1c636de5-f994-42f6-9a67-90a640c701be\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-lzlzq" Apr 22 18:55:09.094378 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:09.094276 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/1c636de5-f994-42f6-9a67-90a640c701be-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-lzlzq\" (UID: \"1c636de5-f994-42f6-9a67-90a640c701be\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-lzlzq" Apr 22 18:55:09.195530 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:09.195493 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zcdbq\" (UniqueName: \"kubernetes.io/projected/1c636de5-f994-42f6-9a67-90a640c701be-kube-api-access-zcdbq\") pod \"kuadrant-console-plugin-6c886788f8-lzlzq\" (UID: \"1c636de5-f994-42f6-9a67-90a640c701be\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-lzlzq" Apr 22 18:55:09.195799 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:09.195571 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/1c636de5-f994-42f6-9a67-90a640c701be-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-lzlzq\" (UID: \"1c636de5-f994-42f6-9a67-90a640c701be\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-lzlzq" Apr 22 18:55:09.195799 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:09.195613 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/1c636de5-f994-42f6-9a67-90a640c701be-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-lzlzq\" (UID: \"1c636de5-f994-42f6-9a67-90a640c701be\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-lzlzq" Apr 22 18:55:09.196216 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:09.196186 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/1c636de5-f994-42f6-9a67-90a640c701be-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-lzlzq\" (UID: \"1c636de5-f994-42f6-9a67-90a640c701be\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-lzlzq" Apr 22 18:55:09.198075 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:09.198045 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/1c636de5-f994-42f6-9a67-90a640c701be-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-lzlzq\" (UID: \"1c636de5-f994-42f6-9a67-90a640c701be\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-lzlzq" Apr 22 18:55:09.202360 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:09.202338 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcdbq\" (UniqueName: \"kubernetes.io/projected/1c636de5-f994-42f6-9a67-90a640c701be-kube-api-access-zcdbq\") pod \"kuadrant-console-plugin-6c886788f8-lzlzq\" (UID: \"1c636de5-f994-42f6-9a67-90a640c701be\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-lzlzq" Apr 22 18:55:09.323388 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:09.323304 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-lzlzq" Apr 22 18:55:09.449230 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:09.449205 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-lzlzq"] Apr 22 18:55:09.450977 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:55:09.450953 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c636de5_f994_42f6_9a67_90a640c701be.slice/crio-7ac1c3564e75351929bf8d7622ae607a06810adde134c15133445e37dc458f5e WatchSource:0}: Error finding container 7ac1c3564e75351929bf8d7622ae607a06810adde134c15133445e37dc458f5e: Status 404 returned error can't find the container with id 7ac1c3564e75351929bf8d7622ae607a06810adde134c15133445e37dc458f5e Apr 22 18:55:09.543750 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:09.543712 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-lzlzq" event={"ID":"1c636de5-f994-42f6-9a67-90a640c701be","Type":"ContainerStarted","Data":"7ac1c3564e75351929bf8d7622ae607a06810adde134c15133445e37dc458f5e"} Apr 22 18:55:14.568643 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:14.568547 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-lzlzq" event={"ID":"1c636de5-f994-42f6-9a67-90a640c701be","Type":"ContainerStarted","Data":"0de0ca6a2f01770e7422f3bd71c82348e598bfd7faece52193059a487d21e871"} Apr 22 18:55:14.583475 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:14.583429 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-lzlzq" podStartSLOduration=1.7891691220000001 podStartE2EDuration="6.583416923s" podCreationTimestamp="2026-04-22 18:55:08 +0000 UTC" firstStartedPulling="2026-04-22 18:55:09.452611904 +0000 UTC m=+511.231445368" lastFinishedPulling="2026-04-22 18:55:14.246859704 +0000 UTC m=+516.025693169" observedRunningTime="2026-04-22 18:55:14.582307526 +0000 UTC m=+516.361141013" watchObservedRunningTime="2026-04-22 18:55:14.583416923 +0000 UTC m=+516.362250408" Apr 22 18:55:25.585876 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:25.585811 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-76f65c8c6b-ckz2w" podUID="5ea88efb-2da5-4288-bb86-47f3b5848684" containerName="console" containerID="cri-o://358d6ceb4062c4700a0b3f20282b9d9b909d1c69eaee104f2ffd9923c037a271" gracePeriod=15 Apr 22 18:55:25.831459 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:25.831435 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-76f65c8c6b-ckz2w_5ea88efb-2da5-4288-bb86-47f3b5848684/console/0.log" Apr 22 18:55:25.831569 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:25.831483 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76f65c8c6b-ckz2w" Apr 22 18:55:25.950400 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:25.950369 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5ea88efb-2da5-4288-bb86-47f3b5848684-console-serving-cert\") pod \"5ea88efb-2da5-4288-bb86-47f3b5848684\" (UID: \"5ea88efb-2da5-4288-bb86-47f3b5848684\") " Apr 22 18:55:25.950560 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:25.950413 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5ea88efb-2da5-4288-bb86-47f3b5848684-console-config\") pod \"5ea88efb-2da5-4288-bb86-47f3b5848684\" (UID: \"5ea88efb-2da5-4288-bb86-47f3b5848684\") " Apr 22 18:55:25.950560 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:25.950437 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ea88efb-2da5-4288-bb86-47f3b5848684-trusted-ca-bundle\") pod \"5ea88efb-2da5-4288-bb86-47f3b5848684\" (UID: \"5ea88efb-2da5-4288-bb86-47f3b5848684\") " Apr 22 18:55:25.950651 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:25.950560 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5ea88efb-2da5-4288-bb86-47f3b5848684-service-ca\") pod \"5ea88efb-2da5-4288-bb86-47f3b5848684\" (UID: \"5ea88efb-2da5-4288-bb86-47f3b5848684\") " Apr 22 18:55:25.950651 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:25.950600 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jnmv\" (UniqueName: \"kubernetes.io/projected/5ea88efb-2da5-4288-bb86-47f3b5848684-kube-api-access-7jnmv\") pod \"5ea88efb-2da5-4288-bb86-47f3b5848684\" (UID: \"5ea88efb-2da5-4288-bb86-47f3b5848684\") " Apr 22 18:55:25.950651 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:25.950635 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5ea88efb-2da5-4288-bb86-47f3b5848684-oauth-serving-cert\") pod \"5ea88efb-2da5-4288-bb86-47f3b5848684\" (UID: \"5ea88efb-2da5-4288-bb86-47f3b5848684\") " Apr 22 18:55:25.950805 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:25.950726 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5ea88efb-2da5-4288-bb86-47f3b5848684-console-oauth-config\") pod \"5ea88efb-2da5-4288-bb86-47f3b5848684\" (UID: \"5ea88efb-2da5-4288-bb86-47f3b5848684\") " Apr 22 18:55:25.950877 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:25.950850 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ea88efb-2da5-4288-bb86-47f3b5848684-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "5ea88efb-2da5-4288-bb86-47f3b5848684" (UID: "5ea88efb-2da5-4288-bb86-47f3b5848684"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:55:25.950969 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:25.950875 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ea88efb-2da5-4288-bb86-47f3b5848684-console-config" (OuterVolumeSpecName: "console-config") pod "5ea88efb-2da5-4288-bb86-47f3b5848684" (UID: "5ea88efb-2da5-4288-bb86-47f3b5848684"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:55:25.951045 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:25.951005 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ea88efb-2da5-4288-bb86-47f3b5848684-service-ca" (OuterVolumeSpecName: "service-ca") pod "5ea88efb-2da5-4288-bb86-47f3b5848684" (UID: "5ea88efb-2da5-4288-bb86-47f3b5848684"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:55:25.951567 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:25.951237 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ea88efb-2da5-4288-bb86-47f3b5848684-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "5ea88efb-2da5-4288-bb86-47f3b5848684" (UID: "5ea88efb-2da5-4288-bb86-47f3b5848684"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:55:25.951703 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:25.951655 2566 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5ea88efb-2da5-4288-bb86-47f3b5848684-oauth-serving-cert\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:55:25.951703 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:25.951678 2566 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5ea88efb-2da5-4288-bb86-47f3b5848684-console-config\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:55:25.951703 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:25.951693 2566 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ea88efb-2da5-4288-bb86-47f3b5848684-trusted-ca-bundle\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:55:25.951879 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:25.951708 2566 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5ea88efb-2da5-4288-bb86-47f3b5848684-service-ca\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:55:25.952980 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:25.952951 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ea88efb-2da5-4288-bb86-47f3b5848684-kube-api-access-7jnmv" (OuterVolumeSpecName: "kube-api-access-7jnmv") pod "5ea88efb-2da5-4288-bb86-47f3b5848684" (UID: "5ea88efb-2da5-4288-bb86-47f3b5848684"). InnerVolumeSpecName "kube-api-access-7jnmv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:55:25.953302 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:25.953277 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ea88efb-2da5-4288-bb86-47f3b5848684-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "5ea88efb-2da5-4288-bb86-47f3b5848684" (UID: "5ea88efb-2da5-4288-bb86-47f3b5848684"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:55:25.953302 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:25.953291 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ea88efb-2da5-4288-bb86-47f3b5848684-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "5ea88efb-2da5-4288-bb86-47f3b5848684" (UID: "5ea88efb-2da5-4288-bb86-47f3b5848684"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:55:26.052365 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:26.052336 2566 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5ea88efb-2da5-4288-bb86-47f3b5848684-console-oauth-config\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:55:26.052365 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:26.052359 2566 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5ea88efb-2da5-4288-bb86-47f3b5848684-console-serving-cert\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:55:26.052365 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:26.052370 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7jnmv\" (UniqueName: \"kubernetes.io/projected/5ea88efb-2da5-4288-bb86-47f3b5848684-kube-api-access-7jnmv\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 18:55:26.610950 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:26.610909 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-76f65c8c6b-ckz2w_5ea88efb-2da5-4288-bb86-47f3b5848684/console/0.log" Apr 22 18:55:26.611381 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:26.610979 2566 generic.go:358] "Generic (PLEG): container finished" podID="5ea88efb-2da5-4288-bb86-47f3b5848684" containerID="358d6ceb4062c4700a0b3f20282b9d9b909d1c69eaee104f2ffd9923c037a271" exitCode=2 Apr 22 18:55:26.611381 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:26.611055 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76f65c8c6b-ckz2w" Apr 22 18:55:26.611381 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:26.611069 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76f65c8c6b-ckz2w" event={"ID":"5ea88efb-2da5-4288-bb86-47f3b5848684","Type":"ContainerDied","Data":"358d6ceb4062c4700a0b3f20282b9d9b909d1c69eaee104f2ffd9923c037a271"} Apr 22 18:55:26.611381 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:26.611109 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76f65c8c6b-ckz2w" event={"ID":"5ea88efb-2da5-4288-bb86-47f3b5848684","Type":"ContainerDied","Data":"d5c1abcad849b05f1dd48c0142883330a0468ec88fe5f68553b424532524ff3c"} Apr 22 18:55:26.611381 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:26.611125 2566 scope.go:117] "RemoveContainer" containerID="358d6ceb4062c4700a0b3f20282b9d9b909d1c69eaee104f2ffd9923c037a271" Apr 22 18:55:26.622344 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:26.622322 2566 scope.go:117] "RemoveContainer" containerID="358d6ceb4062c4700a0b3f20282b9d9b909d1c69eaee104f2ffd9923c037a271" Apr 22 18:55:26.622616 ip-10-0-130-234 kubenswrapper[2566]: E0422 18:55:26.622594 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"358d6ceb4062c4700a0b3f20282b9d9b909d1c69eaee104f2ffd9923c037a271\": container with ID starting with 358d6ceb4062c4700a0b3f20282b9d9b909d1c69eaee104f2ffd9923c037a271 not found: ID does not exist" containerID="358d6ceb4062c4700a0b3f20282b9d9b909d1c69eaee104f2ffd9923c037a271" Apr 22 18:55:26.622679 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:26.622625 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"358d6ceb4062c4700a0b3f20282b9d9b909d1c69eaee104f2ffd9923c037a271"} err="failed to get container status \"358d6ceb4062c4700a0b3f20282b9d9b909d1c69eaee104f2ffd9923c037a271\": rpc error: code = NotFound desc = could not find container \"358d6ceb4062c4700a0b3f20282b9d9b909d1c69eaee104f2ffd9923c037a271\": container with ID starting with 358d6ceb4062c4700a0b3f20282b9d9b909d1c69eaee104f2ffd9923c037a271 not found: ID does not exist" Apr 22 18:55:26.636060 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:26.636036 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-76f65c8c6b-ckz2w"] Apr 22 18:55:26.639604 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:26.639581 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-76f65c8c6b-ckz2w"] Apr 22 18:55:26.829772 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:26.829733 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ea88efb-2da5-4288-bb86-47f3b5848684" path="/var/lib/kubelet/pods/5ea88efb-2da5-4288-bb86-47f3b5848684/volumes" Apr 22 18:55:52.388996 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:52.388957 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-lbcsn"] Apr 22 18:55:52.389457 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:52.389301 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5ea88efb-2da5-4288-bb86-47f3b5848684" containerName="console" Apr 22 18:55:52.389457 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:52.389311 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ea88efb-2da5-4288-bb86-47f3b5848684" containerName="console" Apr 22 18:55:52.389457 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:52.389374 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="5ea88efb-2da5-4288-bb86-47f3b5848684" containerName="console" Apr 22 18:55:52.411953 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:52.411906 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-lbcsn"] Apr 22 18:55:52.412097 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:52.412002 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-67566c68b4-lbcsn" Apr 22 18:55:52.414263 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:52.414233 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 22 18:55:52.415630 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:52.415607 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-lbcsn"] Apr 22 18:55:52.473008 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:52.472977 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dvks\" (UniqueName: \"kubernetes.io/projected/ab2a243c-2f6c-4c80-8d10-46b9fd67b189-kube-api-access-6dvks\") pod \"limitador-limitador-67566c68b4-lbcsn\" (UID: \"ab2a243c-2f6c-4c80-8d10-46b9fd67b189\") " pod="kuadrant-system/limitador-limitador-67566c68b4-lbcsn" Apr 22 18:55:52.473139 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:52.473044 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/ab2a243c-2f6c-4c80-8d10-46b9fd67b189-config-file\") pod \"limitador-limitador-67566c68b4-lbcsn\" (UID: \"ab2a243c-2f6c-4c80-8d10-46b9fd67b189\") " pod="kuadrant-system/limitador-limitador-67566c68b4-lbcsn" Apr 22 18:55:52.574291 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:52.574257 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6dvks\" (UniqueName: \"kubernetes.io/projected/ab2a243c-2f6c-4c80-8d10-46b9fd67b189-kube-api-access-6dvks\") pod \"limitador-limitador-67566c68b4-lbcsn\" (UID: \"ab2a243c-2f6c-4c80-8d10-46b9fd67b189\") " pod="kuadrant-system/limitador-limitador-67566c68b4-lbcsn" Apr 22 18:55:52.574458 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:52.574315 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/ab2a243c-2f6c-4c80-8d10-46b9fd67b189-config-file\") pod \"limitador-limitador-67566c68b4-lbcsn\" (UID: \"ab2a243c-2f6c-4c80-8d10-46b9fd67b189\") " pod="kuadrant-system/limitador-limitador-67566c68b4-lbcsn" Apr 22 18:55:52.574906 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:52.574888 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/ab2a243c-2f6c-4c80-8d10-46b9fd67b189-config-file\") pod \"limitador-limitador-67566c68b4-lbcsn\" (UID: \"ab2a243c-2f6c-4c80-8d10-46b9fd67b189\") " pod="kuadrant-system/limitador-limitador-67566c68b4-lbcsn" Apr 22 18:55:52.581745 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:52.581722 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dvks\" (UniqueName: \"kubernetes.io/projected/ab2a243c-2f6c-4c80-8d10-46b9fd67b189-kube-api-access-6dvks\") pod \"limitador-limitador-67566c68b4-lbcsn\" (UID: \"ab2a243c-2f6c-4c80-8d10-46b9fd67b189\") " pod="kuadrant-system/limitador-limitador-67566c68b4-lbcsn" Apr 22 18:55:52.722913 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:52.722832 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-67566c68b4-lbcsn" Apr 22 18:55:52.845880 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:52.845852 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-lbcsn"] Apr 22 18:55:53.715368 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:53.715330 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-67566c68b4-lbcsn" event={"ID":"ab2a243c-2f6c-4c80-8d10-46b9fd67b189","Type":"ContainerStarted","Data":"0587caaa2ff0fd5dc52c5fd8e4735cc80f09eb289d40e00a1e48209fd6f873f9"} Apr 22 18:55:54.719982 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:54.719932 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-67566c68b4-lbcsn" event={"ID":"ab2a243c-2f6c-4c80-8d10-46b9fd67b189","Type":"ContainerStarted","Data":"2beda368ecba30be673a01282f143e2f3c883fd6354698eb392084825de5c454"} Apr 22 18:55:54.720327 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:54.720108 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-67566c68b4-lbcsn" Apr 22 18:55:54.737002 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:55:54.736951 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-67566c68b4-lbcsn" podStartSLOduration=1.336903593 podStartE2EDuration="2.73692486s" podCreationTimestamp="2026-04-22 18:55:52 +0000 UTC" firstStartedPulling="2026-04-22 18:55:52.854567063 +0000 UTC m=+554.633400530" lastFinishedPulling="2026-04-22 18:55:54.254588328 +0000 UTC m=+556.033421797" observedRunningTime="2026-04-22 18:55:54.735198365 +0000 UTC m=+556.514031864" watchObservedRunningTime="2026-04-22 18:55:54.73692486 +0000 UTC m=+556.515758345" Apr 22 18:56:05.725598 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:56:05.725568 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-67566c68b4-lbcsn" Apr 22 18:56:38.727438 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:56:38.727408 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4jzwx_4871a747-c041-430f-9c1a-3ab2ce6d0156/console-operator/1.log" Apr 22 18:56:38.729475 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:56:38.729454 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4jzwx_4871a747-c041-430f-9c1a-3ab2ce6d0156/console-operator/1.log" Apr 22 18:58:41.525529 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:58:41.525453 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-585488bf84z5tpx"] Apr 22 18:58:41.528820 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:58:41.528804 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-585488bf84z5tpx" Apr 22 18:58:41.531902 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:58:41.531882 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 22 18:58:41.532126 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:58:41.532110 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 18:58:41.532182 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:58:41.532130 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-4mhhb\"" Apr 22 18:58:41.532224 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:58:41.532197 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvad71fa5348b85aebd404221bba611457-kserve-self-signed-certs\"" Apr 22 18:58:41.535950 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:58:41.535913 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-585488bf84z5tpx"] Apr 22 18:58:41.572275 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:58:41.572252 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3c8a9c8d-d360-468b-ba06-d1cd737afa7e-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-585488bf84z5tpx\" (UID: \"3c8a9c8d-d360-468b-ba06-d1cd737afa7e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-585488bf84z5tpx" Apr 22 18:58:41.572393 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:58:41.572296 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3c8a9c8d-d360-468b-ba06-d1cd737afa7e-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-585488bf84z5tpx\" (UID: \"3c8a9c8d-d360-468b-ba06-d1cd737afa7e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-585488bf84z5tpx" Apr 22 18:58:41.572393 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:58:41.572355 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3c8a9c8d-d360-468b-ba06-d1cd737afa7e-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-585488bf84z5tpx\" (UID: \"3c8a9c8d-d360-468b-ba06-d1cd737afa7e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-585488bf84z5tpx" Apr 22 18:58:41.572476 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:58:41.572389 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3c8a9c8d-d360-468b-ba06-d1cd737afa7e-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-585488bf84z5tpx\" (UID: \"3c8a9c8d-d360-468b-ba06-d1cd737afa7e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-585488bf84z5tpx" Apr 22 18:58:41.572513 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:58:41.572471 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6k67\" (UniqueName: \"kubernetes.io/projected/3c8a9c8d-d360-468b-ba06-d1cd737afa7e-kube-api-access-l6k67\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-585488bf84z5tpx\" (UID: \"3c8a9c8d-d360-468b-ba06-d1cd737afa7e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-585488bf84z5tpx" Apr 22 18:58:41.572549 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:58:41.572511 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3c8a9c8d-d360-468b-ba06-d1cd737afa7e-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-585488bf84z5tpx\" (UID: \"3c8a9c8d-d360-468b-ba06-d1cd737afa7e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-585488bf84z5tpx" Apr 22 18:58:41.673734 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:58:41.673698 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l6k67\" (UniqueName: \"kubernetes.io/projected/3c8a9c8d-d360-468b-ba06-d1cd737afa7e-kube-api-access-l6k67\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-585488bf84z5tpx\" (UID: \"3c8a9c8d-d360-468b-ba06-d1cd737afa7e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-585488bf84z5tpx" Apr 22 18:58:41.673913 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:58:41.673745 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3c8a9c8d-d360-468b-ba06-d1cd737afa7e-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-585488bf84z5tpx\" (UID: \"3c8a9c8d-d360-468b-ba06-d1cd737afa7e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-585488bf84z5tpx" Apr 22 18:58:41.673913 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:58:41.673862 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3c8a9c8d-d360-468b-ba06-d1cd737afa7e-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-585488bf84z5tpx\" (UID: \"3c8a9c8d-d360-468b-ba06-d1cd737afa7e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-585488bf84z5tpx" Apr 22 18:58:41.674081 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:58:41.673929 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3c8a9c8d-d360-468b-ba06-d1cd737afa7e-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-585488bf84z5tpx\" (UID: \"3c8a9c8d-d360-468b-ba06-d1cd737afa7e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-585488bf84z5tpx" Apr 22 18:58:41.674081 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:58:41.674003 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3c8a9c8d-d360-468b-ba06-d1cd737afa7e-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-585488bf84z5tpx\" (UID: \"3c8a9c8d-d360-468b-ba06-d1cd737afa7e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-585488bf84z5tpx" Apr 22 18:58:41.674081 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:58:41.674032 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3c8a9c8d-d360-468b-ba06-d1cd737afa7e-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-585488bf84z5tpx\" (UID: \"3c8a9c8d-d360-468b-ba06-d1cd737afa7e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-585488bf84z5tpx" Apr 22 18:58:41.674286 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:58:41.674262 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3c8a9c8d-d360-468b-ba06-d1cd737afa7e-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-585488bf84z5tpx\" (UID: \"3c8a9c8d-d360-468b-ba06-d1cd737afa7e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-585488bf84z5tpx" Apr 22 18:58:41.674346 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:58:41.674297 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3c8a9c8d-d360-468b-ba06-d1cd737afa7e-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-585488bf84z5tpx\" (UID: \"3c8a9c8d-d360-468b-ba06-d1cd737afa7e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-585488bf84z5tpx" Apr 22 18:58:41.674346 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:58:41.674327 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3c8a9c8d-d360-468b-ba06-d1cd737afa7e-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-585488bf84z5tpx\" (UID: \"3c8a9c8d-d360-468b-ba06-d1cd737afa7e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-585488bf84z5tpx" Apr 22 18:58:41.676025 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:58:41.675998 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3c8a9c8d-d360-468b-ba06-d1cd737afa7e-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-585488bf84z5tpx\" (UID: \"3c8a9c8d-d360-468b-ba06-d1cd737afa7e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-585488bf84z5tpx" Apr 22 18:58:41.676315 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:58:41.676299 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3c8a9c8d-d360-468b-ba06-d1cd737afa7e-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-585488bf84z5tpx\" (UID: \"3c8a9c8d-d360-468b-ba06-d1cd737afa7e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-585488bf84z5tpx" Apr 22 18:58:41.680665 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:58:41.680642 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6k67\" (UniqueName: \"kubernetes.io/projected/3c8a9c8d-d360-468b-ba06-d1cd737afa7e-kube-api-access-l6k67\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-585488bf84z5tpx\" (UID: \"3c8a9c8d-d360-468b-ba06-d1cd737afa7e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-585488bf84z5tpx" Apr 22 18:58:41.840102 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:58:41.840032 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-585488bf84z5tpx" Apr 22 18:58:41.961716 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:58:41.961692 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-585488bf84z5tpx"] Apr 22 18:58:41.963687 ip-10-0-130-234 kubenswrapper[2566]: W0422 18:58:41.963660 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c8a9c8d_d360_468b_ba06_d1cd737afa7e.slice/crio-b3a2628c10d9dc5db79fa3b5eb5546077cb8b993b124734b48dbab0dbd9d01e2 WatchSource:0}: Error finding container b3a2628c10d9dc5db79fa3b5eb5546077cb8b993b124734b48dbab0dbd9d01e2: Status 404 returned error can't find the container with id b3a2628c10d9dc5db79fa3b5eb5546077cb8b993b124734b48dbab0dbd9d01e2 Apr 22 18:58:41.965557 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:58:41.965535 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:58:42.333474 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:58:42.333437 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-585488bf84z5tpx" event={"ID":"3c8a9c8d-d360-468b-ba06-d1cd737afa7e","Type":"ContainerStarted","Data":"b3a2628c10d9dc5db79fa3b5eb5546077cb8b993b124734b48dbab0dbd9d01e2"} Apr 22 18:58:45.348603 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:58:45.348518 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-585488bf84z5tpx" event={"ID":"3c8a9c8d-d360-468b-ba06-d1cd737afa7e","Type":"ContainerStarted","Data":"7e544720b47e3de2af80246a8dd3f3782128310fa36a3f691c17130ffdb41c04"} Apr 22 18:58:49.364888 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:58:49.364857 2566 generic.go:358] "Generic (PLEG): container finished" podID="3c8a9c8d-d360-468b-ba06-d1cd737afa7e" containerID="7e544720b47e3de2af80246a8dd3f3782128310fa36a3f691c17130ffdb41c04" exitCode=0 Apr 22 18:58:49.365288 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:58:49.364947 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-585488bf84z5tpx" event={"ID":"3c8a9c8d-d360-468b-ba06-d1cd737afa7e","Type":"ContainerDied","Data":"7e544720b47e3de2af80246a8dd3f3782128310fa36a3f691c17130ffdb41c04"} Apr 22 18:58:51.374806 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:58:51.374763 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-585488bf84z5tpx" event={"ID":"3c8a9c8d-d360-468b-ba06-d1cd737afa7e","Type":"ContainerStarted","Data":"c69d2ca31538fae8a12a76d26137195bd3b5a78221e727aabfa54ce6b8596089"} Apr 22 18:58:51.389667 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:58:51.389617 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-585488bf84z5tpx" podStartSLOduration=1.9237865219999999 podStartE2EDuration="10.389602987s" podCreationTimestamp="2026-04-22 18:58:41 +0000 UTC" firstStartedPulling="2026-04-22 18:58:41.965657871 +0000 UTC m=+723.744491335" lastFinishedPulling="2026-04-22 18:58:50.431474336 +0000 UTC m=+732.210307800" observedRunningTime="2026-04-22 18:58:51.389033684 +0000 UTC m=+733.167867166" watchObservedRunningTime="2026-04-22 18:58:51.389602987 +0000 UTC m=+733.168436472" Apr 22 18:58:51.841096 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:58:51.841005 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-585488bf84z5tpx" Apr 22 18:58:51.841096 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:58:51.841057 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-585488bf84z5tpx" Apr 22 18:58:51.853624 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:58:51.853596 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-585488bf84z5tpx" Apr 22 18:58:52.390545 ip-10-0-130-234 kubenswrapper[2566]: I0422 18:58:52.390520 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-585488bf84z5tpx" Apr 22 19:00:06.848289 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:06.848213 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-fnpsg"] Apr 22 19:00:06.858992 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:06.858968 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-fnpsg" Apr 22 19:00:06.860012 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:06.859979 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-fnpsg"] Apr 22 19:00:06.861435 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:06.861412 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 22 19:00:06.984746 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:06.984714 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/12c7aa67-2267-4d1a-b96f-32c56af9ebe9-home\") pod \"precise-prefix-cache-test-kserve-5f88bfbb4-fnpsg\" (UID: \"12c7aa67-2267-4d1a-b96f-32c56af9ebe9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-fnpsg" Apr 22 19:00:06.984908 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:06.984766 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/12c7aa67-2267-4d1a-b96f-32c56af9ebe9-dshm\") pod \"precise-prefix-cache-test-kserve-5f88bfbb4-fnpsg\" (UID: \"12c7aa67-2267-4d1a-b96f-32c56af9ebe9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-fnpsg" Apr 22 19:00:06.984908 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:06.984794 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/12c7aa67-2267-4d1a-b96f-32c56af9ebe9-tls-certs\") pod \"precise-prefix-cache-test-kserve-5f88bfbb4-fnpsg\" (UID: \"12c7aa67-2267-4d1a-b96f-32c56af9ebe9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-fnpsg" Apr 22 19:00:06.984908 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:06.984814 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/12c7aa67-2267-4d1a-b96f-32c56af9ebe9-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-5f88bfbb4-fnpsg\" (UID: \"12c7aa67-2267-4d1a-b96f-32c56af9ebe9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-fnpsg" Apr 22 19:00:06.985052 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:06.984966 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/12c7aa67-2267-4d1a-b96f-32c56af9ebe9-model-cache\") pod \"precise-prefix-cache-test-kserve-5f88bfbb4-fnpsg\" (UID: \"12c7aa67-2267-4d1a-b96f-32c56af9ebe9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-fnpsg" Apr 22 19:00:06.985052 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:06.984999 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfqdk\" (UniqueName: \"kubernetes.io/projected/12c7aa67-2267-4d1a-b96f-32c56af9ebe9-kube-api-access-qfqdk\") pod \"precise-prefix-cache-test-kserve-5f88bfbb4-fnpsg\" (UID: \"12c7aa67-2267-4d1a-b96f-32c56af9ebe9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-fnpsg" Apr 22 19:00:07.086066 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:07.086031 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/12c7aa67-2267-4d1a-b96f-32c56af9ebe9-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-5f88bfbb4-fnpsg\" (UID: \"12c7aa67-2267-4d1a-b96f-32c56af9ebe9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-fnpsg" Apr 22 19:00:07.086202 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:07.086107 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/12c7aa67-2267-4d1a-b96f-32c56af9ebe9-model-cache\") pod \"precise-prefix-cache-test-kserve-5f88bfbb4-fnpsg\" (UID: \"12c7aa67-2267-4d1a-b96f-32c56af9ebe9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-fnpsg" Apr 22 19:00:07.086202 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:07.086125 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qfqdk\" (UniqueName: \"kubernetes.io/projected/12c7aa67-2267-4d1a-b96f-32c56af9ebe9-kube-api-access-qfqdk\") pod \"precise-prefix-cache-test-kserve-5f88bfbb4-fnpsg\" (UID: \"12c7aa67-2267-4d1a-b96f-32c56af9ebe9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-fnpsg" Apr 22 19:00:07.086202 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:07.086150 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/12c7aa67-2267-4d1a-b96f-32c56af9ebe9-home\") pod \"precise-prefix-cache-test-kserve-5f88bfbb4-fnpsg\" (UID: \"12c7aa67-2267-4d1a-b96f-32c56af9ebe9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-fnpsg" Apr 22 19:00:07.086202 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:07.086179 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/12c7aa67-2267-4d1a-b96f-32c56af9ebe9-dshm\") pod \"precise-prefix-cache-test-kserve-5f88bfbb4-fnpsg\" (UID: \"12c7aa67-2267-4d1a-b96f-32c56af9ebe9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-fnpsg" Apr 22 19:00:07.086409 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:07.086212 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/12c7aa67-2267-4d1a-b96f-32c56af9ebe9-tls-certs\") pod \"precise-prefix-cache-test-kserve-5f88bfbb4-fnpsg\" (UID: \"12c7aa67-2267-4d1a-b96f-32c56af9ebe9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-fnpsg" Apr 22 19:00:07.086490 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:07.086467 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/12c7aa67-2267-4d1a-b96f-32c56af9ebe9-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-5f88bfbb4-fnpsg\" (UID: \"12c7aa67-2267-4d1a-b96f-32c56af9ebe9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-fnpsg" Apr 22 19:00:07.086556 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:07.086531 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/12c7aa67-2267-4d1a-b96f-32c56af9ebe9-home\") pod \"precise-prefix-cache-test-kserve-5f88bfbb4-fnpsg\" (UID: \"12c7aa67-2267-4d1a-b96f-32c56af9ebe9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-fnpsg" Apr 22 19:00:07.086613 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:07.086587 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/12c7aa67-2267-4d1a-b96f-32c56af9ebe9-model-cache\") pod \"precise-prefix-cache-test-kserve-5f88bfbb4-fnpsg\" (UID: \"12c7aa67-2267-4d1a-b96f-32c56af9ebe9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-fnpsg" Apr 22 19:00:07.088437 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:07.088408 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/12c7aa67-2267-4d1a-b96f-32c56af9ebe9-dshm\") pod \"precise-prefix-cache-test-kserve-5f88bfbb4-fnpsg\" (UID: \"12c7aa67-2267-4d1a-b96f-32c56af9ebe9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-fnpsg" Apr 22 19:00:07.088586 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:07.088569 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/12c7aa67-2267-4d1a-b96f-32c56af9ebe9-tls-certs\") pod \"precise-prefix-cache-test-kserve-5f88bfbb4-fnpsg\" (UID: \"12c7aa67-2267-4d1a-b96f-32c56af9ebe9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-fnpsg" Apr 22 19:00:07.095448 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:07.095428 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfqdk\" (UniqueName: \"kubernetes.io/projected/12c7aa67-2267-4d1a-b96f-32c56af9ebe9-kube-api-access-qfqdk\") pod \"precise-prefix-cache-test-kserve-5f88bfbb4-fnpsg\" (UID: \"12c7aa67-2267-4d1a-b96f-32c56af9ebe9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-fnpsg" Apr 22 19:00:07.169513 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:07.169480 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-fnpsg" Apr 22 19:00:07.293695 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:07.293671 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-fnpsg"] Apr 22 19:00:07.296033 ip-10-0-130-234 kubenswrapper[2566]: W0422 19:00:07.295998 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12c7aa67_2267_4d1a_b96f_32c56af9ebe9.slice/crio-c76a3a7f2b3f71aa7f010546e8686794f4a0bb9a46d5003164d4275b97876431 WatchSource:0}: Error finding container c76a3a7f2b3f71aa7f010546e8686794f4a0bb9a46d5003164d4275b97876431: Status 404 returned error can't find the container with id c76a3a7f2b3f71aa7f010546e8686794f4a0bb9a46d5003164d4275b97876431 Apr 22 19:00:07.669956 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:07.669896 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-fnpsg" event={"ID":"12c7aa67-2267-4d1a-b96f-32c56af9ebe9","Type":"ContainerStarted","Data":"420785f08af913bd32ce54a1c7ee7a11ad12a7dff0025cb85807554eefe6dfcd"} Apr 22 19:00:07.670147 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:07.669963 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-fnpsg" event={"ID":"12c7aa67-2267-4d1a-b96f-32c56af9ebe9","Type":"ContainerStarted","Data":"c76a3a7f2b3f71aa7f010546e8686794f4a0bb9a46d5003164d4275b97876431"} Apr 22 19:00:11.687447 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:11.687413 2566 generic.go:358] "Generic (PLEG): container finished" podID="12c7aa67-2267-4d1a-b96f-32c56af9ebe9" containerID="420785f08af913bd32ce54a1c7ee7a11ad12a7dff0025cb85807554eefe6dfcd" exitCode=0 Apr 22 19:00:11.687810 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:11.687456 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-fnpsg" event={"ID":"12c7aa67-2267-4d1a-b96f-32c56af9ebe9","Type":"ContainerDied","Data":"420785f08af913bd32ce54a1c7ee7a11ad12a7dff0025cb85807554eefe6dfcd"} Apr 22 19:00:12.693128 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:12.693093 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-fnpsg" event={"ID":"12c7aa67-2267-4d1a-b96f-32c56af9ebe9","Type":"ContainerStarted","Data":"a8bec693a47618dbd43c0685e8cf27482e296489fd01a94ef4399069b45c6c08"} Apr 22 19:00:12.708473 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:12.708414 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-fnpsg" podStartSLOduration=6.708400496 podStartE2EDuration="6.708400496s" podCreationTimestamp="2026-04-22 19:00:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:00:12.707928095 +0000 UTC m=+814.486761582" watchObservedRunningTime="2026-04-22 19:00:12.708400496 +0000 UTC m=+814.487233984" Apr 22 19:00:15.307962 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:15.307910 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-585488bf84z5tpx"] Apr 22 19:00:15.308347 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:15.308218 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-585488bf84z5tpx" podUID="3c8a9c8d-d360-468b-ba06-d1cd737afa7e" containerName="main" containerID="cri-o://c69d2ca31538fae8a12a76d26137195bd3b5a78221e727aabfa54ce6b8596089" gracePeriod=30 Apr 22 19:00:15.565652 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:15.565597 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-585488bf84z5tpx" Apr 22 19:00:15.660846 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:15.660812 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3c8a9c8d-d360-468b-ba06-d1cd737afa7e-kserve-provision-location\") pod \"3c8a9c8d-d360-468b-ba06-d1cd737afa7e\" (UID: \"3c8a9c8d-d360-468b-ba06-d1cd737afa7e\") " Apr 22 19:00:15.660846 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:15.660851 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3c8a9c8d-d360-468b-ba06-d1cd737afa7e-dshm\") pod \"3c8a9c8d-d360-468b-ba06-d1cd737afa7e\" (UID: \"3c8a9c8d-d360-468b-ba06-d1cd737afa7e\") " Apr 22 19:00:15.661124 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:15.660871 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3c8a9c8d-d360-468b-ba06-d1cd737afa7e-model-cache\") pod \"3c8a9c8d-d360-468b-ba06-d1cd737afa7e\" (UID: \"3c8a9c8d-d360-468b-ba06-d1cd737afa7e\") " Apr 22 19:00:15.661124 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:15.660888 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6k67\" (UniqueName: \"kubernetes.io/projected/3c8a9c8d-d360-468b-ba06-d1cd737afa7e-kube-api-access-l6k67\") pod \"3c8a9c8d-d360-468b-ba06-d1cd737afa7e\" (UID: \"3c8a9c8d-d360-468b-ba06-d1cd737afa7e\") " Apr 22 19:00:15.661124 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:15.660919 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3c8a9c8d-d360-468b-ba06-d1cd737afa7e-tls-certs\") pod \"3c8a9c8d-d360-468b-ba06-d1cd737afa7e\" (UID: \"3c8a9c8d-d360-468b-ba06-d1cd737afa7e\") " Apr 22 19:00:15.661124 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:15.660983 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3c8a9c8d-d360-468b-ba06-d1cd737afa7e-home\") pod \"3c8a9c8d-d360-468b-ba06-d1cd737afa7e\" (UID: \"3c8a9c8d-d360-468b-ba06-d1cd737afa7e\") " Apr 22 19:00:15.661343 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:15.661184 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c8a9c8d-d360-468b-ba06-d1cd737afa7e-model-cache" (OuterVolumeSpecName: "model-cache") pod "3c8a9c8d-d360-468b-ba06-d1cd737afa7e" (UID: "3c8a9c8d-d360-468b-ba06-d1cd737afa7e"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:00:15.661343 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:15.661304 2566 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3c8a9c8d-d360-468b-ba06-d1cd737afa7e-model-cache\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:00:15.661452 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:15.661345 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c8a9c8d-d360-468b-ba06-d1cd737afa7e-home" (OuterVolumeSpecName: "home") pod "3c8a9c8d-d360-468b-ba06-d1cd737afa7e" (UID: "3c8a9c8d-d360-468b-ba06-d1cd737afa7e"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:00:15.663092 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:15.663060 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c8a9c8d-d360-468b-ba06-d1cd737afa7e-dshm" (OuterVolumeSpecName: "dshm") pod "3c8a9c8d-d360-468b-ba06-d1cd737afa7e" (UID: "3c8a9c8d-d360-468b-ba06-d1cd737afa7e"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:00:15.663429 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:15.663407 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c8a9c8d-d360-468b-ba06-d1cd737afa7e-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "3c8a9c8d-d360-468b-ba06-d1cd737afa7e" (UID: "3c8a9c8d-d360-468b-ba06-d1cd737afa7e"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:00:15.663521 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:15.663497 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c8a9c8d-d360-468b-ba06-d1cd737afa7e-kube-api-access-l6k67" (OuterVolumeSpecName: "kube-api-access-l6k67") pod "3c8a9c8d-d360-468b-ba06-d1cd737afa7e" (UID: "3c8a9c8d-d360-468b-ba06-d1cd737afa7e"). InnerVolumeSpecName "kube-api-access-l6k67". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:00:15.709200 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:15.709132 2566 generic.go:358] "Generic (PLEG): container finished" podID="3c8a9c8d-d360-468b-ba06-d1cd737afa7e" containerID="c69d2ca31538fae8a12a76d26137195bd3b5a78221e727aabfa54ce6b8596089" exitCode=0 Apr 22 19:00:15.709356 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:15.709246 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-585488bf84z5tpx" Apr 22 19:00:15.709356 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:15.709246 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-585488bf84z5tpx" event={"ID":"3c8a9c8d-d360-468b-ba06-d1cd737afa7e","Type":"ContainerDied","Data":"c69d2ca31538fae8a12a76d26137195bd3b5a78221e727aabfa54ce6b8596089"} Apr 22 19:00:15.709356 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:15.709287 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-585488bf84z5tpx" event={"ID":"3c8a9c8d-d360-468b-ba06-d1cd737afa7e","Type":"ContainerDied","Data":"b3a2628c10d9dc5db79fa3b5eb5546077cb8b993b124734b48dbab0dbd9d01e2"} Apr 22 19:00:15.709356 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:15.709308 2566 scope.go:117] "RemoveContainer" containerID="c69d2ca31538fae8a12a76d26137195bd3b5a78221e727aabfa54ce6b8596089" Apr 22 19:00:15.715809 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:15.715778 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c8a9c8d-d360-468b-ba06-d1cd737afa7e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3c8a9c8d-d360-468b-ba06-d1cd737afa7e" (UID: "3c8a9c8d-d360-468b-ba06-d1cd737afa7e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:00:15.718003 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:15.717985 2566 scope.go:117] "RemoveContainer" containerID="7e544720b47e3de2af80246a8dd3f3782128310fa36a3f691c17130ffdb41c04" Apr 22 19:00:15.761942 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:15.761920 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3c8a9c8d-d360-468b-ba06-d1cd737afa7e-kserve-provision-location\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:00:15.762069 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:15.761969 2566 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3c8a9c8d-d360-468b-ba06-d1cd737afa7e-dshm\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:00:15.762069 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:15.761987 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l6k67\" (UniqueName: \"kubernetes.io/projected/3c8a9c8d-d360-468b-ba06-d1cd737afa7e-kube-api-access-l6k67\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:00:15.762069 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:15.762002 2566 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3c8a9c8d-d360-468b-ba06-d1cd737afa7e-tls-certs\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:00:15.762069 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:15.762015 2566 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3c8a9c8d-d360-468b-ba06-d1cd737afa7e-home\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:00:15.776383 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:15.776367 2566 scope.go:117] "RemoveContainer" containerID="c69d2ca31538fae8a12a76d26137195bd3b5a78221e727aabfa54ce6b8596089" Apr 22 19:00:15.776664 ip-10-0-130-234 kubenswrapper[2566]: E0422 19:00:15.776641 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c69d2ca31538fae8a12a76d26137195bd3b5a78221e727aabfa54ce6b8596089\": container with ID starting with c69d2ca31538fae8a12a76d26137195bd3b5a78221e727aabfa54ce6b8596089 not found: ID does not exist" containerID="c69d2ca31538fae8a12a76d26137195bd3b5a78221e727aabfa54ce6b8596089" Apr 22 19:00:15.776716 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:15.776674 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c69d2ca31538fae8a12a76d26137195bd3b5a78221e727aabfa54ce6b8596089"} err="failed to get container status \"c69d2ca31538fae8a12a76d26137195bd3b5a78221e727aabfa54ce6b8596089\": rpc error: code = NotFound desc = could not find container \"c69d2ca31538fae8a12a76d26137195bd3b5a78221e727aabfa54ce6b8596089\": container with ID starting with c69d2ca31538fae8a12a76d26137195bd3b5a78221e727aabfa54ce6b8596089 not found: ID does not exist" Apr 22 19:00:15.776716 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:15.776694 2566 scope.go:117] "RemoveContainer" containerID="7e544720b47e3de2af80246a8dd3f3782128310fa36a3f691c17130ffdb41c04" Apr 22 19:00:15.776923 ip-10-0-130-234 kubenswrapper[2566]: E0422 19:00:15.776908 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e544720b47e3de2af80246a8dd3f3782128310fa36a3f691c17130ffdb41c04\": container with ID starting with 7e544720b47e3de2af80246a8dd3f3782128310fa36a3f691c17130ffdb41c04 not found: ID does not exist" containerID="7e544720b47e3de2af80246a8dd3f3782128310fa36a3f691c17130ffdb41c04" Apr 22 19:00:15.776982 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:15.776927 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e544720b47e3de2af80246a8dd3f3782128310fa36a3f691c17130ffdb41c04"} err="failed to get container status \"7e544720b47e3de2af80246a8dd3f3782128310fa36a3f691c17130ffdb41c04\": rpc error: code = NotFound desc = could not find container \"7e544720b47e3de2af80246a8dd3f3782128310fa36a3f691c17130ffdb41c04\": container with ID starting with 7e544720b47e3de2af80246a8dd3f3782128310fa36a3f691c17130ffdb41c04 not found: ID does not exist" Apr 22 19:00:16.030751 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:16.030719 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-585488bf84z5tpx"] Apr 22 19:00:16.032856 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:16.032834 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-585488bf84z5tpx"] Apr 22 19:00:16.830296 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:16.830267 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c8a9c8d-d360-468b-ba06-d1cd737afa7e" path="/var/lib/kubelet/pods/3c8a9c8d-d360-468b-ba06-d1cd737afa7e/volumes" Apr 22 19:00:17.170462 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:17.170425 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-fnpsg" Apr 22 19:00:17.170634 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:17.170474 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-fnpsg" Apr 22 19:00:17.183018 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:17.182989 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-fnpsg" Apr 22 19:00:17.728528 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:17.728499 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-fnpsg" Apr 22 19:00:22.932601 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:22.932569 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8"] Apr 22 19:00:22.933126 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:22.933110 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3c8a9c8d-d360-468b-ba06-d1cd737afa7e" containerName="storage-initializer" Apr 22 19:00:22.933176 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:22.933129 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c8a9c8d-d360-468b-ba06-d1cd737afa7e" containerName="storage-initializer" Apr 22 19:00:22.933176 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:22.933143 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3c8a9c8d-d360-468b-ba06-d1cd737afa7e" containerName="main" Apr 22 19:00:22.933176 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:22.933152 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c8a9c8d-d360-468b-ba06-d1cd737afa7e" containerName="main" Apr 22 19:00:22.933270 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:22.933248 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="3c8a9c8d-d360-468b-ba06-d1cd737afa7e" containerName="main" Apr 22 19:00:22.970025 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:22.969999 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8"] Apr 22 19:00:22.970161 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:22.970043 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8" Apr 22 19:00:22.972466 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:22.972442 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvdde380eaa9fe1facad32d45131f9e34d-kserve-self-signed-certs\"" Apr 22 19:00:23.020778 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:23.020748 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/173b13fc-b954-46a7-bf54-f12ad8319d0b-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8\" (UID: \"173b13fc-b954-46a7-bf54-f12ad8319d0b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8" Apr 22 19:00:23.020912 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:23.020787 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/173b13fc-b954-46a7-bf54-f12ad8319d0b-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8\" (UID: \"173b13fc-b954-46a7-bf54-f12ad8319d0b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8" Apr 22 19:00:23.020912 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:23.020878 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/173b13fc-b954-46a7-bf54-f12ad8319d0b-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8\" (UID: \"173b13fc-b954-46a7-bf54-f12ad8319d0b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8" Apr 22 19:00:23.021018 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:23.020919 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8kr9\" (UniqueName: \"kubernetes.io/projected/173b13fc-b954-46a7-bf54-f12ad8319d0b-kube-api-access-p8kr9\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8\" (UID: \"173b13fc-b954-46a7-bf54-f12ad8319d0b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8" Apr 22 19:00:23.021018 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:23.020968 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/173b13fc-b954-46a7-bf54-f12ad8319d0b-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8\" (UID: \"173b13fc-b954-46a7-bf54-f12ad8319d0b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8" Apr 22 19:00:23.021084 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:23.021033 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/173b13fc-b954-46a7-bf54-f12ad8319d0b-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8\" (UID: \"173b13fc-b954-46a7-bf54-f12ad8319d0b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8" Apr 22 19:00:23.121345 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:23.121316 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/173b13fc-b954-46a7-bf54-f12ad8319d0b-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8\" (UID: \"173b13fc-b954-46a7-bf54-f12ad8319d0b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8" Apr 22 19:00:23.121493 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:23.121381 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/173b13fc-b954-46a7-bf54-f12ad8319d0b-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8\" (UID: \"173b13fc-b954-46a7-bf54-f12ad8319d0b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8" Apr 22 19:00:23.121493 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:23.121407 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p8kr9\" (UniqueName: \"kubernetes.io/projected/173b13fc-b954-46a7-bf54-f12ad8319d0b-kube-api-access-p8kr9\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8\" (UID: \"173b13fc-b954-46a7-bf54-f12ad8319d0b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8" Apr 22 19:00:23.121493 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:23.121442 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/173b13fc-b954-46a7-bf54-f12ad8319d0b-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8\" (UID: \"173b13fc-b954-46a7-bf54-f12ad8319d0b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8" Apr 22 19:00:23.121655 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:23.121508 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/173b13fc-b954-46a7-bf54-f12ad8319d0b-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8\" (UID: \"173b13fc-b954-46a7-bf54-f12ad8319d0b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8" Apr 22 19:00:23.121655 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:23.121546 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/173b13fc-b954-46a7-bf54-f12ad8319d0b-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8\" (UID: \"173b13fc-b954-46a7-bf54-f12ad8319d0b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8" Apr 22 19:00:23.121812 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:23.121710 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/173b13fc-b954-46a7-bf54-f12ad8319d0b-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8\" (UID: \"173b13fc-b954-46a7-bf54-f12ad8319d0b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8" Apr 22 19:00:23.121812 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:23.121769 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/173b13fc-b954-46a7-bf54-f12ad8319d0b-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8\" (UID: \"173b13fc-b954-46a7-bf54-f12ad8319d0b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8" Apr 22 19:00:23.121922 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:23.121848 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/173b13fc-b954-46a7-bf54-f12ad8319d0b-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8\" (UID: \"173b13fc-b954-46a7-bf54-f12ad8319d0b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8" Apr 22 19:00:23.123680 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:23.123661 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/173b13fc-b954-46a7-bf54-f12ad8319d0b-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8\" (UID: \"173b13fc-b954-46a7-bf54-f12ad8319d0b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8" Apr 22 19:00:23.123996 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:23.123976 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/173b13fc-b954-46a7-bf54-f12ad8319d0b-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8\" (UID: \"173b13fc-b954-46a7-bf54-f12ad8319d0b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8" Apr 22 19:00:23.129264 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:23.129244 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8kr9\" (UniqueName: \"kubernetes.io/projected/173b13fc-b954-46a7-bf54-f12ad8319d0b-kube-api-access-p8kr9\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8\" (UID: \"173b13fc-b954-46a7-bf54-f12ad8319d0b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8" Apr 22 19:00:23.279534 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:23.279452 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8" Apr 22 19:00:23.403319 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:23.403292 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8"] Apr 22 19:00:23.405187 ip-10-0-130-234 kubenswrapper[2566]: W0422 19:00:23.405157 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod173b13fc_b954_46a7_bf54_f12ad8319d0b.slice/crio-aeba8fdc91d21e3eb418df84a4e0407d9c5a2fc7014d47d0b1ee40de99b243e1 WatchSource:0}: Error finding container aeba8fdc91d21e3eb418df84a4e0407d9c5a2fc7014d47d0b1ee40de99b243e1: Status 404 returned error can't find the container with id aeba8fdc91d21e3eb418df84a4e0407d9c5a2fc7014d47d0b1ee40de99b243e1 Apr 22 19:00:23.743616 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:23.743581 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8" event={"ID":"173b13fc-b954-46a7-bf54-f12ad8319d0b","Type":"ContainerStarted","Data":"b5025625fbb96ccfeba56371de73c40cbebd03b03986edcce0f9d15c3644756b"} Apr 22 19:00:23.743616 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:23.743619 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8" event={"ID":"173b13fc-b954-46a7-bf54-f12ad8319d0b","Type":"ContainerStarted","Data":"aeba8fdc91d21e3eb418df84a4e0407d9c5a2fc7014d47d0b1ee40de99b243e1"} Apr 22 19:00:27.763759 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:27.763729 2566 generic.go:358] "Generic (PLEG): container finished" podID="173b13fc-b954-46a7-bf54-f12ad8319d0b" containerID="b5025625fbb96ccfeba56371de73c40cbebd03b03986edcce0f9d15c3644756b" exitCode=0 Apr 22 19:00:27.764128 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:27.763805 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8" event={"ID":"173b13fc-b954-46a7-bf54-f12ad8319d0b","Type":"ContainerDied","Data":"b5025625fbb96ccfeba56371de73c40cbebd03b03986edcce0f9d15c3644756b"} Apr 22 19:00:41.115079 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:41.115021 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-fnpsg"] Apr 22 19:00:41.115569 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:41.115340 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-fnpsg" podUID="12c7aa67-2267-4d1a-b96f-32c56af9ebe9" containerName="main" containerID="cri-o://a8bec693a47618dbd43c0685e8cf27482e296489fd01a94ef4399069b45c6c08" gracePeriod=30 Apr 22 19:00:41.404741 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:41.404717 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-fnpsg" Apr 22 19:00:41.501759 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:41.501727 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/12c7aa67-2267-4d1a-b96f-32c56af9ebe9-model-cache\") pod \"12c7aa67-2267-4d1a-b96f-32c56af9ebe9\" (UID: \"12c7aa67-2267-4d1a-b96f-32c56af9ebe9\") " Apr 22 19:00:41.501956 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:41.501782 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/12c7aa67-2267-4d1a-b96f-32c56af9ebe9-kserve-provision-location\") pod \"12c7aa67-2267-4d1a-b96f-32c56af9ebe9\" (UID: \"12c7aa67-2267-4d1a-b96f-32c56af9ebe9\") " Apr 22 19:00:41.501956 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:41.501824 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/12c7aa67-2267-4d1a-b96f-32c56af9ebe9-dshm\") pod \"12c7aa67-2267-4d1a-b96f-32c56af9ebe9\" (UID: \"12c7aa67-2267-4d1a-b96f-32c56af9ebe9\") " Apr 22 19:00:41.501956 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:41.501870 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/12c7aa67-2267-4d1a-b96f-32c56af9ebe9-home\") pod \"12c7aa67-2267-4d1a-b96f-32c56af9ebe9\" (UID: \"12c7aa67-2267-4d1a-b96f-32c56af9ebe9\") " Apr 22 19:00:41.501956 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:41.501899 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfqdk\" (UniqueName: \"kubernetes.io/projected/12c7aa67-2267-4d1a-b96f-32c56af9ebe9-kube-api-access-qfqdk\") pod \"12c7aa67-2267-4d1a-b96f-32c56af9ebe9\" (UID: \"12c7aa67-2267-4d1a-b96f-32c56af9ebe9\") " Apr 22 19:00:41.502202 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:41.501999 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/12c7aa67-2267-4d1a-b96f-32c56af9ebe9-tls-certs\") pod \"12c7aa67-2267-4d1a-b96f-32c56af9ebe9\" (UID: \"12c7aa67-2267-4d1a-b96f-32c56af9ebe9\") " Apr 22 19:00:41.502202 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:41.502037 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12c7aa67-2267-4d1a-b96f-32c56af9ebe9-model-cache" (OuterVolumeSpecName: "model-cache") pod "12c7aa67-2267-4d1a-b96f-32c56af9ebe9" (UID: "12c7aa67-2267-4d1a-b96f-32c56af9ebe9"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:00:41.502311 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:41.502278 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12c7aa67-2267-4d1a-b96f-32c56af9ebe9-home" (OuterVolumeSpecName: "home") pod "12c7aa67-2267-4d1a-b96f-32c56af9ebe9" (UID: "12c7aa67-2267-4d1a-b96f-32c56af9ebe9"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:00:41.502373 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:41.502335 2566 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/12c7aa67-2267-4d1a-b96f-32c56af9ebe9-model-cache\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:00:41.504515 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:41.504487 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12c7aa67-2267-4d1a-b96f-32c56af9ebe9-dshm" (OuterVolumeSpecName: "dshm") pod "12c7aa67-2267-4d1a-b96f-32c56af9ebe9" (UID: "12c7aa67-2267-4d1a-b96f-32c56af9ebe9"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:00:41.504661 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:41.504597 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12c7aa67-2267-4d1a-b96f-32c56af9ebe9-kube-api-access-qfqdk" (OuterVolumeSpecName: "kube-api-access-qfqdk") pod "12c7aa67-2267-4d1a-b96f-32c56af9ebe9" (UID: "12c7aa67-2267-4d1a-b96f-32c56af9ebe9"). InnerVolumeSpecName "kube-api-access-qfqdk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:00:41.504750 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:41.504723 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12c7aa67-2267-4d1a-b96f-32c56af9ebe9-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "12c7aa67-2267-4d1a-b96f-32c56af9ebe9" (UID: "12c7aa67-2267-4d1a-b96f-32c56af9ebe9"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:00:41.566513 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:41.566456 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12c7aa67-2267-4d1a-b96f-32c56af9ebe9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "12c7aa67-2267-4d1a-b96f-32c56af9ebe9" (UID: "12c7aa67-2267-4d1a-b96f-32c56af9ebe9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:00:41.603235 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:41.603199 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/12c7aa67-2267-4d1a-b96f-32c56af9ebe9-kserve-provision-location\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:00:41.603235 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:41.603233 2566 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/12c7aa67-2267-4d1a-b96f-32c56af9ebe9-dshm\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:00:41.603443 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:41.603248 2566 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/12c7aa67-2267-4d1a-b96f-32c56af9ebe9-home\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:00:41.603443 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:41.603260 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qfqdk\" (UniqueName: \"kubernetes.io/projected/12c7aa67-2267-4d1a-b96f-32c56af9ebe9-kube-api-access-qfqdk\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:00:41.603443 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:41.603273 2566 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/12c7aa67-2267-4d1a-b96f-32c56af9ebe9-tls-certs\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:00:41.825136 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:41.825053 2566 generic.go:358] "Generic (PLEG): container finished" podID="12c7aa67-2267-4d1a-b96f-32c56af9ebe9" containerID="a8bec693a47618dbd43c0685e8cf27482e296489fd01a94ef4399069b45c6c08" exitCode=0 Apr 22 19:00:41.825136 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:41.825099 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-fnpsg" event={"ID":"12c7aa67-2267-4d1a-b96f-32c56af9ebe9","Type":"ContainerDied","Data":"a8bec693a47618dbd43c0685e8cf27482e296489fd01a94ef4399069b45c6c08"} Apr 22 19:00:41.825136 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:41.825122 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-fnpsg" event={"ID":"12c7aa67-2267-4d1a-b96f-32c56af9ebe9","Type":"ContainerDied","Data":"c76a3a7f2b3f71aa7f010546e8686794f4a0bb9a46d5003164d4275b97876431"} Apr 22 19:00:41.825136 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:41.825121 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-fnpsg" Apr 22 19:00:41.825439 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:41.825135 2566 scope.go:117] "RemoveContainer" containerID="a8bec693a47618dbd43c0685e8cf27482e296489fd01a94ef4399069b45c6c08" Apr 22 19:00:41.834454 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:41.834432 2566 scope.go:117] "RemoveContainer" containerID="420785f08af913bd32ce54a1c7ee7a11ad12a7dff0025cb85807554eefe6dfcd" Apr 22 19:00:41.845280 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:41.845260 2566 scope.go:117] "RemoveContainer" containerID="a8bec693a47618dbd43c0685e8cf27482e296489fd01a94ef4399069b45c6c08" Apr 22 19:00:41.845533 ip-10-0-130-234 kubenswrapper[2566]: E0422 19:00:41.845516 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8bec693a47618dbd43c0685e8cf27482e296489fd01a94ef4399069b45c6c08\": container with ID starting with a8bec693a47618dbd43c0685e8cf27482e296489fd01a94ef4399069b45c6c08 not found: ID does not exist" containerID="a8bec693a47618dbd43c0685e8cf27482e296489fd01a94ef4399069b45c6c08" Apr 22 19:00:41.845577 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:41.845542 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8bec693a47618dbd43c0685e8cf27482e296489fd01a94ef4399069b45c6c08"} err="failed to get container status \"a8bec693a47618dbd43c0685e8cf27482e296489fd01a94ef4399069b45c6c08\": rpc error: code = NotFound desc = could not find container \"a8bec693a47618dbd43c0685e8cf27482e296489fd01a94ef4399069b45c6c08\": container with ID starting with a8bec693a47618dbd43c0685e8cf27482e296489fd01a94ef4399069b45c6c08 not found: ID does not exist" Apr 22 19:00:41.845577 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:41.845558 2566 scope.go:117] "RemoveContainer" containerID="420785f08af913bd32ce54a1c7ee7a11ad12a7dff0025cb85807554eefe6dfcd" Apr 22 19:00:41.845770 ip-10-0-130-234 kubenswrapper[2566]: E0422 19:00:41.845756 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"420785f08af913bd32ce54a1c7ee7a11ad12a7dff0025cb85807554eefe6dfcd\": container with ID starting with 420785f08af913bd32ce54a1c7ee7a11ad12a7dff0025cb85807554eefe6dfcd not found: ID does not exist" containerID="420785f08af913bd32ce54a1c7ee7a11ad12a7dff0025cb85807554eefe6dfcd" Apr 22 19:00:41.845812 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:41.845772 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"420785f08af913bd32ce54a1c7ee7a11ad12a7dff0025cb85807554eefe6dfcd"} err="failed to get container status \"420785f08af913bd32ce54a1c7ee7a11ad12a7dff0025cb85807554eefe6dfcd\": rpc error: code = NotFound desc = could not find container \"420785f08af913bd32ce54a1c7ee7a11ad12a7dff0025cb85807554eefe6dfcd\": container with ID starting with 420785f08af913bd32ce54a1c7ee7a11ad12a7dff0025cb85807554eefe6dfcd not found: ID does not exist" Apr 22 19:00:41.851463 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:41.851442 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-fnpsg"] Apr 22 19:00:41.853816 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:41.853794 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5f88bfbb4-fnpsg"] Apr 22 19:00:42.831139 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:00:42.831104 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12c7aa67-2267-4d1a-b96f-32c56af9ebe9" path="/var/lib/kubelet/pods/12c7aa67-2267-4d1a-b96f-32c56af9ebe9/volumes" Apr 22 19:01:06.334636 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:01:06.334583 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-hcmqt"] Apr 22 19:01:06.335215 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:01:06.335196 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="12c7aa67-2267-4d1a-b96f-32c56af9ebe9" containerName="storage-initializer" Apr 22 19:01:06.335272 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:01:06.335220 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="12c7aa67-2267-4d1a-b96f-32c56af9ebe9" containerName="storage-initializer" Apr 22 19:01:06.335272 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:01:06.335235 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="12c7aa67-2267-4d1a-b96f-32c56af9ebe9" containerName="main" Apr 22 19:01:06.335272 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:01:06.335244 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="12c7aa67-2267-4d1a-b96f-32c56af9ebe9" containerName="main" Apr 22 19:01:06.335380 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:01:06.335345 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="12c7aa67-2267-4d1a-b96f-32c56af9ebe9" containerName="main" Apr 22 19:01:06.386228 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:01:06.386190 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-hcmqt"] Apr 22 19:01:06.386413 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:01:06.386351 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-hcmqt" Apr 22 19:01:06.388842 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:01:06.388817 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 22 19:01:06.427705 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:01:06.427399 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/97addf7b-b38c-4822-8a1b-7b598fb0019f-home\") pod \"stop-feature-test-kserve-59f568845f-hcmqt\" (UID: \"97addf7b-b38c-4822-8a1b-7b598fb0019f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-hcmqt" Apr 22 19:01:06.427705 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:01:06.427440 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/97addf7b-b38c-4822-8a1b-7b598fb0019f-model-cache\") pod \"stop-feature-test-kserve-59f568845f-hcmqt\" (UID: \"97addf7b-b38c-4822-8a1b-7b598fb0019f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-hcmqt" Apr 22 19:01:06.427705 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:01:06.427470 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgtfm\" (UniqueName: \"kubernetes.io/projected/97addf7b-b38c-4822-8a1b-7b598fb0019f-kube-api-access-sgtfm\") pod \"stop-feature-test-kserve-59f568845f-hcmqt\" (UID: \"97addf7b-b38c-4822-8a1b-7b598fb0019f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-hcmqt" Apr 22 19:01:06.427705 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:01:06.427494 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/97addf7b-b38c-4822-8a1b-7b598fb0019f-dshm\") pod \"stop-feature-test-kserve-59f568845f-hcmqt\" (UID: \"97addf7b-b38c-4822-8a1b-7b598fb0019f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-hcmqt" Apr 22 19:01:06.427705 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:01:06.427525 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/97addf7b-b38c-4822-8a1b-7b598fb0019f-tls-certs\") pod \"stop-feature-test-kserve-59f568845f-hcmqt\" (UID: \"97addf7b-b38c-4822-8a1b-7b598fb0019f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-hcmqt" Apr 22 19:01:06.427705 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:01:06.427615 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/97addf7b-b38c-4822-8a1b-7b598fb0019f-kserve-provision-location\") pod \"stop-feature-test-kserve-59f568845f-hcmqt\" (UID: \"97addf7b-b38c-4822-8a1b-7b598fb0019f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-hcmqt" Apr 22 19:01:06.528836 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:01:06.528800 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/97addf7b-b38c-4822-8a1b-7b598fb0019f-kserve-provision-location\") pod \"stop-feature-test-kserve-59f568845f-hcmqt\" (UID: \"97addf7b-b38c-4822-8a1b-7b598fb0019f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-hcmqt" Apr 22 19:01:06.529048 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:01:06.528863 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/97addf7b-b38c-4822-8a1b-7b598fb0019f-home\") pod \"stop-feature-test-kserve-59f568845f-hcmqt\" (UID: \"97addf7b-b38c-4822-8a1b-7b598fb0019f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-hcmqt" Apr 22 19:01:06.529048 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:01:06.528892 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/97addf7b-b38c-4822-8a1b-7b598fb0019f-model-cache\") pod \"stop-feature-test-kserve-59f568845f-hcmqt\" (UID: \"97addf7b-b38c-4822-8a1b-7b598fb0019f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-hcmqt" Apr 22 19:01:06.529048 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:01:06.528913 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sgtfm\" (UniqueName: \"kubernetes.io/projected/97addf7b-b38c-4822-8a1b-7b598fb0019f-kube-api-access-sgtfm\") pod \"stop-feature-test-kserve-59f568845f-hcmqt\" (UID: \"97addf7b-b38c-4822-8a1b-7b598fb0019f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-hcmqt" Apr 22 19:01:06.529048 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:01:06.528962 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/97addf7b-b38c-4822-8a1b-7b598fb0019f-dshm\") pod \"stop-feature-test-kserve-59f568845f-hcmqt\" (UID: \"97addf7b-b38c-4822-8a1b-7b598fb0019f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-hcmqt" Apr 22 19:01:06.529048 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:01:06.528992 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/97addf7b-b38c-4822-8a1b-7b598fb0019f-tls-certs\") pod \"stop-feature-test-kserve-59f568845f-hcmqt\" (UID: \"97addf7b-b38c-4822-8a1b-7b598fb0019f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-hcmqt" Apr 22 19:01:06.529320 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:01:06.529233 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/97addf7b-b38c-4822-8a1b-7b598fb0019f-kserve-provision-location\") pod \"stop-feature-test-kserve-59f568845f-hcmqt\" (UID: \"97addf7b-b38c-4822-8a1b-7b598fb0019f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-hcmqt" Apr 22 19:01:06.529572 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:01:06.529533 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/97addf7b-b38c-4822-8a1b-7b598fb0019f-model-cache\") pod \"stop-feature-test-kserve-59f568845f-hcmqt\" (UID: \"97addf7b-b38c-4822-8a1b-7b598fb0019f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-hcmqt" Apr 22 19:01:06.529774 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:01:06.529751 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/97addf7b-b38c-4822-8a1b-7b598fb0019f-home\") pod \"stop-feature-test-kserve-59f568845f-hcmqt\" (UID: \"97addf7b-b38c-4822-8a1b-7b598fb0019f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-hcmqt" Apr 22 19:01:06.531549 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:01:06.531528 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/97addf7b-b38c-4822-8a1b-7b598fb0019f-dshm\") pod \"stop-feature-test-kserve-59f568845f-hcmqt\" (UID: \"97addf7b-b38c-4822-8a1b-7b598fb0019f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-hcmqt" Apr 22 19:01:06.532207 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:01:06.532184 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/97addf7b-b38c-4822-8a1b-7b598fb0019f-tls-certs\") pod \"stop-feature-test-kserve-59f568845f-hcmqt\" (UID: \"97addf7b-b38c-4822-8a1b-7b598fb0019f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-hcmqt" Apr 22 19:01:06.541423 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:01:06.541398 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgtfm\" (UniqueName: \"kubernetes.io/projected/97addf7b-b38c-4822-8a1b-7b598fb0019f-kube-api-access-sgtfm\") pod \"stop-feature-test-kserve-59f568845f-hcmqt\" (UID: \"97addf7b-b38c-4822-8a1b-7b598fb0019f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-hcmqt" Apr 22 19:01:06.698899 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:01:06.698847 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-hcmqt" Apr 22 19:01:14.487388 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:01:14.487361 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-hcmqt"] Apr 22 19:01:14.516507 ip-10-0-130-234 kubenswrapper[2566]: W0422 19:01:14.516477 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97addf7b_b38c_4822_8a1b_7b598fb0019f.slice/crio-02b251d142b4deb62e825da8dfdb432dc7b3c43daf8cccc06cb2fdb95f7f0af2 WatchSource:0}: Error finding container 02b251d142b4deb62e825da8dfdb432dc7b3c43daf8cccc06cb2fdb95f7f0af2: Status 404 returned error can't find the container with id 02b251d142b4deb62e825da8dfdb432dc7b3c43daf8cccc06cb2fdb95f7f0af2 Apr 22 19:01:14.962692 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:01:14.962600 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8" event={"ID":"173b13fc-b954-46a7-bf54-f12ad8319d0b","Type":"ContainerStarted","Data":"9d2a5794dfe0ca1bb3b91297e0b70f32383d84c87b74d79d79efc1a64688722c"} Apr 22 19:01:14.964319 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:01:14.964279 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-hcmqt" event={"ID":"97addf7b-b38c-4822-8a1b-7b598fb0019f","Type":"ContainerStarted","Data":"bc6644e6073b8e9c7bca8b09707aac588175c7f0905188cb362fe392a76eb29d"} Apr 22 19:01:14.964319 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:01:14.964317 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-hcmqt" event={"ID":"97addf7b-b38c-4822-8a1b-7b598fb0019f","Type":"ContainerStarted","Data":"02b251d142b4deb62e825da8dfdb432dc7b3c43daf8cccc06cb2fdb95f7f0af2"} Apr 22 19:01:14.982431 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:01:14.982380 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8" podStartSLOduration=6.136271691 podStartE2EDuration="52.982365622s" podCreationTimestamp="2026-04-22 19:00:22 +0000 UTC" firstStartedPulling="2026-04-22 19:00:27.764822463 +0000 UTC m=+829.543655926" lastFinishedPulling="2026-04-22 19:01:14.610916381 +0000 UTC m=+876.389749857" observedRunningTime="2026-04-22 19:01:14.980921002 +0000 UTC m=+876.759754488" watchObservedRunningTime="2026-04-22 19:01:14.982365622 +0000 UTC m=+876.761199107" Apr 22 19:01:19.986311 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:01:19.986275 2566 generic.go:358] "Generic (PLEG): container finished" podID="97addf7b-b38c-4822-8a1b-7b598fb0019f" containerID="bc6644e6073b8e9c7bca8b09707aac588175c7f0905188cb362fe392a76eb29d" exitCode=0 Apr 22 19:01:19.986798 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:01:19.986350 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-hcmqt" event={"ID":"97addf7b-b38c-4822-8a1b-7b598fb0019f","Type":"ContainerDied","Data":"bc6644e6073b8e9c7bca8b09707aac588175c7f0905188cb362fe392a76eb29d"} Apr 22 19:01:20.992836 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:01:20.992797 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-hcmqt" event={"ID":"97addf7b-b38c-4822-8a1b-7b598fb0019f","Type":"ContainerStarted","Data":"f553e4a9266b7ac79a789c0aebaab8e476703b100e2bf43c636472c9d60e784d"} Apr 22 19:01:21.011738 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:01:21.011675 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-hcmqt" podStartSLOduration=15.01165367 podStartE2EDuration="15.01165367s" podCreationTimestamp="2026-04-22 19:01:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:01:21.009377121 +0000 UTC m=+882.788210612" watchObservedRunningTime="2026-04-22 19:01:21.01165367 +0000 UTC m=+882.790487160" Apr 22 19:01:23.279896 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:01:23.279843 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8" Apr 22 19:01:23.279896 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:01:23.279899 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8" Apr 22 19:01:23.281581 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:01:23.281518 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8" podUID="173b13fc-b954-46a7-bf54-f12ad8319d0b" containerName="main" probeResult="failure" output="Get \"https://10.132.0.48:8000/health\": dial tcp 10.132.0.48:8000: connect: connection refused" Apr 22 19:01:26.699781 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:01:26.699733 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-hcmqt" Apr 22 19:01:26.700265 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:01:26.699795 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-hcmqt" Apr 22 19:01:26.701279 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:01:26.701251 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-hcmqt" podUID="97addf7b-b38c-4822-8a1b-7b598fb0019f" containerName="main" probeResult="failure" output="Get \"https://10.132.0.49:8000/health\": dial tcp 10.132.0.49:8000: connect: connection refused" Apr 22 19:01:33.280226 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:01:33.280133 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8" podUID="173b13fc-b954-46a7-bf54-f12ad8319d0b" containerName="main" probeResult="failure" output="Get \"https://10.132.0.48:8000/health\": dial tcp 10.132.0.48:8000: connect: connection refused" Apr 22 19:01:36.699834 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:01:36.699777 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-hcmqt" podUID="97addf7b-b38c-4822-8a1b-7b598fb0019f" containerName="main" probeResult="failure" output="Get \"https://10.132.0.49:8000/health\": dial tcp 10.132.0.49:8000: connect: connection refused" Apr 22 19:01:38.765230 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:01:38.765200 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4jzwx_4871a747-c041-430f-9c1a-3ab2ce6d0156/console-operator/1.log" Apr 22 19:01:38.771637 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:01:38.771617 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4jzwx_4871a747-c041-430f-9c1a-3ab2ce6d0156/console-operator/1.log" Apr 22 19:01:43.280118 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:01:43.280076 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8" podUID="173b13fc-b954-46a7-bf54-f12ad8319d0b" containerName="main" probeResult="failure" output="Get \"https://10.132.0.48:8000/health\": dial tcp 10.132.0.48:8000: connect: connection refused" Apr 22 19:01:46.699373 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:01:46.699318 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-hcmqt" podUID="97addf7b-b38c-4822-8a1b-7b598fb0019f" containerName="main" probeResult="failure" output="Get \"https://10.132.0.49:8000/health\": dial tcp 10.132.0.49:8000: connect: connection refused" Apr 22 19:01:53.280496 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:01:53.280438 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8" podUID="173b13fc-b954-46a7-bf54-f12ad8319d0b" containerName="main" probeResult="failure" output="Get \"https://10.132.0.48:8000/health\": dial tcp 10.132.0.48:8000: connect: connection refused" Apr 22 19:01:56.699765 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:01:56.699706 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-hcmqt" podUID="97addf7b-b38c-4822-8a1b-7b598fb0019f" containerName="main" probeResult="failure" output="Get \"https://10.132.0.49:8000/health\": dial tcp 10.132.0.49:8000: connect: connection refused" Apr 22 19:02:03.280366 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:02:03.280311 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8" podUID="173b13fc-b954-46a7-bf54-f12ad8319d0b" containerName="main" probeResult="failure" output="Get \"https://10.132.0.48:8000/health\": dial tcp 10.132.0.48:8000: connect: connection refused" Apr 22 19:02:06.700063 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:02:06.700005 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-hcmqt" podUID="97addf7b-b38c-4822-8a1b-7b598fb0019f" containerName="main" probeResult="failure" output="Get \"https://10.132.0.49:8000/health\": dial tcp 10.132.0.49:8000: connect: connection refused" Apr 22 19:02:13.280344 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:02:13.280301 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8" podUID="173b13fc-b954-46a7-bf54-f12ad8319d0b" containerName="main" probeResult="failure" output="Get \"https://10.132.0.48:8000/health\": dial tcp 10.132.0.48:8000: connect: connection refused" Apr 22 19:02:16.699623 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:02:16.699571 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-hcmqt" podUID="97addf7b-b38c-4822-8a1b-7b598fb0019f" containerName="main" probeResult="failure" output="Get \"https://10.132.0.49:8000/health\": dial tcp 10.132.0.49:8000: connect: connection refused" Apr 22 19:02:23.280320 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:02:23.280279 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8" podUID="173b13fc-b954-46a7-bf54-f12ad8319d0b" containerName="main" probeResult="failure" output="Get \"https://10.132.0.48:8000/health\": dial tcp 10.132.0.48:8000: connect: connection refused" Apr 22 19:02:26.699843 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:02:26.699797 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-hcmqt" podUID="97addf7b-b38c-4822-8a1b-7b598fb0019f" containerName="main" probeResult="failure" output="Get \"https://10.132.0.49:8000/health\": dial tcp 10.132.0.49:8000: connect: connection refused" Apr 22 19:02:33.280682 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:02:33.280634 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8" podUID="173b13fc-b954-46a7-bf54-f12ad8319d0b" containerName="main" probeResult="failure" output="Get \"https://10.132.0.48:8000/health\": dial tcp 10.132.0.48:8000: connect: connection refused" Apr 22 19:02:36.699455 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:02:36.699407 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-hcmqt" podUID="97addf7b-b38c-4822-8a1b-7b598fb0019f" containerName="main" probeResult="failure" output="Get \"https://10.132.0.49:8000/health\": dial tcp 10.132.0.49:8000: connect: connection refused" Apr 22 19:02:43.280475 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:02:43.280434 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8" podUID="173b13fc-b954-46a7-bf54-f12ad8319d0b" containerName="main" probeResult="failure" output="Get \"https://10.132.0.48:8000/health\": dial tcp 10.132.0.48:8000: connect: connection refused" Apr 22 19:02:46.699552 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:02:46.699504 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-hcmqt" podUID="97addf7b-b38c-4822-8a1b-7b598fb0019f" containerName="main" probeResult="failure" output="Get \"https://10.132.0.49:8000/health\": dial tcp 10.132.0.49:8000: connect: connection refused" Apr 22 19:02:53.280774 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:02:53.280729 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8" podUID="173b13fc-b954-46a7-bf54-f12ad8319d0b" containerName="main" probeResult="failure" output="Get \"https://10.132.0.48:8000/health\": dial tcp 10.132.0.48:8000: connect: connection refused" Apr 22 19:02:56.699726 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:02:56.699685 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-hcmqt" podUID="97addf7b-b38c-4822-8a1b-7b598fb0019f" containerName="main" probeResult="failure" output="Get \"https://10.132.0.49:8000/health\": dial tcp 10.132.0.49:8000: connect: connection refused" Apr 22 19:03:03.280504 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:03.280402 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8" podUID="173b13fc-b954-46a7-bf54-f12ad8319d0b" containerName="main" probeResult="failure" output="Get \"https://10.132.0.48:8000/health\": dial tcp 10.132.0.48:8000: connect: connection refused" Apr 22 19:03:06.699765 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:06.699718 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-hcmqt" podUID="97addf7b-b38c-4822-8a1b-7b598fb0019f" containerName="main" probeResult="failure" output="Get \"https://10.132.0.49:8000/health\": dial tcp 10.132.0.49:8000: connect: connection refused" Apr 22 19:03:13.280626 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:13.280574 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8" podUID="173b13fc-b954-46a7-bf54-f12ad8319d0b" containerName="main" probeResult="failure" output="Get \"https://10.132.0.48:8000/health\": dial tcp 10.132.0.48:8000: connect: connection refused" Apr 22 19:03:16.699725 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:16.699679 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-hcmqt" podUID="97addf7b-b38c-4822-8a1b-7b598fb0019f" containerName="main" probeResult="failure" output="Get \"https://10.132.0.49:8000/health\": dial tcp 10.132.0.49:8000: connect: connection refused" Apr 22 19:03:23.296137 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:23.296106 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8" Apr 22 19:03:23.304026 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:23.304002 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8" Apr 22 19:03:26.708889 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:26.708856 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-hcmqt" Apr 22 19:03:26.716696 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:26.716662 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-hcmqt" Apr 22 19:03:27.509685 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:27.509652 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-hcmqt"] Apr 22 19:03:27.602248 ip-10-0-130-234 kubenswrapper[2566]: E0422 19:03:27.602208 2566 secret.go:189] Couldn't get secret kserve-ci-e2e-test/stop-feature-test-kserve-self-signed-certs: secret "stop-feature-test-kserve-self-signed-certs" not found Apr 22 19:03:27.602430 ip-10-0-130-234 kubenswrapper[2566]: E0422 19:03:27.602303 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97addf7b-b38c-4822-8a1b-7b598fb0019f-tls-certs podName:97addf7b-b38c-4822-8a1b-7b598fb0019f nodeName:}" failed. No retries permitted until 2026-04-22 19:03:28.10227987 +0000 UTC m=+1009.881113338 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/97addf7b-b38c-4822-8a1b-7b598fb0019f-tls-certs") pod "stop-feature-test-kserve-59f568845f-hcmqt" (UID: "97addf7b-b38c-4822-8a1b-7b598fb0019f") : secret "stop-feature-test-kserve-self-signed-certs" not found Apr 22 19:03:28.106986 ip-10-0-130-234 kubenswrapper[2566]: E0422 19:03:28.106927 2566 secret.go:189] Couldn't get secret kserve-ci-e2e-test/stop-feature-test-kserve-self-signed-certs: secret "stop-feature-test-kserve-self-signed-certs" not found Apr 22 19:03:28.107369 ip-10-0-130-234 kubenswrapper[2566]: E0422 19:03:28.107031 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97addf7b-b38c-4822-8a1b-7b598fb0019f-tls-certs podName:97addf7b-b38c-4822-8a1b-7b598fb0019f nodeName:}" failed. No retries permitted until 2026-04-22 19:03:29.10701309 +0000 UTC m=+1010.885846557 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/97addf7b-b38c-4822-8a1b-7b598fb0019f-tls-certs") pod "stop-feature-test-kserve-59f568845f-hcmqt" (UID: "97addf7b-b38c-4822-8a1b-7b598fb0019f") : secret "stop-feature-test-kserve-self-signed-certs" not found Apr 22 19:03:28.504891 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:28.504835 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-hcmqt" podUID="97addf7b-b38c-4822-8a1b-7b598fb0019f" containerName="main" containerID="cri-o://f553e4a9266b7ac79a789c0aebaab8e476703b100e2bf43c636472c9d60e784d" gracePeriod=30 Apr 22 19:03:29.114059 ip-10-0-130-234 kubenswrapper[2566]: E0422 19:03:29.114025 2566 secret.go:189] Couldn't get secret kserve-ci-e2e-test/stop-feature-test-kserve-self-signed-certs: secret "stop-feature-test-kserve-self-signed-certs" not found Apr 22 19:03:29.114428 ip-10-0-130-234 kubenswrapper[2566]: E0422 19:03:29.114094 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97addf7b-b38c-4822-8a1b-7b598fb0019f-tls-certs podName:97addf7b-b38c-4822-8a1b-7b598fb0019f nodeName:}" failed. No retries permitted until 2026-04-22 19:03:31.114078323 +0000 UTC m=+1012.892911787 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/97addf7b-b38c-4822-8a1b-7b598fb0019f-tls-certs") pod "stop-feature-test-kserve-59f568845f-hcmqt" (UID: "97addf7b-b38c-4822-8a1b-7b598fb0019f") : secret "stop-feature-test-kserve-self-signed-certs" not found Apr 22 19:03:31.132584 ip-10-0-130-234 kubenswrapper[2566]: E0422 19:03:31.132553 2566 secret.go:189] Couldn't get secret kserve-ci-e2e-test/stop-feature-test-kserve-self-signed-certs: secret "stop-feature-test-kserve-self-signed-certs" not found Apr 22 19:03:31.132974 ip-10-0-130-234 kubenswrapper[2566]: E0422 19:03:31.132648 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97addf7b-b38c-4822-8a1b-7b598fb0019f-tls-certs podName:97addf7b-b38c-4822-8a1b-7b598fb0019f nodeName:}" failed. No retries permitted until 2026-04-22 19:03:35.132633093 +0000 UTC m=+1016.911466557 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/97addf7b-b38c-4822-8a1b-7b598fb0019f-tls-certs") pod "stop-feature-test-kserve-59f568845f-hcmqt" (UID: "97addf7b-b38c-4822-8a1b-7b598fb0019f") : secret "stop-feature-test-kserve-self-signed-certs" not found Apr 22 19:03:32.737648 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:32.737613 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8"] Apr 22 19:03:32.738063 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:32.737920 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8" podUID="173b13fc-b954-46a7-bf54-f12ad8319d0b" containerName="main" containerID="cri-o://9d2a5794dfe0ca1bb3b91297e0b70f32383d84c87b74d79d79efc1a64688722c" gracePeriod=30 Apr 22 19:03:35.171266 ip-10-0-130-234 kubenswrapper[2566]: E0422 19:03:35.171229 2566 secret.go:189] Couldn't get secret kserve-ci-e2e-test/stop-feature-test-kserve-self-signed-certs: secret "stop-feature-test-kserve-self-signed-certs" not found Apr 22 19:03:35.171724 ip-10-0-130-234 kubenswrapper[2566]: E0422 19:03:35.171305 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97addf7b-b38c-4822-8a1b-7b598fb0019f-tls-certs podName:97addf7b-b38c-4822-8a1b-7b598fb0019f nodeName:}" failed. No retries permitted until 2026-04-22 19:03:43.171285607 +0000 UTC m=+1024.950119071 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/97addf7b-b38c-4822-8a1b-7b598fb0019f-tls-certs") pod "stop-feature-test-kserve-59f568845f-hcmqt" (UID: "97addf7b-b38c-4822-8a1b-7b598fb0019f") : secret "stop-feature-test-kserve-self-signed-certs" not found Apr 22 19:03:43.244866 ip-10-0-130-234 kubenswrapper[2566]: E0422 19:03:43.244825 2566 secret.go:189] Couldn't get secret kserve-ci-e2e-test/stop-feature-test-kserve-self-signed-certs: secret "stop-feature-test-kserve-self-signed-certs" not found Apr 22 19:03:43.245348 ip-10-0-130-234 kubenswrapper[2566]: E0422 19:03:43.244921 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97addf7b-b38c-4822-8a1b-7b598fb0019f-tls-certs podName:97addf7b-b38c-4822-8a1b-7b598fb0019f nodeName:}" failed. No retries permitted until 2026-04-22 19:03:59.244899904 +0000 UTC m=+1041.023733384 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/97addf7b-b38c-4822-8a1b-7b598fb0019f-tls-certs") pod "stop-feature-test-kserve-59f568845f-hcmqt" (UID: "97addf7b-b38c-4822-8a1b-7b598fb0019f") : secret "stop-feature-test-kserve-self-signed-certs" not found Apr 22 19:03:58.622767 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:58.622734 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-59f568845f-hcmqt_97addf7b-b38c-4822-8a1b-7b598fb0019f/main/0.log" Apr 22 19:03:58.623192 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:58.623165 2566 generic.go:358] "Generic (PLEG): container finished" podID="97addf7b-b38c-4822-8a1b-7b598fb0019f" containerID="f553e4a9266b7ac79a789c0aebaab8e476703b100e2bf43c636472c9d60e784d" exitCode=137 Apr 22 19:03:58.623253 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:58.623232 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-hcmqt" event={"ID":"97addf7b-b38c-4822-8a1b-7b598fb0019f","Type":"ContainerDied","Data":"f553e4a9266b7ac79a789c0aebaab8e476703b100e2bf43c636472c9d60e784d"} Apr 22 19:03:58.719294 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:58.719273 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-59f568845f-hcmqt_97addf7b-b38c-4822-8a1b-7b598fb0019f/main/0.log" Apr 22 19:03:58.719636 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:58.719619 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-hcmqt" Apr 22 19:03:58.885508 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:58.885471 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/97addf7b-b38c-4822-8a1b-7b598fb0019f-tls-certs\") pod \"97addf7b-b38c-4822-8a1b-7b598fb0019f\" (UID: \"97addf7b-b38c-4822-8a1b-7b598fb0019f\") " Apr 22 19:03:58.885696 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:58.885545 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/97addf7b-b38c-4822-8a1b-7b598fb0019f-home\") pod \"97addf7b-b38c-4822-8a1b-7b598fb0019f\" (UID: \"97addf7b-b38c-4822-8a1b-7b598fb0019f\") " Apr 22 19:03:58.885696 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:58.885581 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/97addf7b-b38c-4822-8a1b-7b598fb0019f-model-cache\") pod \"97addf7b-b38c-4822-8a1b-7b598fb0019f\" (UID: \"97addf7b-b38c-4822-8a1b-7b598fb0019f\") " Apr 22 19:03:58.885696 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:58.885605 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/97addf7b-b38c-4822-8a1b-7b598fb0019f-dshm\") pod \"97addf7b-b38c-4822-8a1b-7b598fb0019f\" (UID: \"97addf7b-b38c-4822-8a1b-7b598fb0019f\") " Apr 22 19:03:58.885696 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:58.885661 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/97addf7b-b38c-4822-8a1b-7b598fb0019f-kserve-provision-location\") pod \"97addf7b-b38c-4822-8a1b-7b598fb0019f\" (UID: \"97addf7b-b38c-4822-8a1b-7b598fb0019f\") " Apr 22 19:03:58.885905 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:58.885738 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgtfm\" (UniqueName: \"kubernetes.io/projected/97addf7b-b38c-4822-8a1b-7b598fb0019f-kube-api-access-sgtfm\") pod \"97addf7b-b38c-4822-8a1b-7b598fb0019f\" (UID: \"97addf7b-b38c-4822-8a1b-7b598fb0019f\") " Apr 22 19:03:58.885982 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:58.885865 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97addf7b-b38c-4822-8a1b-7b598fb0019f-model-cache" (OuterVolumeSpecName: "model-cache") pod "97addf7b-b38c-4822-8a1b-7b598fb0019f" (UID: "97addf7b-b38c-4822-8a1b-7b598fb0019f"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:03:58.886032 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:58.886019 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97addf7b-b38c-4822-8a1b-7b598fb0019f-home" (OuterVolumeSpecName: "home") pod "97addf7b-b38c-4822-8a1b-7b598fb0019f" (UID: "97addf7b-b38c-4822-8a1b-7b598fb0019f"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:03:58.886103 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:58.886041 2566 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/97addf7b-b38c-4822-8a1b-7b598fb0019f-model-cache\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:03:58.887849 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:58.887819 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97addf7b-b38c-4822-8a1b-7b598fb0019f-kube-api-access-sgtfm" (OuterVolumeSpecName: "kube-api-access-sgtfm") pod "97addf7b-b38c-4822-8a1b-7b598fb0019f" (UID: "97addf7b-b38c-4822-8a1b-7b598fb0019f"). InnerVolumeSpecName "kube-api-access-sgtfm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:03:58.887980 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:58.887897 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97addf7b-b38c-4822-8a1b-7b598fb0019f-dshm" (OuterVolumeSpecName: "dshm") pod "97addf7b-b38c-4822-8a1b-7b598fb0019f" (UID: "97addf7b-b38c-4822-8a1b-7b598fb0019f"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:03:58.888117 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:58.888095 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97addf7b-b38c-4822-8a1b-7b598fb0019f-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "97addf7b-b38c-4822-8a1b-7b598fb0019f" (UID: "97addf7b-b38c-4822-8a1b-7b598fb0019f"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:03:58.941902 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:58.941869 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97addf7b-b38c-4822-8a1b-7b598fb0019f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "97addf7b-b38c-4822-8a1b-7b598fb0019f" (UID: "97addf7b-b38c-4822-8a1b-7b598fb0019f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:03:58.986898 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:58.986871 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/97addf7b-b38c-4822-8a1b-7b598fb0019f-kserve-provision-location\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:03:58.986898 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:58.986894 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sgtfm\" (UniqueName: \"kubernetes.io/projected/97addf7b-b38c-4822-8a1b-7b598fb0019f-kube-api-access-sgtfm\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:03:58.986898 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:58.986904 2566 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/97addf7b-b38c-4822-8a1b-7b598fb0019f-tls-certs\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:03:58.987126 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:58.986914 2566 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/97addf7b-b38c-4822-8a1b-7b598fb0019f-home\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:03:58.987126 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:58.986922 2566 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/97addf7b-b38c-4822-8a1b-7b598fb0019f-dshm\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:03:59.430101 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:59.430068 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-rpmz9"] Apr 22 19:03:59.430433 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:59.430420 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="97addf7b-b38c-4822-8a1b-7b598fb0019f" containerName="storage-initializer" Apr 22 19:03:59.430479 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:59.430435 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="97addf7b-b38c-4822-8a1b-7b598fb0019f" containerName="storage-initializer" Apr 22 19:03:59.430479 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:59.430453 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="97addf7b-b38c-4822-8a1b-7b598fb0019f" containerName="main" Apr 22 19:03:59.430479 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:59.430459 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="97addf7b-b38c-4822-8a1b-7b598fb0019f" containerName="main" Apr 22 19:03:59.430579 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:59.430528 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="97addf7b-b38c-4822-8a1b-7b598fb0019f" containerName="main" Apr 22 19:03:59.433818 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:59.433800 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-rpmz9" Apr 22 19:03:59.445328 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:59.445303 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-rpmz9"] Apr 22 19:03:59.593140 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:59.593093 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fc4f9d14-6397-45c2-a3c1-f515114ba3c2-kserve-provision-location\") pod \"stop-feature-test-kserve-59f568845f-rpmz9\" (UID: \"fc4f9d14-6397-45c2-a3c1-f515114ba3c2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-rpmz9" Apr 22 19:03:59.593140 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:59.593140 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fg4dv\" (UniqueName: \"kubernetes.io/projected/fc4f9d14-6397-45c2-a3c1-f515114ba3c2-kube-api-access-fg4dv\") pod \"stop-feature-test-kserve-59f568845f-rpmz9\" (UID: \"fc4f9d14-6397-45c2-a3c1-f515114ba3c2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-rpmz9" Apr 22 19:03:59.593351 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:59.593162 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fc4f9d14-6397-45c2-a3c1-f515114ba3c2-model-cache\") pod \"stop-feature-test-kserve-59f568845f-rpmz9\" (UID: \"fc4f9d14-6397-45c2-a3c1-f515114ba3c2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-rpmz9" Apr 22 19:03:59.593351 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:59.593218 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fc4f9d14-6397-45c2-a3c1-f515114ba3c2-dshm\") pod \"stop-feature-test-kserve-59f568845f-rpmz9\" (UID: \"fc4f9d14-6397-45c2-a3c1-f515114ba3c2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-rpmz9" Apr 22 19:03:59.593351 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:59.593277 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fc4f9d14-6397-45c2-a3c1-f515114ba3c2-home\") pod \"stop-feature-test-kserve-59f568845f-rpmz9\" (UID: \"fc4f9d14-6397-45c2-a3c1-f515114ba3c2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-rpmz9" Apr 22 19:03:59.593351 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:59.593306 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fc4f9d14-6397-45c2-a3c1-f515114ba3c2-tls-certs\") pod \"stop-feature-test-kserve-59f568845f-rpmz9\" (UID: \"fc4f9d14-6397-45c2-a3c1-f515114ba3c2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-rpmz9" Apr 22 19:03:59.629078 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:59.629053 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-59f568845f-hcmqt_97addf7b-b38c-4822-8a1b-7b598fb0019f/main/0.log" Apr 22 19:03:59.629467 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:59.629403 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-hcmqt" event={"ID":"97addf7b-b38c-4822-8a1b-7b598fb0019f","Type":"ContainerDied","Data":"02b251d142b4deb62e825da8dfdb432dc7b3c43daf8cccc06cb2fdb95f7f0af2"} Apr 22 19:03:59.629467 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:59.629442 2566 scope.go:117] "RemoveContainer" containerID="f553e4a9266b7ac79a789c0aebaab8e476703b100e2bf43c636472c9d60e784d" Apr 22 19:03:59.629552 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:59.629464 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-hcmqt" Apr 22 19:03:59.651442 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:59.651412 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-hcmqt"] Apr 22 19:03:59.652099 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:59.652071 2566 scope.go:117] "RemoveContainer" containerID="bc6644e6073b8e9c7bca8b09707aac588175c7f0905188cb362fe392a76eb29d" Apr 22 19:03:59.666383 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:59.666350 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-hcmqt"] Apr 22 19:03:59.693899 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:59.693823 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fc4f9d14-6397-45c2-a3c1-f515114ba3c2-home\") pod \"stop-feature-test-kserve-59f568845f-rpmz9\" (UID: \"fc4f9d14-6397-45c2-a3c1-f515114ba3c2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-rpmz9" Apr 22 19:03:59.693899 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:59.693873 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fc4f9d14-6397-45c2-a3c1-f515114ba3c2-tls-certs\") pod \"stop-feature-test-kserve-59f568845f-rpmz9\" (UID: \"fc4f9d14-6397-45c2-a3c1-f515114ba3c2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-rpmz9" Apr 22 19:03:59.694116 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:59.693909 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fc4f9d14-6397-45c2-a3c1-f515114ba3c2-kserve-provision-location\") pod \"stop-feature-test-kserve-59f568845f-rpmz9\" (UID: \"fc4f9d14-6397-45c2-a3c1-f515114ba3c2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-rpmz9" Apr 22 19:03:59.694116 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:59.693927 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fg4dv\" (UniqueName: \"kubernetes.io/projected/fc4f9d14-6397-45c2-a3c1-f515114ba3c2-kube-api-access-fg4dv\") pod \"stop-feature-test-kserve-59f568845f-rpmz9\" (UID: \"fc4f9d14-6397-45c2-a3c1-f515114ba3c2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-rpmz9" Apr 22 19:03:59.694116 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:59.693962 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fc4f9d14-6397-45c2-a3c1-f515114ba3c2-model-cache\") pod \"stop-feature-test-kserve-59f568845f-rpmz9\" (UID: \"fc4f9d14-6397-45c2-a3c1-f515114ba3c2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-rpmz9" Apr 22 19:03:59.694116 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:59.694022 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fc4f9d14-6397-45c2-a3c1-f515114ba3c2-dshm\") pod \"stop-feature-test-kserve-59f568845f-rpmz9\" (UID: \"fc4f9d14-6397-45c2-a3c1-f515114ba3c2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-rpmz9" Apr 22 19:03:59.694450 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:59.694278 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fc4f9d14-6397-45c2-a3c1-f515114ba3c2-home\") pod \"stop-feature-test-kserve-59f568845f-rpmz9\" (UID: \"fc4f9d14-6397-45c2-a3c1-f515114ba3c2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-rpmz9" Apr 22 19:03:59.694450 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:59.694326 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fc4f9d14-6397-45c2-a3c1-f515114ba3c2-kserve-provision-location\") pod \"stop-feature-test-kserve-59f568845f-rpmz9\" (UID: \"fc4f9d14-6397-45c2-a3c1-f515114ba3c2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-rpmz9" Apr 22 19:03:59.694450 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:59.694358 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fc4f9d14-6397-45c2-a3c1-f515114ba3c2-model-cache\") pod \"stop-feature-test-kserve-59f568845f-rpmz9\" (UID: \"fc4f9d14-6397-45c2-a3c1-f515114ba3c2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-rpmz9" Apr 22 19:03:59.696244 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:59.696225 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fc4f9d14-6397-45c2-a3c1-f515114ba3c2-dshm\") pod \"stop-feature-test-kserve-59f568845f-rpmz9\" (UID: \"fc4f9d14-6397-45c2-a3c1-f515114ba3c2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-rpmz9" Apr 22 19:03:59.696418 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:59.696401 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fc4f9d14-6397-45c2-a3c1-f515114ba3c2-tls-certs\") pod \"stop-feature-test-kserve-59f568845f-rpmz9\" (UID: \"fc4f9d14-6397-45c2-a3c1-f515114ba3c2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-rpmz9" Apr 22 19:03:59.701326 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:59.701301 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fg4dv\" (UniqueName: \"kubernetes.io/projected/fc4f9d14-6397-45c2-a3c1-f515114ba3c2-kube-api-access-fg4dv\") pod \"stop-feature-test-kserve-59f568845f-rpmz9\" (UID: \"fc4f9d14-6397-45c2-a3c1-f515114ba3c2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-rpmz9" Apr 22 19:03:59.744551 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:59.744527 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-rpmz9" Apr 22 19:03:59.866722 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:59.866692 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-rpmz9"] Apr 22 19:03:59.870017 ip-10-0-130-234 kubenswrapper[2566]: W0422 19:03:59.869990 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc4f9d14_6397_45c2_a3c1_f515114ba3c2.slice/crio-59b0cecd92f0aa9ee8d4c12757056522dda173ddd39ce166457ff8b4958b2f1f WatchSource:0}: Error finding container 59b0cecd92f0aa9ee8d4c12757056522dda173ddd39ce166457ff8b4958b2f1f: Status 404 returned error can't find the container with id 59b0cecd92f0aa9ee8d4c12757056522dda173ddd39ce166457ff8b4958b2f1f Apr 22 19:03:59.871823 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:03:59.871807 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:04:00.636582 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:04:00.636546 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-rpmz9" event={"ID":"fc4f9d14-6397-45c2-a3c1-f515114ba3c2","Type":"ContainerStarted","Data":"fea553e60d90453285ab9bb8d92cecb316bf43e4735a1ad034189148f4e8843d"} Apr 22 19:04:00.636582 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:04:00.636586 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-rpmz9" event={"ID":"fc4f9d14-6397-45c2-a3c1-f515114ba3c2","Type":"ContainerStarted","Data":"59b0cecd92f0aa9ee8d4c12757056522dda173ddd39ce166457ff8b4958b2f1f"} Apr 22 19:04:00.831337 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:04:00.831300 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97addf7b-b38c-4822-8a1b-7b598fb0019f" path="/var/lib/kubelet/pods/97addf7b-b38c-4822-8a1b-7b598fb0019f/volumes" Apr 22 19:04:03.004391 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:04:03.004270 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8_173b13fc-b954-46a7-bf54-f12ad8319d0b/main/0.log" Apr 22 19:04:03.004861 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:04:03.004681 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8" Apr 22 19:04:03.128448 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:04:03.128394 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/173b13fc-b954-46a7-bf54-f12ad8319d0b-home\") pod \"173b13fc-b954-46a7-bf54-f12ad8319d0b\" (UID: \"173b13fc-b954-46a7-bf54-f12ad8319d0b\") " Apr 22 19:04:03.128448 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:04:03.128449 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/173b13fc-b954-46a7-bf54-f12ad8319d0b-dshm\") pod \"173b13fc-b954-46a7-bf54-f12ad8319d0b\" (UID: \"173b13fc-b954-46a7-bf54-f12ad8319d0b\") " Apr 22 19:04:03.128742 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:04:03.128525 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/173b13fc-b954-46a7-bf54-f12ad8319d0b-tls-certs\") pod \"173b13fc-b954-46a7-bf54-f12ad8319d0b\" (UID: \"173b13fc-b954-46a7-bf54-f12ad8319d0b\") " Apr 22 19:04:03.128742 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:04:03.128576 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8kr9\" (UniqueName: \"kubernetes.io/projected/173b13fc-b954-46a7-bf54-f12ad8319d0b-kube-api-access-p8kr9\") pod \"173b13fc-b954-46a7-bf54-f12ad8319d0b\" (UID: \"173b13fc-b954-46a7-bf54-f12ad8319d0b\") " Apr 22 19:04:03.128742 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:04:03.128624 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/173b13fc-b954-46a7-bf54-f12ad8319d0b-kserve-provision-location\") pod \"173b13fc-b954-46a7-bf54-f12ad8319d0b\" (UID: \"173b13fc-b954-46a7-bf54-f12ad8319d0b\") " Apr 22 19:04:03.128742 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:04:03.128648 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/173b13fc-b954-46a7-bf54-f12ad8319d0b-model-cache\") pod \"173b13fc-b954-46a7-bf54-f12ad8319d0b\" (UID: \"173b13fc-b954-46a7-bf54-f12ad8319d0b\") " Apr 22 19:04:03.129023 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:04:03.128808 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/173b13fc-b954-46a7-bf54-f12ad8319d0b-home" (OuterVolumeSpecName: "home") pod "173b13fc-b954-46a7-bf54-f12ad8319d0b" (UID: "173b13fc-b954-46a7-bf54-f12ad8319d0b"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:04:03.129023 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:04:03.128983 2566 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/173b13fc-b954-46a7-bf54-f12ad8319d0b-home\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:04:03.129262 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:04:03.129092 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/173b13fc-b954-46a7-bf54-f12ad8319d0b-model-cache" (OuterVolumeSpecName: "model-cache") pod "173b13fc-b954-46a7-bf54-f12ad8319d0b" (UID: "173b13fc-b954-46a7-bf54-f12ad8319d0b"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:04:03.131159 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:04:03.131032 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/173b13fc-b954-46a7-bf54-f12ad8319d0b-dshm" (OuterVolumeSpecName: "dshm") pod "173b13fc-b954-46a7-bf54-f12ad8319d0b" (UID: "173b13fc-b954-46a7-bf54-f12ad8319d0b"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:04:03.131159 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:04:03.131128 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/173b13fc-b954-46a7-bf54-f12ad8319d0b-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "173b13fc-b954-46a7-bf54-f12ad8319d0b" (UID: "173b13fc-b954-46a7-bf54-f12ad8319d0b"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:04:03.131159 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:04:03.131159 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/173b13fc-b954-46a7-bf54-f12ad8319d0b-kube-api-access-p8kr9" (OuterVolumeSpecName: "kube-api-access-p8kr9") pod "173b13fc-b954-46a7-bf54-f12ad8319d0b" (UID: "173b13fc-b954-46a7-bf54-f12ad8319d0b"). InnerVolumeSpecName "kube-api-access-p8kr9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:04:03.184496 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:04:03.184453 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/173b13fc-b954-46a7-bf54-f12ad8319d0b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "173b13fc-b954-46a7-bf54-f12ad8319d0b" (UID: "173b13fc-b954-46a7-bf54-f12ad8319d0b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:04:03.229441 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:04:03.229401 2566 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/173b13fc-b954-46a7-bf54-f12ad8319d0b-tls-certs\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:04:03.229441 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:04:03.229433 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p8kr9\" (UniqueName: \"kubernetes.io/projected/173b13fc-b954-46a7-bf54-f12ad8319d0b-kube-api-access-p8kr9\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:04:03.229441 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:04:03.229445 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/173b13fc-b954-46a7-bf54-f12ad8319d0b-kserve-provision-location\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:04:03.229686 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:04:03.229454 2566 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/173b13fc-b954-46a7-bf54-f12ad8319d0b-model-cache\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:04:03.229686 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:04:03.229464 2566 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/173b13fc-b954-46a7-bf54-f12ad8319d0b-dshm\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:04:03.652589 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:04:03.652558 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8_173b13fc-b954-46a7-bf54-f12ad8319d0b/main/0.log" Apr 22 19:04:03.652916 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:04:03.652894 2566 generic.go:358] "Generic (PLEG): container finished" podID="173b13fc-b954-46a7-bf54-f12ad8319d0b" containerID="9d2a5794dfe0ca1bb3b91297e0b70f32383d84c87b74d79d79efc1a64688722c" exitCode=137 Apr 22 19:04:03.653012 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:04:03.652982 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8" Apr 22 19:04:03.653053 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:04:03.653027 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8" event={"ID":"173b13fc-b954-46a7-bf54-f12ad8319d0b","Type":"ContainerDied","Data":"9d2a5794dfe0ca1bb3b91297e0b70f32383d84c87b74d79d79efc1a64688722c"} Apr 22 19:04:03.653089 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:04:03.653069 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8" event={"ID":"173b13fc-b954-46a7-bf54-f12ad8319d0b","Type":"ContainerDied","Data":"aeba8fdc91d21e3eb418df84a4e0407d9c5a2fc7014d47d0b1ee40de99b243e1"} Apr 22 19:04:03.653138 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:04:03.653095 2566 scope.go:117] "RemoveContainer" containerID="9d2a5794dfe0ca1bb3b91297e0b70f32383d84c87b74d79d79efc1a64688722c" Apr 22 19:04:03.672381 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:04:03.672361 2566 scope.go:117] "RemoveContainer" containerID="b5025625fbb96ccfeba56371de73c40cbebd03b03986edcce0f9d15c3644756b" Apr 22 19:04:03.674750 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:04:03.674723 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8"] Apr 22 19:04:03.680147 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:04:03.680122 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-555bcd9557bwpp8"] Apr 22 19:04:03.739295 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:04:03.739272 2566 scope.go:117] "RemoveContainer" containerID="9d2a5794dfe0ca1bb3b91297e0b70f32383d84c87b74d79d79efc1a64688722c" Apr 22 19:04:03.739658 ip-10-0-130-234 kubenswrapper[2566]: E0422 19:04:03.739633 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d2a5794dfe0ca1bb3b91297e0b70f32383d84c87b74d79d79efc1a64688722c\": container with ID starting with 9d2a5794dfe0ca1bb3b91297e0b70f32383d84c87b74d79d79efc1a64688722c not found: ID does not exist" containerID="9d2a5794dfe0ca1bb3b91297e0b70f32383d84c87b74d79d79efc1a64688722c" Apr 22 19:04:03.739761 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:04:03.739668 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d2a5794dfe0ca1bb3b91297e0b70f32383d84c87b74d79d79efc1a64688722c"} err="failed to get container status \"9d2a5794dfe0ca1bb3b91297e0b70f32383d84c87b74d79d79efc1a64688722c\": rpc error: code = NotFound desc = could not find container \"9d2a5794dfe0ca1bb3b91297e0b70f32383d84c87b74d79d79efc1a64688722c\": container with ID starting with 9d2a5794dfe0ca1bb3b91297e0b70f32383d84c87b74d79d79efc1a64688722c not found: ID does not exist" Apr 22 19:04:03.739761 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:04:03.739696 2566 scope.go:117] "RemoveContainer" containerID="b5025625fbb96ccfeba56371de73c40cbebd03b03986edcce0f9d15c3644756b" Apr 22 19:04:03.740084 ip-10-0-130-234 kubenswrapper[2566]: E0422 19:04:03.740053 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5025625fbb96ccfeba56371de73c40cbebd03b03986edcce0f9d15c3644756b\": container with ID starting with b5025625fbb96ccfeba56371de73c40cbebd03b03986edcce0f9d15c3644756b not found: ID does not exist" containerID="b5025625fbb96ccfeba56371de73c40cbebd03b03986edcce0f9d15c3644756b" Apr 22 19:04:03.740148 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:04:03.740086 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5025625fbb96ccfeba56371de73c40cbebd03b03986edcce0f9d15c3644756b"} err="failed to get container status \"b5025625fbb96ccfeba56371de73c40cbebd03b03986edcce0f9d15c3644756b\": rpc error: code = NotFound desc = could not find container \"b5025625fbb96ccfeba56371de73c40cbebd03b03986edcce0f9d15c3644756b\": container with ID starting with b5025625fbb96ccfeba56371de73c40cbebd03b03986edcce0f9d15c3644756b not found: ID does not exist" Apr 22 19:04:04.662361 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:04:04.662283 2566 generic.go:358] "Generic (PLEG): container finished" podID="fc4f9d14-6397-45c2-a3c1-f515114ba3c2" containerID="fea553e60d90453285ab9bb8d92cecb316bf43e4735a1ad034189148f4e8843d" exitCode=0 Apr 22 19:04:04.662361 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:04:04.662320 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-rpmz9" event={"ID":"fc4f9d14-6397-45c2-a3c1-f515114ba3c2","Type":"ContainerDied","Data":"fea553e60d90453285ab9bb8d92cecb316bf43e4735a1ad034189148f4e8843d"} Apr 22 19:04:04.829508 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:04:04.829475 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="173b13fc-b954-46a7-bf54-f12ad8319d0b" path="/var/lib/kubelet/pods/173b13fc-b954-46a7-bf54-f12ad8319d0b/volumes" Apr 22 19:04:05.668033 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:04:05.667992 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-rpmz9" event={"ID":"fc4f9d14-6397-45c2-a3c1-f515114ba3c2","Type":"ContainerStarted","Data":"a3f3d8ada9d7ebb2a0637357502a08815165a2ac567c6fcd77dafa4bd7d4eb37"} Apr 22 19:04:05.687784 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:04:05.687737 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-rpmz9" podStartSLOduration=6.687722115 podStartE2EDuration="6.687722115s" podCreationTimestamp="2026-04-22 19:03:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:04:05.684118216 +0000 UTC m=+1047.462951701" watchObservedRunningTime="2026-04-22 19:04:05.687722115 +0000 UTC m=+1047.466555600" Apr 22 19:04:09.744858 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:04:09.744813 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-rpmz9" Apr 22 19:04:09.744858 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:04:09.744863 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-rpmz9" Apr 22 19:04:09.746369 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:04:09.746337 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-rpmz9" podUID="fc4f9d14-6397-45c2-a3c1-f515114ba3c2" containerName="main" probeResult="failure" output="Get \"https://10.132.0.50:8000/health\": dial tcp 10.132.0.50:8000: connect: connection refused" Apr 22 19:04:19.745821 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:04:19.745768 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-rpmz9" podUID="fc4f9d14-6397-45c2-a3c1-f515114ba3c2" containerName="main" probeResult="failure" output="Get \"https://10.132.0.50:8000/health\": dial tcp 10.132.0.50:8000: connect: connection refused" Apr 22 19:04:29.745369 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:04:29.745272 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-rpmz9" podUID="fc4f9d14-6397-45c2-a3c1-f515114ba3c2" containerName="main" probeResult="failure" output="Get \"https://10.132.0.50:8000/health\": dial tcp 10.132.0.50:8000: connect: connection refused" Apr 22 19:04:39.745318 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:04:39.745267 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-rpmz9" podUID="fc4f9d14-6397-45c2-a3c1-f515114ba3c2" containerName="main" probeResult="failure" output="Get \"https://10.132.0.50:8000/health\": dial tcp 10.132.0.50:8000: connect: connection refused" Apr 22 19:04:49.745612 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:04:49.745552 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-rpmz9" podUID="fc4f9d14-6397-45c2-a3c1-f515114ba3c2" containerName="main" probeResult="failure" output="Get \"https://10.132.0.50:8000/health\": dial tcp 10.132.0.50:8000: connect: connection refused" Apr 22 19:04:59.745712 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:04:59.745664 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-rpmz9" podUID="fc4f9d14-6397-45c2-a3c1-f515114ba3c2" containerName="main" probeResult="failure" output="Get \"https://10.132.0.50:8000/health\": dial tcp 10.132.0.50:8000: connect: connection refused" Apr 22 19:05:09.744991 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:05:09.744925 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-rpmz9" podUID="fc4f9d14-6397-45c2-a3c1-f515114ba3c2" containerName="main" probeResult="failure" output="Get \"https://10.132.0.50:8000/health\": dial tcp 10.132.0.50:8000: connect: connection refused" Apr 22 19:05:19.745040 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:05:19.744999 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-rpmz9" podUID="fc4f9d14-6397-45c2-a3c1-f515114ba3c2" containerName="main" probeResult="failure" output="Get \"https://10.132.0.50:8000/health\": dial tcp 10.132.0.50:8000: connect: connection refused" Apr 22 19:05:29.745675 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:05:29.745632 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-rpmz9" podUID="fc4f9d14-6397-45c2-a3c1-f515114ba3c2" containerName="main" probeResult="failure" output="Get \"https://10.132.0.50:8000/health\": dial tcp 10.132.0.50:8000: connect: connection refused" Apr 22 19:05:39.755794 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:05:39.755757 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-rpmz9" Apr 22 19:05:39.763492 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:05:39.763467 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-rpmz9" Apr 22 19:05:41.045479 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:05:41.045445 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-rpmz9"] Apr 22 19:05:41.045998 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:05:41.045773 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-rpmz9" podUID="fc4f9d14-6397-45c2-a3c1-f515114ba3c2" containerName="main" containerID="cri-o://a3f3d8ada9d7ebb2a0637357502a08815165a2ac567c6fcd77dafa4bd7d4eb37" gracePeriod=30 Apr 22 19:06:02.022905 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:02.022826 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-589c656575-4xc5q"] Apr 22 19:06:02.023395 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:02.023220 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="173b13fc-b954-46a7-bf54-f12ad8319d0b" containerName="storage-initializer" Apr 22 19:06:02.023395 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:02.023232 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="173b13fc-b954-46a7-bf54-f12ad8319d0b" containerName="storage-initializer" Apr 22 19:06:02.023395 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:02.023240 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="173b13fc-b954-46a7-bf54-f12ad8319d0b" containerName="main" Apr 22 19:06:02.023395 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:02.023246 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="173b13fc-b954-46a7-bf54-f12ad8319d0b" containerName="main" Apr 22 19:06:02.023395 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:02.023295 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="173b13fc-b954-46a7-bf54-f12ad8319d0b" containerName="main" Apr 22 19:06:02.025518 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:02.025499 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-589c656575-4xc5q" Apr 22 19:06:02.027687 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:02.027650 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-kserve-self-signed-certs\"" Apr 22 19:06:02.036152 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:02.036126 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-589c656575-4xc5q"] Apr 22 19:06:02.058157 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:02.058130 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/30300a21-81ce-4770-8a05-1814637475e9-home\") pod \"router-with-refs-test-kserve-589c656575-4xc5q\" (UID: \"30300a21-81ce-4770-8a05-1814637475e9\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-589c656575-4xc5q" Apr 22 19:06:02.058275 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:02.058180 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhj7s\" (UniqueName: \"kubernetes.io/projected/30300a21-81ce-4770-8a05-1814637475e9-kube-api-access-bhj7s\") pod \"router-with-refs-test-kserve-589c656575-4xc5q\" (UID: \"30300a21-81ce-4770-8a05-1814637475e9\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-589c656575-4xc5q" Apr 22 19:06:02.058275 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:02.058206 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/30300a21-81ce-4770-8a05-1814637475e9-dshm\") pod \"router-with-refs-test-kserve-589c656575-4xc5q\" (UID: \"30300a21-81ce-4770-8a05-1814637475e9\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-589c656575-4xc5q" Apr 22 19:06:02.058275 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:02.058225 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/30300a21-81ce-4770-8a05-1814637475e9-tls-certs\") pod \"router-with-refs-test-kserve-589c656575-4xc5q\" (UID: \"30300a21-81ce-4770-8a05-1814637475e9\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-589c656575-4xc5q" Apr 22 19:06:02.058385 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:02.058271 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/30300a21-81ce-4770-8a05-1814637475e9-model-cache\") pod \"router-with-refs-test-kserve-589c656575-4xc5q\" (UID: \"30300a21-81ce-4770-8a05-1814637475e9\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-589c656575-4xc5q" Apr 22 19:06:02.058385 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:02.058334 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/30300a21-81ce-4770-8a05-1814637475e9-kserve-provision-location\") pod \"router-with-refs-test-kserve-589c656575-4xc5q\" (UID: \"30300a21-81ce-4770-8a05-1814637475e9\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-589c656575-4xc5q" Apr 22 19:06:02.159488 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:02.159453 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/30300a21-81ce-4770-8a05-1814637475e9-kserve-provision-location\") pod \"router-with-refs-test-kserve-589c656575-4xc5q\" (UID: \"30300a21-81ce-4770-8a05-1814637475e9\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-589c656575-4xc5q" Apr 22 19:06:02.159666 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:02.159501 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/30300a21-81ce-4770-8a05-1814637475e9-home\") pod \"router-with-refs-test-kserve-589c656575-4xc5q\" (UID: \"30300a21-81ce-4770-8a05-1814637475e9\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-589c656575-4xc5q" Apr 22 19:06:02.159666 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:02.159562 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bhj7s\" (UniqueName: \"kubernetes.io/projected/30300a21-81ce-4770-8a05-1814637475e9-kube-api-access-bhj7s\") pod \"router-with-refs-test-kserve-589c656575-4xc5q\" (UID: \"30300a21-81ce-4770-8a05-1814637475e9\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-589c656575-4xc5q" Apr 22 19:06:02.159666 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:02.159586 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/30300a21-81ce-4770-8a05-1814637475e9-dshm\") pod \"router-with-refs-test-kserve-589c656575-4xc5q\" (UID: \"30300a21-81ce-4770-8a05-1814637475e9\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-589c656575-4xc5q" Apr 22 19:06:02.159830 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:02.159724 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/30300a21-81ce-4770-8a05-1814637475e9-tls-certs\") pod \"router-with-refs-test-kserve-589c656575-4xc5q\" (UID: \"30300a21-81ce-4770-8a05-1814637475e9\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-589c656575-4xc5q" Apr 22 19:06:02.159830 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:02.159779 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/30300a21-81ce-4770-8a05-1814637475e9-model-cache\") pod \"router-with-refs-test-kserve-589c656575-4xc5q\" (UID: \"30300a21-81ce-4770-8a05-1814637475e9\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-589c656575-4xc5q" Apr 22 19:06:02.159962 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:02.159900 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/30300a21-81ce-4770-8a05-1814637475e9-kserve-provision-location\") pod \"router-with-refs-test-kserve-589c656575-4xc5q\" (UID: \"30300a21-81ce-4770-8a05-1814637475e9\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-589c656575-4xc5q" Apr 22 19:06:02.160038 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:02.160026 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/30300a21-81ce-4770-8a05-1814637475e9-home\") pod \"router-with-refs-test-kserve-589c656575-4xc5q\" (UID: \"30300a21-81ce-4770-8a05-1814637475e9\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-589c656575-4xc5q" Apr 22 19:06:02.160137 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:02.160117 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/30300a21-81ce-4770-8a05-1814637475e9-model-cache\") pod \"router-with-refs-test-kserve-589c656575-4xc5q\" (UID: \"30300a21-81ce-4770-8a05-1814637475e9\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-589c656575-4xc5q" Apr 22 19:06:02.161871 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:02.161847 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/30300a21-81ce-4770-8a05-1814637475e9-dshm\") pod \"router-with-refs-test-kserve-589c656575-4xc5q\" (UID: \"30300a21-81ce-4770-8a05-1814637475e9\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-589c656575-4xc5q" Apr 22 19:06:02.162142 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:02.162124 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/30300a21-81ce-4770-8a05-1814637475e9-tls-certs\") pod \"router-with-refs-test-kserve-589c656575-4xc5q\" (UID: \"30300a21-81ce-4770-8a05-1814637475e9\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-589c656575-4xc5q" Apr 22 19:06:02.166343 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:02.166326 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhj7s\" (UniqueName: \"kubernetes.io/projected/30300a21-81ce-4770-8a05-1814637475e9-kube-api-access-bhj7s\") pod \"router-with-refs-test-kserve-589c656575-4xc5q\" (UID: \"30300a21-81ce-4770-8a05-1814637475e9\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-589c656575-4xc5q" Apr 22 19:06:02.336557 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:02.336473 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-589c656575-4xc5q" Apr 22 19:06:02.460932 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:02.460903 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-589c656575-4xc5q"] Apr 22 19:06:02.462281 ip-10-0-130-234 kubenswrapper[2566]: W0422 19:06:02.462254 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30300a21_81ce_4770_8a05_1814637475e9.slice/crio-44a8f13e9c57de60e3e4baaa1310b5dcac4590fc0a3f907cbd67729727039f75 WatchSource:0}: Error finding container 44a8f13e9c57de60e3e4baaa1310b5dcac4590fc0a3f907cbd67729727039f75: Status 404 returned error can't find the container with id 44a8f13e9c57de60e3e4baaa1310b5dcac4590fc0a3f907cbd67729727039f75 Apr 22 19:06:03.123427 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:03.123392 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-589c656575-4xc5q" event={"ID":"30300a21-81ce-4770-8a05-1814637475e9","Type":"ContainerStarted","Data":"702355b1a4225333d22324058d5b3af5e6d9c566936f0d28dc88982996d96e20"} Apr 22 19:06:03.123427 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:03.123428 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-589c656575-4xc5q" event={"ID":"30300a21-81ce-4770-8a05-1814637475e9","Type":"ContainerStarted","Data":"44a8f13e9c57de60e3e4baaa1310b5dcac4590fc0a3f907cbd67729727039f75"} Apr 22 19:06:07.141575 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:07.141548 2566 generic.go:358] "Generic (PLEG): container finished" podID="30300a21-81ce-4770-8a05-1814637475e9" containerID="702355b1a4225333d22324058d5b3af5e6d9c566936f0d28dc88982996d96e20" exitCode=0 Apr 22 19:06:07.141874 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:07.141616 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-589c656575-4xc5q" event={"ID":"30300a21-81ce-4770-8a05-1814637475e9","Type":"ContainerDied","Data":"702355b1a4225333d22324058d5b3af5e6d9c566936f0d28dc88982996d96e20"} Apr 22 19:06:08.149005 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:08.148967 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-589c656575-4xc5q" event={"ID":"30300a21-81ce-4770-8a05-1814637475e9","Type":"ContainerStarted","Data":"7cc43a7013ce10e9fd23c3cc84076bd8f1e8c7101ae786a01139c19745a89a86"} Apr 22 19:06:08.170210 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:08.170154 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-589c656575-4xc5q" podStartSLOduration=6.170135747 podStartE2EDuration="6.170135747s" podCreationTimestamp="2026-04-22 19:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:06:08.167368825 +0000 UTC m=+1169.946202313" watchObservedRunningTime="2026-04-22 19:06:08.170135747 +0000 UTC m=+1169.948969234" Apr 22 19:06:11.161979 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:11.161953 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-59f568845f-rpmz9_fc4f9d14-6397-45c2-a3c1-f515114ba3c2/main/0.log" Apr 22 19:06:11.162368 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:11.162302 2566 generic.go:358] "Generic (PLEG): container finished" podID="fc4f9d14-6397-45c2-a3c1-f515114ba3c2" containerID="a3f3d8ada9d7ebb2a0637357502a08815165a2ac567c6fcd77dafa4bd7d4eb37" exitCode=137 Apr 22 19:06:11.162368 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:11.162343 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-rpmz9" event={"ID":"fc4f9d14-6397-45c2-a3c1-f515114ba3c2","Type":"ContainerDied","Data":"a3f3d8ada9d7ebb2a0637357502a08815165a2ac567c6fcd77dafa4bd7d4eb37"} Apr 22 19:06:11.286885 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:11.286859 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-59f568845f-rpmz9_fc4f9d14-6397-45c2-a3c1-f515114ba3c2/main/0.log" Apr 22 19:06:11.287250 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:11.287233 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-rpmz9" Apr 22 19:06:11.340787 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:11.340715 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fc4f9d14-6397-45c2-a3c1-f515114ba3c2-kserve-provision-location\") pod \"fc4f9d14-6397-45c2-a3c1-f515114ba3c2\" (UID: \"fc4f9d14-6397-45c2-a3c1-f515114ba3c2\") " Apr 22 19:06:11.340787 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:11.340748 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fc4f9d14-6397-45c2-a3c1-f515114ba3c2-model-cache\") pod \"fc4f9d14-6397-45c2-a3c1-f515114ba3c2\" (UID: \"fc4f9d14-6397-45c2-a3c1-f515114ba3c2\") " Apr 22 19:06:11.340787 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:11.340787 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fc4f9d14-6397-45c2-a3c1-f515114ba3c2-dshm\") pod \"fc4f9d14-6397-45c2-a3c1-f515114ba3c2\" (UID: \"fc4f9d14-6397-45c2-a3c1-f515114ba3c2\") " Apr 22 19:06:11.341069 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:11.340816 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fc4f9d14-6397-45c2-a3c1-f515114ba3c2-home\") pod \"fc4f9d14-6397-45c2-a3c1-f515114ba3c2\" (UID: \"fc4f9d14-6397-45c2-a3c1-f515114ba3c2\") " Apr 22 19:06:11.341069 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:11.340885 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fg4dv\" (UniqueName: \"kubernetes.io/projected/fc4f9d14-6397-45c2-a3c1-f515114ba3c2-kube-api-access-fg4dv\") pod \"fc4f9d14-6397-45c2-a3c1-f515114ba3c2\" (UID: \"fc4f9d14-6397-45c2-a3c1-f515114ba3c2\") " Apr 22 19:06:11.341069 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:11.340909 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fc4f9d14-6397-45c2-a3c1-f515114ba3c2-tls-certs\") pod \"fc4f9d14-6397-45c2-a3c1-f515114ba3c2\" (UID: \"fc4f9d14-6397-45c2-a3c1-f515114ba3c2\") " Apr 22 19:06:11.341069 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:11.341000 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc4f9d14-6397-45c2-a3c1-f515114ba3c2-model-cache" (OuterVolumeSpecName: "model-cache") pod "fc4f9d14-6397-45c2-a3c1-f515114ba3c2" (UID: "fc4f9d14-6397-45c2-a3c1-f515114ba3c2"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:06:11.341314 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:11.341281 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc4f9d14-6397-45c2-a3c1-f515114ba3c2-home" (OuterVolumeSpecName: "home") pod "fc4f9d14-6397-45c2-a3c1-f515114ba3c2" (UID: "fc4f9d14-6397-45c2-a3c1-f515114ba3c2"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:06:11.341373 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:11.341309 2566 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fc4f9d14-6397-45c2-a3c1-f515114ba3c2-model-cache\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:06:11.343052 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:11.343010 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc4f9d14-6397-45c2-a3c1-f515114ba3c2-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "fc4f9d14-6397-45c2-a3c1-f515114ba3c2" (UID: "fc4f9d14-6397-45c2-a3c1-f515114ba3c2"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:06:11.343384 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:11.343363 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc4f9d14-6397-45c2-a3c1-f515114ba3c2-kube-api-access-fg4dv" (OuterVolumeSpecName: "kube-api-access-fg4dv") pod "fc4f9d14-6397-45c2-a3c1-f515114ba3c2" (UID: "fc4f9d14-6397-45c2-a3c1-f515114ba3c2"). InnerVolumeSpecName "kube-api-access-fg4dv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:06:11.343384 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:11.343367 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc4f9d14-6397-45c2-a3c1-f515114ba3c2-dshm" (OuterVolumeSpecName: "dshm") pod "fc4f9d14-6397-45c2-a3c1-f515114ba3c2" (UID: "fc4f9d14-6397-45c2-a3c1-f515114ba3c2"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:06:11.402788 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:11.402744 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc4f9d14-6397-45c2-a3c1-f515114ba3c2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "fc4f9d14-6397-45c2-a3c1-f515114ba3c2" (UID: "fc4f9d14-6397-45c2-a3c1-f515114ba3c2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:06:11.442253 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:11.442223 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fc4f9d14-6397-45c2-a3c1-f515114ba3c2-kserve-provision-location\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:06:11.442253 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:11.442253 2566 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fc4f9d14-6397-45c2-a3c1-f515114ba3c2-dshm\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:06:11.442390 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:11.442263 2566 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fc4f9d14-6397-45c2-a3c1-f515114ba3c2-home\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:06:11.442390 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:11.442272 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fg4dv\" (UniqueName: \"kubernetes.io/projected/fc4f9d14-6397-45c2-a3c1-f515114ba3c2-kube-api-access-fg4dv\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:06:11.442390 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:11.442284 2566 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fc4f9d14-6397-45c2-a3c1-f515114ba3c2-tls-certs\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:06:12.168148 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:12.168120 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-59f568845f-rpmz9_fc4f9d14-6397-45c2-a3c1-f515114ba3c2/main/0.log" Apr 22 19:06:12.168603 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:12.168538 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-rpmz9" event={"ID":"fc4f9d14-6397-45c2-a3c1-f515114ba3c2","Type":"ContainerDied","Data":"59b0cecd92f0aa9ee8d4c12757056522dda173ddd39ce166457ff8b4958b2f1f"} Apr 22 19:06:12.168603 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:12.168553 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-rpmz9" Apr 22 19:06:12.168603 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:12.168588 2566 scope.go:117] "RemoveContainer" containerID="a3f3d8ada9d7ebb2a0637357502a08815165a2ac567c6fcd77dafa4bd7d4eb37" Apr 22 19:06:12.192932 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:12.192901 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-rpmz9"] Apr 22 19:06:12.196514 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:12.196491 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-59f568845f-rpmz9"] Apr 22 19:06:12.201346 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:12.201321 2566 scope.go:117] "RemoveContainer" containerID="fea553e60d90453285ab9bb8d92cecb316bf43e4735a1ad034189148f4e8843d" Apr 22 19:06:12.337501 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:12.337475 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-589c656575-4xc5q" Apr 22 19:06:12.337632 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:12.337524 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-589c656575-4xc5q" Apr 22 19:06:12.338850 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:12.338824 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-589c656575-4xc5q" podUID="30300a21-81ce-4770-8a05-1814637475e9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.51:8000/health\": dial tcp 10.132.0.51:8000: connect: connection refused" Apr 22 19:06:12.830283 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:12.830251 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc4f9d14-6397-45c2-a3c1-f515114ba3c2" path="/var/lib/kubelet/pods/fc4f9d14-6397-45c2-a3c1-f515114ba3c2/volumes" Apr 22 19:06:22.337269 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:22.337213 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-589c656575-4xc5q" podUID="30300a21-81ce-4770-8a05-1814637475e9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.51:8000/health\": dial tcp 10.132.0.51:8000: connect: connection refused" Apr 22 19:06:32.337513 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:32.337466 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-589c656575-4xc5q" podUID="30300a21-81ce-4770-8a05-1814637475e9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.51:8000/health\": dial tcp 10.132.0.51:8000: connect: connection refused" Apr 22 19:06:38.799288 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:38.799263 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4jzwx_4871a747-c041-430f-9c1a-3ab2ce6d0156/console-operator/1.log" Apr 22 19:06:38.809250 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:38.809229 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4jzwx_4871a747-c041-430f-9c1a-3ab2ce6d0156/console-operator/1.log" Apr 22 19:06:42.337606 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:42.337569 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-589c656575-4xc5q" podUID="30300a21-81ce-4770-8a05-1814637475e9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.51:8000/health\": dial tcp 10.132.0.51:8000: connect: connection refused" Apr 22 19:06:52.337452 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:06:52.337398 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-589c656575-4xc5q" podUID="30300a21-81ce-4770-8a05-1814637475e9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.51:8000/health\": dial tcp 10.132.0.51:8000: connect: connection refused" Apr 22 19:07:02.337713 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:07:02.337668 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-589c656575-4xc5q" podUID="30300a21-81ce-4770-8a05-1814637475e9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.51:8000/health\": dial tcp 10.132.0.51:8000: connect: connection refused" Apr 22 19:07:12.337346 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:07:12.337301 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-589c656575-4xc5q" podUID="30300a21-81ce-4770-8a05-1814637475e9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.51:8000/health\": dial tcp 10.132.0.51:8000: connect: connection refused" Apr 22 19:07:22.337959 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:07:22.337882 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-589c656575-4xc5q" podUID="30300a21-81ce-4770-8a05-1814637475e9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.51:8000/health\": dial tcp 10.132.0.51:8000: connect: connection refused" Apr 22 19:07:31.029541 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:07:31.029449 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm"] Apr 22 19:07:31.030160 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:07:31.029903 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fc4f9d14-6397-45c2-a3c1-f515114ba3c2" containerName="storage-initializer" Apr 22 19:07:31.030160 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:07:31.029918 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc4f9d14-6397-45c2-a3c1-f515114ba3c2" containerName="storage-initializer" Apr 22 19:07:31.030160 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:07:31.029961 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fc4f9d14-6397-45c2-a3c1-f515114ba3c2" containerName="main" Apr 22 19:07:31.030160 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:07:31.029969 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc4f9d14-6397-45c2-a3c1-f515114ba3c2" containerName="main" Apr 22 19:07:31.030160 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:07:31.030035 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="fc4f9d14-6397-45c2-a3c1-f515114ba3c2" containerName="main" Apr 22 19:07:31.032647 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:07:31.032627 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm" Apr 22 19:07:31.035293 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:07:31.035273 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 22 19:07:31.044607 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:07:31.044585 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm"] Apr 22 19:07:31.133419 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:07:31.133371 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/64d8c721-309b-4698-af35-7f48fa97d5af-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm\" (UID: \"64d8c721-309b-4698-af35-7f48fa97d5af\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm" Apr 22 19:07:31.133592 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:07:31.133441 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/64d8c721-309b-4698-af35-7f48fa97d5af-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm\" (UID: \"64d8c721-309b-4698-af35-7f48fa97d5af\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm" Apr 22 19:07:31.133592 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:07:31.133475 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/64d8c721-309b-4698-af35-7f48fa97d5af-home\") pod \"router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm\" (UID: \"64d8c721-309b-4698-af35-7f48fa97d5af\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm" Apr 22 19:07:31.133592 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:07:31.133503 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/64d8c721-309b-4698-af35-7f48fa97d5af-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm\" (UID: \"64d8c721-309b-4698-af35-7f48fa97d5af\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm" Apr 22 19:07:31.133592 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:07:31.133554 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/64d8c721-309b-4698-af35-7f48fa97d5af-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm\" (UID: \"64d8c721-309b-4698-af35-7f48fa97d5af\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm" Apr 22 19:07:31.133592 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:07:31.133586 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zr6fg\" (UniqueName: \"kubernetes.io/projected/64d8c721-309b-4698-af35-7f48fa97d5af-kube-api-access-zr6fg\") pod \"router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm\" (UID: \"64d8c721-309b-4698-af35-7f48fa97d5af\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm" Apr 22 19:07:31.234976 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:07:31.234902 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/64d8c721-309b-4698-af35-7f48fa97d5af-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm\" (UID: \"64d8c721-309b-4698-af35-7f48fa97d5af\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm" Apr 22 19:07:31.235121 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:07:31.235017 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/64d8c721-309b-4698-af35-7f48fa97d5af-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm\" (UID: \"64d8c721-309b-4698-af35-7f48fa97d5af\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm" Apr 22 19:07:31.235121 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:07:31.235044 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/64d8c721-309b-4698-af35-7f48fa97d5af-home\") pod \"router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm\" (UID: \"64d8c721-309b-4698-af35-7f48fa97d5af\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm" Apr 22 19:07:31.235121 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:07:31.235076 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/64d8c721-309b-4698-af35-7f48fa97d5af-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm\" (UID: \"64d8c721-309b-4698-af35-7f48fa97d5af\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm" Apr 22 19:07:31.235121 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:07:31.235114 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/64d8c721-309b-4698-af35-7f48fa97d5af-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm\" (UID: \"64d8c721-309b-4698-af35-7f48fa97d5af\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm" Apr 22 19:07:31.235286 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:07:31.235218 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zr6fg\" (UniqueName: \"kubernetes.io/projected/64d8c721-309b-4698-af35-7f48fa97d5af-kube-api-access-zr6fg\") pod \"router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm\" (UID: \"64d8c721-309b-4698-af35-7f48fa97d5af\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm" Apr 22 19:07:31.235422 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:07:31.235395 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/64d8c721-309b-4698-af35-7f48fa97d5af-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm\" (UID: \"64d8c721-309b-4698-af35-7f48fa97d5af\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm" Apr 22 19:07:31.235504 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:07:31.235406 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/64d8c721-309b-4698-af35-7f48fa97d5af-home\") pod \"router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm\" (UID: \"64d8c721-309b-4698-af35-7f48fa97d5af\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm" Apr 22 19:07:31.235504 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:07:31.235478 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/64d8c721-309b-4698-af35-7f48fa97d5af-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm\" (UID: \"64d8c721-309b-4698-af35-7f48fa97d5af\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm" Apr 22 19:07:31.237374 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:07:31.237351 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/64d8c721-309b-4698-af35-7f48fa97d5af-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm\" (UID: \"64d8c721-309b-4698-af35-7f48fa97d5af\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm" Apr 22 19:07:31.237609 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:07:31.237591 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/64d8c721-309b-4698-af35-7f48fa97d5af-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm\" (UID: \"64d8c721-309b-4698-af35-7f48fa97d5af\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm" Apr 22 19:07:31.242824 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:07:31.242799 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zr6fg\" (UniqueName: \"kubernetes.io/projected/64d8c721-309b-4698-af35-7f48fa97d5af-kube-api-access-zr6fg\") pod \"router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm\" (UID: \"64d8c721-309b-4698-af35-7f48fa97d5af\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm" Apr 22 19:07:31.345112 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:07:31.345028 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm" Apr 22 19:07:31.476776 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:07:31.476746 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm"] Apr 22 19:07:31.480000 ip-10-0-130-234 kubenswrapper[2566]: W0422 19:07:31.479973 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64d8c721_309b_4698_af35_7f48fa97d5af.slice/crio-0ad08025c0617eba0ebc91d2d57c0af301b61e8969e007072a23d7ae96a68080 WatchSource:0}: Error finding container 0ad08025c0617eba0ebc91d2d57c0af301b61e8969e007072a23d7ae96a68080: Status 404 returned error can't find the container with id 0ad08025c0617eba0ebc91d2d57c0af301b61e8969e007072a23d7ae96a68080 Apr 22 19:07:31.489392 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:07:31.489353 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm" event={"ID":"64d8c721-309b-4698-af35-7f48fa97d5af","Type":"ContainerStarted","Data":"0ad08025c0617eba0ebc91d2d57c0af301b61e8969e007072a23d7ae96a68080"} Apr 22 19:07:32.337194 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:07:32.337149 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-589c656575-4xc5q" podUID="30300a21-81ce-4770-8a05-1814637475e9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.51:8000/health\": dial tcp 10.132.0.51:8000: connect: connection refused" Apr 22 19:07:32.495184 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:07:32.495144 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm" event={"ID":"64d8c721-309b-4698-af35-7f48fa97d5af","Type":"ContainerStarted","Data":"22cc78b3294c4458a8a08ab045e68ed903720930ccfc74c231afeb1198199cae"} Apr 22 19:07:37.517195 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:07:37.517157 2566 generic.go:358] "Generic (PLEG): container finished" podID="64d8c721-309b-4698-af35-7f48fa97d5af" containerID="22cc78b3294c4458a8a08ab045e68ed903720930ccfc74c231afeb1198199cae" exitCode=0 Apr 22 19:07:37.517595 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:07:37.517230 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm" event={"ID":"64d8c721-309b-4698-af35-7f48fa97d5af","Type":"ContainerDied","Data":"22cc78b3294c4458a8a08ab045e68ed903720930ccfc74c231afeb1198199cae"} Apr 22 19:07:38.522635 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:07:38.522594 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm" event={"ID":"64d8c721-309b-4698-af35-7f48fa97d5af","Type":"ContainerStarted","Data":"37020e862beecafed3e45716455de875832e4316afd361ab1d99605922d8c91c"} Apr 22 19:07:38.545698 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:07:38.545639 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm" podStartSLOduration=7.545618472 podStartE2EDuration="7.545618472s" podCreationTimestamp="2026-04-22 19:07:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:07:38.540807245 +0000 UTC m=+1260.319640732" watchObservedRunningTime="2026-04-22 19:07:38.545618472 +0000 UTC m=+1260.324451959" Apr 22 19:07:41.345342 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:07:41.345305 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm" Apr 22 19:07:41.345791 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:07:41.345352 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm" Apr 22 19:07:41.346986 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:07:41.346959 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm" podUID="64d8c721-309b-4698-af35-7f48fa97d5af" containerName="main" probeResult="failure" output="Get \"https://10.132.0.52:8000/health\": dial tcp 10.132.0.52:8000: connect: connection refused" Apr 22 19:07:42.347042 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:07:42.347012 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-589c656575-4xc5q" Apr 22 19:07:42.355264 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:07:42.355227 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-589c656575-4xc5q" Apr 22 19:07:48.588043 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:07:48.588004 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-589c656575-4xc5q"] Apr 22 19:07:48.588694 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:07:48.588668 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-589c656575-4xc5q" podUID="30300a21-81ce-4770-8a05-1814637475e9" containerName="main" containerID="cri-o://7cc43a7013ce10e9fd23c3cc84076bd8f1e8c7101ae786a01139c19745a89a86" gracePeriod=30 Apr 22 19:07:51.346253 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:07:51.346208 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm" podUID="64d8c721-309b-4698-af35-7f48fa97d5af" containerName="main" probeResult="failure" output="Get \"https://10.132.0.52:8000/health\": dial tcp 10.132.0.52:8000: connect: connection refused" Apr 22 19:08:01.345750 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:01.345700 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm" podUID="64d8c721-309b-4698-af35-7f48fa97d5af" containerName="main" probeResult="failure" output="Get \"https://10.132.0.52:8000/health\": dial tcp 10.132.0.52:8000: connect: connection refused" Apr 22 19:08:04.141856 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:04.141816 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9"] Apr 22 19:08:04.146056 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:04.146029 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9" Apr 22 19:08:04.148441 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:04.148411 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8f1a6f044e8c7a4d31a250e0c4861caf-kserve-self-signed-certs\"" Apr 22 19:08:04.153621 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:04.153600 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9"] Apr 22 19:08:04.243324 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:04.243283 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9da9adaf-bb15-4393-95fc-ff8bd0dbee6e-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9\" (UID: \"9da9adaf-bb15-4393-95fc-ff8bd0dbee6e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9" Apr 22 19:08:04.243492 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:04.243334 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9da9adaf-bb15-4393-95fc-ff8bd0dbee6e-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9\" (UID: \"9da9adaf-bb15-4393-95fc-ff8bd0dbee6e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9" Apr 22 19:08:04.243492 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:04.243361 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9da9adaf-bb15-4393-95fc-ff8bd0dbee6e-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9\" (UID: \"9da9adaf-bb15-4393-95fc-ff8bd0dbee6e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9" Apr 22 19:08:04.243492 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:04.243460 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcl25\" (UniqueName: \"kubernetes.io/projected/9da9adaf-bb15-4393-95fc-ff8bd0dbee6e-kube-api-access-dcl25\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9\" (UID: \"9da9adaf-bb15-4393-95fc-ff8bd0dbee6e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9" Apr 22 19:08:04.243599 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:04.243551 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9da9adaf-bb15-4393-95fc-ff8bd0dbee6e-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9\" (UID: \"9da9adaf-bb15-4393-95fc-ff8bd0dbee6e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9" Apr 22 19:08:04.243634 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:04.243605 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9da9adaf-bb15-4393-95fc-ff8bd0dbee6e-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9\" (UID: \"9da9adaf-bb15-4393-95fc-ff8bd0dbee6e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9" Apr 22 19:08:04.344354 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:04.344316 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dcl25\" (UniqueName: \"kubernetes.io/projected/9da9adaf-bb15-4393-95fc-ff8bd0dbee6e-kube-api-access-dcl25\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9\" (UID: \"9da9adaf-bb15-4393-95fc-ff8bd0dbee6e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9" Apr 22 19:08:04.344525 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:04.344426 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9da9adaf-bb15-4393-95fc-ff8bd0dbee6e-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9\" (UID: \"9da9adaf-bb15-4393-95fc-ff8bd0dbee6e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9" Apr 22 19:08:04.344525 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:04.344455 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9da9adaf-bb15-4393-95fc-ff8bd0dbee6e-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9\" (UID: \"9da9adaf-bb15-4393-95fc-ff8bd0dbee6e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9" Apr 22 19:08:04.344525 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:04.344492 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9da9adaf-bb15-4393-95fc-ff8bd0dbee6e-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9\" (UID: \"9da9adaf-bb15-4393-95fc-ff8bd0dbee6e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9" Apr 22 19:08:04.344525 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:04.344519 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9da9adaf-bb15-4393-95fc-ff8bd0dbee6e-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9\" (UID: \"9da9adaf-bb15-4393-95fc-ff8bd0dbee6e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9" Apr 22 19:08:04.344733 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:04.344651 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9da9adaf-bb15-4393-95fc-ff8bd0dbee6e-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9\" (UID: \"9da9adaf-bb15-4393-95fc-ff8bd0dbee6e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9" Apr 22 19:08:04.344965 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:04.344927 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9da9adaf-bb15-4393-95fc-ff8bd0dbee6e-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9\" (UID: \"9da9adaf-bb15-4393-95fc-ff8bd0dbee6e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9" Apr 22 19:08:04.345058 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:04.345030 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9da9adaf-bb15-4393-95fc-ff8bd0dbee6e-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9\" (UID: \"9da9adaf-bb15-4393-95fc-ff8bd0dbee6e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9" Apr 22 19:08:04.345112 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:04.345034 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9da9adaf-bb15-4393-95fc-ff8bd0dbee6e-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9\" (UID: \"9da9adaf-bb15-4393-95fc-ff8bd0dbee6e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9" Apr 22 19:08:04.346849 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:04.346825 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9da9adaf-bb15-4393-95fc-ff8bd0dbee6e-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9\" (UID: \"9da9adaf-bb15-4393-95fc-ff8bd0dbee6e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9" Apr 22 19:08:04.347295 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:04.347274 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9da9adaf-bb15-4393-95fc-ff8bd0dbee6e-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9\" (UID: \"9da9adaf-bb15-4393-95fc-ff8bd0dbee6e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9" Apr 22 19:08:04.353004 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:04.352977 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcl25\" (UniqueName: \"kubernetes.io/projected/9da9adaf-bb15-4393-95fc-ff8bd0dbee6e-kube-api-access-dcl25\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9\" (UID: \"9da9adaf-bb15-4393-95fc-ff8bd0dbee6e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9" Apr 22 19:08:04.459608 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:04.459524 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9" Apr 22 19:08:04.596742 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:04.596709 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9"] Apr 22 19:08:04.597923 ip-10-0-130-234 kubenswrapper[2566]: W0422 19:08:04.597894 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9da9adaf_bb15_4393_95fc_ff8bd0dbee6e.slice/crio-0caa17b39d21e78c05c117690698e307ab1e4f9ab56e52b7c86cd95c5a1d5a80 WatchSource:0}: Error finding container 0caa17b39d21e78c05c117690698e307ab1e4f9ab56e52b7c86cd95c5a1d5a80: Status 404 returned error can't find the container with id 0caa17b39d21e78c05c117690698e307ab1e4f9ab56e52b7c86cd95c5a1d5a80 Apr 22 19:08:04.628990 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:04.628965 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9" event={"ID":"9da9adaf-bb15-4393-95fc-ff8bd0dbee6e","Type":"ContainerStarted","Data":"0caa17b39d21e78c05c117690698e307ab1e4f9ab56e52b7c86cd95c5a1d5a80"} Apr 22 19:08:05.634093 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:05.634051 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9" event={"ID":"9da9adaf-bb15-4393-95fc-ff8bd0dbee6e","Type":"ContainerStarted","Data":"b3271b5df8ab3232f992ebf45ae19a2a5dea11d8b417fcd067de9db7cecec783"} Apr 22 19:08:09.653179 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:09.653145 2566 generic.go:358] "Generic (PLEG): container finished" podID="9da9adaf-bb15-4393-95fc-ff8bd0dbee6e" containerID="b3271b5df8ab3232f992ebf45ae19a2a5dea11d8b417fcd067de9db7cecec783" exitCode=0 Apr 22 19:08:09.653561 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:09.653224 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9" event={"ID":"9da9adaf-bb15-4393-95fc-ff8bd0dbee6e","Type":"ContainerDied","Data":"b3271b5df8ab3232f992ebf45ae19a2a5dea11d8b417fcd067de9db7cecec783"} Apr 22 19:08:10.659153 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:10.659114 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9" event={"ID":"9da9adaf-bb15-4393-95fc-ff8bd0dbee6e","Type":"ContainerStarted","Data":"d811e4999ff1cb267164589f5c137425bfc50009015c46542299bcba6ee236ac"} Apr 22 19:08:10.678737 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:10.678683 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9" podStartSLOduration=6.6786696150000004 podStartE2EDuration="6.678669615s" podCreationTimestamp="2026-04-22 19:08:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:08:10.675345108 +0000 UTC m=+1292.454178593" watchObservedRunningTime="2026-04-22 19:08:10.678669615 +0000 UTC m=+1292.457503131" Apr 22 19:08:11.345776 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:11.345734 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm" podUID="64d8c721-309b-4698-af35-7f48fa97d5af" containerName="main" probeResult="failure" output="Get \"https://10.132.0.52:8000/health\": dial tcp 10.132.0.52:8000: connect: connection refused" Apr 22 19:08:14.460598 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:14.460562 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9" Apr 22 19:08:14.461005 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:14.460615 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9" Apr 22 19:08:14.462402 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:14.462371 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9" podUID="9da9adaf-bb15-4393-95fc-ff8bd0dbee6e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.53:8000/health\": dial tcp 10.132.0.53:8000: connect: connection refused" Apr 22 19:08:18.883518 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:18.883490 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-test-kserve-589c656575-4xc5q_30300a21-81ce-4770-8a05-1814637475e9/main/0.log" Apr 22 19:08:18.883901 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:18.883852 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-589c656575-4xc5q" Apr 22 19:08:18.979502 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:18.979474 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/30300a21-81ce-4770-8a05-1814637475e9-tls-certs\") pod \"30300a21-81ce-4770-8a05-1814637475e9\" (UID: \"30300a21-81ce-4770-8a05-1814637475e9\") " Apr 22 19:08:18.979685 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:18.979522 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/30300a21-81ce-4770-8a05-1814637475e9-kserve-provision-location\") pod \"30300a21-81ce-4770-8a05-1814637475e9\" (UID: \"30300a21-81ce-4770-8a05-1814637475e9\") " Apr 22 19:08:18.979685 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:18.979561 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/30300a21-81ce-4770-8a05-1814637475e9-model-cache\") pod \"30300a21-81ce-4770-8a05-1814637475e9\" (UID: \"30300a21-81ce-4770-8a05-1814637475e9\") " Apr 22 19:08:18.979685 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:18.979619 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/30300a21-81ce-4770-8a05-1814637475e9-home\") pod \"30300a21-81ce-4770-8a05-1814637475e9\" (UID: \"30300a21-81ce-4770-8a05-1814637475e9\") " Apr 22 19:08:18.979685 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:18.979679 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhj7s\" (UniqueName: \"kubernetes.io/projected/30300a21-81ce-4770-8a05-1814637475e9-kube-api-access-bhj7s\") pod \"30300a21-81ce-4770-8a05-1814637475e9\" (UID: \"30300a21-81ce-4770-8a05-1814637475e9\") " Apr 22 19:08:18.979923 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:18.979722 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/30300a21-81ce-4770-8a05-1814637475e9-dshm\") pod \"30300a21-81ce-4770-8a05-1814637475e9\" (UID: \"30300a21-81ce-4770-8a05-1814637475e9\") " Apr 22 19:08:18.979923 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:18.979863 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30300a21-81ce-4770-8a05-1814637475e9-model-cache" (OuterVolumeSpecName: "model-cache") pod "30300a21-81ce-4770-8a05-1814637475e9" (UID: "30300a21-81ce-4770-8a05-1814637475e9"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:08:18.980083 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:18.980043 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30300a21-81ce-4770-8a05-1814637475e9-home" (OuterVolumeSpecName: "home") pod "30300a21-81ce-4770-8a05-1814637475e9" (UID: "30300a21-81ce-4770-8a05-1814637475e9"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:08:18.980083 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:18.980065 2566 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/30300a21-81ce-4770-8a05-1814637475e9-model-cache\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:08:18.982182 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:18.982156 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30300a21-81ce-4770-8a05-1814637475e9-dshm" (OuterVolumeSpecName: "dshm") pod "30300a21-81ce-4770-8a05-1814637475e9" (UID: "30300a21-81ce-4770-8a05-1814637475e9"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:08:18.982657 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:18.982632 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30300a21-81ce-4770-8a05-1814637475e9-kube-api-access-bhj7s" (OuterVolumeSpecName: "kube-api-access-bhj7s") pod "30300a21-81ce-4770-8a05-1814637475e9" (UID: "30300a21-81ce-4770-8a05-1814637475e9"). InnerVolumeSpecName "kube-api-access-bhj7s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:08:18.982749 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:18.982652 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30300a21-81ce-4770-8a05-1814637475e9-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "30300a21-81ce-4770-8a05-1814637475e9" (UID: "30300a21-81ce-4770-8a05-1814637475e9"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:08:19.038067 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:19.038025 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30300a21-81ce-4770-8a05-1814637475e9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "30300a21-81ce-4770-8a05-1814637475e9" (UID: "30300a21-81ce-4770-8a05-1814637475e9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:08:19.080665 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:19.080631 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bhj7s\" (UniqueName: \"kubernetes.io/projected/30300a21-81ce-4770-8a05-1814637475e9-kube-api-access-bhj7s\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:08:19.080665 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:19.080661 2566 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/30300a21-81ce-4770-8a05-1814637475e9-dshm\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:08:19.080665 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:19.080670 2566 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/30300a21-81ce-4770-8a05-1814637475e9-tls-certs\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:08:19.080899 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:19.080682 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/30300a21-81ce-4770-8a05-1814637475e9-kserve-provision-location\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:08:19.080899 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:19.080690 2566 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/30300a21-81ce-4770-8a05-1814637475e9-home\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:08:19.706979 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:19.706924 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-test-kserve-589c656575-4xc5q_30300a21-81ce-4770-8a05-1814637475e9/main/0.log" Apr 22 19:08:19.707381 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:19.707351 2566 generic.go:358] "Generic (PLEG): container finished" podID="30300a21-81ce-4770-8a05-1814637475e9" containerID="7cc43a7013ce10e9fd23c3cc84076bd8f1e8c7101ae786a01139c19745a89a86" exitCode=137 Apr 22 19:08:19.707518 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:19.707422 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-589c656575-4xc5q" event={"ID":"30300a21-81ce-4770-8a05-1814637475e9","Type":"ContainerDied","Data":"7cc43a7013ce10e9fd23c3cc84076bd8f1e8c7101ae786a01139c19745a89a86"} Apr 22 19:08:19.707518 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:19.707472 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-589c656575-4xc5q" event={"ID":"30300a21-81ce-4770-8a05-1814637475e9","Type":"ContainerDied","Data":"44a8f13e9c57de60e3e4baaa1310b5dcac4590fc0a3f907cbd67729727039f75"} Apr 22 19:08:19.707518 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:19.707472 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-589c656575-4xc5q" Apr 22 19:08:19.707518 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:19.707498 2566 scope.go:117] "RemoveContainer" containerID="7cc43a7013ce10e9fd23c3cc84076bd8f1e8c7101ae786a01139c19745a89a86" Apr 22 19:08:19.729217 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:19.729178 2566 scope.go:117] "RemoveContainer" containerID="702355b1a4225333d22324058d5b3af5e6d9c566936f0d28dc88982996d96e20" Apr 22 19:08:19.732656 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:19.732633 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-589c656575-4xc5q"] Apr 22 19:08:19.736429 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:19.736406 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-589c656575-4xc5q"] Apr 22 19:08:19.792230 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:19.792207 2566 scope.go:117] "RemoveContainer" containerID="7cc43a7013ce10e9fd23c3cc84076bd8f1e8c7101ae786a01139c19745a89a86" Apr 22 19:08:19.792623 ip-10-0-130-234 kubenswrapper[2566]: E0422 19:08:19.792596 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cc43a7013ce10e9fd23c3cc84076bd8f1e8c7101ae786a01139c19745a89a86\": container with ID starting with 7cc43a7013ce10e9fd23c3cc84076bd8f1e8c7101ae786a01139c19745a89a86 not found: ID does not exist" containerID="7cc43a7013ce10e9fd23c3cc84076bd8f1e8c7101ae786a01139c19745a89a86" Apr 22 19:08:19.792694 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:19.792638 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cc43a7013ce10e9fd23c3cc84076bd8f1e8c7101ae786a01139c19745a89a86"} err="failed to get container status \"7cc43a7013ce10e9fd23c3cc84076bd8f1e8c7101ae786a01139c19745a89a86\": rpc error: code = NotFound desc = could not find container \"7cc43a7013ce10e9fd23c3cc84076bd8f1e8c7101ae786a01139c19745a89a86\": container with ID starting with 7cc43a7013ce10e9fd23c3cc84076bd8f1e8c7101ae786a01139c19745a89a86 not found: ID does not exist" Apr 22 19:08:19.792694 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:19.792666 2566 scope.go:117] "RemoveContainer" containerID="702355b1a4225333d22324058d5b3af5e6d9c566936f0d28dc88982996d96e20" Apr 22 19:08:19.792978 ip-10-0-130-234 kubenswrapper[2566]: E0422 19:08:19.792953 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"702355b1a4225333d22324058d5b3af5e6d9c566936f0d28dc88982996d96e20\": container with ID starting with 702355b1a4225333d22324058d5b3af5e6d9c566936f0d28dc88982996d96e20 not found: ID does not exist" containerID="702355b1a4225333d22324058d5b3af5e6d9c566936f0d28dc88982996d96e20" Apr 22 19:08:19.793080 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:19.792984 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"702355b1a4225333d22324058d5b3af5e6d9c566936f0d28dc88982996d96e20"} err="failed to get container status \"702355b1a4225333d22324058d5b3af5e6d9c566936f0d28dc88982996d96e20\": rpc error: code = NotFound desc = could not find container \"702355b1a4225333d22324058d5b3af5e6d9c566936f0d28dc88982996d96e20\": container with ID starting with 702355b1a4225333d22324058d5b3af5e6d9c566936f0d28dc88982996d96e20 not found: ID does not exist" Apr 22 19:08:20.830684 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:20.830648 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30300a21-81ce-4770-8a05-1814637475e9" path="/var/lib/kubelet/pods/30300a21-81ce-4770-8a05-1814637475e9/volumes" Apr 22 19:08:21.346264 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:21.346224 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm" podUID="64d8c721-309b-4698-af35-7f48fa97d5af" containerName="main" probeResult="failure" output="Get \"https://10.132.0.52:8000/health\": dial tcp 10.132.0.52:8000: connect: connection refused" Apr 22 19:08:24.460002 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:24.459956 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9" podUID="9da9adaf-bb15-4393-95fc-ff8bd0dbee6e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.53:8000/health\": dial tcp 10.132.0.53:8000: connect: connection refused" Apr 22 19:08:31.346391 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:31.346345 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm" podUID="64d8c721-309b-4698-af35-7f48fa97d5af" containerName="main" probeResult="failure" output="Get \"https://10.132.0.52:8000/health\": dial tcp 10.132.0.52:8000: connect: connection refused" Apr 22 19:08:34.460596 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:34.460547 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9" podUID="9da9adaf-bb15-4393-95fc-ff8bd0dbee6e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.53:8000/health\": dial tcp 10.132.0.53:8000: connect: connection refused" Apr 22 19:08:41.346228 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:41.346176 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm" podUID="64d8c721-309b-4698-af35-7f48fa97d5af" containerName="main" probeResult="failure" output="Get \"https://10.132.0.52:8000/health\": dial tcp 10.132.0.52:8000: connect: connection refused" Apr 22 19:08:44.460193 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:44.460146 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9" podUID="9da9adaf-bb15-4393-95fc-ff8bd0dbee6e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.53:8000/health\": dial tcp 10.132.0.53:8000: connect: connection refused" Apr 22 19:08:51.345848 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:51.345801 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm" podUID="64d8c721-309b-4698-af35-7f48fa97d5af" containerName="main" probeResult="failure" output="Get \"https://10.132.0.52:8000/health\": dial tcp 10.132.0.52:8000: connect: connection refused" Apr 22 19:08:54.460915 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:08:54.460853 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9" podUID="9da9adaf-bb15-4393-95fc-ff8bd0dbee6e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.53:8000/health\": dial tcp 10.132.0.53:8000: connect: connection refused" Apr 22 19:09:01.345958 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:09:01.345833 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm" podUID="64d8c721-309b-4698-af35-7f48fa97d5af" containerName="main" probeResult="failure" output="Get \"https://10.132.0.52:8000/health\": dial tcp 10.132.0.52:8000: connect: connection refused" Apr 22 19:09:04.460220 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:09:04.460180 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9" podUID="9da9adaf-bb15-4393-95fc-ff8bd0dbee6e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.53:8000/health\": dial tcp 10.132.0.53:8000: connect: connection refused" Apr 22 19:09:11.345824 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:09:11.345780 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm" podUID="64d8c721-309b-4698-af35-7f48fa97d5af" containerName="main" probeResult="failure" output="Get \"https://10.132.0.52:8000/health\": dial tcp 10.132.0.52:8000: connect: connection refused" Apr 22 19:09:14.460328 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:09:14.460273 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9" podUID="9da9adaf-bb15-4393-95fc-ff8bd0dbee6e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.53:8000/health\": dial tcp 10.132.0.53:8000: connect: connection refused" Apr 22 19:09:21.355127 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:09:21.355087 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm" Apr 22 19:09:21.363137 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:09:21.363112 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm" Apr 22 19:09:24.460615 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:09:24.460568 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9" podUID="9da9adaf-bb15-4393-95fc-ff8bd0dbee6e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.53:8000/health\": dial tcp 10.132.0.53:8000: connect: connection refused" Apr 22 19:09:34.460267 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:09:34.460224 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9" podUID="9da9adaf-bb15-4393-95fc-ff8bd0dbee6e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.53:8000/health\": dial tcp 10.132.0.53:8000: connect: connection refused" Apr 22 19:09:44.460395 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:09:44.460350 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9" podUID="9da9adaf-bb15-4393-95fc-ff8bd0dbee6e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.53:8000/health\": dial tcp 10.132.0.53:8000: connect: connection refused" Apr 22 19:09:49.309822 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:09:49.309783 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm"] Apr 22 19:09:49.310314 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:09:49.310198 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm" podUID="64d8c721-309b-4698-af35-7f48fa97d5af" containerName="main" containerID="cri-o://37020e862beecafed3e45716455de875832e4316afd361ab1d99605922d8c91c" gracePeriod=30 Apr 22 19:09:54.470005 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:09:54.469970 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9" Apr 22 19:09:54.477777 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:09:54.477755 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9" Apr 22 19:10:11.587613 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:11.587572 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9"] Apr 22 19:10:11.588080 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:11.587980 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="30300a21-81ce-4770-8a05-1814637475e9" containerName="storage-initializer" Apr 22 19:10:11.588080 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:11.587992 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="30300a21-81ce-4770-8a05-1814637475e9" containerName="storage-initializer" Apr 22 19:10:11.588080 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:11.588012 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="30300a21-81ce-4770-8a05-1814637475e9" containerName="main" Apr 22 19:10:11.588080 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:11.588018 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="30300a21-81ce-4770-8a05-1814637475e9" containerName="main" Apr 22 19:10:11.588080 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:11.588078 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="30300a21-81ce-4770-8a05-1814637475e9" containerName="main" Apr 22 19:10:11.590585 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:11.590561 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9" Apr 22 19:10:11.592806 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:11.592783 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8de1d74aab16d9cabd8b5aafeb5248e8-kserve-self-signed-certs\"" Apr 22 19:10:11.600006 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:11.599983 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9"] Apr 22 19:10:11.695868 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:11.695834 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9\" (UID: \"f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9" Apr 22 19:10:11.695868 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:11.695868 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhtfz\" (UniqueName: \"kubernetes.io/projected/f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5-kube-api-access-bhtfz\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9\" (UID: \"f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9" Apr 22 19:10:11.696101 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:11.695976 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9\" (UID: \"f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9" Apr 22 19:10:11.696101 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:11.696008 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9\" (UID: \"f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9" Apr 22 19:10:11.696101 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:11.696029 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9\" (UID: \"f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9" Apr 22 19:10:11.696101 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:11.696057 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9\" (UID: \"f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9" Apr 22 19:10:11.796667 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:11.796631 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9\" (UID: \"f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9" Apr 22 19:10:11.796667 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:11.796668 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bhtfz\" (UniqueName: \"kubernetes.io/projected/f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5-kube-api-access-bhtfz\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9\" (UID: \"f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9" Apr 22 19:10:11.796906 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:11.796728 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9\" (UID: \"f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9" Apr 22 19:10:11.796906 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:11.796749 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9\" (UID: \"f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9" Apr 22 19:10:11.796906 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:11.796766 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9\" (UID: \"f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9" Apr 22 19:10:11.796906 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:11.796790 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9\" (UID: \"f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9" Apr 22 19:10:11.797142 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:11.797075 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9\" (UID: \"f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9" Apr 22 19:10:11.797185 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:11.797156 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9\" (UID: \"f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9" Apr 22 19:10:11.797247 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:11.797212 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9\" (UID: \"f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9" Apr 22 19:10:11.798889 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:11.798859 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9\" (UID: \"f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9" Apr 22 19:10:11.799247 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:11.799226 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9\" (UID: \"f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9" Apr 22 19:10:11.803688 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:11.803665 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhtfz\" (UniqueName: \"kubernetes.io/projected/f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5-kube-api-access-bhtfz\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9\" (UID: \"f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9" Apr 22 19:10:11.902207 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:11.902177 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9" Apr 22 19:10:12.027462 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:12.027431 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9"] Apr 22 19:10:12.029085 ip-10-0-130-234 kubenswrapper[2566]: W0422 19:10:12.029057 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf92a5388_51c0_4ef5_bd5a_6bd09cb1aed5.slice/crio-8dd72860a1f5a9346b9aeb001bd6c503175ca4197caf89705595adb88d44025b WatchSource:0}: Error finding container 8dd72860a1f5a9346b9aeb001bd6c503175ca4197caf89705595adb88d44025b: Status 404 returned error can't find the container with id 8dd72860a1f5a9346b9aeb001bd6c503175ca4197caf89705595adb88d44025b Apr 22 19:10:12.031262 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:12.031243 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:10:12.151705 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:12.151672 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9" event={"ID":"f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5","Type":"ContainerStarted","Data":"94096d14d65477fa1ca452f60bcfd12e579eb6aea268d89a2846538456a88a3b"} Apr 22 19:10:12.151705 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:12.151711 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9" event={"ID":"f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5","Type":"ContainerStarted","Data":"8dd72860a1f5a9346b9aeb001bd6c503175ca4197caf89705595adb88d44025b"} Apr 22 19:10:17.175507 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:17.175474 2566 generic.go:358] "Generic (PLEG): container finished" podID="f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5" containerID="94096d14d65477fa1ca452f60bcfd12e579eb6aea268d89a2846538456a88a3b" exitCode=0 Apr 22 19:10:17.175955 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:17.175524 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9" event={"ID":"f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5","Type":"ContainerDied","Data":"94096d14d65477fa1ca452f60bcfd12e579eb6aea268d89a2846538456a88a3b"} Apr 22 19:10:18.180642 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:18.180608 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9" event={"ID":"f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5","Type":"ContainerStarted","Data":"024b4a266431cf49106a452f7898e7fccee4adaab8c79198a97b8a691181ea2e"} Apr 22 19:10:18.200000 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:18.199952 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9" podStartSLOduration=7.199921277 podStartE2EDuration="7.199921277s" podCreationTimestamp="2026-04-22 19:10:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:10:18.196781445 +0000 UTC m=+1419.975614931" watchObservedRunningTime="2026-04-22 19:10:18.199921277 +0000 UTC m=+1419.978754762" Apr 22 19:10:19.563494 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:19.563431 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm" Apr 22 19:10:19.673406 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:19.673370 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/64d8c721-309b-4698-af35-7f48fa97d5af-model-cache\") pod \"64d8c721-309b-4698-af35-7f48fa97d5af\" (UID: \"64d8c721-309b-4698-af35-7f48fa97d5af\") " Apr 22 19:10:19.673576 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:19.673430 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/64d8c721-309b-4698-af35-7f48fa97d5af-dshm\") pod \"64d8c721-309b-4698-af35-7f48fa97d5af\" (UID: \"64d8c721-309b-4698-af35-7f48fa97d5af\") " Apr 22 19:10:19.673576 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:19.673450 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/64d8c721-309b-4698-af35-7f48fa97d5af-tls-certs\") pod \"64d8c721-309b-4698-af35-7f48fa97d5af\" (UID: \"64d8c721-309b-4698-af35-7f48fa97d5af\") " Apr 22 19:10:19.673576 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:19.673479 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zr6fg\" (UniqueName: \"kubernetes.io/projected/64d8c721-309b-4698-af35-7f48fa97d5af-kube-api-access-zr6fg\") pod \"64d8c721-309b-4698-af35-7f48fa97d5af\" (UID: \"64d8c721-309b-4698-af35-7f48fa97d5af\") " Apr 22 19:10:19.673576 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:19.673537 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/64d8c721-309b-4698-af35-7f48fa97d5af-home\") pod \"64d8c721-309b-4698-af35-7f48fa97d5af\" (UID: \"64d8c721-309b-4698-af35-7f48fa97d5af\") " Apr 22 19:10:19.673919 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:19.673599 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/64d8c721-309b-4698-af35-7f48fa97d5af-kserve-provision-location\") pod \"64d8c721-309b-4698-af35-7f48fa97d5af\" (UID: \"64d8c721-309b-4698-af35-7f48fa97d5af\") " Apr 22 19:10:19.673919 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:19.673722 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64d8c721-309b-4698-af35-7f48fa97d5af-model-cache" (OuterVolumeSpecName: "model-cache") pod "64d8c721-309b-4698-af35-7f48fa97d5af" (UID: "64d8c721-309b-4698-af35-7f48fa97d5af"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:10:19.674070 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:19.673918 2566 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/64d8c721-309b-4698-af35-7f48fa97d5af-model-cache\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:10:19.674070 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:19.674026 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64d8c721-309b-4698-af35-7f48fa97d5af-home" (OuterVolumeSpecName: "home") pod "64d8c721-309b-4698-af35-7f48fa97d5af" (UID: "64d8c721-309b-4698-af35-7f48fa97d5af"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:10:19.676160 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:19.676136 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64d8c721-309b-4698-af35-7f48fa97d5af-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "64d8c721-309b-4698-af35-7f48fa97d5af" (UID: "64d8c721-309b-4698-af35-7f48fa97d5af"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:10:19.676508 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:19.676485 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64d8c721-309b-4698-af35-7f48fa97d5af-kube-api-access-zr6fg" (OuterVolumeSpecName: "kube-api-access-zr6fg") pod "64d8c721-309b-4698-af35-7f48fa97d5af" (UID: "64d8c721-309b-4698-af35-7f48fa97d5af"). InnerVolumeSpecName "kube-api-access-zr6fg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:10:19.676581 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:19.676566 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64d8c721-309b-4698-af35-7f48fa97d5af-dshm" (OuterVolumeSpecName: "dshm") pod "64d8c721-309b-4698-af35-7f48fa97d5af" (UID: "64d8c721-309b-4698-af35-7f48fa97d5af"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:10:19.731107 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:19.731065 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64d8c721-309b-4698-af35-7f48fa97d5af-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "64d8c721-309b-4698-af35-7f48fa97d5af" (UID: "64d8c721-309b-4698-af35-7f48fa97d5af"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:10:19.774352 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:19.774319 2566 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/64d8c721-309b-4698-af35-7f48fa97d5af-dshm\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:10:19.774352 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:19.774349 2566 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/64d8c721-309b-4698-af35-7f48fa97d5af-tls-certs\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:10:19.774534 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:19.774361 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zr6fg\" (UniqueName: \"kubernetes.io/projected/64d8c721-309b-4698-af35-7f48fa97d5af-kube-api-access-zr6fg\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:10:19.774534 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:19.774370 2566 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/64d8c721-309b-4698-af35-7f48fa97d5af-home\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:10:19.774534 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:19.774381 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/64d8c721-309b-4698-af35-7f48fa97d5af-kserve-provision-location\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:10:20.193400 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:20.193367 2566 generic.go:358] "Generic (PLEG): container finished" podID="64d8c721-309b-4698-af35-7f48fa97d5af" containerID="37020e862beecafed3e45716455de875832e4316afd361ab1d99605922d8c91c" exitCode=137 Apr 22 19:10:20.193573 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:20.193439 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm" Apr 22 19:10:20.193573 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:20.193465 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm" event={"ID":"64d8c721-309b-4698-af35-7f48fa97d5af","Type":"ContainerDied","Data":"37020e862beecafed3e45716455de875832e4316afd361ab1d99605922d8c91c"} Apr 22 19:10:20.193573 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:20.193514 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm" event={"ID":"64d8c721-309b-4698-af35-7f48fa97d5af","Type":"ContainerDied","Data":"0ad08025c0617eba0ebc91d2d57c0af301b61e8969e007072a23d7ae96a68080"} Apr 22 19:10:20.193573 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:20.193538 2566 scope.go:117] "RemoveContainer" containerID="37020e862beecafed3e45716455de875832e4316afd361ab1d99605922d8c91c" Apr 22 19:10:20.215763 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:20.215738 2566 scope.go:117] "RemoveContainer" containerID="22cc78b3294c4458a8a08ab045e68ed903720930ccfc74c231afeb1198199cae" Apr 22 19:10:20.219916 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:20.219892 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm"] Apr 22 19:10:20.223810 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:20.223786 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-9d864484d-ht2gm"] Apr 22 19:10:20.282225 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:20.282194 2566 scope.go:117] "RemoveContainer" containerID="37020e862beecafed3e45716455de875832e4316afd361ab1d99605922d8c91c" Apr 22 19:10:20.282551 ip-10-0-130-234 kubenswrapper[2566]: E0422 19:10:20.282527 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37020e862beecafed3e45716455de875832e4316afd361ab1d99605922d8c91c\": container with ID starting with 37020e862beecafed3e45716455de875832e4316afd361ab1d99605922d8c91c not found: ID does not exist" containerID="37020e862beecafed3e45716455de875832e4316afd361ab1d99605922d8c91c" Apr 22 19:10:20.282648 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:20.282552 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37020e862beecafed3e45716455de875832e4316afd361ab1d99605922d8c91c"} err="failed to get container status \"37020e862beecafed3e45716455de875832e4316afd361ab1d99605922d8c91c\": rpc error: code = NotFound desc = could not find container \"37020e862beecafed3e45716455de875832e4316afd361ab1d99605922d8c91c\": container with ID starting with 37020e862beecafed3e45716455de875832e4316afd361ab1d99605922d8c91c not found: ID does not exist" Apr 22 19:10:20.282648 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:20.282570 2566 scope.go:117] "RemoveContainer" containerID="22cc78b3294c4458a8a08ab045e68ed903720930ccfc74c231afeb1198199cae" Apr 22 19:10:20.282846 ip-10-0-130-234 kubenswrapper[2566]: E0422 19:10:20.282826 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22cc78b3294c4458a8a08ab045e68ed903720930ccfc74c231afeb1198199cae\": container with ID starting with 22cc78b3294c4458a8a08ab045e68ed903720930ccfc74c231afeb1198199cae not found: ID does not exist" containerID="22cc78b3294c4458a8a08ab045e68ed903720930ccfc74c231afeb1198199cae" Apr 22 19:10:20.282886 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:20.282850 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22cc78b3294c4458a8a08ab045e68ed903720930ccfc74c231afeb1198199cae"} err="failed to get container status \"22cc78b3294c4458a8a08ab045e68ed903720930ccfc74c231afeb1198199cae\": rpc error: code = NotFound desc = could not find container \"22cc78b3294c4458a8a08ab045e68ed903720930ccfc74c231afeb1198199cae\": container with ID starting with 22cc78b3294c4458a8a08ab045e68ed903720930ccfc74c231afeb1198199cae not found: ID does not exist" Apr 22 19:10:20.830213 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:20.830178 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64d8c721-309b-4698-af35-7f48fa97d5af" path="/var/lib/kubelet/pods/64d8c721-309b-4698-af35-7f48fa97d5af/volumes" Apr 22 19:10:21.902964 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:21.902917 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9" Apr 22 19:10:21.903334 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:21.902979 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9" Apr 22 19:10:21.904553 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:21.904516 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9" podUID="f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.54:8000/health\": dial tcp 10.132.0.54:8000: connect: connection refused" Apr 22 19:10:25.249220 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:25.249188 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9"] Apr 22 19:10:25.249632 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:25.249453 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9" podUID="9da9adaf-bb15-4393-95fc-ff8bd0dbee6e" containerName="main" containerID="cri-o://d811e4999ff1cb267164589f5c137425bfc50009015c46542299bcba6ee236ac" gracePeriod=30 Apr 22 19:10:29.428789 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:29.428708 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp"] Apr 22 19:10:29.429355 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:29.429333 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="64d8c721-309b-4698-af35-7f48fa97d5af" containerName="storage-initializer" Apr 22 19:10:29.429414 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:29.429360 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="64d8c721-309b-4698-af35-7f48fa97d5af" containerName="storage-initializer" Apr 22 19:10:29.429414 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:29.429392 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="64d8c721-309b-4698-af35-7f48fa97d5af" containerName="main" Apr 22 19:10:29.429414 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:29.429400 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="64d8c721-309b-4698-af35-7f48fa97d5af" containerName="main" Apr 22 19:10:29.429527 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:29.429515 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="64d8c721-309b-4698-af35-7f48fa97d5af" containerName="main" Apr 22 19:10:29.473281 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:29.473245 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp"] Apr 22 19:10:29.473453 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:29.473375 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp" Apr 22 19:10:29.475880 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:29.475853 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 22 19:10:29.559160 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:29.559127 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/42da7359-3343-43cb-bc34-341f479dcb96-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp\" (UID: \"42da7359-3343-43cb-bc34-341f479dcb96\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp" Apr 22 19:10:29.559319 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:29.559168 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/42da7359-3343-43cb-bc34-341f479dcb96-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp\" (UID: \"42da7359-3343-43cb-bc34-341f479dcb96\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp" Apr 22 19:10:29.559319 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:29.559193 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/42da7359-3343-43cb-bc34-341f479dcb96-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp\" (UID: \"42da7359-3343-43cb-bc34-341f479dcb96\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp" Apr 22 19:10:29.559319 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:29.559213 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pf586\" (UniqueName: \"kubernetes.io/projected/42da7359-3343-43cb-bc34-341f479dcb96-kube-api-access-pf586\") pod \"custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp\" (UID: \"42da7359-3343-43cb-bc34-341f479dcb96\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp" Apr 22 19:10:29.559319 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:29.559291 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/42da7359-3343-43cb-bc34-341f479dcb96-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp\" (UID: \"42da7359-3343-43cb-bc34-341f479dcb96\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp" Apr 22 19:10:29.559480 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:29.559333 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/42da7359-3343-43cb-bc34-341f479dcb96-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp\" (UID: \"42da7359-3343-43cb-bc34-341f479dcb96\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp" Apr 22 19:10:29.660212 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:29.660176 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/42da7359-3343-43cb-bc34-341f479dcb96-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp\" (UID: \"42da7359-3343-43cb-bc34-341f479dcb96\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp" Apr 22 19:10:29.660408 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:29.660234 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/42da7359-3343-43cb-bc34-341f479dcb96-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp\" (UID: \"42da7359-3343-43cb-bc34-341f479dcb96\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp" Apr 22 19:10:29.660408 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:29.660267 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/42da7359-3343-43cb-bc34-341f479dcb96-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp\" (UID: \"42da7359-3343-43cb-bc34-341f479dcb96\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp" Apr 22 19:10:29.660408 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:29.660296 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pf586\" (UniqueName: \"kubernetes.io/projected/42da7359-3343-43cb-bc34-341f479dcb96-kube-api-access-pf586\") pod \"custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp\" (UID: \"42da7359-3343-43cb-bc34-341f479dcb96\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp" Apr 22 19:10:29.660408 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:29.660339 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/42da7359-3343-43cb-bc34-341f479dcb96-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp\" (UID: \"42da7359-3343-43cb-bc34-341f479dcb96\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp" Apr 22 19:10:29.660408 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:29.660364 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/42da7359-3343-43cb-bc34-341f479dcb96-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp\" (UID: \"42da7359-3343-43cb-bc34-341f479dcb96\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp" Apr 22 19:10:29.660727 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:29.660699 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/42da7359-3343-43cb-bc34-341f479dcb96-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp\" (UID: \"42da7359-3343-43cb-bc34-341f479dcb96\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp" Apr 22 19:10:29.660795 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:29.660724 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/42da7359-3343-43cb-bc34-341f479dcb96-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp\" (UID: \"42da7359-3343-43cb-bc34-341f479dcb96\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp" Apr 22 19:10:29.660795 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:29.660777 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/42da7359-3343-43cb-bc34-341f479dcb96-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp\" (UID: \"42da7359-3343-43cb-bc34-341f479dcb96\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp" Apr 22 19:10:29.662578 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:29.662553 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/42da7359-3343-43cb-bc34-341f479dcb96-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp\" (UID: \"42da7359-3343-43cb-bc34-341f479dcb96\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp" Apr 22 19:10:29.662713 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:29.662697 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/42da7359-3343-43cb-bc34-341f479dcb96-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp\" (UID: \"42da7359-3343-43cb-bc34-341f479dcb96\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp" Apr 22 19:10:29.669030 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:29.669002 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pf586\" (UniqueName: \"kubernetes.io/projected/42da7359-3343-43cb-bc34-341f479dcb96-kube-api-access-pf586\") pod \"custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp\" (UID: \"42da7359-3343-43cb-bc34-341f479dcb96\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp" Apr 22 19:10:29.783912 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:29.783821 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp" Apr 22 19:10:29.916489 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:29.916460 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp"] Apr 22 19:10:29.918416 ip-10-0-130-234 kubenswrapper[2566]: W0422 19:10:29.918386 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42da7359_3343_43cb_bc34_341f479dcb96.slice/crio-e63d5e8622902418e59a4732febfda0ee4207c1880b14887bd9834ad8e4fd40c WatchSource:0}: Error finding container e63d5e8622902418e59a4732febfda0ee4207c1880b14887bd9834ad8e4fd40c: Status 404 returned error can't find the container with id e63d5e8622902418e59a4732febfda0ee4207c1880b14887bd9834ad8e4fd40c Apr 22 19:10:30.237711 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:30.237671 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp" event={"ID":"42da7359-3343-43cb-bc34-341f479dcb96","Type":"ContainerStarted","Data":"ef2fa9680d46b64206ed8352b534dcedbc36521c20fa229c8cf6ceca039ffc6a"} Apr 22 19:10:30.237711 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:30.237712 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp" event={"ID":"42da7359-3343-43cb-bc34-341f479dcb96","Type":"ContainerStarted","Data":"e63d5e8622902418e59a4732febfda0ee4207c1880b14887bd9834ad8e4fd40c"} Apr 22 19:10:31.903501 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:31.903455 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9" podUID="f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.54:8000/health\": dial tcp 10.132.0.54:8000: connect: connection refused" Apr 22 19:10:35.260239 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:35.260208 2566 generic.go:358] "Generic (PLEG): container finished" podID="42da7359-3343-43cb-bc34-341f479dcb96" containerID="ef2fa9680d46b64206ed8352b534dcedbc36521c20fa229c8cf6ceca039ffc6a" exitCode=0 Apr 22 19:10:35.260661 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:35.260277 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp" event={"ID":"42da7359-3343-43cb-bc34-341f479dcb96","Type":"ContainerDied","Data":"ef2fa9680d46b64206ed8352b534dcedbc36521c20fa229c8cf6ceca039ffc6a"} Apr 22 19:10:36.266586 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:36.266544 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp" event={"ID":"42da7359-3343-43cb-bc34-341f479dcb96","Type":"ContainerStarted","Data":"ec65d3af44dc3fc307a3ad47fe5129c9e9d78a7580183315cde3e076a7253bba"} Apr 22 19:10:36.287988 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:36.287899 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp" podStartSLOduration=7.287884583 podStartE2EDuration="7.287884583s" podCreationTimestamp="2026-04-22 19:10:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:10:36.284527812 +0000 UTC m=+1438.063361297" watchObservedRunningTime="2026-04-22 19:10:36.287884583 +0000 UTC m=+1438.066718069" Apr 22 19:10:39.784812 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:39.784762 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp" Apr 22 19:10:39.784812 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:39.784823 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp" Apr 22 19:10:39.786329 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:39.786299 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp" podUID="42da7359-3343-43cb-bc34-341f479dcb96" containerName="main" probeResult="failure" output="Get \"https://10.132.0.55:8000/health\": dial tcp 10.132.0.55:8000: connect: connection refused" Apr 22 19:10:41.903461 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:41.903418 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9" podUID="f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.54:8000/health\": dial tcp 10.132.0.54:8000: connect: connection refused" Apr 22 19:10:49.785307 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:49.785235 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp" podUID="42da7359-3343-43cb-bc34-341f479dcb96" containerName="main" probeResult="failure" output="Get \"https://10.132.0.55:8000/health\": dial tcp 10.132.0.55:8000: connect: connection refused" Apr 22 19:10:51.903560 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:51.903509 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9" podUID="f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.54:8000/health\": dial tcp 10.132.0.54:8000: connect: connection refused" Apr 22 19:10:55.514886 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:55.514831 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9" Apr 22 19:10:55.598072 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:55.598033 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9da9adaf-bb15-4393-95fc-ff8bd0dbee6e-kserve-provision-location\") pod \"9da9adaf-bb15-4393-95fc-ff8bd0dbee6e\" (UID: \"9da9adaf-bb15-4393-95fc-ff8bd0dbee6e\") " Apr 22 19:10:55.598072 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:55.598081 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9da9adaf-bb15-4393-95fc-ff8bd0dbee6e-home\") pod \"9da9adaf-bb15-4393-95fc-ff8bd0dbee6e\" (UID: \"9da9adaf-bb15-4393-95fc-ff8bd0dbee6e\") " Apr 22 19:10:55.598350 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:55.598101 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcl25\" (UniqueName: \"kubernetes.io/projected/9da9adaf-bb15-4393-95fc-ff8bd0dbee6e-kube-api-access-dcl25\") pod \"9da9adaf-bb15-4393-95fc-ff8bd0dbee6e\" (UID: \"9da9adaf-bb15-4393-95fc-ff8bd0dbee6e\") " Apr 22 19:10:55.598350 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:55.598128 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9da9adaf-bb15-4393-95fc-ff8bd0dbee6e-dshm\") pod \"9da9adaf-bb15-4393-95fc-ff8bd0dbee6e\" (UID: \"9da9adaf-bb15-4393-95fc-ff8bd0dbee6e\") " Apr 22 19:10:55.598350 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:55.598157 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9da9adaf-bb15-4393-95fc-ff8bd0dbee6e-model-cache\") pod \"9da9adaf-bb15-4393-95fc-ff8bd0dbee6e\" (UID: \"9da9adaf-bb15-4393-95fc-ff8bd0dbee6e\") " Apr 22 19:10:55.598350 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:55.598236 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9da9adaf-bb15-4393-95fc-ff8bd0dbee6e-tls-certs\") pod \"9da9adaf-bb15-4393-95fc-ff8bd0dbee6e\" (UID: \"9da9adaf-bb15-4393-95fc-ff8bd0dbee6e\") " Apr 22 19:10:55.598568 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:55.598465 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9da9adaf-bb15-4393-95fc-ff8bd0dbee6e-home" (OuterVolumeSpecName: "home") pod "9da9adaf-bb15-4393-95fc-ff8bd0dbee6e" (UID: "9da9adaf-bb15-4393-95fc-ff8bd0dbee6e"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:10:55.598629 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:55.598562 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9da9adaf-bb15-4393-95fc-ff8bd0dbee6e-model-cache" (OuterVolumeSpecName: "model-cache") pod "9da9adaf-bb15-4393-95fc-ff8bd0dbee6e" (UID: "9da9adaf-bb15-4393-95fc-ff8bd0dbee6e"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:10:55.600853 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:55.600823 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9da9adaf-bb15-4393-95fc-ff8bd0dbee6e-kube-api-access-dcl25" (OuterVolumeSpecName: "kube-api-access-dcl25") pod "9da9adaf-bb15-4393-95fc-ff8bd0dbee6e" (UID: "9da9adaf-bb15-4393-95fc-ff8bd0dbee6e"). InnerVolumeSpecName "kube-api-access-dcl25". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:10:55.600853 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:55.600833 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9da9adaf-bb15-4393-95fc-ff8bd0dbee6e-dshm" (OuterVolumeSpecName: "dshm") pod "9da9adaf-bb15-4393-95fc-ff8bd0dbee6e" (UID: "9da9adaf-bb15-4393-95fc-ff8bd0dbee6e"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:10:55.601048 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:55.600918 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9da9adaf-bb15-4393-95fc-ff8bd0dbee6e-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "9da9adaf-bb15-4393-95fc-ff8bd0dbee6e" (UID: "9da9adaf-bb15-4393-95fc-ff8bd0dbee6e"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:10:55.652199 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:55.652151 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9da9adaf-bb15-4393-95fc-ff8bd0dbee6e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9da9adaf-bb15-4393-95fc-ff8bd0dbee6e" (UID: "9da9adaf-bb15-4393-95fc-ff8bd0dbee6e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:10:55.699798 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:55.699761 2566 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9da9adaf-bb15-4393-95fc-ff8bd0dbee6e-tls-certs\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:10:55.699798 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:55.699795 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9da9adaf-bb15-4393-95fc-ff8bd0dbee6e-kserve-provision-location\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:10:55.699798 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:55.699806 2566 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9da9adaf-bb15-4393-95fc-ff8bd0dbee6e-home\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:10:55.700046 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:55.699815 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dcl25\" (UniqueName: \"kubernetes.io/projected/9da9adaf-bb15-4393-95fc-ff8bd0dbee6e-kube-api-access-dcl25\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:10:55.700046 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:55.699825 2566 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9da9adaf-bb15-4393-95fc-ff8bd0dbee6e-dshm\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:10:55.700046 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:55.699833 2566 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9da9adaf-bb15-4393-95fc-ff8bd0dbee6e-model-cache\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:10:56.355922 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:56.355886 2566 generic.go:358] "Generic (PLEG): container finished" podID="9da9adaf-bb15-4393-95fc-ff8bd0dbee6e" containerID="d811e4999ff1cb267164589f5c137425bfc50009015c46542299bcba6ee236ac" exitCode=137 Apr 22 19:10:56.356138 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:56.355989 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9" Apr 22 19:10:56.356138 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:56.356026 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9" event={"ID":"9da9adaf-bb15-4393-95fc-ff8bd0dbee6e","Type":"ContainerDied","Data":"d811e4999ff1cb267164589f5c137425bfc50009015c46542299bcba6ee236ac"} Apr 22 19:10:56.356138 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:56.356064 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9" event={"ID":"9da9adaf-bb15-4393-95fc-ff8bd0dbee6e","Type":"ContainerDied","Data":"0caa17b39d21e78c05c117690698e307ab1e4f9ab56e52b7c86cd95c5a1d5a80"} Apr 22 19:10:56.356138 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:56.356083 2566 scope.go:117] "RemoveContainer" containerID="d811e4999ff1cb267164589f5c137425bfc50009015c46542299bcba6ee236ac" Apr 22 19:10:56.376134 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:56.376109 2566 scope.go:117] "RemoveContainer" containerID="b3271b5df8ab3232f992ebf45ae19a2a5dea11d8b417fcd067de9db7cecec783" Apr 22 19:10:56.382988 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:56.382963 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9"] Apr 22 19:10:56.385811 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:56.385788 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86dxxv9"] Apr 22 19:10:56.388025 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:56.388001 2566 scope.go:117] "RemoveContainer" containerID="d811e4999ff1cb267164589f5c137425bfc50009015c46542299bcba6ee236ac" Apr 22 19:10:56.388308 ip-10-0-130-234 kubenswrapper[2566]: E0422 19:10:56.388283 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d811e4999ff1cb267164589f5c137425bfc50009015c46542299bcba6ee236ac\": container with ID starting with d811e4999ff1cb267164589f5c137425bfc50009015c46542299bcba6ee236ac not found: ID does not exist" containerID="d811e4999ff1cb267164589f5c137425bfc50009015c46542299bcba6ee236ac" Apr 22 19:10:56.388376 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:56.388322 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d811e4999ff1cb267164589f5c137425bfc50009015c46542299bcba6ee236ac"} err="failed to get container status \"d811e4999ff1cb267164589f5c137425bfc50009015c46542299bcba6ee236ac\": rpc error: code = NotFound desc = could not find container \"d811e4999ff1cb267164589f5c137425bfc50009015c46542299bcba6ee236ac\": container with ID starting with d811e4999ff1cb267164589f5c137425bfc50009015c46542299bcba6ee236ac not found: ID does not exist" Apr 22 19:10:56.388376 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:56.388353 2566 scope.go:117] "RemoveContainer" containerID="b3271b5df8ab3232f992ebf45ae19a2a5dea11d8b417fcd067de9db7cecec783" Apr 22 19:10:56.388650 ip-10-0-130-234 kubenswrapper[2566]: E0422 19:10:56.388624 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3271b5df8ab3232f992ebf45ae19a2a5dea11d8b417fcd067de9db7cecec783\": container with ID starting with b3271b5df8ab3232f992ebf45ae19a2a5dea11d8b417fcd067de9db7cecec783 not found: ID does not exist" containerID="b3271b5df8ab3232f992ebf45ae19a2a5dea11d8b417fcd067de9db7cecec783" Apr 22 19:10:56.388708 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:56.388662 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3271b5df8ab3232f992ebf45ae19a2a5dea11d8b417fcd067de9db7cecec783"} err="failed to get container status \"b3271b5df8ab3232f992ebf45ae19a2a5dea11d8b417fcd067de9db7cecec783\": rpc error: code = NotFound desc = could not find container \"b3271b5df8ab3232f992ebf45ae19a2a5dea11d8b417fcd067de9db7cecec783\": container with ID starting with b3271b5df8ab3232f992ebf45ae19a2a5dea11d8b417fcd067de9db7cecec783 not found: ID does not exist" Apr 22 19:10:56.832276 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:56.832241 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9da9adaf-bb15-4393-95fc-ff8bd0dbee6e" path="/var/lib/kubelet/pods/9da9adaf-bb15-4393-95fc-ff8bd0dbee6e/volumes" Apr 22 19:10:59.785218 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:10:59.785162 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp" podUID="42da7359-3343-43cb-bc34-341f479dcb96" containerName="main" probeResult="failure" output="Get \"https://10.132.0.55:8000/health\": dial tcp 10.132.0.55:8000: connect: connection refused" Apr 22 19:11:01.903186 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:11:01.903135 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9" podUID="f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.54:8000/health\": dial tcp 10.132.0.54:8000: connect: connection refused" Apr 22 19:11:09.784933 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:11:09.784876 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp" podUID="42da7359-3343-43cb-bc34-341f479dcb96" containerName="main" probeResult="failure" output="Get \"https://10.132.0.55:8000/health\": dial tcp 10.132.0.55:8000: connect: connection refused" Apr 22 19:11:11.903020 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:11:11.902969 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9" podUID="f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.54:8000/health\": dial tcp 10.132.0.54:8000: connect: connection refused" Apr 22 19:11:19.785193 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:11:19.785144 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp" podUID="42da7359-3343-43cb-bc34-341f479dcb96" containerName="main" probeResult="failure" output="Get \"https://10.132.0.55:8000/health\": dial tcp 10.132.0.55:8000: connect: connection refused" Apr 22 19:11:21.903200 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:11:21.903156 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9" podUID="f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.54:8000/health\": dial tcp 10.132.0.54:8000: connect: connection refused" Apr 22 19:11:29.784726 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:11:29.784682 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp" podUID="42da7359-3343-43cb-bc34-341f479dcb96" containerName="main" probeResult="failure" output="Get \"https://10.132.0.55:8000/health\": dial tcp 10.132.0.55:8000: connect: connection refused" Apr 22 19:11:31.902529 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:11:31.902488 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9" podUID="f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.54:8000/health\": dial tcp 10.132.0.54:8000: connect: connection refused" Apr 22 19:11:38.839836 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:11:38.839806 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4jzwx_4871a747-c041-430f-9c1a-3ab2ce6d0156/console-operator/1.log" Apr 22 19:11:38.848210 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:11:38.848189 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4jzwx_4871a747-c041-430f-9c1a-3ab2ce6d0156/console-operator/1.log" Apr 22 19:11:39.785158 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:11:39.785114 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp" podUID="42da7359-3343-43cb-bc34-341f479dcb96" containerName="main" probeResult="failure" output="Get \"https://10.132.0.55:8000/health\": dial tcp 10.132.0.55:8000: connect: connection refused" Apr 22 19:11:41.903048 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:11:41.903001 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9" podUID="f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.54:8000/health\": dial tcp 10.132.0.54:8000: connect: connection refused" Apr 22 19:11:49.784638 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:11:49.784591 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp" podUID="42da7359-3343-43cb-bc34-341f479dcb96" containerName="main" probeResult="failure" output="Get \"https://10.132.0.55:8000/health\": dial tcp 10.132.0.55:8000: connect: connection refused" Apr 22 19:11:51.902836 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:11:51.902790 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9" podUID="f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.54:8000/health\": dial tcp 10.132.0.54:8000: connect: connection refused" Apr 22 19:11:59.785141 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:11:59.785049 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp" podUID="42da7359-3343-43cb-bc34-341f479dcb96" containerName="main" probeResult="failure" output="Get \"https://10.132.0.55:8000/health\": dial tcp 10.132.0.55:8000: connect: connection refused" Apr 22 19:12:01.903368 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:01.903320 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9" podUID="f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.54:8000/health\": dial tcp 10.132.0.54:8000: connect: connection refused" Apr 22 19:12:09.785270 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:09.785229 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp" podUID="42da7359-3343-43cb-bc34-341f479dcb96" containerName="main" probeResult="failure" output="Get \"https://10.132.0.55:8000/health\": dial tcp 10.132.0.55:8000: connect: connection refused" Apr 22 19:12:11.913122 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:11.913090 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9" Apr 22 19:12:11.920968 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:11.920921 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9" Apr 22 19:12:19.370236 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:19.370204 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9"] Apr 22 19:12:19.370616 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:19.370497 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9" podUID="f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5" containerName="main" containerID="cri-o://024b4a266431cf49106a452f7898e7fccee4adaab8c79198a97b8a691181ea2e" gracePeriod=30 Apr 22 19:12:19.785123 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:19.785013 2566 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp" podUID="42da7359-3343-43cb-bc34-341f479dcb96" containerName="main" probeResult="failure" output="Get \"https://10.132.0.55:8000/health\": dial tcp 10.132.0.55:8000: connect: connection refused" Apr 22 19:12:28.420004 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:28.419964 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 22 19:12:28.422778 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:28.420328 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9da9adaf-bb15-4393-95fc-ff8bd0dbee6e" containerName="storage-initializer" Apr 22 19:12:28.422778 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:28.420339 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="9da9adaf-bb15-4393-95fc-ff8bd0dbee6e" containerName="storage-initializer" Apr 22 19:12:28.422778 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:28.420361 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9da9adaf-bb15-4393-95fc-ff8bd0dbee6e" containerName="main" Apr 22 19:12:28.422778 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:28.420367 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="9da9adaf-bb15-4393-95fc-ff8bd0dbee6e" containerName="main" Apr 22 19:12:28.422778 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:28.420432 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="9da9adaf-bb15-4393-95fc-ff8bd0dbee6e" containerName="main" Apr 22 19:12:28.423563 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:28.423546 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:12:28.426873 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:28.426847 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 22 19:12:28.427011 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:28.426888 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5-cb7fb8cf-dockercfg-4wcjg\"" Apr 22 19:12:28.434607 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:28.434582 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 22 19:12:28.547051 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:28.547013 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d6107282-dbd2-4ceb-a9e3-67ed5aa11633-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"d6107282-dbd2-4ceb-a9e3-67ed5aa11633\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:12:28.547236 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:28.547088 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d6107282-dbd2-4ceb-a9e3-67ed5aa11633-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"d6107282-dbd2-4ceb-a9e3-67ed5aa11633\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:12:28.547236 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:28.547117 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d6107282-dbd2-4ceb-a9e3-67ed5aa11633-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"d6107282-dbd2-4ceb-a9e3-67ed5aa11633\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:12:28.547236 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:28.547187 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d6107282-dbd2-4ceb-a9e3-67ed5aa11633-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"d6107282-dbd2-4ceb-a9e3-67ed5aa11633\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:12:28.547236 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:28.547215 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d6107282-dbd2-4ceb-a9e3-67ed5aa11633-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"d6107282-dbd2-4ceb-a9e3-67ed5aa11633\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:12:28.547395 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:28.547240 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2dsb\" (UniqueName: \"kubernetes.io/projected/d6107282-dbd2-4ceb-a9e3-67ed5aa11633-kube-api-access-m2dsb\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"d6107282-dbd2-4ceb-a9e3-67ed5aa11633\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:12:28.648254 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:28.648216 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d6107282-dbd2-4ceb-a9e3-67ed5aa11633-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"d6107282-dbd2-4ceb-a9e3-67ed5aa11633\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:12:28.648444 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:28.648279 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d6107282-dbd2-4ceb-a9e3-67ed5aa11633-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"d6107282-dbd2-4ceb-a9e3-67ed5aa11633\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:12:28.648444 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:28.648298 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d6107282-dbd2-4ceb-a9e3-67ed5aa11633-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"d6107282-dbd2-4ceb-a9e3-67ed5aa11633\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:12:28.648444 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:28.648332 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d6107282-dbd2-4ceb-a9e3-67ed5aa11633-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"d6107282-dbd2-4ceb-a9e3-67ed5aa11633\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:12:28.648444 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:28.648354 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d6107282-dbd2-4ceb-a9e3-67ed5aa11633-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"d6107282-dbd2-4ceb-a9e3-67ed5aa11633\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:12:28.648444 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:28.648379 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m2dsb\" (UniqueName: \"kubernetes.io/projected/d6107282-dbd2-4ceb-a9e3-67ed5aa11633-kube-api-access-m2dsb\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"d6107282-dbd2-4ceb-a9e3-67ed5aa11633\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:12:28.648719 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:28.648625 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d6107282-dbd2-4ceb-a9e3-67ed5aa11633-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"d6107282-dbd2-4ceb-a9e3-67ed5aa11633\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:12:28.648719 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:28.648707 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d6107282-dbd2-4ceb-a9e3-67ed5aa11633-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"d6107282-dbd2-4ceb-a9e3-67ed5aa11633\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:12:28.648719 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:28.648718 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d6107282-dbd2-4ceb-a9e3-67ed5aa11633-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"d6107282-dbd2-4ceb-a9e3-67ed5aa11633\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:12:28.650729 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:28.650697 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d6107282-dbd2-4ceb-a9e3-67ed5aa11633-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"d6107282-dbd2-4ceb-a9e3-67ed5aa11633\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:12:28.650974 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:28.650924 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d6107282-dbd2-4ceb-a9e3-67ed5aa11633-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"d6107282-dbd2-4ceb-a9e3-67ed5aa11633\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:12:28.655908 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:28.655886 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2dsb\" (UniqueName: \"kubernetes.io/projected/d6107282-dbd2-4ceb-a9e3-67ed5aa11633-kube-api-access-m2dsb\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"d6107282-dbd2-4ceb-a9e3-67ed5aa11633\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:12:28.735980 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:28.735873 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:12:28.868432 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:28.868398 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 22 19:12:28.869910 ip-10-0-130-234 kubenswrapper[2566]: W0422 19:12:28.869871 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6107282_dbd2_4ceb_a9e3_67ed5aa11633.slice/crio-5c5a9cdef7b52809d820c2bde565cf0cc0601e4a013062a87290a1f48071adc4 WatchSource:0}: Error finding container 5c5a9cdef7b52809d820c2bde565cf0cc0601e4a013062a87290a1f48071adc4: Status 404 returned error can't find the container with id 5c5a9cdef7b52809d820c2bde565cf0cc0601e4a013062a87290a1f48071adc4 Apr 22 19:12:29.731182 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:29.731141 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"d6107282-dbd2-4ceb-a9e3-67ed5aa11633","Type":"ContainerStarted","Data":"d909d83e41d9697429d55865d5635a34b40290e312c010d60d7625512241af41"} Apr 22 19:12:29.731182 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:29.731187 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"d6107282-dbd2-4ceb-a9e3-67ed5aa11633","Type":"ContainerStarted","Data":"5c5a9cdef7b52809d820c2bde565cf0cc0601e4a013062a87290a1f48071adc4"} Apr 22 19:12:29.794300 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:29.794274 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp" Apr 22 19:12:29.801855 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:29.801834 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp" Apr 22 19:12:33.754144 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:33.754102 2566 generic.go:358] "Generic (PLEG): container finished" podID="d6107282-dbd2-4ceb-a9e3-67ed5aa11633" containerID="d909d83e41d9697429d55865d5635a34b40290e312c010d60d7625512241af41" exitCode=0 Apr 22 19:12:33.754655 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:33.754182 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"d6107282-dbd2-4ceb-a9e3-67ed5aa11633","Type":"ContainerDied","Data":"d909d83e41d9697429d55865d5635a34b40290e312c010d60d7625512241af41"} Apr 22 19:12:34.760279 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:34.760240 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"d6107282-dbd2-4ceb-a9e3-67ed5aa11633","Type":"ContainerStarted","Data":"b6264481bd65cc636d906c3a1b97c68d24450421f73e94c23d1943719ede1869"} Apr 22 19:12:34.776469 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:34.776411 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" podStartSLOduration=6.7763981189999996 podStartE2EDuration="6.776398119s" podCreationTimestamp="2026-04-22 19:12:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:12:34.774807535 +0000 UTC m=+1556.553641020" watchObservedRunningTime="2026-04-22 19:12:34.776398119 +0000 UTC m=+1556.555231605" Apr 22 19:12:49.636320 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:49.636262 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9_f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5/main/0.log" Apr 22 19:12:49.636682 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:49.636605 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9" Apr 22 19:12:49.757732 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:49.757697 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5-home\") pod \"f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5\" (UID: \"f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5\") " Apr 22 19:12:49.757967 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:49.757777 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5-model-cache\") pod \"f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5\" (UID: \"f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5\") " Apr 22 19:12:49.757967 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:49.757795 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5-tls-certs\") pod \"f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5\" (UID: \"f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5\") " Apr 22 19:12:49.757967 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:49.757814 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5-kserve-provision-location\") pod \"f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5\" (UID: \"f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5\") " Apr 22 19:12:49.757967 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:49.757864 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhtfz\" (UniqueName: \"kubernetes.io/projected/f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5-kube-api-access-bhtfz\") pod \"f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5\" (UID: \"f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5\") " Apr 22 19:12:49.757967 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:49.757888 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5-dshm\") pod \"f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5\" (UID: \"f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5\") " Apr 22 19:12:49.758245 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:49.758137 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5-model-cache" (OuterVolumeSpecName: "model-cache") pod "f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5" (UID: "f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:12:49.758245 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:49.758171 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5-home" (OuterVolumeSpecName: "home") pod "f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5" (UID: "f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:12:49.758364 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:49.758252 2566 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5-home\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:12:49.758364 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:49.758270 2566 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5-model-cache\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:12:49.760187 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:49.760159 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5" (UID: "f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:12:49.760295 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:49.760182 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5-kube-api-access-bhtfz" (OuterVolumeSpecName: "kube-api-access-bhtfz") pod "f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5" (UID: "f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5"). InnerVolumeSpecName "kube-api-access-bhtfz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:12:49.760505 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:49.760478 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5-dshm" (OuterVolumeSpecName: "dshm") pod "f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5" (UID: "f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:12:49.815085 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:49.815046 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5" (UID: "f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:12:49.824669 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:49.824648 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9_f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5/main/0.log" Apr 22 19:12:49.825062 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:49.825035 2566 generic.go:358] "Generic (PLEG): container finished" podID="f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5" containerID="024b4a266431cf49106a452f7898e7fccee4adaab8c79198a97b8a691181ea2e" exitCode=137 Apr 22 19:12:49.825166 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:49.825148 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9" event={"ID":"f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5","Type":"ContainerDied","Data":"024b4a266431cf49106a452f7898e7fccee4adaab8c79198a97b8a691181ea2e"} Apr 22 19:12:49.825228 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:49.825170 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9" Apr 22 19:12:49.825228 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:49.825187 2566 scope.go:117] "RemoveContainer" containerID="024b4a266431cf49106a452f7898e7fccee4adaab8c79198a97b8a691181ea2e" Apr 22 19:12:49.825322 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:49.825175 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9" event={"ID":"f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5","Type":"ContainerDied","Data":"8dd72860a1f5a9346b9aeb001bd6c503175ca4197caf89705595adb88d44025b"} Apr 22 19:12:49.846043 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:49.846022 2566 scope.go:117] "RemoveContainer" containerID="94096d14d65477fa1ca452f60bcfd12e579eb6aea268d89a2846538456a88a3b" Apr 22 19:12:49.854420 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:49.854365 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9"] Apr 22 19:12:49.856693 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:49.856673 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6fdcd5545cngkd9"] Apr 22 19:12:49.857642 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:49.857625 2566 scope.go:117] "RemoveContainer" containerID="024b4a266431cf49106a452f7898e7fccee4adaab8c79198a97b8a691181ea2e" Apr 22 19:12:49.857891 ip-10-0-130-234 kubenswrapper[2566]: E0422 19:12:49.857874 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"024b4a266431cf49106a452f7898e7fccee4adaab8c79198a97b8a691181ea2e\": container with ID starting with 024b4a266431cf49106a452f7898e7fccee4adaab8c79198a97b8a691181ea2e not found: ID does not exist" containerID="024b4a266431cf49106a452f7898e7fccee4adaab8c79198a97b8a691181ea2e" Apr 22 19:12:49.857974 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:49.857900 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"024b4a266431cf49106a452f7898e7fccee4adaab8c79198a97b8a691181ea2e"} err="failed to get container status \"024b4a266431cf49106a452f7898e7fccee4adaab8c79198a97b8a691181ea2e\": rpc error: code = NotFound desc = could not find container \"024b4a266431cf49106a452f7898e7fccee4adaab8c79198a97b8a691181ea2e\": container with ID starting with 024b4a266431cf49106a452f7898e7fccee4adaab8c79198a97b8a691181ea2e not found: ID does not exist" Apr 22 19:12:49.857974 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:49.857919 2566 scope.go:117] "RemoveContainer" containerID="94096d14d65477fa1ca452f60bcfd12e579eb6aea268d89a2846538456a88a3b" Apr 22 19:12:49.858199 ip-10-0-130-234 kubenswrapper[2566]: E0422 19:12:49.858179 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94096d14d65477fa1ca452f60bcfd12e579eb6aea268d89a2846538456a88a3b\": container with ID starting with 94096d14d65477fa1ca452f60bcfd12e579eb6aea268d89a2846538456a88a3b not found: ID does not exist" containerID="94096d14d65477fa1ca452f60bcfd12e579eb6aea268d89a2846538456a88a3b" Apr 22 19:12:49.858244 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:49.858206 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94096d14d65477fa1ca452f60bcfd12e579eb6aea268d89a2846538456a88a3b"} err="failed to get container status \"94096d14d65477fa1ca452f60bcfd12e579eb6aea268d89a2846538456a88a3b\": rpc error: code = NotFound desc = could not find container \"94096d14d65477fa1ca452f60bcfd12e579eb6aea268d89a2846538456a88a3b\": container with ID starting with 94096d14d65477fa1ca452f60bcfd12e579eb6aea268d89a2846538456a88a3b not found: ID does not exist" Apr 22 19:12:49.859579 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:49.859562 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bhtfz\" (UniqueName: \"kubernetes.io/projected/f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5-kube-api-access-bhtfz\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:12:49.859629 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:49.859581 2566 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5-dshm\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:12:49.859629 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:49.859591 2566 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5-tls-certs\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:12:49.859629 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:49.859600 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5-kserve-provision-location\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:12:50.830298 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:50.830261 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5" path="/var/lib/kubelet/pods/f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5/volumes" Apr 22 19:12:50.982581 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:50.982542 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-85dc67dfdd-bt886"] Apr 22 19:12:50.983117 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:50.983097 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5" containerName="storage-initializer" Apr 22 19:12:50.983213 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:50.983120 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5" containerName="storage-initializer" Apr 22 19:12:50.983213 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:50.983153 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5" containerName="main" Apr 22 19:12:50.983213 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:50.983162 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5" containerName="main" Apr 22 19:12:50.983383 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:50.983267 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="f92a5388-51c0-4ef5-bd5a-6bd09cb1aed5" containerName="main" Apr 22 19:12:50.989476 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:50.989450 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-85dc67dfdd-bt886" Apr 22 19:12:50.992663 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:50.992637 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-kserve-self-signed-certs\"" Apr 22 19:12:50.992930 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:50.992909 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-85dc67dfdd-bt886"] Apr 22 19:12:51.172632 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:51.172592 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/51f86925-502a-4320-b025-4ac8e3aa2b02-dshm\") pod \"scheduler-inline-config-test-kserve-85dc67dfdd-bt886\" (UID: \"51f86925-502a-4320-b025-4ac8e3aa2b02\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-85dc67dfdd-bt886" Apr 22 19:12:51.172816 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:51.172658 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/51f86925-502a-4320-b025-4ac8e3aa2b02-home\") pod \"scheduler-inline-config-test-kserve-85dc67dfdd-bt886\" (UID: \"51f86925-502a-4320-b025-4ac8e3aa2b02\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-85dc67dfdd-bt886" Apr 22 19:12:51.172816 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:51.172681 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d9kh\" (UniqueName: \"kubernetes.io/projected/51f86925-502a-4320-b025-4ac8e3aa2b02-kube-api-access-4d9kh\") pod \"scheduler-inline-config-test-kserve-85dc67dfdd-bt886\" (UID: \"51f86925-502a-4320-b025-4ac8e3aa2b02\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-85dc67dfdd-bt886" Apr 22 19:12:51.172816 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:51.172707 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/51f86925-502a-4320-b025-4ac8e3aa2b02-model-cache\") pod \"scheduler-inline-config-test-kserve-85dc67dfdd-bt886\" (UID: \"51f86925-502a-4320-b025-4ac8e3aa2b02\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-85dc67dfdd-bt886" Apr 22 19:12:51.172816 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:51.172788 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/51f86925-502a-4320-b025-4ac8e3aa2b02-tls-certs\") pod \"scheduler-inline-config-test-kserve-85dc67dfdd-bt886\" (UID: \"51f86925-502a-4320-b025-4ac8e3aa2b02\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-85dc67dfdd-bt886" Apr 22 19:12:51.173002 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:51.172826 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/51f86925-502a-4320-b025-4ac8e3aa2b02-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-85dc67dfdd-bt886\" (UID: \"51f86925-502a-4320-b025-4ac8e3aa2b02\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-85dc67dfdd-bt886" Apr 22 19:12:51.274154 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:51.274115 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/51f86925-502a-4320-b025-4ac8e3aa2b02-dshm\") pod \"scheduler-inline-config-test-kserve-85dc67dfdd-bt886\" (UID: \"51f86925-502a-4320-b025-4ac8e3aa2b02\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-85dc67dfdd-bt886" Apr 22 19:12:51.274326 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:51.274197 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/51f86925-502a-4320-b025-4ac8e3aa2b02-home\") pod \"scheduler-inline-config-test-kserve-85dc67dfdd-bt886\" (UID: \"51f86925-502a-4320-b025-4ac8e3aa2b02\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-85dc67dfdd-bt886" Apr 22 19:12:51.274326 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:51.274216 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4d9kh\" (UniqueName: \"kubernetes.io/projected/51f86925-502a-4320-b025-4ac8e3aa2b02-kube-api-access-4d9kh\") pod \"scheduler-inline-config-test-kserve-85dc67dfdd-bt886\" (UID: \"51f86925-502a-4320-b025-4ac8e3aa2b02\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-85dc67dfdd-bt886" Apr 22 19:12:51.274326 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:51.274232 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/51f86925-502a-4320-b025-4ac8e3aa2b02-model-cache\") pod \"scheduler-inline-config-test-kserve-85dc67dfdd-bt886\" (UID: \"51f86925-502a-4320-b025-4ac8e3aa2b02\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-85dc67dfdd-bt886" Apr 22 19:12:51.274326 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:51.274278 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/51f86925-502a-4320-b025-4ac8e3aa2b02-tls-certs\") pod \"scheduler-inline-config-test-kserve-85dc67dfdd-bt886\" (UID: \"51f86925-502a-4320-b025-4ac8e3aa2b02\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-85dc67dfdd-bt886" Apr 22 19:12:51.274326 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:51.274307 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/51f86925-502a-4320-b025-4ac8e3aa2b02-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-85dc67dfdd-bt886\" (UID: \"51f86925-502a-4320-b025-4ac8e3aa2b02\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-85dc67dfdd-bt886" Apr 22 19:12:51.274651 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:51.274629 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/51f86925-502a-4320-b025-4ac8e3aa2b02-home\") pod \"scheduler-inline-config-test-kserve-85dc67dfdd-bt886\" (UID: \"51f86925-502a-4320-b025-4ac8e3aa2b02\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-85dc67dfdd-bt886" Apr 22 19:12:51.274738 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:51.274702 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/51f86925-502a-4320-b025-4ac8e3aa2b02-model-cache\") pod \"scheduler-inline-config-test-kserve-85dc67dfdd-bt886\" (UID: \"51f86925-502a-4320-b025-4ac8e3aa2b02\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-85dc67dfdd-bt886" Apr 22 19:12:51.274738 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:51.274720 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/51f86925-502a-4320-b025-4ac8e3aa2b02-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-85dc67dfdd-bt886\" (UID: \"51f86925-502a-4320-b025-4ac8e3aa2b02\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-85dc67dfdd-bt886" Apr 22 19:12:51.276368 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:51.276343 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/51f86925-502a-4320-b025-4ac8e3aa2b02-dshm\") pod \"scheduler-inline-config-test-kserve-85dc67dfdd-bt886\" (UID: \"51f86925-502a-4320-b025-4ac8e3aa2b02\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-85dc67dfdd-bt886" Apr 22 19:12:51.276727 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:51.276709 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/51f86925-502a-4320-b025-4ac8e3aa2b02-tls-certs\") pod \"scheduler-inline-config-test-kserve-85dc67dfdd-bt886\" (UID: \"51f86925-502a-4320-b025-4ac8e3aa2b02\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-85dc67dfdd-bt886" Apr 22 19:12:51.281313 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:51.281295 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d9kh\" (UniqueName: \"kubernetes.io/projected/51f86925-502a-4320-b025-4ac8e3aa2b02-kube-api-access-4d9kh\") pod \"scheduler-inline-config-test-kserve-85dc67dfdd-bt886\" (UID: \"51f86925-502a-4320-b025-4ac8e3aa2b02\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-85dc67dfdd-bt886" Apr 22 19:12:51.303344 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:51.303312 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-85dc67dfdd-bt886" Apr 22 19:12:51.430890 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:51.430861 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-85dc67dfdd-bt886"] Apr 22 19:12:51.433348 ip-10-0-130-234 kubenswrapper[2566]: W0422 19:12:51.433311 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51f86925_502a_4320_b025_4ac8e3aa2b02.slice/crio-5a2fef478c2cf514db8fd39501539a624e6cb9ede301e66ba3e579ede7d1a25a WatchSource:0}: Error finding container 5a2fef478c2cf514db8fd39501539a624e6cb9ede301e66ba3e579ede7d1a25a: Status 404 returned error can't find the container with id 5a2fef478c2cf514db8fd39501539a624e6cb9ede301e66ba3e579ede7d1a25a Apr 22 19:12:51.837336 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:51.837244 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-85dc67dfdd-bt886" event={"ID":"51f86925-502a-4320-b025-4ac8e3aa2b02","Type":"ContainerStarted","Data":"3100ffa6e398de6ef19a5da1b5ff91323b249838e0508b9cefba0938f4f36cde"} Apr 22 19:12:51.837336 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:51.837280 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-85dc67dfdd-bt886" event={"ID":"51f86925-502a-4320-b025-4ac8e3aa2b02","Type":"ContainerStarted","Data":"5a2fef478c2cf514db8fd39501539a624e6cb9ede301e66ba3e579ede7d1a25a"} Apr 22 19:12:53.208879 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:53.208842 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp"] Apr 22 19:12:53.209410 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:53.209262 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp" podUID="42da7359-3343-43cb-bc34-341f479dcb96" containerName="main" containerID="cri-o://ec65d3af44dc3fc307a3ad47fe5129c9e9d78a7580183315cde3e076a7253bba" gracePeriod=30 Apr 22 19:12:56.868969 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:56.868919 2566 generic.go:358] "Generic (PLEG): container finished" podID="51f86925-502a-4320-b025-4ac8e3aa2b02" containerID="3100ffa6e398de6ef19a5da1b5ff91323b249838e0508b9cefba0938f4f36cde" exitCode=0 Apr 22 19:12:56.869381 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:56.868992 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-85dc67dfdd-bt886" event={"ID":"51f86925-502a-4320-b025-4ac8e3aa2b02","Type":"ContainerDied","Data":"3100ffa6e398de6ef19a5da1b5ff91323b249838e0508b9cefba0938f4f36cde"} Apr 22 19:12:57.874882 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:57.874847 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-85dc67dfdd-bt886" event={"ID":"51f86925-502a-4320-b025-4ac8e3aa2b02","Type":"ContainerStarted","Data":"19e2e2964599f15e4d5fcb847f15b14d66d6cf6f15cf1d657293edf9c9f33750"} Apr 22 19:12:57.892998 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:12:57.892923 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-85dc67dfdd-bt886" podStartSLOduration=7.892905186 podStartE2EDuration="7.892905186s" podCreationTimestamp="2026-04-22 19:12:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:12:57.890350625 +0000 UTC m=+1579.669184123" watchObservedRunningTime="2026-04-22 19:12:57.892905186 +0000 UTC m=+1579.671738675" Apr 22 19:13:01.304316 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:01.304275 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-85dc67dfdd-bt886" Apr 22 19:13:01.304316 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:01.304323 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-85dc67dfdd-bt886" Apr 22 19:13:01.317128 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:01.317104 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-85dc67dfdd-bt886" Apr 22 19:13:01.902247 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:01.902217 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-85dc67dfdd-bt886" Apr 22 19:13:23.457530 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:23.457503 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp" Apr 22 19:13:23.567405 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:23.567316 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/42da7359-3343-43cb-bc34-341f479dcb96-dshm\") pod \"42da7359-3343-43cb-bc34-341f479dcb96\" (UID: \"42da7359-3343-43cb-bc34-341f479dcb96\") " Apr 22 19:13:23.567405 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:23.567358 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/42da7359-3343-43cb-bc34-341f479dcb96-home\") pod \"42da7359-3343-43cb-bc34-341f479dcb96\" (UID: \"42da7359-3343-43cb-bc34-341f479dcb96\") " Apr 22 19:13:23.567405 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:23.567379 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/42da7359-3343-43cb-bc34-341f479dcb96-kserve-provision-location\") pod \"42da7359-3343-43cb-bc34-341f479dcb96\" (UID: \"42da7359-3343-43cb-bc34-341f479dcb96\") " Apr 22 19:13:23.567701 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:23.567416 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/42da7359-3343-43cb-bc34-341f479dcb96-tls-certs\") pod \"42da7359-3343-43cb-bc34-341f479dcb96\" (UID: \"42da7359-3343-43cb-bc34-341f479dcb96\") " Apr 22 19:13:23.567701 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:23.567443 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/42da7359-3343-43cb-bc34-341f479dcb96-model-cache\") pod \"42da7359-3343-43cb-bc34-341f479dcb96\" (UID: \"42da7359-3343-43cb-bc34-341f479dcb96\") " Apr 22 19:13:23.567701 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:23.567507 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pf586\" (UniqueName: \"kubernetes.io/projected/42da7359-3343-43cb-bc34-341f479dcb96-kube-api-access-pf586\") pod \"42da7359-3343-43cb-bc34-341f479dcb96\" (UID: \"42da7359-3343-43cb-bc34-341f479dcb96\") " Apr 22 19:13:23.567867 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:23.567765 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42da7359-3343-43cb-bc34-341f479dcb96-model-cache" (OuterVolumeSpecName: "model-cache") pod "42da7359-3343-43cb-bc34-341f479dcb96" (UID: "42da7359-3343-43cb-bc34-341f479dcb96"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:13:23.567867 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:23.567833 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42da7359-3343-43cb-bc34-341f479dcb96-home" (OuterVolumeSpecName: "home") pod "42da7359-3343-43cb-bc34-341f479dcb96" (UID: "42da7359-3343-43cb-bc34-341f479dcb96"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:13:23.569736 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:23.569704 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42da7359-3343-43cb-bc34-341f479dcb96-kube-api-access-pf586" (OuterVolumeSpecName: "kube-api-access-pf586") pod "42da7359-3343-43cb-bc34-341f479dcb96" (UID: "42da7359-3343-43cb-bc34-341f479dcb96"). InnerVolumeSpecName "kube-api-access-pf586". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:13:23.569851 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:23.569760 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42da7359-3343-43cb-bc34-341f479dcb96-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "42da7359-3343-43cb-bc34-341f479dcb96" (UID: "42da7359-3343-43cb-bc34-341f479dcb96"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:13:23.569851 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:23.569794 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42da7359-3343-43cb-bc34-341f479dcb96-dshm" (OuterVolumeSpecName: "dshm") pod "42da7359-3343-43cb-bc34-341f479dcb96" (UID: "42da7359-3343-43cb-bc34-341f479dcb96"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:13:23.638409 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:23.638369 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42da7359-3343-43cb-bc34-341f479dcb96-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "42da7359-3343-43cb-bc34-341f479dcb96" (UID: "42da7359-3343-43cb-bc34-341f479dcb96"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:13:23.668901 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:23.668874 2566 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/42da7359-3343-43cb-bc34-341f479dcb96-dshm\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:13:23.668901 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:23.668897 2566 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/42da7359-3343-43cb-bc34-341f479dcb96-home\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:13:23.669090 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:23.668906 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/42da7359-3343-43cb-bc34-341f479dcb96-kserve-provision-location\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:13:23.669090 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:23.668918 2566 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/42da7359-3343-43cb-bc34-341f479dcb96-tls-certs\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:13:23.669090 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:23.668928 2566 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/42da7359-3343-43cb-bc34-341f479dcb96-model-cache\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:13:23.669090 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:23.668957 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pf586\" (UniqueName: \"kubernetes.io/projected/42da7359-3343-43cb-bc34-341f479dcb96-kube-api-access-pf586\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:13:23.973657 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:23.973608 2566 generic.go:358] "Generic (PLEG): container finished" podID="42da7359-3343-43cb-bc34-341f479dcb96" containerID="ec65d3af44dc3fc307a3ad47fe5129c9e9d78a7580183315cde3e076a7253bba" exitCode=137 Apr 22 19:13:23.973919 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:23.973719 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp" event={"ID":"42da7359-3343-43cb-bc34-341f479dcb96","Type":"ContainerDied","Data":"ec65d3af44dc3fc307a3ad47fe5129c9e9d78a7580183315cde3e076a7253bba"} Apr 22 19:13:23.974089 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:23.974069 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp" event={"ID":"42da7359-3343-43cb-bc34-341f479dcb96","Type":"ContainerDied","Data":"e63d5e8622902418e59a4732febfda0ee4207c1880b14887bd9834ad8e4fd40c"} Apr 22 19:13:23.974193 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:23.974180 2566 scope.go:117] "RemoveContainer" containerID="ec65d3af44dc3fc307a3ad47fe5129c9e9d78a7580183315cde3e076a7253bba" Apr 22 19:13:23.974375 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:23.973812 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp" Apr 22 19:13:24.000138 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:24.000112 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-85dc67dfdd-bt886"] Apr 22 19:13:24.000396 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:24.000374 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-85dc67dfdd-bt886" podUID="51f86925-502a-4320-b025-4ac8e3aa2b02" containerName="main" containerID="cri-o://19e2e2964599f15e4d5fcb847f15b14d66d6cf6f15cf1d657293edf9c9f33750" gracePeriod=30 Apr 22 19:13:24.008285 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:24.008239 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp"] Apr 22 19:13:24.009221 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:24.009176 2566 scope.go:117] "RemoveContainer" containerID="ef2fa9680d46b64206ed8352b534dcedbc36521c20fa229c8cf6ceca039ffc6a" Apr 22 19:13:24.010587 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:24.010569 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-756f5b6848-7k4jp"] Apr 22 19:13:24.074881 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:24.074856 2566 scope.go:117] "RemoveContainer" containerID="ec65d3af44dc3fc307a3ad47fe5129c9e9d78a7580183315cde3e076a7253bba" Apr 22 19:13:24.075225 ip-10-0-130-234 kubenswrapper[2566]: E0422 19:13:24.075200 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec65d3af44dc3fc307a3ad47fe5129c9e9d78a7580183315cde3e076a7253bba\": container with ID starting with ec65d3af44dc3fc307a3ad47fe5129c9e9d78a7580183315cde3e076a7253bba not found: ID does not exist" containerID="ec65d3af44dc3fc307a3ad47fe5129c9e9d78a7580183315cde3e076a7253bba" Apr 22 19:13:24.075323 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:24.075234 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec65d3af44dc3fc307a3ad47fe5129c9e9d78a7580183315cde3e076a7253bba"} err="failed to get container status \"ec65d3af44dc3fc307a3ad47fe5129c9e9d78a7580183315cde3e076a7253bba\": rpc error: code = NotFound desc = could not find container \"ec65d3af44dc3fc307a3ad47fe5129c9e9d78a7580183315cde3e076a7253bba\": container with ID starting with ec65d3af44dc3fc307a3ad47fe5129c9e9d78a7580183315cde3e076a7253bba not found: ID does not exist" Apr 22 19:13:24.075323 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:24.075252 2566 scope.go:117] "RemoveContainer" containerID="ef2fa9680d46b64206ed8352b534dcedbc36521c20fa229c8cf6ceca039ffc6a" Apr 22 19:13:24.075511 ip-10-0-130-234 kubenswrapper[2566]: E0422 19:13:24.075493 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef2fa9680d46b64206ed8352b534dcedbc36521c20fa229c8cf6ceca039ffc6a\": container with ID starting with ef2fa9680d46b64206ed8352b534dcedbc36521c20fa229c8cf6ceca039ffc6a not found: ID does not exist" containerID="ef2fa9680d46b64206ed8352b534dcedbc36521c20fa229c8cf6ceca039ffc6a" Apr 22 19:13:24.075551 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:24.075519 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef2fa9680d46b64206ed8352b534dcedbc36521c20fa229c8cf6ceca039ffc6a"} err="failed to get container status \"ef2fa9680d46b64206ed8352b534dcedbc36521c20fa229c8cf6ceca039ffc6a\": rpc error: code = NotFound desc = could not find container \"ef2fa9680d46b64206ed8352b534dcedbc36521c20fa229c8cf6ceca039ffc6a\": container with ID starting with ef2fa9680d46b64206ed8352b534dcedbc36521c20fa229c8cf6ceca039ffc6a not found: ID does not exist" Apr 22 19:13:24.239089 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:24.239066 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-85dc67dfdd-bt886" Apr 22 19:13:24.374315 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:24.374279 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d9kh\" (UniqueName: \"kubernetes.io/projected/51f86925-502a-4320-b025-4ac8e3aa2b02-kube-api-access-4d9kh\") pod \"51f86925-502a-4320-b025-4ac8e3aa2b02\" (UID: \"51f86925-502a-4320-b025-4ac8e3aa2b02\") " Apr 22 19:13:24.374315 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:24.374322 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/51f86925-502a-4320-b025-4ac8e3aa2b02-home\") pod \"51f86925-502a-4320-b025-4ac8e3aa2b02\" (UID: \"51f86925-502a-4320-b025-4ac8e3aa2b02\") " Apr 22 19:13:24.374541 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:24.374340 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/51f86925-502a-4320-b025-4ac8e3aa2b02-kserve-provision-location\") pod \"51f86925-502a-4320-b025-4ac8e3aa2b02\" (UID: \"51f86925-502a-4320-b025-4ac8e3aa2b02\") " Apr 22 19:13:24.374541 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:24.374373 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/51f86925-502a-4320-b025-4ac8e3aa2b02-dshm\") pod \"51f86925-502a-4320-b025-4ac8e3aa2b02\" (UID: \"51f86925-502a-4320-b025-4ac8e3aa2b02\") " Apr 22 19:13:24.374541 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:24.374513 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/51f86925-502a-4320-b025-4ac8e3aa2b02-model-cache\") pod \"51f86925-502a-4320-b025-4ac8e3aa2b02\" (UID: \"51f86925-502a-4320-b025-4ac8e3aa2b02\") " Apr 22 19:13:24.374703 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:24.374542 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/51f86925-502a-4320-b025-4ac8e3aa2b02-tls-certs\") pod \"51f86925-502a-4320-b025-4ac8e3aa2b02\" (UID: \"51f86925-502a-4320-b025-4ac8e3aa2b02\") " Apr 22 19:13:24.374703 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:24.374589 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51f86925-502a-4320-b025-4ac8e3aa2b02-home" (OuterVolumeSpecName: "home") pod "51f86925-502a-4320-b025-4ac8e3aa2b02" (UID: "51f86925-502a-4320-b025-4ac8e3aa2b02"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:13:24.374820 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:24.374794 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51f86925-502a-4320-b025-4ac8e3aa2b02-model-cache" (OuterVolumeSpecName: "model-cache") pod "51f86925-502a-4320-b025-4ac8e3aa2b02" (UID: "51f86925-502a-4320-b025-4ac8e3aa2b02"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:13:24.374876 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:24.374805 2566 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/51f86925-502a-4320-b025-4ac8e3aa2b02-home\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:13:24.376610 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:24.376579 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51f86925-502a-4320-b025-4ac8e3aa2b02-dshm" (OuterVolumeSpecName: "dshm") pod "51f86925-502a-4320-b025-4ac8e3aa2b02" (UID: "51f86925-502a-4320-b025-4ac8e3aa2b02"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:13:24.376735 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:24.376684 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51f86925-502a-4320-b025-4ac8e3aa2b02-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "51f86925-502a-4320-b025-4ac8e3aa2b02" (UID: "51f86925-502a-4320-b025-4ac8e3aa2b02"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:13:24.376735 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:24.376726 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51f86925-502a-4320-b025-4ac8e3aa2b02-kube-api-access-4d9kh" (OuterVolumeSpecName: "kube-api-access-4d9kh") pod "51f86925-502a-4320-b025-4ac8e3aa2b02" (UID: "51f86925-502a-4320-b025-4ac8e3aa2b02"). InnerVolumeSpecName "kube-api-access-4d9kh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:13:24.446148 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:24.446108 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51f86925-502a-4320-b025-4ac8e3aa2b02-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "51f86925-502a-4320-b025-4ac8e3aa2b02" (UID: "51f86925-502a-4320-b025-4ac8e3aa2b02"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:13:24.475931 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:24.475895 2566 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/51f86925-502a-4320-b025-4ac8e3aa2b02-model-cache\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:13:24.475931 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:24.475925 2566 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/51f86925-502a-4320-b025-4ac8e3aa2b02-tls-certs\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:13:24.475931 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:24.475956 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4d9kh\" (UniqueName: \"kubernetes.io/projected/51f86925-502a-4320-b025-4ac8e3aa2b02-kube-api-access-4d9kh\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:13:24.476370 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:24.475967 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/51f86925-502a-4320-b025-4ac8e3aa2b02-kserve-provision-location\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:13:24.476370 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:24.475976 2566 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/51f86925-502a-4320-b025-4ac8e3aa2b02-dshm\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:13:24.830907 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:24.830874 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42da7359-3343-43cb-bc34-341f479dcb96" path="/var/lib/kubelet/pods/42da7359-3343-43cb-bc34-341f479dcb96/volumes" Apr 22 19:13:24.980080 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:24.980043 2566 generic.go:358] "Generic (PLEG): container finished" podID="51f86925-502a-4320-b025-4ac8e3aa2b02" containerID="19e2e2964599f15e4d5fcb847f15b14d66d6cf6f15cf1d657293edf9c9f33750" exitCode=0 Apr 22 19:13:24.980258 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:24.980106 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-85dc67dfdd-bt886" Apr 22 19:13:24.980258 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:24.980128 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-85dc67dfdd-bt886" event={"ID":"51f86925-502a-4320-b025-4ac8e3aa2b02","Type":"ContainerDied","Data":"19e2e2964599f15e4d5fcb847f15b14d66d6cf6f15cf1d657293edf9c9f33750"} Apr 22 19:13:24.980258 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:24.980164 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-85dc67dfdd-bt886" event={"ID":"51f86925-502a-4320-b025-4ac8e3aa2b02","Type":"ContainerDied","Data":"5a2fef478c2cf514db8fd39501539a624e6cb9ede301e66ba3e579ede7d1a25a"} Apr 22 19:13:24.980258 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:24.980180 2566 scope.go:117] "RemoveContainer" containerID="19e2e2964599f15e4d5fcb847f15b14d66d6cf6f15cf1d657293edf9c9f33750" Apr 22 19:13:24.989006 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:24.988968 2566 scope.go:117] "RemoveContainer" containerID="3100ffa6e398de6ef19a5da1b5ff91323b249838e0508b9cefba0938f4f36cde" Apr 22 19:13:24.998635 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:24.998613 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-85dc67dfdd-bt886"] Apr 22 19:13:25.002788 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:25.002768 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-85dc67dfdd-bt886"] Apr 22 19:13:25.056449 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:25.056424 2566 scope.go:117] "RemoveContainer" containerID="19e2e2964599f15e4d5fcb847f15b14d66d6cf6f15cf1d657293edf9c9f33750" Apr 22 19:13:25.056775 ip-10-0-130-234 kubenswrapper[2566]: E0422 19:13:25.056746 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19e2e2964599f15e4d5fcb847f15b14d66d6cf6f15cf1d657293edf9c9f33750\": container with ID starting with 19e2e2964599f15e4d5fcb847f15b14d66d6cf6f15cf1d657293edf9c9f33750 not found: ID does not exist" containerID="19e2e2964599f15e4d5fcb847f15b14d66d6cf6f15cf1d657293edf9c9f33750" Apr 22 19:13:25.056859 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:25.056779 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19e2e2964599f15e4d5fcb847f15b14d66d6cf6f15cf1d657293edf9c9f33750"} err="failed to get container status \"19e2e2964599f15e4d5fcb847f15b14d66d6cf6f15cf1d657293edf9c9f33750\": rpc error: code = NotFound desc = could not find container \"19e2e2964599f15e4d5fcb847f15b14d66d6cf6f15cf1d657293edf9c9f33750\": container with ID starting with 19e2e2964599f15e4d5fcb847f15b14d66d6cf6f15cf1d657293edf9c9f33750 not found: ID does not exist" Apr 22 19:13:25.056859 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:25.056809 2566 scope.go:117] "RemoveContainer" containerID="3100ffa6e398de6ef19a5da1b5ff91323b249838e0508b9cefba0938f4f36cde" Apr 22 19:13:25.057129 ip-10-0-130-234 kubenswrapper[2566]: E0422 19:13:25.057112 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3100ffa6e398de6ef19a5da1b5ff91323b249838e0508b9cefba0938f4f36cde\": container with ID starting with 3100ffa6e398de6ef19a5da1b5ff91323b249838e0508b9cefba0938f4f36cde not found: ID does not exist" containerID="3100ffa6e398de6ef19a5da1b5ff91323b249838e0508b9cefba0938f4f36cde" Apr 22 19:13:25.057185 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:25.057134 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3100ffa6e398de6ef19a5da1b5ff91323b249838e0508b9cefba0938f4f36cde"} err="failed to get container status \"3100ffa6e398de6ef19a5da1b5ff91323b249838e0508b9cefba0938f4f36cde\": rpc error: code = NotFound desc = could not find container \"3100ffa6e398de6ef19a5da1b5ff91323b249838e0508b9cefba0938f4f36cde\": container with ID starting with 3100ffa6e398de6ef19a5da1b5ff91323b249838e0508b9cefba0938f4f36cde not found: ID does not exist" Apr 22 19:13:26.830913 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:13:26.830870 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51f86925-502a-4320-b025-4ac8e3aa2b02" path="/var/lib/kubelet/pods/51f86925-502a-4320-b025-4ac8e3aa2b02/volumes" Apr 22 19:14:28.100355 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:28.100323 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 22 19:14:28.100780 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:28.100594 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" podUID="d6107282-dbd2-4ceb-a9e3-67ed5aa11633" containerName="main" containerID="cri-o://b6264481bd65cc636d906c3a1b97c68d24450421f73e94c23d1943719ede1869" gracePeriod=30 Apr 22 19:14:28.879316 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:28.879298 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:14:29.014486 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:29.014404 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d6107282-dbd2-4ceb-a9e3-67ed5aa11633-dshm\") pod \"d6107282-dbd2-4ceb-a9e3-67ed5aa11633\" (UID: \"d6107282-dbd2-4ceb-a9e3-67ed5aa11633\") " Apr 22 19:14:29.014486 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:29.014440 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d6107282-dbd2-4ceb-a9e3-67ed5aa11633-model-cache\") pod \"d6107282-dbd2-4ceb-a9e3-67ed5aa11633\" (UID: \"d6107282-dbd2-4ceb-a9e3-67ed5aa11633\") " Apr 22 19:14:29.014486 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:29.014475 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d6107282-dbd2-4ceb-a9e3-67ed5aa11633-home\") pod \"d6107282-dbd2-4ceb-a9e3-67ed5aa11633\" (UID: \"d6107282-dbd2-4ceb-a9e3-67ed5aa11633\") " Apr 22 19:14:29.014753 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:29.014509 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d6107282-dbd2-4ceb-a9e3-67ed5aa11633-kserve-provision-location\") pod \"d6107282-dbd2-4ceb-a9e3-67ed5aa11633\" (UID: \"d6107282-dbd2-4ceb-a9e3-67ed5aa11633\") " Apr 22 19:14:29.014753 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:29.014539 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2dsb\" (UniqueName: \"kubernetes.io/projected/d6107282-dbd2-4ceb-a9e3-67ed5aa11633-kube-api-access-m2dsb\") pod \"d6107282-dbd2-4ceb-a9e3-67ed5aa11633\" (UID: \"d6107282-dbd2-4ceb-a9e3-67ed5aa11633\") " Apr 22 19:14:29.014753 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:29.014619 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d6107282-dbd2-4ceb-a9e3-67ed5aa11633-tls-certs\") pod \"d6107282-dbd2-4ceb-a9e3-67ed5aa11633\" (UID: \"d6107282-dbd2-4ceb-a9e3-67ed5aa11633\") " Apr 22 19:14:29.015008 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:29.014745 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6107282-dbd2-4ceb-a9e3-67ed5aa11633-model-cache" (OuterVolumeSpecName: "model-cache") pod "d6107282-dbd2-4ceb-a9e3-67ed5aa11633" (UID: "d6107282-dbd2-4ceb-a9e3-67ed5aa11633"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:14:29.015008 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:29.014897 2566 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d6107282-dbd2-4ceb-a9e3-67ed5aa11633-model-cache\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:14:29.015008 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:29.014892 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6107282-dbd2-4ceb-a9e3-67ed5aa11633-home" (OuterVolumeSpecName: "home") pod "d6107282-dbd2-4ceb-a9e3-67ed5aa11633" (UID: "d6107282-dbd2-4ceb-a9e3-67ed5aa11633"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:14:29.016738 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:29.016706 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6107282-dbd2-4ceb-a9e3-67ed5aa11633-kube-api-access-m2dsb" (OuterVolumeSpecName: "kube-api-access-m2dsb") pod "d6107282-dbd2-4ceb-a9e3-67ed5aa11633" (UID: "d6107282-dbd2-4ceb-a9e3-67ed5aa11633"). InnerVolumeSpecName "kube-api-access-m2dsb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:14:29.017125 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:29.017095 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6107282-dbd2-4ceb-a9e3-67ed5aa11633-dshm" (OuterVolumeSpecName: "dshm") pod "d6107282-dbd2-4ceb-a9e3-67ed5aa11633" (UID: "d6107282-dbd2-4ceb-a9e3-67ed5aa11633"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:14:29.017234 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:29.017147 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6107282-dbd2-4ceb-a9e3-67ed5aa11633-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "d6107282-dbd2-4ceb-a9e3-67ed5aa11633" (UID: "d6107282-dbd2-4ceb-a9e3-67ed5aa11633"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:14:29.072012 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:29.071977 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6107282-dbd2-4ceb-a9e3-67ed5aa11633-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d6107282-dbd2-4ceb-a9e3-67ed5aa11633" (UID: "d6107282-dbd2-4ceb-a9e3-67ed5aa11633"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:14:29.115455 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:29.115424 2566 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d6107282-dbd2-4ceb-a9e3-67ed5aa11633-dshm\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:14:29.115455 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:29.115449 2566 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d6107282-dbd2-4ceb-a9e3-67ed5aa11633-home\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:14:29.115455 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:29.115458 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d6107282-dbd2-4ceb-a9e3-67ed5aa11633-kserve-provision-location\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:14:29.115966 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:29.115468 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m2dsb\" (UniqueName: \"kubernetes.io/projected/d6107282-dbd2-4ceb-a9e3-67ed5aa11633-kube-api-access-m2dsb\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:14:29.115966 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:29.115478 2566 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d6107282-dbd2-4ceb-a9e3-67ed5aa11633-tls-certs\") on node \"ip-10-0-130-234.ec2.internal\" DevicePath \"\"" Apr 22 19:14:29.228735 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:29.228704 2566 generic.go:358] "Generic (PLEG): container finished" podID="d6107282-dbd2-4ceb-a9e3-67ed5aa11633" containerID="b6264481bd65cc636d906c3a1b97c68d24450421f73e94c23d1943719ede1869" exitCode=0 Apr 22 19:14:29.228884 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:29.228758 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"d6107282-dbd2-4ceb-a9e3-67ed5aa11633","Type":"ContainerDied","Data":"b6264481bd65cc636d906c3a1b97c68d24450421f73e94c23d1943719ede1869"} Apr 22 19:14:29.228884 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:29.228784 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"d6107282-dbd2-4ceb-a9e3-67ed5aa11633","Type":"ContainerDied","Data":"5c5a9cdef7b52809d820c2bde565cf0cc0601e4a013062a87290a1f48071adc4"} Apr 22 19:14:29.228884 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:29.228798 2566 scope.go:117] "RemoveContainer" containerID="b6264481bd65cc636d906c3a1b97c68d24450421f73e94c23d1943719ede1869" Apr 22 19:14:29.228884 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:29.228806 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:14:29.250238 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:29.250209 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 22 19:14:29.251014 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:29.250996 2566 scope.go:117] "RemoveContainer" containerID="d909d83e41d9697429d55865d5635a34b40290e312c010d60d7625512241af41" Apr 22 19:14:29.254807 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:29.254785 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 22 19:14:29.262776 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:29.262746 2566 scope.go:117] "RemoveContainer" containerID="b6264481bd65cc636d906c3a1b97c68d24450421f73e94c23d1943719ede1869" Apr 22 19:14:29.263066 ip-10-0-130-234 kubenswrapper[2566]: E0422 19:14:29.263038 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6264481bd65cc636d906c3a1b97c68d24450421f73e94c23d1943719ede1869\": container with ID starting with b6264481bd65cc636d906c3a1b97c68d24450421f73e94c23d1943719ede1869 not found: ID does not exist" containerID="b6264481bd65cc636d906c3a1b97c68d24450421f73e94c23d1943719ede1869" Apr 22 19:14:29.263166 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:29.263070 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6264481bd65cc636d906c3a1b97c68d24450421f73e94c23d1943719ede1869"} err="failed to get container status \"b6264481bd65cc636d906c3a1b97c68d24450421f73e94c23d1943719ede1869\": rpc error: code = NotFound desc = could not find container \"b6264481bd65cc636d906c3a1b97c68d24450421f73e94c23d1943719ede1869\": container with ID starting with b6264481bd65cc636d906c3a1b97c68d24450421f73e94c23d1943719ede1869 not found: ID does not exist" Apr 22 19:14:29.263166 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:29.263102 2566 scope.go:117] "RemoveContainer" containerID="d909d83e41d9697429d55865d5635a34b40290e312c010d60d7625512241af41" Apr 22 19:14:29.263362 ip-10-0-130-234 kubenswrapper[2566]: E0422 19:14:29.263340 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d909d83e41d9697429d55865d5635a34b40290e312c010d60d7625512241af41\": container with ID starting with d909d83e41d9697429d55865d5635a34b40290e312c010d60d7625512241af41 not found: ID does not exist" containerID="d909d83e41d9697429d55865d5635a34b40290e312c010d60d7625512241af41" Apr 22 19:14:29.263417 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:29.263369 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d909d83e41d9697429d55865d5635a34b40290e312c010d60d7625512241af41"} err="failed to get container status \"d909d83e41d9697429d55865d5635a34b40290e312c010d60d7625512241af41\": rpc error: code = NotFound desc = could not find container \"d909d83e41d9697429d55865d5635a34b40290e312c010d60d7625512241af41\": container with ID starting with d909d83e41d9697429d55865d5635a34b40290e312c010d60d7625512241af41 not found: ID does not exist" Apr 22 19:14:30.830951 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:30.830900 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6107282-dbd2-4ceb-a9e3-67ed5aa11633" path="/var/lib/kubelet/pods/d6107282-dbd2-4ceb-a9e3-67ed5aa11633/volumes" Apr 22 19:14:56.528521 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:56.528489 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-lzlzq_1c636de5-f994-42f6-9a67-90a640c701be/kuadrant-console-plugin/0.log" Apr 22 19:14:56.586562 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:56.586533 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-lbcsn_ab2a243c-2f6c-4c80-8d10-46b9fd67b189/limitador/0.log" Apr 22 19:14:58.844088 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:58.844012 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-sn5zn/must-gather-ch7gv"] Apr 22 19:14:58.844475 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:58.844381 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d6107282-dbd2-4ceb-a9e3-67ed5aa11633" containerName="main" Apr 22 19:14:58.844475 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:58.844394 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6107282-dbd2-4ceb-a9e3-67ed5aa11633" containerName="main" Apr 22 19:14:58.844475 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:58.844408 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="42da7359-3343-43cb-bc34-341f479dcb96" containerName="storage-initializer" Apr 22 19:14:58.844475 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:58.844414 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="42da7359-3343-43cb-bc34-341f479dcb96" containerName="storage-initializer" Apr 22 19:14:58.844475 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:58.844426 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="42da7359-3343-43cb-bc34-341f479dcb96" containerName="main" Apr 22 19:14:58.844475 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:58.844431 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="42da7359-3343-43cb-bc34-341f479dcb96" containerName="main" Apr 22 19:14:58.844475 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:58.844452 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="51f86925-502a-4320-b025-4ac8e3aa2b02" containerName="storage-initializer" Apr 22 19:14:58.844475 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:58.844458 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="51f86925-502a-4320-b025-4ac8e3aa2b02" containerName="storage-initializer" Apr 22 19:14:58.844475 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:58.844467 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="51f86925-502a-4320-b025-4ac8e3aa2b02" containerName="main" Apr 22 19:14:58.844475 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:58.844472 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="51f86925-502a-4320-b025-4ac8e3aa2b02" containerName="main" Apr 22 19:14:58.844475 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:58.844479 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d6107282-dbd2-4ceb-a9e3-67ed5aa11633" containerName="storage-initializer" Apr 22 19:14:58.844856 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:58.844484 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6107282-dbd2-4ceb-a9e3-67ed5aa11633" containerName="storage-initializer" Apr 22 19:14:58.844856 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:58.844533 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="51f86925-502a-4320-b025-4ac8e3aa2b02" containerName="main" Apr 22 19:14:58.844856 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:58.844541 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="d6107282-dbd2-4ceb-a9e3-67ed5aa11633" containerName="main" Apr 22 19:14:58.844856 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:58.844549 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="42da7359-3343-43cb-bc34-341f479dcb96" containerName="main" Apr 22 19:14:58.849315 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:58.849300 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sn5zn/must-gather-ch7gv" Apr 22 19:14:58.851620 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:58.851595 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-sn5zn\"/\"default-dockercfg-dc758\"" Apr 22 19:14:58.852563 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:58.852541 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-sn5zn\"/\"kube-root-ca.crt\"" Apr 22 19:14:58.852871 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:58.852656 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-sn5zn\"/\"openshift-service-ca.crt\"" Apr 22 19:14:58.854779 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:58.854757 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-sn5zn/must-gather-ch7gv"] Apr 22 19:14:58.863800 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:58.863776 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8sxk\" (UniqueName: \"kubernetes.io/projected/aaa4db0b-be35-404f-86d3-7cc3d428dde4-kube-api-access-h8sxk\") pod \"must-gather-ch7gv\" (UID: \"aaa4db0b-be35-404f-86d3-7cc3d428dde4\") " pod="openshift-must-gather-sn5zn/must-gather-ch7gv" Apr 22 19:14:58.863979 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:58.863919 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/aaa4db0b-be35-404f-86d3-7cc3d428dde4-must-gather-output\") pod \"must-gather-ch7gv\" (UID: \"aaa4db0b-be35-404f-86d3-7cc3d428dde4\") " pod="openshift-must-gather-sn5zn/must-gather-ch7gv" Apr 22 19:14:58.964807 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:58.964782 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/aaa4db0b-be35-404f-86d3-7cc3d428dde4-must-gather-output\") pod \"must-gather-ch7gv\" (UID: \"aaa4db0b-be35-404f-86d3-7cc3d428dde4\") " pod="openshift-must-gather-sn5zn/must-gather-ch7gv" Apr 22 19:14:58.964961 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:58.964854 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h8sxk\" (UniqueName: \"kubernetes.io/projected/aaa4db0b-be35-404f-86d3-7cc3d428dde4-kube-api-access-h8sxk\") pod \"must-gather-ch7gv\" (UID: \"aaa4db0b-be35-404f-86d3-7cc3d428dde4\") " pod="openshift-must-gather-sn5zn/must-gather-ch7gv" Apr 22 19:14:58.965189 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:58.965172 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/aaa4db0b-be35-404f-86d3-7cc3d428dde4-must-gather-output\") pod \"must-gather-ch7gv\" (UID: \"aaa4db0b-be35-404f-86d3-7cc3d428dde4\") " pod="openshift-must-gather-sn5zn/must-gather-ch7gv" Apr 22 19:14:58.972066 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:58.972044 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8sxk\" (UniqueName: \"kubernetes.io/projected/aaa4db0b-be35-404f-86d3-7cc3d428dde4-kube-api-access-h8sxk\") pod \"must-gather-ch7gv\" (UID: \"aaa4db0b-be35-404f-86d3-7cc3d428dde4\") " pod="openshift-must-gather-sn5zn/must-gather-ch7gv" Apr 22 19:14:59.158797 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:59.158769 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sn5zn/must-gather-ch7gv" Apr 22 19:14:59.275931 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:59.275907 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-sn5zn/must-gather-ch7gv"] Apr 22 19:14:59.278222 ip-10-0-130-234 kubenswrapper[2566]: W0422 19:14:59.278193 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaaa4db0b_be35_404f_86d3_7cc3d428dde4.slice/crio-11ebb1fa3e2c555f2290aa6fe84e77654ce647e34c9d68306328139d62122be1 WatchSource:0}: Error finding container 11ebb1fa3e2c555f2290aa6fe84e77654ce647e34c9d68306328139d62122be1: Status 404 returned error can't find the container with id 11ebb1fa3e2c555f2290aa6fe84e77654ce647e34c9d68306328139d62122be1 Apr 22 19:14:59.337087 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:14:59.337057 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sn5zn/must-gather-ch7gv" event={"ID":"aaa4db0b-be35-404f-86d3-7cc3d428dde4","Type":"ContainerStarted","Data":"11ebb1fa3e2c555f2290aa6fe84e77654ce647e34c9d68306328139d62122be1"} Apr 22 19:15:00.343218 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:00.343172 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sn5zn/must-gather-ch7gv" event={"ID":"aaa4db0b-be35-404f-86d3-7cc3d428dde4","Type":"ContainerStarted","Data":"d913adfcea452c6fa2b8b7edbc969b3619f344e5cad9dba499ea2be60fdb6991"} Apr 22 19:15:01.349390 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:01.349355 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sn5zn/must-gather-ch7gv" event={"ID":"aaa4db0b-be35-404f-86d3-7cc3d428dde4","Type":"ContainerStarted","Data":"ef6eafef9e99468d2020cde3075f663ff43415d662c07bf4b56d3cf4c380613d"} Apr 22 19:15:01.363841 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:01.363780 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-sn5zn/must-gather-ch7gv" podStartSLOduration=2.427213385 podStartE2EDuration="3.363762989s" podCreationTimestamp="2026-04-22 19:14:58 +0000 UTC" firstStartedPulling="2026-04-22 19:14:59.280091076 +0000 UTC m=+1701.058924552" lastFinishedPulling="2026-04-22 19:15:00.216640693 +0000 UTC m=+1701.995474156" observedRunningTime="2026-04-22 19:15:01.361857755 +0000 UTC m=+1703.140691240" watchObservedRunningTime="2026-04-22 19:15:01.363762989 +0000 UTC m=+1703.142596476" Apr 22 19:15:01.732708 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:01.732678 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-xtfvj_55779771-0d58-4918-9769-95bf4525dc2d/global-pull-secret-syncer/0.log" Apr 22 19:15:01.821261 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:01.821229 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-x8klk_284cb099-6d1a-47cf-883b-5ada5cd11fa0/konnectivity-agent/0.log" Apr 22 19:15:01.841568 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:01.841540 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-130-234.ec2.internal_cd060d06ccaccd228d0dac1aa712346b/haproxy/0.log" Apr 22 19:15:06.151468 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:06.151350 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-lzlzq_1c636de5-f994-42f6-9a67-90a640c701be/kuadrant-console-plugin/0.log" Apr 22 19:15:06.215384 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:06.215353 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-lbcsn_ab2a243c-2f6c-4c80-8d10-46b9fd67b189/limitador/0.log" Apr 22 19:15:07.163524 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:07.163493 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5833426b-676a-4d5e-8e8b-ead0f3e719f9/alertmanager/0.log" Apr 22 19:15:07.188542 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:07.188515 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5833426b-676a-4d5e-8e8b-ead0f3e719f9/config-reloader/0.log" Apr 22 19:15:07.210010 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:07.209901 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5833426b-676a-4d5e-8e8b-ead0f3e719f9/kube-rbac-proxy-web/0.log" Apr 22 19:15:07.238311 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:07.238238 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5833426b-676a-4d5e-8e8b-ead0f3e719f9/kube-rbac-proxy/0.log" Apr 22 19:15:07.258136 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:07.258108 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5833426b-676a-4d5e-8e8b-ead0f3e719f9/kube-rbac-proxy-metric/0.log" Apr 22 19:15:07.278419 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:07.278387 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5833426b-676a-4d5e-8e8b-ead0f3e719f9/prom-label-proxy/0.log" Apr 22 19:15:07.300022 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:07.299991 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5833426b-676a-4d5e-8e8b-ead0f3e719f9/init-config-reloader/0.log" Apr 22 19:15:07.343000 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:07.342971 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-djlpr_574f89c9-2eec-4ad6-a1ef-c9b8109907a5/cluster-monitoring-operator/0.log" Apr 22 19:15:07.457185 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:07.457093 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-vmtgp_f01825ba-3046-4f60-9037-77b19a52845f/monitoring-plugin/0.log" Apr 22 19:15:07.673037 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:07.673011 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zlrvw_e91496df-a9d7-48a5-a22a-be7cd298ca98/node-exporter/0.log" Apr 22 19:15:07.694861 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:07.694780 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zlrvw_e91496df-a9d7-48a5-a22a-be7cd298ca98/kube-rbac-proxy/0.log" Apr 22 19:15:07.714843 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:07.714698 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zlrvw_e91496df-a9d7-48a5-a22a-be7cd298ca98/init-textfile/0.log" Apr 22 19:15:07.745910 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:07.745881 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-g2fbc_5721f580-f8b3-4921-9cec-388fc89208c3/kube-rbac-proxy-main/0.log" Apr 22 19:15:07.767637 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:07.767615 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-g2fbc_5721f580-f8b3-4921-9cec-388fc89208c3/kube-rbac-proxy-self/0.log" Apr 22 19:15:07.792866 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:07.792839 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-g2fbc_5721f580-f8b3-4921-9cec-388fc89208c3/openshift-state-metrics/0.log" Apr 22 19:15:08.047264 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:08.047113 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-htlw7_2dbd4d57-a75b-405a-bc03-338b3a2a721c/prometheus-operator/0.log" Apr 22 19:15:08.068800 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:08.068772 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-htlw7_2dbd4d57-a75b-405a-bc03-338b3a2a721c/kube-rbac-proxy/0.log" Apr 22 19:15:09.915765 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:09.915731 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4jzwx_4871a747-c041-430f-9c1a-3ab2ce6d0156/console-operator/1.log" Apr 22 19:15:09.921837 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:09.921812 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4jzwx_4871a747-c041-430f-9c1a-3ab2ce6d0156/console-operator/2.log" Apr 22 19:15:10.335982 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:10.335877 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-66b5c98d57-hhw22_f5cadbec-20f2-4935-a4e1-eebd6962ef43/console/0.log" Apr 22 19:15:10.628773 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:10.628736 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-sn5zn/perf-node-gather-daemonset-b4g2v"] Apr 22 19:15:10.634246 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:10.634222 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-b4g2v" Apr 22 19:15:10.639526 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:10.639501 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-sn5zn/perf-node-gather-daemonset-b4g2v"] Apr 22 19:15:10.676514 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:10.676477 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97kmz\" (UniqueName: \"kubernetes.io/projected/c9c880bb-dd2d-40e3-ab41-7cf16d3849d5-kube-api-access-97kmz\") pod \"perf-node-gather-daemonset-b4g2v\" (UID: \"c9c880bb-dd2d-40e3-ab41-7cf16d3849d5\") " pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-b4g2v" Apr 22 19:15:10.676658 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:10.676571 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/c9c880bb-dd2d-40e3-ab41-7cf16d3849d5-podres\") pod \"perf-node-gather-daemonset-b4g2v\" (UID: \"c9c880bb-dd2d-40e3-ab41-7cf16d3849d5\") " pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-b4g2v" Apr 22 19:15:10.676658 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:10.676628 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c9c880bb-dd2d-40e3-ab41-7cf16d3849d5-lib-modules\") pod \"perf-node-gather-daemonset-b4g2v\" (UID: \"c9c880bb-dd2d-40e3-ab41-7cf16d3849d5\") " pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-b4g2v" Apr 22 19:15:10.676743 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:10.676672 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c9c880bb-dd2d-40e3-ab41-7cf16d3849d5-sys\") pod \"perf-node-gather-daemonset-b4g2v\" (UID: \"c9c880bb-dd2d-40e3-ab41-7cf16d3849d5\") " pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-b4g2v" Apr 22 19:15:10.676743 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:10.676701 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/c9c880bb-dd2d-40e3-ab41-7cf16d3849d5-proc\") pod \"perf-node-gather-daemonset-b4g2v\" (UID: \"c9c880bb-dd2d-40e3-ab41-7cf16d3849d5\") " pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-b4g2v" Apr 22 19:15:10.777468 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:10.777432 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-97kmz\" (UniqueName: \"kubernetes.io/projected/c9c880bb-dd2d-40e3-ab41-7cf16d3849d5-kube-api-access-97kmz\") pod \"perf-node-gather-daemonset-b4g2v\" (UID: \"c9c880bb-dd2d-40e3-ab41-7cf16d3849d5\") " pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-b4g2v" Apr 22 19:15:10.777468 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:10.777475 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/c9c880bb-dd2d-40e3-ab41-7cf16d3849d5-podres\") pod \"perf-node-gather-daemonset-b4g2v\" (UID: \"c9c880bb-dd2d-40e3-ab41-7cf16d3849d5\") " pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-b4g2v" Apr 22 19:15:10.777676 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:10.777572 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/c9c880bb-dd2d-40e3-ab41-7cf16d3849d5-podres\") pod \"perf-node-gather-daemonset-b4g2v\" (UID: \"c9c880bb-dd2d-40e3-ab41-7cf16d3849d5\") " pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-b4g2v" Apr 22 19:15:10.777676 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:10.777586 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c9c880bb-dd2d-40e3-ab41-7cf16d3849d5-lib-modules\") pod \"perf-node-gather-daemonset-b4g2v\" (UID: \"c9c880bb-dd2d-40e3-ab41-7cf16d3849d5\") " pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-b4g2v" Apr 22 19:15:10.777676 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:10.777642 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c9c880bb-dd2d-40e3-ab41-7cf16d3849d5-lib-modules\") pod \"perf-node-gather-daemonset-b4g2v\" (UID: \"c9c880bb-dd2d-40e3-ab41-7cf16d3849d5\") " pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-b4g2v" Apr 22 19:15:10.777676 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:10.777653 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c9c880bb-dd2d-40e3-ab41-7cf16d3849d5-sys\") pod \"perf-node-gather-daemonset-b4g2v\" (UID: \"c9c880bb-dd2d-40e3-ab41-7cf16d3849d5\") " pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-b4g2v" Apr 22 19:15:10.777676 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:10.777675 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/c9c880bb-dd2d-40e3-ab41-7cf16d3849d5-proc\") pod \"perf-node-gather-daemonset-b4g2v\" (UID: \"c9c880bb-dd2d-40e3-ab41-7cf16d3849d5\") " pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-b4g2v" Apr 22 19:15:10.777876 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:10.777736 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c9c880bb-dd2d-40e3-ab41-7cf16d3849d5-sys\") pod \"perf-node-gather-daemonset-b4g2v\" (UID: \"c9c880bb-dd2d-40e3-ab41-7cf16d3849d5\") " pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-b4g2v" Apr 22 19:15:10.777876 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:10.777778 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/c9c880bb-dd2d-40e3-ab41-7cf16d3849d5-proc\") pod \"perf-node-gather-daemonset-b4g2v\" (UID: \"c9c880bb-dd2d-40e3-ab41-7cf16d3849d5\") " pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-b4g2v" Apr 22 19:15:10.785040 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:10.785003 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-97kmz\" (UniqueName: \"kubernetes.io/projected/c9c880bb-dd2d-40e3-ab41-7cf16d3849d5-kube-api-access-97kmz\") pod \"perf-node-gather-daemonset-b4g2v\" (UID: \"c9c880bb-dd2d-40e3-ab41-7cf16d3849d5\") " pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-b4g2v" Apr 22 19:15:10.949597 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:10.949506 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-b4g2v" Apr 22 19:15:11.095283 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:11.095251 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-sn5zn/perf-node-gather-daemonset-b4g2v"] Apr 22 19:15:11.096517 ip-10-0-130-234 kubenswrapper[2566]: W0422 19:15:11.096483 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podc9c880bb_dd2d_40e3_ab41_7cf16d3849d5.slice/crio-d02d37330d0f256ef41f342bc310ff1162c2a2bf6ea78c1c5bbe7b51a75fa6a9 WatchSource:0}: Error finding container d02d37330d0f256ef41f342bc310ff1162c2a2bf6ea78c1c5bbe7b51a75fa6a9: Status 404 returned error can't find the container with id d02d37330d0f256ef41f342bc310ff1162c2a2bf6ea78c1c5bbe7b51a75fa6a9 Apr 22 19:15:11.401320 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:11.401240 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-b4g2v" event={"ID":"c9c880bb-dd2d-40e3-ab41-7cf16d3849d5","Type":"ContainerStarted","Data":"f82781523ff70d5e13a0499fba3ae5affd401bce72d4fb5f91549ee1eb1352e0"} Apr 22 19:15:11.401320 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:11.401287 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-b4g2v" event={"ID":"c9c880bb-dd2d-40e3-ab41-7cf16d3849d5","Type":"ContainerStarted","Data":"d02d37330d0f256ef41f342bc310ff1162c2a2bf6ea78c1c5bbe7b51a75fa6a9"} Apr 22 19:15:11.401667 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:11.401630 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-b4g2v" Apr 22 19:15:11.415728 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:11.415679 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-b4g2v" podStartSLOduration=1.4156649940000001 podStartE2EDuration="1.415664994s" podCreationTimestamp="2026-04-22 19:15:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:15:11.41422837 +0000 UTC m=+1713.193061855" watchObservedRunningTime="2026-04-22 19:15:11.415664994 +0000 UTC m=+1713.194498480" Apr 22 19:15:11.521896 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:11.521872 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-6czt4_8d6d09bf-c484-4e3a-8f6f-43ddb670c369/dns/0.log" Apr 22 19:15:11.541891 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:11.541868 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-6czt4_8d6d09bf-c484-4e3a-8f6f-43ddb670c369/kube-rbac-proxy/0.log" Apr 22 19:15:11.644494 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:11.644468 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-6xjch_e92d5c0b-3e38-4729-bb74-4ca59dfb7f0f/dns-node-resolver/0.log" Apr 22 19:15:12.167447 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:12.167406 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-wj7nl_462b5fca-7ee5-444f-af12-38942635aa2f/node-ca/0.log" Apr 22 19:15:13.412060 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:13.412009 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-zkddm_9a00536f-bef4-4ca5-bc76-1956a1edb1d1/serve-healthcheck-canary/0.log" Apr 22 19:15:13.799659 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:13.799584 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-zvgcc_81a18479-876e-4704-9f10-9b3d459cce34/insights-operator/1.log" Apr 22 19:15:13.799804 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:13.799792 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-zvgcc_81a18479-876e-4704-9f10-9b3d459cce34/insights-operator/0.log" Apr 22 19:15:13.944349 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:13.944321 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-rpswj_de824459-5c69-45bc-b7b1-9320eba8e257/kube-rbac-proxy/0.log" Apr 22 19:15:13.968315 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:13.968282 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-rpswj_de824459-5c69-45bc-b7b1-9320eba8e257/exporter/0.log" Apr 22 19:15:13.989395 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:13.989355 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-rpswj_de824459-5c69-45bc-b7b1-9320eba8e257/extractor/0.log" Apr 22 19:15:16.495289 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:16.495254 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-6869d4659b-qm862_e13e10c4-2946-45e5-827f-7aea7b38e767/manager/0.log" Apr 22 19:15:17.416543 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:17.416513 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-sn5zn/perf-node-gather-daemonset-b4g2v" Apr 22 19:15:21.659689 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:21.659632 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-fkllm_59a77f39-9d49-4aba-99e0-0e3df93fec7e/migrator/0.log" Apr 22 19:15:21.679078 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:21.679051 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-fkllm_59a77f39-9d49-4aba-99e0-0e3df93fec7e/graceful-termination/0.log" Apr 22 19:15:23.227321 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:23.227289 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bt425_ffe6e393-6faa-47fe-a818-c1ffc6dda35c/kube-multus-additional-cni-plugins/0.log" Apr 22 19:15:23.247747 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:23.247715 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bt425_ffe6e393-6faa-47fe-a818-c1ffc6dda35c/egress-router-binary-copy/0.log" Apr 22 19:15:23.266674 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:23.266648 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bt425_ffe6e393-6faa-47fe-a818-c1ffc6dda35c/cni-plugins/0.log" Apr 22 19:15:23.290303 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:23.290281 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bt425_ffe6e393-6faa-47fe-a818-c1ffc6dda35c/bond-cni-plugin/0.log" Apr 22 19:15:23.319105 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:23.319080 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bt425_ffe6e393-6faa-47fe-a818-c1ffc6dda35c/routeoverride-cni/0.log" Apr 22 19:15:23.339636 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:23.339617 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bt425_ffe6e393-6faa-47fe-a818-c1ffc6dda35c/whereabouts-cni-bincopy/0.log" Apr 22 19:15:23.361279 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:23.361259 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bt425_ffe6e393-6faa-47fe-a818-c1ffc6dda35c/whereabouts-cni/0.log" Apr 22 19:15:23.395142 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:23.395121 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gw5cp_3952e485-121f-46cf-820b-24e6a13ec3bc/kube-multus/0.log" Apr 22 19:15:23.471295 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:23.471267 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-7zj27_66e2806a-82a6-444a-a19a-13641a426bed/network-metrics-daemon/0.log" Apr 22 19:15:23.535289 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:23.535221 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-7zj27_66e2806a-82a6-444a-a19a-13641a426bed/kube-rbac-proxy/0.log" Apr 22 19:15:24.739143 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:24.739118 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s8qfk_3c183a68-147a-4cc9-8e01-6e6df3a46a7b/ovn-controller/0.log" Apr 22 19:15:24.783604 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:24.783577 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s8qfk_3c183a68-147a-4cc9-8e01-6e6df3a46a7b/ovn-acl-logging/0.log" Apr 22 19:15:24.816134 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:24.816110 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s8qfk_3c183a68-147a-4cc9-8e01-6e6df3a46a7b/kube-rbac-proxy-node/0.log" Apr 22 19:15:24.857339 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:24.857314 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s8qfk_3c183a68-147a-4cc9-8e01-6e6df3a46a7b/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 19:15:24.896623 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:24.896599 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s8qfk_3c183a68-147a-4cc9-8e01-6e6df3a46a7b/northd/0.log" Apr 22 19:15:24.936520 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:24.936497 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s8qfk_3c183a68-147a-4cc9-8e01-6e6df3a46a7b/nbdb/0.log" Apr 22 19:15:24.973416 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:24.973389 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s8qfk_3c183a68-147a-4cc9-8e01-6e6df3a46a7b/sbdb/0.log" Apr 22 19:15:25.099584 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:25.099517 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s8qfk_3c183a68-147a-4cc9-8e01-6e6df3a46a7b/ovnkube-controller/0.log" Apr 22 19:15:26.296528 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:26.296492 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-v9ml5_4a6e19f9-6606-4820-8fc6-604159ab398c/check-endpoints/0.log" Apr 22 19:15:26.319271 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:26.319240 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-6cpjm_85d15dde-b7d7-4309-980e-49ce8977c219/network-check-target-container/0.log" Apr 22 19:15:27.279859 ip-10-0-130-234 kubenswrapper[2566]: I0422 19:15:27.279783 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-94k7v_2780ff15-4023-4008-904e-d48820d8e1d9/iptables-alerter/0.log"