Apr 16 18:27:45.459180 ip-10-0-139-33 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 18:27:45.459194 ip-10-0-139-33 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 18:27:45.459203 ip-10-0-139-33 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 18:27:45.459517 ip-10-0-139-33 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 18:27:55.541213 ip-10-0-139-33 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 18:27:55.541231 ip-10-0-139-33 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 54ccb382b44a440ba8266f3e1574051b -- Apr 16 18:30:23.090478 ip-10-0-139-33 systemd[1]: Starting Kubernetes Kubelet... Apr 16 18:30:23.560411 ip-10-0-139-33 kubenswrapper[2578]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:30:23.560411 ip-10-0-139-33 kubenswrapper[2578]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 18:30:23.560411 ip-10-0-139-33 kubenswrapper[2578]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:30:23.560411 ip-10-0-139-33 kubenswrapper[2578]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 18:30:23.560411 ip-10-0-139-33 kubenswrapper[2578]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:30:23.562584 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.562493 2578 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 18:30:23.567272 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567243 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:30:23.567272 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567267 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:30:23.567272 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567271 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:30:23.567272 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567274 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:30:23.567272 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567278 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:30:23.567272 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567281 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:30:23.567478 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567283 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:30:23.567478 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567286 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:30:23.567478 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567290 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:30:23.567478 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567292 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:30:23.567478 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567295 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:30:23.567478 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567297 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:30:23.567478 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567300 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:30:23.567478 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567302 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:30:23.567478 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567305 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:30:23.567478 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567307 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:30:23.567478 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567310 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:30:23.567478 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567313 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:30:23.567478 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567315 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:30:23.567478 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567318 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:30:23.567478 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567321 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:30:23.567478 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567323 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:30:23.567478 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567326 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:30:23.567478 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567328 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:30:23.567478 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567330 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:30:23.567478 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567336 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:30:23.567937 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567338 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:30:23.567937 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567341 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:30:23.567937 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567343 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:30:23.567937 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567346 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:30:23.567937 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567348 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:30:23.567937 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567351 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:30:23.567937 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567354 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:30:23.567937 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567356 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:30:23.567937 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567361 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:30:23.567937 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567364 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:30:23.567937 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567367 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:30:23.567937 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567370 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:30:23.567937 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567373 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:30:23.567937 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567376 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:30:23.567937 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567379 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:30:23.567937 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567382 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:30:23.567937 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567384 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:30:23.567937 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567387 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:30:23.567937 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567391 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:30:23.568409 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567395 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:30:23.568409 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567397 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:30:23.568409 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567400 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:30:23.568409 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567403 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:30:23.568409 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567406 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:30:23.568409 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567409 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:30:23.568409 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567412 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:30:23.568409 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567414 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:30:23.568409 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567417 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:30:23.568409 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567420 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:30:23.568409 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567422 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:30:23.568409 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567425 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:30:23.568409 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567427 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:30:23.568409 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567430 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:30:23.568409 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567432 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:30:23.568409 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567435 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:30:23.568409 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567437 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:30:23.568409 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567440 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:30:23.568409 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567443 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:30:23.568409 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567446 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:30:23.568899 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567448 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:30:23.568899 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567451 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:30:23.568899 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567453 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:30:23.568899 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567456 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:30:23.568899 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567459 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:30:23.568899 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567462 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:30:23.568899 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567465 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:30:23.568899 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567467 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:30:23.568899 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567470 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:30:23.568899 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567472 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:30:23.568899 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567475 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:30:23.568899 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567478 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:30:23.568899 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567480 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:30:23.568899 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567489 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:30:23.568899 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567492 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:30:23.568899 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567495 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:30:23.568899 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567497 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:30:23.568899 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567500 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:30:23.568899 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567502 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:30:23.568899 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567505 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:30:23.569513 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.567507 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:30:23.569513 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568782 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:30:23.569513 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568789 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:30:23.569513 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568792 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:30:23.569513 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568794 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:30:23.569513 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568797 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:30:23.569513 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568800 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:30:23.569513 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568803 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:30:23.569513 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568806 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:30:23.569513 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568810 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:30:23.569513 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568813 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:30:23.569513 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568816 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:30:23.569513 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568818 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:30:23.569513 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568822 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:30:23.569513 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568825 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:30:23.569513 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568828 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:30:23.569513 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568831 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:30:23.569513 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568833 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:30:23.569513 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568836 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:30:23.569513 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568839 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:30:23.569983 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568841 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:30:23.569983 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568846 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:30:23.569983 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568849 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:30:23.569983 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568852 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:30:23.569983 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568855 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:30:23.569983 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568863 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:30:23.569983 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568866 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:30:23.569983 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568868 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:30:23.569983 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568871 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:30:23.569983 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568873 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:30:23.569983 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568876 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:30:23.569983 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568878 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:30:23.569983 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568881 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:30:23.569983 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568883 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:30:23.569983 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568886 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:30:23.569983 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568888 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:30:23.569983 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568891 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:30:23.569983 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568893 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:30:23.569983 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568896 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:30:23.570462 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568898 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:30:23.570462 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568901 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:30:23.570462 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568903 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:30:23.570462 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568906 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:30:23.570462 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568909 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:30:23.570462 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568912 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:30:23.570462 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568915 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:30:23.570462 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568918 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:30:23.570462 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568920 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:30:23.570462 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568923 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:30:23.570462 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568925 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:30:23.570462 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568929 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:30:23.570462 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568932 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:30:23.570462 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568934 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:30:23.570462 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568937 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:30:23.570462 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568940 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:30:23.570462 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568944 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:30:23.570462 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568948 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:30:23.570462 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568950 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:30:23.570462 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568959 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:30:23.570934 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568962 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:30:23.570934 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568964 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:30:23.570934 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568967 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:30:23.570934 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568969 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:30:23.570934 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568972 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:30:23.570934 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568975 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:30:23.570934 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568977 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:30:23.570934 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568980 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:30:23.570934 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568983 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:30:23.570934 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568985 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:30:23.570934 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568988 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:30:23.570934 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568990 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:30:23.570934 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568993 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:30:23.570934 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568995 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:30:23.570934 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.568998 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:30:23.570934 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.569000 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:30:23.570934 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.569003 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:30:23.570934 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.569005 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:30:23.570934 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.569008 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:30:23.571421 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.569011 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:30:23.571421 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.569014 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:30:23.571421 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.569017 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:30:23.571421 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.569020 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:30:23.571421 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.569022 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:30:23.571421 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.569025 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:30:23.571421 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.569028 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:30:23.571421 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.569030 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:30:23.571421 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.569032 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:30:23.571421 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.570827 2578 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 18:30:23.571421 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.570840 2578 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 18:30:23.571421 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.570850 2578 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 18:30:23.571421 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.570855 2578 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 18:30:23.571421 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.570865 2578 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 18:30:23.571421 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.570868 2578 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 18:30:23.571421 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.570872 2578 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 18:30:23.571421 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.570877 2578 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 18:30:23.571421 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.570880 2578 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 18:30:23.571421 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.570883 2578 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 18:30:23.571421 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.570886 2578 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 18:30:23.571421 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.570889 2578 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 18:30:23.571909 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.570892 2578 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 18:30:23.571909 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.570895 2578 flags.go:64] FLAG: --cgroup-root="" Apr 16 18:30:23.571909 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.570898 2578 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 18:30:23.571909 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.570901 2578 flags.go:64] FLAG: --client-ca-file="" Apr 16 18:30:23.571909 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.570904 2578 flags.go:64] FLAG: --cloud-config="" Apr 16 18:30:23.571909 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.570906 2578 flags.go:64] FLAG: --cloud-provider="external" Apr 16 18:30:23.571909 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.570909 2578 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 18:30:23.571909 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.570916 2578 flags.go:64] FLAG: --cluster-domain="" Apr 16 18:30:23.571909 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.570918 2578 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 18:30:23.571909 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.570921 2578 flags.go:64] FLAG: --config-dir="" Apr 16 18:30:23.571909 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.570924 2578 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 18:30:23.571909 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.570927 2578 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 18:30:23.571909 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.570931 2578 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 18:30:23.571909 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.570934 2578 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 18:30:23.571909 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.570938 2578 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 18:30:23.571909 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.570942 2578 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 18:30:23.571909 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.570945 2578 flags.go:64] FLAG: --contention-profiling="false" Apr 16 18:30:23.571909 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.570948 2578 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 18:30:23.571909 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.570951 2578 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 18:30:23.571909 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.570954 2578 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 18:30:23.571909 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.570957 2578 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 18:30:23.571909 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.570961 2578 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 18:30:23.571909 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.570964 2578 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 18:30:23.571909 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.570966 2578 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 18:30:23.571909 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.570969 2578 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 18:30:23.572535 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.570983 2578 flags.go:64] FLAG: --enable-server="true" Apr 16 18:30:23.572535 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.570986 2578 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 18:30:23.572535 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.570993 2578 flags.go:64] FLAG: --event-burst="100" Apr 16 18:30:23.572535 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.570996 2578 flags.go:64] FLAG: --event-qps="50" Apr 16 18:30:23.572535 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.570999 2578 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 18:30:23.572535 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571002 2578 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 18:30:23.572535 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571005 2578 flags.go:64] FLAG: --eviction-hard="" Apr 16 18:30:23.572535 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571010 2578 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 18:30:23.572535 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571012 2578 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 18:30:23.572535 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571015 2578 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 18:30:23.572535 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571018 2578 flags.go:64] FLAG: --eviction-soft="" Apr 16 18:30:23.572535 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571021 2578 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 18:30:23.572535 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571024 2578 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 18:30:23.572535 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571026 2578 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 18:30:23.572535 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571029 2578 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 18:30:23.572535 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571032 2578 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 18:30:23.572535 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571035 2578 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 18:30:23.572535 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571037 2578 flags.go:64] FLAG: --feature-gates="" Apr 16 18:30:23.572535 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571042 2578 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 18:30:23.572535 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571045 2578 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 18:30:23.572535 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571048 2578 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 18:30:23.572535 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571051 2578 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 18:30:23.572535 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571054 2578 flags.go:64] FLAG: --healthz-port="10248" Apr 16 18:30:23.572535 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571057 2578 flags.go:64] FLAG: --help="false" Apr 16 18:30:23.572535 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571060 2578 flags.go:64] FLAG: --hostname-override="ip-10-0-139-33.ec2.internal" Apr 16 18:30:23.573131 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571063 2578 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 18:30:23.573131 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571066 2578 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 18:30:23.573131 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571068 2578 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 18:30:23.573131 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571072 2578 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 18:30:23.573131 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571075 2578 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 18:30:23.573131 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571077 2578 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 18:30:23.573131 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571080 2578 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 18:30:23.573131 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571083 2578 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 18:30:23.573131 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571092 2578 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 18:30:23.573131 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571095 2578 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 18:30:23.573131 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571098 2578 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 18:30:23.573131 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571101 2578 flags.go:64] FLAG: --kube-reserved="" Apr 16 18:30:23.573131 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571103 2578 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 18:30:23.573131 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571106 2578 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 18:30:23.573131 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571109 2578 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 18:30:23.573131 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571112 2578 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 18:30:23.573131 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571114 2578 flags.go:64] FLAG: --lock-file="" Apr 16 18:30:23.573131 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571117 2578 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 18:30:23.573131 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571119 2578 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 18:30:23.573131 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571122 2578 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 18:30:23.573131 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571128 2578 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 18:30:23.573131 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571130 2578 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 18:30:23.573131 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571133 2578 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 18:30:23.573131 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571136 2578 flags.go:64] FLAG: --logging-format="text" Apr 16 18:30:23.573722 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571139 2578 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 18:30:23.573722 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571142 2578 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 18:30:23.573722 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571144 2578 flags.go:64] FLAG: --manifest-url="" Apr 16 18:30:23.573722 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571147 2578 flags.go:64] FLAG: --manifest-url-header="" Apr 16 18:30:23.573722 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571151 2578 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 18:30:23.573722 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571155 2578 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 18:30:23.573722 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571158 2578 flags.go:64] FLAG: --max-pods="110" Apr 16 18:30:23.573722 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571161 2578 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 18:30:23.573722 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571164 2578 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 18:30:23.573722 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571167 2578 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 18:30:23.573722 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571170 2578 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 18:30:23.573722 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571183 2578 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 18:30:23.573722 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571187 2578 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 18:30:23.573722 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571189 2578 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 18:30:23.573722 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571197 2578 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 18:30:23.573722 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571200 2578 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 18:30:23.573722 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571203 2578 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 18:30:23.573722 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571214 2578 flags.go:64] FLAG: --pod-cidr="" Apr 16 18:30:23.573722 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571217 2578 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 18:30:23.573722 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571222 2578 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 18:30:23.573722 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571225 2578 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 18:30:23.573722 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571228 2578 flags.go:64] FLAG: --pods-per-core="0" Apr 16 18:30:23.573722 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571231 2578 flags.go:64] FLAG: --port="10250" Apr 16 18:30:23.573722 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571234 2578 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 18:30:23.574300 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571236 2578 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-017e33092192ace27" Apr 16 18:30:23.574300 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571240 2578 flags.go:64] FLAG: --qos-reserved="" Apr 16 18:30:23.574300 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571242 2578 flags.go:64] FLAG: --read-only-port="10255" Apr 16 18:30:23.574300 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571245 2578 flags.go:64] FLAG: --register-node="true" Apr 16 18:30:23.574300 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571248 2578 flags.go:64] FLAG: --register-schedulable="true" Apr 16 18:30:23.574300 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571250 2578 flags.go:64] FLAG: --register-with-taints="" Apr 16 18:30:23.574300 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571254 2578 flags.go:64] FLAG: --registry-burst="10" Apr 16 18:30:23.574300 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571256 2578 flags.go:64] FLAG: --registry-qps="5" Apr 16 18:30:23.574300 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571259 2578 flags.go:64] FLAG: --reserved-cpus="" Apr 16 18:30:23.574300 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571262 2578 flags.go:64] FLAG: --reserved-memory="" Apr 16 18:30:23.574300 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571267 2578 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 18:30:23.574300 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571270 2578 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 18:30:23.574300 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571273 2578 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 18:30:23.574300 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571278 2578 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 18:30:23.574300 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571280 2578 flags.go:64] FLAG: --runonce="false" Apr 16 18:30:23.574300 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571283 2578 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 18:30:23.574300 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571286 2578 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 18:30:23.574300 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571289 2578 flags.go:64] FLAG: --seccomp-default="false" Apr 16 18:30:23.574300 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571291 2578 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 18:30:23.574300 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571294 2578 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 18:30:23.574300 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571297 2578 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 18:30:23.574300 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571300 2578 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 18:30:23.574300 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571303 2578 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 18:30:23.574300 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571305 2578 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 18:30:23.574300 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571308 2578 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 18:30:23.574300 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571311 2578 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 18:30:23.574941 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571319 2578 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 18:30:23.574941 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571322 2578 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 18:30:23.574941 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571325 2578 flags.go:64] FLAG: --system-cgroups="" Apr 16 18:30:23.574941 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571328 2578 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 18:30:23.574941 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571333 2578 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 18:30:23.574941 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571336 2578 flags.go:64] FLAG: --tls-cert-file="" Apr 16 18:30:23.574941 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571338 2578 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 18:30:23.574941 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571344 2578 flags.go:64] FLAG: --tls-min-version="" Apr 16 18:30:23.574941 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571347 2578 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 18:30:23.574941 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571350 2578 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 18:30:23.574941 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571352 2578 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 18:30:23.574941 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571355 2578 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 18:30:23.574941 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571358 2578 flags.go:64] FLAG: --v="2" Apr 16 18:30:23.574941 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571362 2578 flags.go:64] FLAG: --version="false" Apr 16 18:30:23.574941 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571366 2578 flags.go:64] FLAG: --vmodule="" Apr 16 18:30:23.574941 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571370 2578 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 18:30:23.574941 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571373 2578 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 18:30:23.574941 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571492 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:30:23.574941 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571496 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:30:23.574941 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571501 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:30:23.574941 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571504 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:30:23.574941 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571506 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:30:23.574941 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571509 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:30:23.575498 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571511 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:30:23.575498 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571514 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:30:23.575498 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571516 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:30:23.575498 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571519 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:30:23.575498 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571521 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:30:23.575498 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571524 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:30:23.575498 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571526 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:30:23.575498 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571529 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:30:23.575498 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571531 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:30:23.575498 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571534 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:30:23.575498 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571542 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:30:23.575498 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571545 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:30:23.575498 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571547 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:30:23.575498 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571550 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:30:23.575498 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571552 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:30:23.575498 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571555 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:30:23.575498 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571557 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:30:23.575498 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571576 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:30:23.575498 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571580 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:30:23.575498 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571583 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:30:23.576024 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571585 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:30:23.576024 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571588 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:30:23.576024 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571591 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:30:23.576024 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571593 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:30:23.576024 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571595 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:30:23.576024 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571598 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:30:23.576024 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571601 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:30:23.576024 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571603 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:30:23.576024 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571607 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:30:23.576024 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571610 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:30:23.576024 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571612 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:30:23.576024 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571615 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:30:23.576024 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571617 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:30:23.576024 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571620 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:30:23.576024 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571622 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:30:23.576024 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571625 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:30:23.576024 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571627 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:30:23.576024 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571630 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:30:23.576024 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571632 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:30:23.576024 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571637 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:30:23.576546 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571640 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:30:23.576546 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571643 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:30:23.576546 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571645 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:30:23.576546 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571649 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:30:23.576546 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571651 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:30:23.576546 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571654 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:30:23.576546 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571656 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:30:23.576546 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571659 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:30:23.576546 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571661 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:30:23.576546 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571664 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:30:23.576546 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571666 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:30:23.576546 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571669 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:30:23.576546 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571672 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:30:23.576546 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571674 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:30:23.576546 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571678 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:30:23.576546 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571682 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:30:23.576546 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571685 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:30:23.576546 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571688 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:30:23.576546 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571692 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:30:23.577012 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571695 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:30:23.577012 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571699 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:30:23.577012 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571702 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:30:23.577012 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571704 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:30:23.577012 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571707 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:30:23.577012 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571710 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:30:23.577012 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571712 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:30:23.577012 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571714 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:30:23.577012 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571717 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:30:23.577012 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571720 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:30:23.577012 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571722 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:30:23.577012 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571725 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:30:23.577012 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571729 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:30:23.577012 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571731 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:30:23.577012 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571734 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:30:23.577012 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571736 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:30:23.577012 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571739 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:30:23.577012 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571742 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:30:23.577012 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571744 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:30:23.577012 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571747 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:30:23.577550 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.571750 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:30:23.577550 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.571754 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:30:23.578729 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.578710 2578 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 18:30:23.578762 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.578730 2578 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 18:30:23.578796 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578786 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:30:23.578796 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578792 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:30:23.578796 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578795 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:30:23.578885 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578810 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:30:23.578885 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578813 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:30:23.578885 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578816 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:30:23.578885 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578819 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:30:23.578885 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578822 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:30:23.578885 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578824 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:30:23.578885 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578827 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:30:23.578885 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578830 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:30:23.578885 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578832 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:30:23.578885 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578836 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:30:23.578885 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578838 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:30:23.578885 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578841 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:30:23.578885 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578843 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:30:23.578885 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578846 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:30:23.578885 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578848 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:30:23.578885 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578851 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:30:23.578885 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578854 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:30:23.578885 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578856 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:30:23.578885 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578860 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:30:23.579345 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578865 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:30:23.579345 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578867 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:30:23.579345 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578870 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:30:23.579345 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578874 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:30:23.579345 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578877 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:30:23.579345 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578879 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:30:23.579345 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578882 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:30:23.579345 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578884 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:30:23.579345 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578887 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:30:23.579345 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578890 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:30:23.579345 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578892 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:30:23.579345 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578895 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:30:23.579345 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578898 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:30:23.579345 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578901 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:30:23.579345 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578903 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:30:23.579345 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578906 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:30:23.579345 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578909 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:30:23.579345 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578911 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:30:23.579345 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578913 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:30:23.579778 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578916 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:30:23.579778 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578918 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:30:23.579778 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578921 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:30:23.579778 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578923 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:30:23.579778 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578926 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:30:23.579778 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578928 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:30:23.579778 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578931 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:30:23.579778 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578933 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:30:23.579778 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578936 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:30:23.579778 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578938 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:30:23.579778 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578940 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:30:23.579778 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578943 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:30:23.579778 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578945 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:30:23.579778 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578948 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:30:23.579778 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578951 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:30:23.579778 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578954 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:30:23.579778 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578956 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:30:23.579778 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578959 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:30:23.579778 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578962 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:30:23.580342 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578964 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:30:23.580342 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578967 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:30:23.580342 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578969 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:30:23.580342 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578972 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:30:23.580342 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578974 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:30:23.580342 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578977 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:30:23.580342 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578979 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:30:23.580342 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578981 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:30:23.580342 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578984 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:30:23.580342 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578986 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:30:23.580342 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578989 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:30:23.580342 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578991 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:30:23.580342 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578994 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:30:23.580342 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578996 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:30:23.580342 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.578998 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:30:23.580342 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579001 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:30:23.580342 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579003 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:30:23.580342 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579006 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:30:23.580342 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579008 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:30:23.580342 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579011 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:30:23.580342 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579013 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:30:23.580827 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579016 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:30:23.580827 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579020 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:30:23.580827 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579024 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:30:23.580827 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579026 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:30:23.580827 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579029 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:30:23.580827 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.579034 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:30:23.580827 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579186 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:30:23.580827 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579193 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:30:23.580827 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579196 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:30:23.580827 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579199 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:30:23.580827 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579203 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:30:23.580827 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579205 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:30:23.580827 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579208 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:30:23.580827 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579211 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:30:23.580827 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579213 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:30:23.581248 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579216 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:30:23.581248 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579218 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:30:23.581248 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579221 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:30:23.581248 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579223 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:30:23.581248 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579226 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:30:23.581248 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579228 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:30:23.581248 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579231 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:30:23.581248 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579233 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:30:23.581248 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579236 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:30:23.581248 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579238 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:30:23.581248 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579241 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:30:23.581248 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579243 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:30:23.581248 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579245 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:30:23.581248 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579248 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:30:23.581248 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579251 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:30:23.581248 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579253 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:30:23.581248 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579256 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:30:23.581248 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579258 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:30:23.581248 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579261 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:30:23.581248 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579263 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:30:23.581758 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579266 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:30:23.581758 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579268 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:30:23.581758 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579270 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:30:23.581758 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579274 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:30:23.581758 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579276 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:30:23.581758 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579279 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:30:23.581758 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579282 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:30:23.581758 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579284 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:30:23.581758 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579287 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:30:23.581758 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579289 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:30:23.581758 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579292 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:30:23.581758 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579295 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:30:23.581758 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579297 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:30:23.581758 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579300 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:30:23.581758 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579302 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:30:23.581758 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579305 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:30:23.581758 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579307 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:30:23.581758 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579309 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:30:23.581758 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579312 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:30:23.582234 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579314 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:30:23.582234 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579317 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:30:23.582234 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579319 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:30:23.582234 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579321 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:30:23.582234 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579324 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:30:23.582234 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579326 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:30:23.582234 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579329 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:30:23.582234 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579331 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:30:23.582234 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579334 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:30:23.582234 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579336 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:30:23.582234 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579339 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:30:23.582234 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579341 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:30:23.582234 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579343 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:30:23.582234 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579346 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:30:23.582234 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579348 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:30:23.582234 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579352 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:30:23.582234 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579354 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:30:23.582234 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579358 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:30:23.582234 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579361 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:30:23.582234 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579363 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:30:23.582715 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579365 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:30:23.582715 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579368 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:30:23.582715 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579376 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:30:23.582715 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579379 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:30:23.582715 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579383 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:30:23.582715 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579386 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:30:23.582715 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579389 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:30:23.582715 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579392 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:30:23.582715 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579395 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:30:23.582715 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579398 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:30:23.582715 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579401 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:30:23.582715 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579404 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:30:23.582715 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579407 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:30:23.582715 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579409 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:30:23.582715 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579412 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:30:23.582715 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579414 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:30:23.582715 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579418 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:30:23.582715 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:23.579421 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:30:23.583150 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.579425 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:30:23.583150 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.580205 2578 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 18:30:23.583150 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.582647 2578 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 18:30:23.584899 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.584887 2578 server.go:1019] "Starting client certificate rotation" Apr 16 18:30:23.585030 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.585014 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:30:23.586149 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.586133 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:30:23.612411 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.612393 2578 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:30:23.615132 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.615109 2578 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:30:23.633129 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.633111 2578 log.go:25] "Validated CRI v1 runtime API" Apr 16 18:30:23.638604 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.638588 2578 log.go:25] "Validated CRI v1 image API" Apr 16 18:30:23.641065 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.641043 2578 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 18:30:23.644382 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.644361 2578 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 7c190ccf-4801-47ab-a246-bb410633e4f1:/dev/nvme0n1p4 a40c464c-79b6-479f-bea2-702465fc9c64:/dev/nvme0n1p3] Apr 16 18:30:23.644465 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.644381 2578 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 18:30:23.645636 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.645619 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:30:23.649956 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.649845 2578 manager.go:217] Machine: {Timestamp:2026-04-16 18:30:23.647940537 +0000 UTC m=+0.435276809 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3199845 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec283d344e9a710278da710a0b79a6bb SystemUUID:ec283d34-4e9a-7102-78da-710a0b79a6bb BootID:54ccb382-b44a-440b-a826-6f3e1574051b Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:e2:2d:5f:fb:03 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:e2:2d:5f:fb:03 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:1e:27:6f:0f:af:bd Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 18:30:23.649956 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.649952 2578 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 18:30:23.650062 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.650029 2578 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 18:30:23.652708 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.652684 2578 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 18:30:23.652840 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.652710 2578 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-139-33.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 18:30:23.652884 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.652849 2578 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 18:30:23.652884 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.652857 2578 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 18:30:23.652884 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.652870 2578 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:30:23.653853 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.653842 2578 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:30:23.655875 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.655865 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:30:23.655979 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.655970 2578 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 18:30:23.658544 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.658534 2578 kubelet.go:491] "Attempting to sync node with API server" Apr 16 18:30:23.658579 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.658554 2578 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 18:30:23.658579 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.658566 2578 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 18:30:23.658579 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.658575 2578 kubelet.go:397] "Adding apiserver pod source" Apr 16 18:30:23.658691 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.658583 2578 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 18:30:23.659885 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.659873 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:30:23.659934 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.659891 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:30:23.661445 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.661429 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-9j7s4" Apr 16 18:30:23.662922 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.662902 2578 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 18:30:23.665921 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.665907 2578 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 18:30:23.667807 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.667793 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 18:30:23.667807 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.667810 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 18:30:23.667906 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.667816 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 18:30:23.667906 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.667822 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 18:30:23.667906 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.667827 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 18:30:23.667906 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.667841 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 18:30:23.667906 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.667847 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 18:30:23.667906 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.667852 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 18:30:23.667906 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.667858 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 18:30:23.667906 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.667864 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 18:30:23.667906 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.667873 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 18:30:23.667906 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.667881 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 18:30:23.669129 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.669115 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-9j7s4" Apr 16 18:30:23.669546 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:23.669524 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 18:30:23.669546 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:23.669524 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-139-33.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 18:30:23.669639 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.669628 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 18:30:23.669669 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.669640 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 18:30:23.673203 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.673189 2578 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 18:30:23.673275 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.673222 2578 server.go:1295] "Started kubelet" Apr 16 18:30:23.673367 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.673314 2578 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 18:30:23.673430 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.673354 2578 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 18:30:23.673430 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.673421 2578 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 18:30:23.677597 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.677563 2578 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 18:30:23.677619 ip-10-0-139-33 systemd[1]: Started Kubernetes Kubelet. Apr 16 18:30:23.679388 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.679370 2578 server.go:317] "Adding debug handlers to kubelet server" Apr 16 18:30:23.686588 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.686573 2578 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 18:30:23.686688 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.686590 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 18:30:23.687190 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.687162 2578 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-139-33.ec2.internal" not found Apr 16 18:30:23.687327 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.687310 2578 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 18:30:23.687390 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.687332 2578 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 18:30:23.687437 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.687433 2578 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 18:30:23.687510 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.687493 2578 reconstruct.go:97] "Volume reconstruction finished" Apr 16 18:30:23.687510 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.687504 2578 reconciler.go:26] "Reconciler: start to sync state" Apr 16 18:30:23.687653 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:23.687576 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-33.ec2.internal\" not found" Apr 16 18:30:23.687852 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.687833 2578 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 18:30:23.687852 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.687852 2578 factory.go:55] Registering systemd factory Apr 16 18:30:23.688025 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.687861 2578 factory.go:223] Registration of the systemd container factory successfully Apr 16 18:30:23.688621 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.688602 2578 factory.go:153] Registering CRI-O factory Apr 16 18:30:23.688621 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.688625 2578 factory.go:223] Registration of the crio container factory successfully Apr 16 18:30:23.688768 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.688647 2578 factory.go:103] Registering Raw factory Apr 16 18:30:23.688768 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.688661 2578 manager.go:1196] Started watching for new ooms in manager Apr 16 18:30:23.688768 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.688676 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:30:23.689832 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.689812 2578 manager.go:319] Starting recovery of all containers Apr 16 18:30:23.689995 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:23.689971 2578 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 18:30:23.692314 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:23.692292 2578 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-139-33.ec2.internal\" not found" node="ip-10-0-139-33.ec2.internal" Apr 16 18:30:23.700072 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.699974 2578 manager.go:324] Recovery completed Apr 16 18:30:23.702619 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.702605 2578 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-139-33.ec2.internal" not found Apr 16 18:30:23.704340 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.704327 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:30:23.707101 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.707088 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-33.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:30:23.707169 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.707111 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-33.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:30:23.707169 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.707122 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-33.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:30:23.707565 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.707550 2578 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 18:30:23.707634 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.707564 2578 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 18:30:23.707634 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.707581 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:30:23.709907 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.709896 2578 policy_none.go:49] "None policy: Start" Apr 16 18:30:23.709943 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.709912 2578 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 18:30:23.709943 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.709921 2578 state_mem.go:35] "Initializing new in-memory state store" Apr 16 18:30:23.752739 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.752720 2578 manager.go:341] "Starting Device Plugin manager" Apr 16 18:30:23.760419 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:23.752760 2578 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 18:30:23.760419 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.752772 2578 server.go:85] "Starting device plugin registration server" Apr 16 18:30:23.760419 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.752980 2578 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 18:30:23.760419 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.752992 2578 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 18:30:23.760419 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.753085 2578 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 18:30:23.760419 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.753157 2578 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 18:30:23.760419 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.753168 2578 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 18:30:23.760419 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:23.753654 2578 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 18:30:23.760419 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:23.753692 2578 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-139-33.ec2.internal\" not found" Apr 16 18:30:23.762087 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.762073 2578 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-139-33.ec2.internal" not found Apr 16 18:30:23.835009 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.834930 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 18:30:23.836146 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.836128 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 18:30:23.836246 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.836161 2578 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 18:30:23.836246 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.836213 2578 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 18:30:23.836246 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.836223 2578 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 18:30:23.836384 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:23.836265 2578 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 18:30:23.839391 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.839374 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:30:23.853298 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.853287 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:30:23.854050 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.854034 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-33.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:30:23.854119 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.854062 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-33.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:30:23.854119 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.854071 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-33.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:30:23.854119 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.854094 2578 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-139-33.ec2.internal" Apr 16 18:30:23.871706 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.871684 2578 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-139-33.ec2.internal" Apr 16 18:30:23.871706 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:23.871706 2578 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-139-33.ec2.internal\": node \"ip-10-0-139-33.ec2.internal\" not found" Apr 16 18:30:23.884861 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:23.884840 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-33.ec2.internal\" not found" Apr 16 18:30:23.937231 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.937194 2578 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-33.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-139-33.ec2.internal"] Apr 16 18:30:23.937299 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.937270 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:30:23.938054 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.938039 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-33.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:30:23.938119 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.938064 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-33.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:30:23.938119 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.938074 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-33.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:30:23.939876 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.939864 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:30:23.940019 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.940004 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-33.ec2.internal" Apr 16 18:30:23.940067 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.940048 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:30:23.940546 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.940531 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-33.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:30:23.940546 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.940529 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-33.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:30:23.940664 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.940556 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-33.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:30:23.940664 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.940563 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-33.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:30:23.940664 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.940569 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-33.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:30:23.940664 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.940575 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-33.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:30:23.942445 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.942431 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-33.ec2.internal" Apr 16 18:30:23.942527 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.942453 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:30:23.943101 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.943083 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-33.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:30:23.943189 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.943117 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-33.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:30:23.943189 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.943134 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-33.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:30:23.965690 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:23.965670 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-33.ec2.internal\" not found" node="ip-10-0-139-33.ec2.internal" Apr 16 18:30:23.969955 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:23.969939 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-33.ec2.internal\" not found" node="ip-10-0-139-33.ec2.internal" Apr 16 18:30:23.985047 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:23.985028 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-33.ec2.internal\" not found" Apr 16 18:30:23.989528 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.989513 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c53f305fe96e7a58062f6a77dba1ae45-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-33.ec2.internal\" (UID: \"c53f305fe96e7a58062f6a77dba1ae45\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-33.ec2.internal" Apr 16 18:30:23.989604 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.989543 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c53f305fe96e7a58062f6a77dba1ae45-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-33.ec2.internal\" (UID: \"c53f305fe96e7a58062f6a77dba1ae45\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-33.ec2.internal" Apr 16 18:30:23.989604 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:23.989572 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/27e141a8b2c2991dacebb3be05ed01ab-config\") pod \"kube-apiserver-proxy-ip-10-0-139-33.ec2.internal\" (UID: \"27e141a8b2c2991dacebb3be05ed01ab\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-33.ec2.internal" Apr 16 18:30:24.086085 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:24.086027 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-33.ec2.internal\" not found" Apr 16 18:30:24.090394 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:24.090377 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c53f305fe96e7a58062f6a77dba1ae45-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-33.ec2.internal\" (UID: \"c53f305fe96e7a58062f6a77dba1ae45\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-33.ec2.internal" Apr 16 18:30:24.090444 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:24.090403 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/27e141a8b2c2991dacebb3be05ed01ab-config\") pod \"kube-apiserver-proxy-ip-10-0-139-33.ec2.internal\" (UID: \"27e141a8b2c2991dacebb3be05ed01ab\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-33.ec2.internal" Apr 16 18:30:24.090444 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:24.090419 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c53f305fe96e7a58062f6a77dba1ae45-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-33.ec2.internal\" (UID: \"c53f305fe96e7a58062f6a77dba1ae45\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-33.ec2.internal" Apr 16 18:30:24.090507 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:24.090461 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c53f305fe96e7a58062f6a77dba1ae45-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-33.ec2.internal\" (UID: \"c53f305fe96e7a58062f6a77dba1ae45\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-33.ec2.internal" Apr 16 18:30:24.090507 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:24.090477 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c53f305fe96e7a58062f6a77dba1ae45-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-33.ec2.internal\" (UID: \"c53f305fe96e7a58062f6a77dba1ae45\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-33.ec2.internal" Apr 16 18:30:24.090507 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:24.090486 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/27e141a8b2c2991dacebb3be05ed01ab-config\") pod \"kube-apiserver-proxy-ip-10-0-139-33.ec2.internal\" (UID: \"27e141a8b2c2991dacebb3be05ed01ab\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-33.ec2.internal" Apr 16 18:30:24.186721 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:24.186689 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-33.ec2.internal\" not found" Apr 16 18:30:24.270134 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:24.270100 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-33.ec2.internal" Apr 16 18:30:24.273475 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:24.273459 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-33.ec2.internal" Apr 16 18:30:24.287750 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:24.287732 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-33.ec2.internal\" not found" Apr 16 18:30:24.388200 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:24.388109 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-33.ec2.internal\" not found" Apr 16 18:30:24.488572 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:24.488549 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-33.ec2.internal\" not found" Apr 16 18:30:24.544317 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:24.544298 2578 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:30:24.583780 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:24.583761 2578 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 18:30:24.584274 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:24.583865 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:30:24.584274 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:24.583876 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:30:24.584274 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:24.583877 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:30:24.588972 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:24.588957 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-33.ec2.internal\" not found" Apr 16 18:30:24.670704 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:24.670631 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 18:25:23 +0000 UTC" deadline="2027-10-16 03:44:29.192850713 +0000 UTC" Apr 16 18:30:24.670704 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:24.670657 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13137h14m4.522195456s" Apr 16 18:30:24.686724 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:24.686708 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 18:30:24.689824 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:24.689807 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-33.ec2.internal\" not found" Apr 16 18:30:24.704368 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:24.704344 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:30:24.733253 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:24.733237 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-h5lpm" Apr 16 18:30:24.739471 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:24.739450 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-h5lpm" Apr 16 18:30:24.790307 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:24.790277 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-33.ec2.internal\" not found" Apr 16 18:30:24.826882 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:24.826853 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27e141a8b2c2991dacebb3be05ed01ab.slice/crio-d9c88d9e89a77826bbed6865c6b799c873129f0be340e141c67c708c0a05fa73 WatchSource:0}: Error finding container d9c88d9e89a77826bbed6865c6b799c873129f0be340e141c67c708c0a05fa73: Status 404 returned error can't find the container with id d9c88d9e89a77826bbed6865c6b799c873129f0be340e141c67c708c0a05fa73 Apr 16 18:30:24.827276 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:24.827239 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc53f305fe96e7a58062f6a77dba1ae45.slice/crio-1b5b7c91fd19d5b17ceb7dbecae4fd69d61f7efb6ecc81b773b98d33097de5a6 WatchSource:0}: Error finding container 1b5b7c91fd19d5b17ceb7dbecae4fd69d61f7efb6ecc81b773b98d33097de5a6: Status 404 returned error can't find the container with id 1b5b7c91fd19d5b17ceb7dbecae4fd69d61f7efb6ecc81b773b98d33097de5a6 Apr 16 18:30:24.831759 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:24.831744 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:30:24.838910 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:24.838873 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-33.ec2.internal" event={"ID":"27e141a8b2c2991dacebb3be05ed01ab","Type":"ContainerStarted","Data":"d9c88d9e89a77826bbed6865c6b799c873129f0be340e141c67c708c0a05fa73"} Apr 16 18:30:24.839769 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:24.839752 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-33.ec2.internal" event={"ID":"c53f305fe96e7a58062f6a77dba1ae45","Type":"ContainerStarted","Data":"1b5b7c91fd19d5b17ceb7dbecae4fd69d61f7efb6ecc81b773b98d33097de5a6"} Apr 16 18:30:24.890977 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:24.890952 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-33.ec2.internal\" not found" Apr 16 18:30:24.991472 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:24.991416 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-33.ec2.internal\" not found" Apr 16 18:30:25.091919 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:25.091891 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-33.ec2.internal\" not found" Apr 16 18:30:25.192700 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:25.192667 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-33.ec2.internal\" not found" Apr 16 18:30:25.260712 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.260646 2578 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:30:25.287960 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.287933 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-33.ec2.internal" Apr 16 18:30:25.299007 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.298983 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:30:25.300623 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.300441 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-33.ec2.internal" Apr 16 18:30:25.309672 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.309571 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:30:25.447814 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.447787 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:30:25.558970 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.558885 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:30:25.659748 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.659711 2578 apiserver.go:52] "Watching apiserver" Apr 16 18:30:25.669921 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.669889 2578 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 18:30:25.670793 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.670757 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-npwmv","openshift-ovn-kubernetes/ovnkube-node-fdzzg","kube-system/konnectivity-agent-gcdwk","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4hbb8","openshift-cluster-node-tuning-operator/tuned-fctsh","openshift-image-registry/node-ca-fj55k","openshift-multus/multus-562j6","openshift-multus/multus-additional-cni-plugins-nv72w","openshift-multus/network-metrics-daemon-f4smb","kube-system/kube-apiserver-proxy-ip-10-0-139-33.ec2.internal","openshift-dns/node-resolver-hkqvd","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-33.ec2.internal","openshift-network-diagnostics/network-check-target-rp72n"] Apr 16 18:30:25.672318 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.672297 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-562j6" Apr 16 18:30:25.673821 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.673797 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:25.675311 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.675112 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 18:30:25.675311 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.675158 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 18:30:25.675311 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.675245 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 18:30:25.675499 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.675331 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-j78md\"" Apr 16 18:30:25.675542 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.675501 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 18:30:25.676820 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.676800 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 18:30:25.677013 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.676991 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-gcdwk" Apr 16 18:30:25.677765 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.677733 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-z7h8v\"" Apr 16 18:30:25.677765 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.677751 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 18:30:25.677913 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.677784 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 18:30:25.677913 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.677741 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 18:30:25.678058 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.677921 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 18:30:25.678058 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.677967 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 18:30:25.678687 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.678445 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4hbb8" Apr 16 18:30:25.678687 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.678580 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-fctsh" Apr 16 18:30:25.679848 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.679312 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 18:30:25.679848 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.679403 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-sscph\"" Apr 16 18:30:25.679848 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.679460 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 18:30:25.680025 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.680006 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-fj55k" Apr 16 18:30:25.680564 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.680543 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 18:30:25.681508 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.680995 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 18:30:25.681508 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.681032 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:30:25.681508 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.681039 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 18:30:25.681508 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.681045 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-2q659\"" Apr 16 18:30:25.681508 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.681106 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-mvzrq\"" Apr 16 18:30:25.681508 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.681046 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 18:30:25.681508 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.681509 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-npwmv" Apr 16 18:30:25.682551 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.682533 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 18:30:25.682751 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.682736 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 18:30:25.682842 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.682801 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-8cksw\"" Apr 16 18:30:25.682943 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.682923 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 18:30:25.683166 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.683151 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-nv72w" Apr 16 18:30:25.683723 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.683703 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-dr2x5\"" Apr 16 18:30:25.683943 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.683926 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 18:30:25.684219 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.684200 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:30:25.684704 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.684686 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 18:30:25.684999 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.684982 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4smb" Apr 16 18:30:25.685096 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:25.685076 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4smb" podUID="b62474b5-9999-4dd6-83ae-96e3bc355df3" Apr 16 18:30:25.685496 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.685479 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 18:30:25.685760 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.685739 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-8cgh9\"" Apr 16 18:30:25.685851 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.685741 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 18:30:25.686543 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.686420 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hkqvd" Apr 16 18:30:25.689909 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.689370 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rp72n" Apr 16 18:30:25.689909 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.689570 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 18:30:25.689909 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:25.689575 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rp72n" podUID="dcf43e1d-4165-4661-a113-011616920ebe" Apr 16 18:30:25.689909 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.689733 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 18:30:25.690503 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.690161 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-rhvdt\"" Apr 16 18:30:25.699346 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.699326 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1168c01f-c07f-44f9-b56f-cc88b2028e0b-host\") pod \"node-ca-fj55k\" (UID: \"1168c01f-c07f-44f9-b56f-cc88b2028e0b\") " pod="openshift-image-registry/node-ca-fj55k" Apr 16 18:30:25.699451 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.699360 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4a775d94-d89f-4059-894a-f78b252c1c3c-host-run-k8s-cni-cncf-io\") pod \"multus-562j6\" (UID: \"4a775d94-d89f-4059-894a-f78b252c1c3c\") " pod="openshift-multus/multus-562j6" Apr 16 18:30:25.699451 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.699382 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj56w\" (UniqueName: \"kubernetes.io/projected/dcf43e1d-4165-4661-a113-011616920ebe-kube-api-access-dj56w\") pod \"network-check-target-rp72n\" (UID: \"dcf43e1d-4165-4661-a113-011616920ebe\") " pod="openshift-network-diagnostics/network-check-target-rp72n" Apr 16 18:30:25.699451 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.699407 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4a775d94-d89f-4059-894a-f78b252c1c3c-multus-daemon-config\") pod \"multus-562j6\" (UID: \"4a775d94-d89f-4059-894a-f78b252c1c3c\") " pod="openshift-multus/multus-562j6" Apr 16 18:30:25.699451 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.699431 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/02533a21-4e1f-4bc0-a493-7ac7d35295b8-run-systemd\") pod \"ovnkube-node-fdzzg\" (UID: \"02533a21-4e1f-4bc0-a493-7ac7d35295b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:25.699451 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.699450 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/02533a21-4e1f-4bc0-a493-7ac7d35295b8-host-cni-netd\") pod \"ovnkube-node-fdzzg\" (UID: \"02533a21-4e1f-4bc0-a493-7ac7d35295b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:25.699681 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.699487 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1168c01f-c07f-44f9-b56f-cc88b2028e0b-serviceca\") pod \"node-ca-fj55k\" (UID: \"1168c01f-c07f-44f9-b56f-cc88b2028e0b\") " pod="openshift-image-registry/node-ca-fj55k" Apr 16 18:30:25.699681 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.699519 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4a775d94-d89f-4059-894a-f78b252c1c3c-host-run-netns\") pod \"multus-562j6\" (UID: \"4a775d94-d89f-4059-894a-f78b252c1c3c\") " pod="openshift-multus/multus-562j6" Apr 16 18:30:25.699681 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.699559 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/02533a21-4e1f-4bc0-a493-7ac7d35295b8-host-run-netns\") pod \"ovnkube-node-fdzzg\" (UID: \"02533a21-4e1f-4bc0-a493-7ac7d35295b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:25.699681 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.699583 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/02533a21-4e1f-4bc0-a493-7ac7d35295b8-ovn-node-metrics-cert\") pod \"ovnkube-node-fdzzg\" (UID: \"02533a21-4e1f-4bc0-a493-7ac7d35295b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:25.699681 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.699606 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0003db0c-dda0-4476-bd64-528082f53f33-tmp-dir\") pod \"node-resolver-hkqvd\" (UID: \"0003db0c-dda0-4476-bd64-528082f53f33\") " pod="openshift-dns/node-resolver-hkqvd" Apr 16 18:30:25.699681 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.699628 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/08946e7b-da1a-4169-a4aa-556c72f0074e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-4hbb8\" (UID: \"08946e7b-da1a-4169-a4aa-556c72f0074e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4hbb8" Apr 16 18:30:25.699681 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.699653 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4a775d94-d89f-4059-894a-f78b252c1c3c-system-cni-dir\") pod \"multus-562j6\" (UID: \"4a775d94-d89f-4059-894a-f78b252c1c3c\") " pod="openshift-multus/multus-562j6" Apr 16 18:30:25.699681 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.699675 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4a775d94-d89f-4059-894a-f78b252c1c3c-host-var-lib-kubelet\") pod \"multus-562j6\" (UID: \"4a775d94-d89f-4059-894a-f78b252c1c3c\") " pod="openshift-multus/multus-562j6" Apr 16 18:30:25.700033 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.699715 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/02533a21-4e1f-4bc0-a493-7ac7d35295b8-host-kubelet\") pod \"ovnkube-node-fdzzg\" (UID: \"02533a21-4e1f-4bc0-a493-7ac7d35295b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:25.700033 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.699749 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b62474b5-9999-4dd6-83ae-96e3bc355df3-metrics-certs\") pod \"network-metrics-daemon-f4smb\" (UID: \"b62474b5-9999-4dd6-83ae-96e3bc355df3\") " pod="openshift-multus/network-metrics-daemon-f4smb" Apr 16 18:30:25.700033 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.699772 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/08946e7b-da1a-4169-a4aa-556c72f0074e-sys-fs\") pod \"aws-ebs-csi-driver-node-4hbb8\" (UID: \"08946e7b-da1a-4169-a4aa-556c72f0074e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4hbb8" Apr 16 18:30:25.700033 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.699795 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4a775d94-d89f-4059-894a-f78b252c1c3c-cni-binary-copy\") pod \"multus-562j6\" (UID: \"4a775d94-d89f-4059-894a-f78b252c1c3c\") " pod="openshift-multus/multus-562j6" Apr 16 18:30:25.700033 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.699820 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4a775d94-d89f-4059-894a-f78b252c1c3c-multus-conf-dir\") pod \"multus-562j6\" (UID: \"4a775d94-d89f-4059-894a-f78b252c1c3c\") " pod="openshift-multus/multus-562j6" Apr 16 18:30:25.700033 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.699842 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/02533a21-4e1f-4bc0-a493-7ac7d35295b8-host-slash\") pod \"ovnkube-node-fdzzg\" (UID: \"02533a21-4e1f-4bc0-a493-7ac7d35295b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:25.700033 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.699894 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/02533a21-4e1f-4bc0-a493-7ac7d35295b8-env-overrides\") pod \"ovnkube-node-fdzzg\" (UID: \"02533a21-4e1f-4bc0-a493-7ac7d35295b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:25.700033 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.699933 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7be083de-137d-4eb1-b371-dc0a37d2d527-etc-kubernetes\") pod \"tuned-fctsh\" (UID: \"7be083de-137d-4eb1-b371-dc0a37d2d527\") " pod="openshift-cluster-node-tuning-operator/tuned-fctsh" Apr 16 18:30:25.700033 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.699958 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4a775d94-d89f-4059-894a-f78b252c1c3c-host-var-lib-cni-bin\") pod \"multus-562j6\" (UID: \"4a775d94-d89f-4059-894a-f78b252c1c3c\") " pod="openshift-multus/multus-562j6" Apr 16 18:30:25.700033 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.699979 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4a775d94-d89f-4059-894a-f78b252c1c3c-hostroot\") pod \"multus-562j6\" (UID: \"4a775d94-d89f-4059-894a-f78b252c1c3c\") " pod="openshift-multus/multus-562j6" Apr 16 18:30:25.700033 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.700010 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/7be083de-137d-4eb1-b371-dc0a37d2d527-etc-sysconfig\") pod \"tuned-fctsh\" (UID: \"7be083de-137d-4eb1-b371-dc0a37d2d527\") " pod="openshift-cluster-node-tuning-operator/tuned-fctsh" Apr 16 18:30:25.700033 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.700033 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7be083de-137d-4eb1-b371-dc0a37d2d527-var-lib-kubelet\") pod \"tuned-fctsh\" (UID: \"7be083de-137d-4eb1-b371-dc0a37d2d527\") " pod="openshift-cluster-node-tuning-operator/tuned-fctsh" Apr 16 18:30:25.700544 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.700053 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/7be083de-137d-4eb1-b371-dc0a37d2d527-etc-systemd\") pod \"tuned-fctsh\" (UID: \"7be083de-137d-4eb1-b371-dc0a37d2d527\") " pod="openshift-cluster-node-tuning-operator/tuned-fctsh" Apr 16 18:30:25.700544 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.700088 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/68a69615-3320-4f7f-b763-6991f367c93d-host-slash\") pod \"iptables-alerter-npwmv\" (UID: \"68a69615-3320-4f7f-b763-6991f367c93d\") " pod="openshift-network-operator/iptables-alerter-npwmv" Apr 16 18:30:25.700544 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.700118 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4a775d94-d89f-4059-894a-f78b252c1c3c-etc-kubernetes\") pod \"multus-562j6\" (UID: \"4a775d94-d89f-4059-894a-f78b252c1c3c\") " pod="openshift-multus/multus-562j6" Apr 16 18:30:25.700544 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.700147 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02533a21-4e1f-4bc0-a493-7ac7d35295b8-var-lib-openvswitch\") pod \"ovnkube-node-fdzzg\" (UID: \"02533a21-4e1f-4bc0-a493-7ac7d35295b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:25.700544 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.700172 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02533a21-4e1f-4bc0-a493-7ac7d35295b8-run-openvswitch\") pod \"ovnkube-node-fdzzg\" (UID: \"02533a21-4e1f-4bc0-a493-7ac7d35295b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:25.700544 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.700213 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/02533a21-4e1f-4bc0-a493-7ac7d35295b8-log-socket\") pod \"ovnkube-node-fdzzg\" (UID: \"02533a21-4e1f-4bc0-a493-7ac7d35295b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:25.700544 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.700236 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7be083de-137d-4eb1-b371-dc0a37d2d527-tmp\") pod \"tuned-fctsh\" (UID: \"7be083de-137d-4eb1-b371-dc0a37d2d527\") " pod="openshift-cluster-node-tuning-operator/tuned-fctsh" Apr 16 18:30:25.700544 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.700267 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/7be083de-137d-4eb1-b371-dc0a37d2d527-etc-sysctl-conf\") pod \"tuned-fctsh\" (UID: \"7be083de-137d-4eb1-b371-dc0a37d2d527\") " pod="openshift-cluster-node-tuning-operator/tuned-fctsh" Apr 16 18:30:25.700544 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.700293 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/7be083de-137d-4eb1-b371-dc0a37d2d527-etc-tuned\") pod \"tuned-fctsh\" (UID: \"7be083de-137d-4eb1-b371-dc0a37d2d527\") " pod="openshift-cluster-node-tuning-operator/tuned-fctsh" Apr 16 18:30:25.700544 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.700317 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3bf46358-96b5-41b1-9b21-a398d5f87d6e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-nv72w\" (UID: \"3bf46358-96b5-41b1-9b21-a398d5f87d6e\") " pod="openshift-multus/multus-additional-cni-plugins-nv72w" Apr 16 18:30:25.700544 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.700341 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3bf46358-96b5-41b1-9b21-a398d5f87d6e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-nv72w\" (UID: \"3bf46358-96b5-41b1-9b21-a398d5f87d6e\") " pod="openshift-multus/multus-additional-cni-plugins-nv72w" Apr 16 18:30:25.700544 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.700366 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/3bf46358-96b5-41b1-9b21-a398d5f87d6e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-nv72w\" (UID: \"3bf46358-96b5-41b1-9b21-a398d5f87d6e\") " pod="openshift-multus/multus-additional-cni-plugins-nv72w" Apr 16 18:30:25.700544 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.700391 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/08946e7b-da1a-4169-a4aa-556c72f0074e-device-dir\") pod \"aws-ebs-csi-driver-node-4hbb8\" (UID: \"08946e7b-da1a-4169-a4aa-556c72f0074e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4hbb8" Apr 16 18:30:25.700544 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.700414 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/68a69615-3320-4f7f-b763-6991f367c93d-iptables-alerter-script\") pod \"iptables-alerter-npwmv\" (UID: \"68a69615-3320-4f7f-b763-6991f367c93d\") " pod="openshift-network-operator/iptables-alerter-npwmv" Apr 16 18:30:25.700544 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.700437 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4a775d94-d89f-4059-894a-f78b252c1c3c-host-run-multus-certs\") pod \"multus-562j6\" (UID: \"4a775d94-d89f-4059-894a-f78b252c1c3c\") " pod="openshift-multus/multus-562j6" Apr 16 18:30:25.700544 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.700459 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhxwk\" (UniqueName: \"kubernetes.io/projected/4a775d94-d89f-4059-894a-f78b252c1c3c-kube-api-access-xhxwk\") pod \"multus-562j6\" (UID: \"4a775d94-d89f-4059-894a-f78b252c1c3c\") " pod="openshift-multus/multus-562j6" Apr 16 18:30:25.701167 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.700486 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/02533a21-4e1f-4bc0-a493-7ac7d35295b8-ovnkube-config\") pod \"ovnkube-node-fdzzg\" (UID: \"02533a21-4e1f-4bc0-a493-7ac7d35295b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:25.701167 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.700528 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbrfr\" (UniqueName: \"kubernetes.io/projected/b62474b5-9999-4dd6-83ae-96e3bc355df3-kube-api-access-lbrfr\") pod \"network-metrics-daemon-f4smb\" (UID: \"b62474b5-9999-4dd6-83ae-96e3bc355df3\") " pod="openshift-multus/network-metrics-daemon-f4smb" Apr 16 18:30:25.701167 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.700573 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/08946e7b-da1a-4169-a4aa-556c72f0074e-registration-dir\") pod \"aws-ebs-csi-driver-node-4hbb8\" (UID: \"08946e7b-da1a-4169-a4aa-556c72f0074e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4hbb8" Apr 16 18:30:25.701167 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.700631 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/02533a21-4e1f-4bc0-a493-7ac7d35295b8-systemd-units\") pod \"ovnkube-node-fdzzg\" (UID: \"02533a21-4e1f-4bc0-a493-7ac7d35295b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:25.701167 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.700660 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9brn\" (UniqueName: \"kubernetes.io/projected/02533a21-4e1f-4bc0-a493-7ac7d35295b8-kube-api-access-l9brn\") pod \"ovnkube-node-fdzzg\" (UID: \"02533a21-4e1f-4bc0-a493-7ac7d35295b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:25.701167 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.700690 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/7be083de-137d-4eb1-b371-dc0a37d2d527-etc-sysctl-d\") pod \"tuned-fctsh\" (UID: \"7be083de-137d-4eb1-b371-dc0a37d2d527\") " pod="openshift-cluster-node-tuning-operator/tuned-fctsh" Apr 16 18:30:25.701167 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.700705 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7be083de-137d-4eb1-b371-dc0a37d2d527-lib-modules\") pod \"tuned-fctsh\" (UID: \"7be083de-137d-4eb1-b371-dc0a37d2d527\") " pod="openshift-cluster-node-tuning-operator/tuned-fctsh" Apr 16 18:30:25.701167 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.700727 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/08946e7b-da1a-4169-a4aa-556c72f0074e-socket-dir\") pod \"aws-ebs-csi-driver-node-4hbb8\" (UID: \"08946e7b-da1a-4169-a4aa-556c72f0074e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4hbb8" Apr 16 18:30:25.701167 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.700753 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4a775d94-d89f-4059-894a-f78b252c1c3c-multus-socket-dir-parent\") pod \"multus-562j6\" (UID: \"4a775d94-d89f-4059-894a-f78b252c1c3c\") " pod="openshift-multus/multus-562j6" Apr 16 18:30:25.701167 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.700788 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/02533a21-4e1f-4bc0-a493-7ac7d35295b8-host-run-ovn-kubernetes\") pod \"ovnkube-node-fdzzg\" (UID: \"02533a21-4e1f-4bc0-a493-7ac7d35295b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:25.701167 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.700817 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/7be083de-137d-4eb1-b371-dc0a37d2d527-etc-modprobe-d\") pod \"tuned-fctsh\" (UID: \"7be083de-137d-4eb1-b371-dc0a37d2d527\") " pod="openshift-cluster-node-tuning-operator/tuned-fctsh" Apr 16 18:30:25.701167 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.700845 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3bf46358-96b5-41b1-9b21-a398d5f87d6e-cnibin\") pod \"multus-additional-cni-plugins-nv72w\" (UID: \"3bf46358-96b5-41b1-9b21-a398d5f87d6e\") " pod="openshift-multus/multus-additional-cni-plugins-nv72w" Apr 16 18:30:25.701167 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.700894 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3bf46358-96b5-41b1-9b21-a398d5f87d6e-cni-binary-copy\") pod \"multus-additional-cni-plugins-nv72w\" (UID: \"3bf46358-96b5-41b1-9b21-a398d5f87d6e\") " pod="openshift-multus/multus-additional-cni-plugins-nv72w" Apr 16 18:30:25.701167 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.700924 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0003db0c-dda0-4476-bd64-528082f53f33-hosts-file\") pod \"node-resolver-hkqvd\" (UID: \"0003db0c-dda0-4476-bd64-528082f53f33\") " pod="openshift-dns/node-resolver-hkqvd" Apr 16 18:30:25.701167 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.700948 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjntw\" (UniqueName: \"kubernetes.io/projected/0003db0c-dda0-4476-bd64-528082f53f33-kube-api-access-sjntw\") pod \"node-resolver-hkqvd\" (UID: \"0003db0c-dda0-4476-bd64-528082f53f33\") " pod="openshift-dns/node-resolver-hkqvd" Apr 16 18:30:25.701167 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.700975 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4a775d94-d89f-4059-894a-f78b252c1c3c-host-var-lib-cni-multus\") pod \"multus-562j6\" (UID: \"4a775d94-d89f-4059-894a-f78b252c1c3c\") " pod="openshift-multus/multus-562j6" Apr 16 18:30:25.701799 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.700997 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/02533a21-4e1f-4bc0-a493-7ac7d35295b8-node-log\") pod \"ovnkube-node-fdzzg\" (UID: \"02533a21-4e1f-4bc0-a493-7ac7d35295b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:25.701799 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.701017 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7be083de-137d-4eb1-b371-dc0a37d2d527-host\") pod \"tuned-fctsh\" (UID: \"7be083de-137d-4eb1-b371-dc0a37d2d527\") " pod="openshift-cluster-node-tuning-operator/tuned-fctsh" Apr 16 18:30:25.701799 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.701041 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4a775d94-d89f-4059-894a-f78b252c1c3c-multus-cni-dir\") pod \"multus-562j6\" (UID: \"4a775d94-d89f-4059-894a-f78b252c1c3c\") " pod="openshift-multus/multus-562j6" Apr 16 18:30:25.701799 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.701062 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02533a21-4e1f-4bc0-a493-7ac7d35295b8-etc-openvswitch\") pod \"ovnkube-node-fdzzg\" (UID: \"02533a21-4e1f-4bc0-a493-7ac7d35295b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:25.701799 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.701081 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kpfw\" (UniqueName: \"kubernetes.io/projected/08946e7b-da1a-4169-a4aa-556c72f0074e-kube-api-access-5kpfw\") pod \"aws-ebs-csi-driver-node-4hbb8\" (UID: \"08946e7b-da1a-4169-a4aa-556c72f0074e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4hbb8" Apr 16 18:30:25.701799 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.701104 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/02533a21-4e1f-4bc0-a493-7ac7d35295b8-host-cni-bin\") pod \"ovnkube-node-fdzzg\" (UID: \"02533a21-4e1f-4bc0-a493-7ac7d35295b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:25.701799 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.701132 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3bf46358-96b5-41b1-9b21-a398d5f87d6e-system-cni-dir\") pod \"multus-additional-cni-plugins-nv72w\" (UID: \"3bf46358-96b5-41b1-9b21-a398d5f87d6e\") " pod="openshift-multus/multus-additional-cni-plugins-nv72w" Apr 16 18:30:25.701799 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.701162 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbl66\" (UniqueName: \"kubernetes.io/projected/3bf46358-96b5-41b1-9b21-a398d5f87d6e-kube-api-access-jbl66\") pod \"multus-additional-cni-plugins-nv72w\" (UID: \"3bf46358-96b5-41b1-9b21-a398d5f87d6e\") " pod="openshift-multus/multus-additional-cni-plugins-nv72w" Apr 16 18:30:25.701799 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.701219 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvz25\" (UniqueName: \"kubernetes.io/projected/1168c01f-c07f-44f9-b56f-cc88b2028e0b-kube-api-access-tvz25\") pod \"node-ca-fj55k\" (UID: \"1168c01f-c07f-44f9-b56f-cc88b2028e0b\") " pod="openshift-image-registry/node-ca-fj55k" Apr 16 18:30:25.701799 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.701371 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f4d6c00d-a887-4de5-87f8-5b4449359aa4-konnectivity-ca\") pod \"konnectivity-agent-gcdwk\" (UID: \"f4d6c00d-a887-4de5-87f8-5b4449359aa4\") " pod="kube-system/konnectivity-agent-gcdwk" Apr 16 18:30:25.701799 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.701408 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7be083de-137d-4eb1-b371-dc0a37d2d527-run\") pod \"tuned-fctsh\" (UID: \"7be083de-137d-4eb1-b371-dc0a37d2d527\") " pod="openshift-cluster-node-tuning-operator/tuned-fctsh" Apr 16 18:30:25.701799 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.701431 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7be083de-137d-4eb1-b371-dc0a37d2d527-sys\") pod \"tuned-fctsh\" (UID: \"7be083de-137d-4eb1-b371-dc0a37d2d527\") " pod="openshift-cluster-node-tuning-operator/tuned-fctsh" Apr 16 18:30:25.701799 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.701455 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dh97\" (UniqueName: \"kubernetes.io/projected/68a69615-3320-4f7f-b763-6991f367c93d-kube-api-access-4dh97\") pod \"iptables-alerter-npwmv\" (UID: \"68a69615-3320-4f7f-b763-6991f367c93d\") " pod="openshift-network-operator/iptables-alerter-npwmv" Apr 16 18:30:25.701799 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.701470 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/02533a21-4e1f-4bc0-a493-7ac7d35295b8-run-ovn\") pod \"ovnkube-node-fdzzg\" (UID: \"02533a21-4e1f-4bc0-a493-7ac7d35295b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:25.701799 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.701485 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f4d6c00d-a887-4de5-87f8-5b4449359aa4-agent-certs\") pod \"konnectivity-agent-gcdwk\" (UID: \"f4d6c00d-a887-4de5-87f8-5b4449359aa4\") " pod="kube-system/konnectivity-agent-gcdwk" Apr 16 18:30:25.701799 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.701506 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3bf46358-96b5-41b1-9b21-a398d5f87d6e-os-release\") pod \"multus-additional-cni-plugins-nv72w\" (UID: \"3bf46358-96b5-41b1-9b21-a398d5f87d6e\") " pod="openshift-multus/multus-additional-cni-plugins-nv72w" Apr 16 18:30:25.702364 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.701524 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4a775d94-d89f-4059-894a-f78b252c1c3c-cnibin\") pod \"multus-562j6\" (UID: \"4a775d94-d89f-4059-894a-f78b252c1c3c\") " pod="openshift-multus/multus-562j6" Apr 16 18:30:25.702364 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.701550 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4a775d94-d89f-4059-894a-f78b252c1c3c-os-release\") pod \"multus-562j6\" (UID: \"4a775d94-d89f-4059-894a-f78b252c1c3c\") " pod="openshift-multus/multus-562j6" Apr 16 18:30:25.702364 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.701568 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/02533a21-4e1f-4bc0-a493-7ac7d35295b8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fdzzg\" (UID: \"02533a21-4e1f-4bc0-a493-7ac7d35295b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:25.702364 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.701589 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/02533a21-4e1f-4bc0-a493-7ac7d35295b8-ovnkube-script-lib\") pod \"ovnkube-node-fdzzg\" (UID: \"02533a21-4e1f-4bc0-a493-7ac7d35295b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:25.702364 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.701639 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5ssj\" (UniqueName: \"kubernetes.io/projected/7be083de-137d-4eb1-b371-dc0a37d2d527-kube-api-access-p5ssj\") pod \"tuned-fctsh\" (UID: \"7be083de-137d-4eb1-b371-dc0a37d2d527\") " pod="openshift-cluster-node-tuning-operator/tuned-fctsh" Apr 16 18:30:25.702364 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.701671 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/08946e7b-da1a-4169-a4aa-556c72f0074e-etc-selinux\") pod \"aws-ebs-csi-driver-node-4hbb8\" (UID: \"08946e7b-da1a-4169-a4aa-556c72f0074e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4hbb8" Apr 16 18:30:25.740041 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.740013 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 18:25:24 +0000 UTC" deadline="2028-01-11 17:58:40.187643958 +0000 UTC" Apr 16 18:30:25.740128 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.740042 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15239h28m14.447605612s" Apr 16 18:30:25.788279 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.788256 2578 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 18:30:25.801850 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.801823 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4a775d94-d89f-4059-894a-f78b252c1c3c-cni-binary-copy\") pod \"multus-562j6\" (UID: \"4a775d94-d89f-4059-894a-f78b252c1c3c\") " pod="openshift-multus/multus-562j6" Apr 16 18:30:25.802005 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.801858 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4a775d94-d89f-4059-894a-f78b252c1c3c-multus-conf-dir\") pod \"multus-562j6\" (UID: \"4a775d94-d89f-4059-894a-f78b252c1c3c\") " pod="openshift-multus/multus-562j6" Apr 16 18:30:25.802005 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.801885 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/02533a21-4e1f-4bc0-a493-7ac7d35295b8-host-slash\") pod \"ovnkube-node-fdzzg\" (UID: \"02533a21-4e1f-4bc0-a493-7ac7d35295b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:25.802005 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.801909 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/02533a21-4e1f-4bc0-a493-7ac7d35295b8-env-overrides\") pod \"ovnkube-node-fdzzg\" (UID: \"02533a21-4e1f-4bc0-a493-7ac7d35295b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:25.802005 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.801954 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/02533a21-4e1f-4bc0-a493-7ac7d35295b8-host-slash\") pod \"ovnkube-node-fdzzg\" (UID: \"02533a21-4e1f-4bc0-a493-7ac7d35295b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:25.802005 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.801954 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4a775d94-d89f-4059-894a-f78b252c1c3c-multus-conf-dir\") pod \"multus-562j6\" (UID: \"4a775d94-d89f-4059-894a-f78b252c1c3c\") " pod="openshift-multus/multus-562j6" Apr 16 18:30:25.802005 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.801994 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7be083de-137d-4eb1-b371-dc0a37d2d527-etc-kubernetes\") pod \"tuned-fctsh\" (UID: \"7be083de-137d-4eb1-b371-dc0a37d2d527\") " pod="openshift-cluster-node-tuning-operator/tuned-fctsh" Apr 16 18:30:25.802316 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.802022 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4a775d94-d89f-4059-894a-f78b252c1c3c-host-var-lib-cni-bin\") pod \"multus-562j6\" (UID: \"4a775d94-d89f-4059-894a-f78b252c1c3c\") " pod="openshift-multus/multus-562j6" Apr 16 18:30:25.802316 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.802048 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4a775d94-d89f-4059-894a-f78b252c1c3c-hostroot\") pod \"multus-562j6\" (UID: \"4a775d94-d89f-4059-894a-f78b252c1c3c\") " pod="openshift-multus/multus-562j6" Apr 16 18:30:25.802316 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.802070 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/7be083de-137d-4eb1-b371-dc0a37d2d527-etc-sysconfig\") pod \"tuned-fctsh\" (UID: \"7be083de-137d-4eb1-b371-dc0a37d2d527\") " pod="openshift-cluster-node-tuning-operator/tuned-fctsh" Apr 16 18:30:25.802316 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.802095 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4a775d94-d89f-4059-894a-f78b252c1c3c-host-var-lib-cni-bin\") pod \"multus-562j6\" (UID: \"4a775d94-d89f-4059-894a-f78b252c1c3c\") " pod="openshift-multus/multus-562j6" Apr 16 18:30:25.802316 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.802110 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7be083de-137d-4eb1-b371-dc0a37d2d527-etc-kubernetes\") pod \"tuned-fctsh\" (UID: \"7be083de-137d-4eb1-b371-dc0a37d2d527\") " pod="openshift-cluster-node-tuning-operator/tuned-fctsh" Apr 16 18:30:25.802316 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.802119 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4a775d94-d89f-4059-894a-f78b252c1c3c-hostroot\") pod \"multus-562j6\" (UID: \"4a775d94-d89f-4059-894a-f78b252c1c3c\") " pod="openshift-multus/multus-562j6" Apr 16 18:30:25.802316 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.802127 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/7be083de-137d-4eb1-b371-dc0a37d2d527-etc-sysconfig\") pod \"tuned-fctsh\" (UID: \"7be083de-137d-4eb1-b371-dc0a37d2d527\") " pod="openshift-cluster-node-tuning-operator/tuned-fctsh" Apr 16 18:30:25.802316 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.802155 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7be083de-137d-4eb1-b371-dc0a37d2d527-var-lib-kubelet\") pod \"tuned-fctsh\" (UID: \"7be083de-137d-4eb1-b371-dc0a37d2d527\") " pod="openshift-cluster-node-tuning-operator/tuned-fctsh" Apr 16 18:30:25.802316 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.802202 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/7be083de-137d-4eb1-b371-dc0a37d2d527-etc-systemd\") pod \"tuned-fctsh\" (UID: \"7be083de-137d-4eb1-b371-dc0a37d2d527\") " pod="openshift-cluster-node-tuning-operator/tuned-fctsh" Apr 16 18:30:25.802316 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.802229 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/68a69615-3320-4f7f-b763-6991f367c93d-host-slash\") pod \"iptables-alerter-npwmv\" (UID: \"68a69615-3320-4f7f-b763-6991f367c93d\") " pod="openshift-network-operator/iptables-alerter-npwmv" Apr 16 18:30:25.802316 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.802255 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4a775d94-d89f-4059-894a-f78b252c1c3c-etc-kubernetes\") pod \"multus-562j6\" (UID: \"4a775d94-d89f-4059-894a-f78b252c1c3c\") " pod="openshift-multus/multus-562j6" Apr 16 18:30:25.802316 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.802279 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02533a21-4e1f-4bc0-a493-7ac7d35295b8-var-lib-openvswitch\") pod \"ovnkube-node-fdzzg\" (UID: \"02533a21-4e1f-4bc0-a493-7ac7d35295b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:25.802316 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.802287 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/68a69615-3320-4f7f-b763-6991f367c93d-host-slash\") pod \"iptables-alerter-npwmv\" (UID: \"68a69615-3320-4f7f-b763-6991f367c93d\") " pod="openshift-network-operator/iptables-alerter-npwmv" Apr 16 18:30:25.802316 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.802294 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/7be083de-137d-4eb1-b371-dc0a37d2d527-etc-systemd\") pod \"tuned-fctsh\" (UID: \"7be083de-137d-4eb1-b371-dc0a37d2d527\") " pod="openshift-cluster-node-tuning-operator/tuned-fctsh" Apr 16 18:30:25.802316 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.802303 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02533a21-4e1f-4bc0-a493-7ac7d35295b8-run-openvswitch\") pod \"ovnkube-node-fdzzg\" (UID: \"02533a21-4e1f-4bc0-a493-7ac7d35295b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:25.802895 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.802337 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4a775d94-d89f-4059-894a-f78b252c1c3c-etc-kubernetes\") pod \"multus-562j6\" (UID: \"4a775d94-d89f-4059-894a-f78b252c1c3c\") " pod="openshift-multus/multus-562j6" Apr 16 18:30:25.802895 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.802348 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02533a21-4e1f-4bc0-a493-7ac7d35295b8-run-openvswitch\") pod \"ovnkube-node-fdzzg\" (UID: \"02533a21-4e1f-4bc0-a493-7ac7d35295b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:25.802895 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.802446 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/02533a21-4e1f-4bc0-a493-7ac7d35295b8-log-socket\") pod \"ovnkube-node-fdzzg\" (UID: \"02533a21-4e1f-4bc0-a493-7ac7d35295b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:25.803113 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.803088 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7be083de-137d-4eb1-b371-dc0a37d2d527-tmp\") pod \"tuned-fctsh\" (UID: \"7be083de-137d-4eb1-b371-dc0a37d2d527\") " pod="openshift-cluster-node-tuning-operator/tuned-fctsh" Apr 16 18:30:25.803230 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.803145 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/7be083de-137d-4eb1-b371-dc0a37d2d527-etc-sysctl-conf\") pod \"tuned-fctsh\" (UID: \"7be083de-137d-4eb1-b371-dc0a37d2d527\") " pod="openshift-cluster-node-tuning-operator/tuned-fctsh" Apr 16 18:30:25.803230 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.803030 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/02533a21-4e1f-4bc0-a493-7ac7d35295b8-env-overrides\") pod \"ovnkube-node-fdzzg\" (UID: \"02533a21-4e1f-4bc0-a493-7ac7d35295b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:25.803230 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.802235 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7be083de-137d-4eb1-b371-dc0a37d2d527-var-lib-kubelet\") pod \"tuned-fctsh\" (UID: \"7be083de-137d-4eb1-b371-dc0a37d2d527\") " pod="openshift-cluster-node-tuning-operator/tuned-fctsh" Apr 16 18:30:25.803230 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.802339 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02533a21-4e1f-4bc0-a493-7ac7d35295b8-var-lib-openvswitch\") pod \"ovnkube-node-fdzzg\" (UID: \"02533a21-4e1f-4bc0-a493-7ac7d35295b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:25.803514 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.803260 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/02533a21-4e1f-4bc0-a493-7ac7d35295b8-log-socket\") pod \"ovnkube-node-fdzzg\" (UID: \"02533a21-4e1f-4bc0-a493-7ac7d35295b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:25.803514 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.803354 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4a775d94-d89f-4059-894a-f78b252c1c3c-cni-binary-copy\") pod \"multus-562j6\" (UID: \"4a775d94-d89f-4059-894a-f78b252c1c3c\") " pod="openshift-multus/multus-562j6" Apr 16 18:30:25.803514 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.803366 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/7be083de-137d-4eb1-b371-dc0a37d2d527-etc-tuned\") pod \"tuned-fctsh\" (UID: \"7be083de-137d-4eb1-b371-dc0a37d2d527\") " pod="openshift-cluster-node-tuning-operator/tuned-fctsh" Apr 16 18:30:25.803514 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.803418 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/7be083de-137d-4eb1-b371-dc0a37d2d527-etc-sysctl-conf\") pod \"tuned-fctsh\" (UID: \"7be083de-137d-4eb1-b371-dc0a37d2d527\") " pod="openshift-cluster-node-tuning-operator/tuned-fctsh" Apr 16 18:30:25.803514 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.803502 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3bf46358-96b5-41b1-9b21-a398d5f87d6e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-nv72w\" (UID: \"3bf46358-96b5-41b1-9b21-a398d5f87d6e\") " pod="openshift-multus/multus-additional-cni-plugins-nv72w" Apr 16 18:30:25.803737 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.803601 2578 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 18:30:25.803737 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.803679 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3bf46358-96b5-41b1-9b21-a398d5f87d6e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-nv72w\" (UID: \"3bf46358-96b5-41b1-9b21-a398d5f87d6e\") " pod="openshift-multus/multus-additional-cni-plugins-nv72w" Apr 16 18:30:25.803737 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.803723 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/3bf46358-96b5-41b1-9b21-a398d5f87d6e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-nv72w\" (UID: \"3bf46358-96b5-41b1-9b21-a398d5f87d6e\") " pod="openshift-multus/multus-additional-cni-plugins-nv72w" Apr 16 18:30:25.803877 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.803758 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/08946e7b-da1a-4169-a4aa-556c72f0074e-device-dir\") pod \"aws-ebs-csi-driver-node-4hbb8\" (UID: \"08946e7b-da1a-4169-a4aa-556c72f0074e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4hbb8" Apr 16 18:30:25.803877 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.803792 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/68a69615-3320-4f7f-b763-6991f367c93d-iptables-alerter-script\") pod \"iptables-alerter-npwmv\" (UID: \"68a69615-3320-4f7f-b763-6991f367c93d\") " pod="openshift-network-operator/iptables-alerter-npwmv" Apr 16 18:30:25.803877 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.803818 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4a775d94-d89f-4059-894a-f78b252c1c3c-host-run-multus-certs\") pod \"multus-562j6\" (UID: \"4a775d94-d89f-4059-894a-f78b252c1c3c\") " pod="openshift-multus/multus-562j6" Apr 16 18:30:25.803877 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.803850 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xhxwk\" (UniqueName: \"kubernetes.io/projected/4a775d94-d89f-4059-894a-f78b252c1c3c-kube-api-access-xhxwk\") pod \"multus-562j6\" (UID: \"4a775d94-d89f-4059-894a-f78b252c1c3c\") " pod="openshift-multus/multus-562j6" Apr 16 18:30:25.804053 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.803882 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/02533a21-4e1f-4bc0-a493-7ac7d35295b8-ovnkube-config\") pod \"ovnkube-node-fdzzg\" (UID: \"02533a21-4e1f-4bc0-a493-7ac7d35295b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:25.804053 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.803913 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lbrfr\" (UniqueName: \"kubernetes.io/projected/b62474b5-9999-4dd6-83ae-96e3bc355df3-kube-api-access-lbrfr\") pod \"network-metrics-daemon-f4smb\" (UID: \"b62474b5-9999-4dd6-83ae-96e3bc355df3\") " pod="openshift-multus/network-metrics-daemon-f4smb" Apr 16 18:30:25.804053 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.803941 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/08946e7b-da1a-4169-a4aa-556c72f0074e-registration-dir\") pod \"aws-ebs-csi-driver-node-4hbb8\" (UID: \"08946e7b-da1a-4169-a4aa-556c72f0074e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4hbb8" Apr 16 18:30:25.804053 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.803971 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/02533a21-4e1f-4bc0-a493-7ac7d35295b8-systemd-units\") pod \"ovnkube-node-fdzzg\" (UID: \"02533a21-4e1f-4bc0-a493-7ac7d35295b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:25.804053 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.803996 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l9brn\" (UniqueName: \"kubernetes.io/projected/02533a21-4e1f-4bc0-a493-7ac7d35295b8-kube-api-access-l9brn\") pod \"ovnkube-node-fdzzg\" (UID: \"02533a21-4e1f-4bc0-a493-7ac7d35295b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:25.804053 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.804025 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/7be083de-137d-4eb1-b371-dc0a37d2d527-etc-sysctl-d\") pod \"tuned-fctsh\" (UID: \"7be083de-137d-4eb1-b371-dc0a37d2d527\") " pod="openshift-cluster-node-tuning-operator/tuned-fctsh" Apr 16 18:30:25.804053 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.804053 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7be083de-137d-4eb1-b371-dc0a37d2d527-lib-modules\") pod \"tuned-fctsh\" (UID: \"7be083de-137d-4eb1-b371-dc0a37d2d527\") " pod="openshift-cluster-node-tuning-operator/tuned-fctsh" Apr 16 18:30:25.804388 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.804082 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/08946e7b-da1a-4169-a4aa-556c72f0074e-socket-dir\") pod \"aws-ebs-csi-driver-node-4hbb8\" (UID: \"08946e7b-da1a-4169-a4aa-556c72f0074e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4hbb8" Apr 16 18:30:25.804388 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.804114 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4a775d94-d89f-4059-894a-f78b252c1c3c-multus-socket-dir-parent\") pod \"multus-562j6\" (UID: \"4a775d94-d89f-4059-894a-f78b252c1c3c\") " pod="openshift-multus/multus-562j6" Apr 16 18:30:25.804388 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.804162 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/02533a21-4e1f-4bc0-a493-7ac7d35295b8-host-run-ovn-kubernetes\") pod \"ovnkube-node-fdzzg\" (UID: \"02533a21-4e1f-4bc0-a493-7ac7d35295b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:25.804388 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.804203 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/7be083de-137d-4eb1-b371-dc0a37d2d527-etc-modprobe-d\") pod \"tuned-fctsh\" (UID: \"7be083de-137d-4eb1-b371-dc0a37d2d527\") " pod="openshift-cluster-node-tuning-operator/tuned-fctsh" Apr 16 18:30:25.804388 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.804231 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3bf46358-96b5-41b1-9b21-a398d5f87d6e-cnibin\") pod \"multus-additional-cni-plugins-nv72w\" (UID: \"3bf46358-96b5-41b1-9b21-a398d5f87d6e\") " pod="openshift-multus/multus-additional-cni-plugins-nv72w" Apr 16 18:30:25.804388 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.804259 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3bf46358-96b5-41b1-9b21-a398d5f87d6e-cni-binary-copy\") pod \"multus-additional-cni-plugins-nv72w\" (UID: \"3bf46358-96b5-41b1-9b21-a398d5f87d6e\") " pod="openshift-multus/multus-additional-cni-plugins-nv72w" Apr 16 18:30:25.804388 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.804286 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0003db0c-dda0-4476-bd64-528082f53f33-hosts-file\") pod \"node-resolver-hkqvd\" (UID: \"0003db0c-dda0-4476-bd64-528082f53f33\") " pod="openshift-dns/node-resolver-hkqvd" Apr 16 18:30:25.804388 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.804314 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sjntw\" (UniqueName: \"kubernetes.io/projected/0003db0c-dda0-4476-bd64-528082f53f33-kube-api-access-sjntw\") pod \"node-resolver-hkqvd\" (UID: \"0003db0c-dda0-4476-bd64-528082f53f33\") " pod="openshift-dns/node-resolver-hkqvd" Apr 16 18:30:25.804388 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.804336 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4a775d94-d89f-4059-894a-f78b252c1c3c-host-var-lib-cni-multus\") pod \"multus-562j6\" (UID: \"4a775d94-d89f-4059-894a-f78b252c1c3c\") " pod="openshift-multus/multus-562j6" Apr 16 18:30:25.804388 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.804365 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/02533a21-4e1f-4bc0-a493-7ac7d35295b8-node-log\") pod \"ovnkube-node-fdzzg\" (UID: \"02533a21-4e1f-4bc0-a493-7ac7d35295b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:25.804388 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.804391 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7be083de-137d-4eb1-b371-dc0a37d2d527-host\") pod \"tuned-fctsh\" (UID: \"7be083de-137d-4eb1-b371-dc0a37d2d527\") " pod="openshift-cluster-node-tuning-operator/tuned-fctsh" Apr 16 18:30:25.804864 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.804419 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4a775d94-d89f-4059-894a-f78b252c1c3c-multus-cni-dir\") pod \"multus-562j6\" (UID: \"4a775d94-d89f-4059-894a-f78b252c1c3c\") " pod="openshift-multus/multus-562j6" Apr 16 18:30:25.804864 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.804447 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02533a21-4e1f-4bc0-a493-7ac7d35295b8-etc-openvswitch\") pod \"ovnkube-node-fdzzg\" (UID: \"02533a21-4e1f-4bc0-a493-7ac7d35295b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:25.804864 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.804446 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3bf46358-96b5-41b1-9b21-a398d5f87d6e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-nv72w\" (UID: \"3bf46358-96b5-41b1-9b21-a398d5f87d6e\") " pod="openshift-multus/multus-additional-cni-plugins-nv72w" Apr 16 18:30:25.804864 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.804471 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5kpfw\" (UniqueName: \"kubernetes.io/projected/08946e7b-da1a-4169-a4aa-556c72f0074e-kube-api-access-5kpfw\") pod \"aws-ebs-csi-driver-node-4hbb8\" (UID: \"08946e7b-da1a-4169-a4aa-556c72f0074e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4hbb8" Apr 16 18:30:25.804864 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.804516 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/02533a21-4e1f-4bc0-a493-7ac7d35295b8-host-cni-bin\") pod \"ovnkube-node-fdzzg\" (UID: \"02533a21-4e1f-4bc0-a493-7ac7d35295b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:25.804864 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.804553 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3bf46358-96b5-41b1-9b21-a398d5f87d6e-system-cni-dir\") pod \"multus-additional-cni-plugins-nv72w\" (UID: \"3bf46358-96b5-41b1-9b21-a398d5f87d6e\") " pod="openshift-multus/multus-additional-cni-plugins-nv72w" Apr 16 18:30:25.804864 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.804586 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jbl66\" (UniqueName: \"kubernetes.io/projected/3bf46358-96b5-41b1-9b21-a398d5f87d6e-kube-api-access-jbl66\") pod \"multus-additional-cni-plugins-nv72w\" (UID: \"3bf46358-96b5-41b1-9b21-a398d5f87d6e\") " pod="openshift-multus/multus-additional-cni-plugins-nv72w" Apr 16 18:30:25.804864 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.804594 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/02533a21-4e1f-4bc0-a493-7ac7d35295b8-ovnkube-config\") pod \"ovnkube-node-fdzzg\" (UID: \"02533a21-4e1f-4bc0-a493-7ac7d35295b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:25.804864 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.804619 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tvz25\" (UniqueName: \"kubernetes.io/projected/1168c01f-c07f-44f9-b56f-cc88b2028e0b-kube-api-access-tvz25\") pod \"node-ca-fj55k\" (UID: \"1168c01f-c07f-44f9-b56f-cc88b2028e0b\") " pod="openshift-image-registry/node-ca-fj55k" Apr 16 18:30:25.804864 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.804652 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f4d6c00d-a887-4de5-87f8-5b4449359aa4-konnectivity-ca\") pod \"konnectivity-agent-gcdwk\" (UID: \"f4d6c00d-a887-4de5-87f8-5b4449359aa4\") " pod="kube-system/konnectivity-agent-gcdwk" Apr 16 18:30:25.804864 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.804676 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7be083de-137d-4eb1-b371-dc0a37d2d527-run\") pod \"tuned-fctsh\" (UID: \"7be083de-137d-4eb1-b371-dc0a37d2d527\") " pod="openshift-cluster-node-tuning-operator/tuned-fctsh" Apr 16 18:30:25.804864 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.804683 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4a775d94-d89f-4059-894a-f78b252c1c3c-host-run-multus-certs\") pod \"multus-562j6\" (UID: \"4a775d94-d89f-4059-894a-f78b252c1c3c\") " pod="openshift-multus/multus-562j6" Apr 16 18:30:25.804864 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.804747 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/02533a21-4e1f-4bc0-a493-7ac7d35295b8-host-cni-bin\") pod \"ovnkube-node-fdzzg\" (UID: \"02533a21-4e1f-4bc0-a493-7ac7d35295b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:25.804864 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.804802 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3bf46358-96b5-41b1-9b21-a398d5f87d6e-system-cni-dir\") pod \"multus-additional-cni-plugins-nv72w\" (UID: \"3bf46358-96b5-41b1-9b21-a398d5f87d6e\") " pod="openshift-multus/multus-additional-cni-plugins-nv72w" Apr 16 18:30:25.805548 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.804925 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/08946e7b-da1a-4169-a4aa-556c72f0074e-registration-dir\") pod \"aws-ebs-csi-driver-node-4hbb8\" (UID: \"08946e7b-da1a-4169-a4aa-556c72f0074e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4hbb8" Apr 16 18:30:25.805548 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.804989 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/02533a21-4e1f-4bc0-a493-7ac7d35295b8-systemd-units\") pod \"ovnkube-node-fdzzg\" (UID: \"02533a21-4e1f-4bc0-a493-7ac7d35295b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:25.805548 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.805028 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7be083de-137d-4eb1-b371-dc0a37d2d527-sys\") pod \"tuned-fctsh\" (UID: \"7be083de-137d-4eb1-b371-dc0a37d2d527\") " pod="openshift-cluster-node-tuning-operator/tuned-fctsh" Apr 16 18:30:25.805548 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.805073 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4dh97\" (UniqueName: \"kubernetes.io/projected/68a69615-3320-4f7f-b763-6991f367c93d-kube-api-access-4dh97\") pod \"iptables-alerter-npwmv\" (UID: \"68a69615-3320-4f7f-b763-6991f367c93d\") " pod="openshift-network-operator/iptables-alerter-npwmv" Apr 16 18:30:25.805548 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.805108 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/02533a21-4e1f-4bc0-a493-7ac7d35295b8-run-ovn\") pod \"ovnkube-node-fdzzg\" (UID: \"02533a21-4e1f-4bc0-a493-7ac7d35295b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:25.805548 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.805119 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/7be083de-137d-4eb1-b371-dc0a37d2d527-etc-sysctl-d\") pod \"tuned-fctsh\" (UID: \"7be083de-137d-4eb1-b371-dc0a37d2d527\") " pod="openshift-cluster-node-tuning-operator/tuned-fctsh" Apr 16 18:30:25.805548 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.805138 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f4d6c00d-a887-4de5-87f8-5b4449359aa4-agent-certs\") pod \"konnectivity-agent-gcdwk\" (UID: \"f4d6c00d-a887-4de5-87f8-5b4449359aa4\") " pod="kube-system/konnectivity-agent-gcdwk" Apr 16 18:30:25.805548 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.805195 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3bf46358-96b5-41b1-9b21-a398d5f87d6e-os-release\") pod \"multus-additional-cni-plugins-nv72w\" (UID: \"3bf46358-96b5-41b1-9b21-a398d5f87d6e\") " pod="openshift-multus/multus-additional-cni-plugins-nv72w" Apr 16 18:30:25.805548 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.805231 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7be083de-137d-4eb1-b371-dc0a37d2d527-lib-modules\") pod \"tuned-fctsh\" (UID: \"7be083de-137d-4eb1-b371-dc0a37d2d527\") " pod="openshift-cluster-node-tuning-operator/tuned-fctsh" Apr 16 18:30:25.805548 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.805235 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4a775d94-d89f-4059-894a-f78b252c1c3c-cnibin\") pod \"multus-562j6\" (UID: \"4a775d94-d89f-4059-894a-f78b252c1c3c\") " pod="openshift-multus/multus-562j6" Apr 16 18:30:25.805548 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.805308 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4a775d94-d89f-4059-894a-f78b252c1c3c-cnibin\") pod \"multus-562j6\" (UID: \"4a775d94-d89f-4059-894a-f78b252c1c3c\") " pod="openshift-multus/multus-562j6" Apr 16 18:30:25.805548 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.805338 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/08946e7b-da1a-4169-a4aa-556c72f0074e-socket-dir\") pod \"aws-ebs-csi-driver-node-4hbb8\" (UID: \"08946e7b-da1a-4169-a4aa-556c72f0074e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4hbb8" Apr 16 18:30:25.805548 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.805368 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/08946e7b-da1a-4169-a4aa-556c72f0074e-device-dir\") pod \"aws-ebs-csi-driver-node-4hbb8\" (UID: \"08946e7b-da1a-4169-a4aa-556c72f0074e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4hbb8" Apr 16 18:30:25.805548 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.805445 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7be083de-137d-4eb1-b371-dc0a37d2d527-sys\") pod \"tuned-fctsh\" (UID: \"7be083de-137d-4eb1-b371-dc0a37d2d527\") " pod="openshift-cluster-node-tuning-operator/tuned-fctsh" Apr 16 18:30:25.806164 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.805590 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4a775d94-d89f-4059-894a-f78b252c1c3c-multus-socket-dir-parent\") pod \"multus-562j6\" (UID: \"4a775d94-d89f-4059-894a-f78b252c1c3c\") " pod="openshift-multus/multus-562j6" Apr 16 18:30:25.806164 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.805638 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/02533a21-4e1f-4bc0-a493-7ac7d35295b8-host-run-ovn-kubernetes\") pod \"ovnkube-node-fdzzg\" (UID: \"02533a21-4e1f-4bc0-a493-7ac7d35295b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:25.806164 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.805706 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/02533a21-4e1f-4bc0-a493-7ac7d35295b8-run-ovn\") pod \"ovnkube-node-fdzzg\" (UID: \"02533a21-4e1f-4bc0-a493-7ac7d35295b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:25.806164 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.805722 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/7be083de-137d-4eb1-b371-dc0a37d2d527-etc-modprobe-d\") pod \"tuned-fctsh\" (UID: \"7be083de-137d-4eb1-b371-dc0a37d2d527\") " pod="openshift-cluster-node-tuning-operator/tuned-fctsh" Apr 16 18:30:25.806164 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.805771 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/68a69615-3320-4f7f-b763-6991f367c93d-iptables-alerter-script\") pod \"iptables-alerter-npwmv\" (UID: \"68a69615-3320-4f7f-b763-6991f367c93d\") " pod="openshift-network-operator/iptables-alerter-npwmv" Apr 16 18:30:25.806164 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.805778 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3bf46358-96b5-41b1-9b21-a398d5f87d6e-cnibin\") pod \"multus-additional-cni-plugins-nv72w\" (UID: \"3bf46358-96b5-41b1-9b21-a398d5f87d6e\") " pod="openshift-multus/multus-additional-cni-plugins-nv72w" Apr 16 18:30:25.806164 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.805983 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3bf46358-96b5-41b1-9b21-a398d5f87d6e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-nv72w\" (UID: \"3bf46358-96b5-41b1-9b21-a398d5f87d6e\") " pod="openshift-multus/multus-additional-cni-plugins-nv72w" Apr 16 18:30:25.806164 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.806063 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3bf46358-96b5-41b1-9b21-a398d5f87d6e-os-release\") pod \"multus-additional-cni-plugins-nv72w\" (UID: \"3bf46358-96b5-41b1-9b21-a398d5f87d6e\") " pod="openshift-multus/multus-additional-cni-plugins-nv72w" Apr 16 18:30:25.806164 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.806130 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7be083de-137d-4eb1-b371-dc0a37d2d527-host\") pod \"tuned-fctsh\" (UID: \"7be083de-137d-4eb1-b371-dc0a37d2d527\") " pod="openshift-cluster-node-tuning-operator/tuned-fctsh" Apr 16 18:30:25.806601 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.806205 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0003db0c-dda0-4476-bd64-528082f53f33-hosts-file\") pod \"node-resolver-hkqvd\" (UID: \"0003db0c-dda0-4476-bd64-528082f53f33\") " pod="openshift-dns/node-resolver-hkqvd" Apr 16 18:30:25.806674 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.806607 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4a775d94-d89f-4059-894a-f78b252c1c3c-host-var-lib-cni-multus\") pod \"multus-562j6\" (UID: \"4a775d94-d89f-4059-894a-f78b252c1c3c\") " pod="openshift-multus/multus-562j6" Apr 16 18:30:25.806730 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.806703 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/02533a21-4e1f-4bc0-a493-7ac7d35295b8-node-log\") pod \"ovnkube-node-fdzzg\" (UID: \"02533a21-4e1f-4bc0-a493-7ac7d35295b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:25.806882 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.806864 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7be083de-137d-4eb1-b371-dc0a37d2d527-run\") pod \"tuned-fctsh\" (UID: \"7be083de-137d-4eb1-b371-dc0a37d2d527\") " pod="openshift-cluster-node-tuning-operator/tuned-fctsh" Apr 16 18:30:25.806965 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.806945 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3bf46358-96b5-41b1-9b21-a398d5f87d6e-cni-binary-copy\") pod \"multus-additional-cni-plugins-nv72w\" (UID: \"3bf46358-96b5-41b1-9b21-a398d5f87d6e\") " pod="openshift-multus/multus-additional-cni-plugins-nv72w" Apr 16 18:30:25.807095 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.807076 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4a775d94-d89f-4059-894a-f78b252c1c3c-multus-cni-dir\") pod \"multus-562j6\" (UID: \"4a775d94-d89f-4059-894a-f78b252c1c3c\") " pod="openshift-multus/multus-562j6" Apr 16 18:30:25.807187 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.806951 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4a775d94-d89f-4059-894a-f78b252c1c3c-os-release\") pod \"multus-562j6\" (UID: \"4a775d94-d89f-4059-894a-f78b252c1c3c\") " pod="openshift-multus/multus-562j6" Apr 16 18:30:25.807243 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.807184 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02533a21-4e1f-4bc0-a493-7ac7d35295b8-etc-openvswitch\") pod \"ovnkube-node-fdzzg\" (UID: \"02533a21-4e1f-4bc0-a493-7ac7d35295b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:25.807243 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.807225 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/02533a21-4e1f-4bc0-a493-7ac7d35295b8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fdzzg\" (UID: \"02533a21-4e1f-4bc0-a493-7ac7d35295b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:25.807347 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.807242 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/3bf46358-96b5-41b1-9b21-a398d5f87d6e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-nv72w\" (UID: \"3bf46358-96b5-41b1-9b21-a398d5f87d6e\") " pod="openshift-multus/multus-additional-cni-plugins-nv72w" Apr 16 18:30:25.807347 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.807265 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/02533a21-4e1f-4bc0-a493-7ac7d35295b8-ovnkube-script-lib\") pod \"ovnkube-node-fdzzg\" (UID: \"02533a21-4e1f-4bc0-a493-7ac7d35295b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:25.807347 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.807272 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4a775d94-d89f-4059-894a-f78b252c1c3c-os-release\") pod \"multus-562j6\" (UID: \"4a775d94-d89f-4059-894a-f78b252c1c3c\") " pod="openshift-multus/multus-562j6" Apr 16 18:30:25.807347 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.807316 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p5ssj\" (UniqueName: \"kubernetes.io/projected/7be083de-137d-4eb1-b371-dc0a37d2d527-kube-api-access-p5ssj\") pod \"tuned-fctsh\" (UID: \"7be083de-137d-4eb1-b371-dc0a37d2d527\") " pod="openshift-cluster-node-tuning-operator/tuned-fctsh" Apr 16 18:30:25.807539 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.807346 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/02533a21-4e1f-4bc0-a493-7ac7d35295b8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fdzzg\" (UID: \"02533a21-4e1f-4bc0-a493-7ac7d35295b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:25.807539 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.807357 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/08946e7b-da1a-4169-a4aa-556c72f0074e-etc-selinux\") pod \"aws-ebs-csi-driver-node-4hbb8\" (UID: \"08946e7b-da1a-4169-a4aa-556c72f0074e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4hbb8" Apr 16 18:30:25.807539 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.807463 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/08946e7b-da1a-4169-a4aa-556c72f0074e-etc-selinux\") pod \"aws-ebs-csi-driver-node-4hbb8\" (UID: \"08946e7b-da1a-4169-a4aa-556c72f0074e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4hbb8" Apr 16 18:30:25.807687 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.807513 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1168c01f-c07f-44f9-b56f-cc88b2028e0b-host\") pod \"node-ca-fj55k\" (UID: \"1168c01f-c07f-44f9-b56f-cc88b2028e0b\") " pod="openshift-image-registry/node-ca-fj55k" Apr 16 18:30:25.807687 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.807575 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1168c01f-c07f-44f9-b56f-cc88b2028e0b-host\") pod \"node-ca-fj55k\" (UID: \"1168c01f-c07f-44f9-b56f-cc88b2028e0b\") " pod="openshift-image-registry/node-ca-fj55k" Apr 16 18:30:25.807781 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.807727 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4a775d94-d89f-4059-894a-f78b252c1c3c-host-run-k8s-cni-cncf-io\") pod \"multus-562j6\" (UID: \"4a775d94-d89f-4059-894a-f78b252c1c3c\") " pod="openshift-multus/multus-562j6" Apr 16 18:30:25.807878 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.807856 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4a775d94-d89f-4059-894a-f78b252c1c3c-host-run-k8s-cni-cncf-io\") pod \"multus-562j6\" (UID: \"4a775d94-d89f-4059-894a-f78b252c1c3c\") " pod="openshift-multus/multus-562j6" Apr 16 18:30:25.807933 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.807905 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dj56w\" (UniqueName: \"kubernetes.io/projected/dcf43e1d-4165-4661-a113-011616920ebe-kube-api-access-dj56w\") pod \"network-check-target-rp72n\" (UID: \"dcf43e1d-4165-4661-a113-011616920ebe\") " pod="openshift-network-diagnostics/network-check-target-rp72n" Apr 16 18:30:25.807983 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.807934 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4a775d94-d89f-4059-894a-f78b252c1c3c-multus-daemon-config\") pod \"multus-562j6\" (UID: \"4a775d94-d89f-4059-894a-f78b252c1c3c\") " pod="openshift-multus/multus-562j6" Apr 16 18:30:25.807983 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.807965 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/02533a21-4e1f-4bc0-a493-7ac7d35295b8-run-systemd\") pod \"ovnkube-node-fdzzg\" (UID: \"02533a21-4e1f-4bc0-a493-7ac7d35295b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:25.808081 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.807995 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/02533a21-4e1f-4bc0-a493-7ac7d35295b8-host-cni-netd\") pod \"ovnkube-node-fdzzg\" (UID: \"02533a21-4e1f-4bc0-a493-7ac7d35295b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:25.808187 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.808154 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1168c01f-c07f-44f9-b56f-cc88b2028e0b-serviceca\") pod \"node-ca-fj55k\" (UID: \"1168c01f-c07f-44f9-b56f-cc88b2028e0b\") " pod="openshift-image-registry/node-ca-fj55k" Apr 16 18:30:25.808263 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.808241 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/02533a21-4e1f-4bc0-a493-7ac7d35295b8-run-systemd\") pod \"ovnkube-node-fdzzg\" (UID: \"02533a21-4e1f-4bc0-a493-7ac7d35295b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:25.808317 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.808291 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/02533a21-4e1f-4bc0-a493-7ac7d35295b8-host-cni-netd\") pod \"ovnkube-node-fdzzg\" (UID: \"02533a21-4e1f-4bc0-a493-7ac7d35295b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:25.808491 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.808452 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4a775d94-d89f-4059-894a-f78b252c1c3c-host-run-netns\") pod \"multus-562j6\" (UID: \"4a775d94-d89f-4059-894a-f78b252c1c3c\") " pod="openshift-multus/multus-562j6" Apr 16 18:30:25.808491 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.808471 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f4d6c00d-a887-4de5-87f8-5b4449359aa4-konnectivity-ca\") pod \"konnectivity-agent-gcdwk\" (UID: \"f4d6c00d-a887-4de5-87f8-5b4449359aa4\") " pod="kube-system/konnectivity-agent-gcdwk" Apr 16 18:30:25.808641 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.808501 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/02533a21-4e1f-4bc0-a493-7ac7d35295b8-host-run-netns\") pod \"ovnkube-node-fdzzg\" (UID: \"02533a21-4e1f-4bc0-a493-7ac7d35295b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:25.808641 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.808537 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/02533a21-4e1f-4bc0-a493-7ac7d35295b8-ovn-node-metrics-cert\") pod \"ovnkube-node-fdzzg\" (UID: \"02533a21-4e1f-4bc0-a493-7ac7d35295b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:25.808641 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.808560 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/02533a21-4e1f-4bc0-a493-7ac7d35295b8-ovnkube-script-lib\") pod \"ovnkube-node-fdzzg\" (UID: \"02533a21-4e1f-4bc0-a493-7ac7d35295b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:25.808641 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.808569 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0003db0c-dda0-4476-bd64-528082f53f33-tmp-dir\") pod \"node-resolver-hkqvd\" (UID: \"0003db0c-dda0-4476-bd64-528082f53f33\") " pod="openshift-dns/node-resolver-hkqvd" Apr 16 18:30:25.808641 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.808600 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/08946e7b-da1a-4169-a4aa-556c72f0074e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-4hbb8\" (UID: \"08946e7b-da1a-4169-a4aa-556c72f0074e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4hbb8" Apr 16 18:30:25.808641 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.808624 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/02533a21-4e1f-4bc0-a493-7ac7d35295b8-host-run-netns\") pod \"ovnkube-node-fdzzg\" (UID: \"02533a21-4e1f-4bc0-a493-7ac7d35295b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:25.808641 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.808633 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4a775d94-d89f-4059-894a-f78b252c1c3c-system-cni-dir\") pod \"multus-562j6\" (UID: \"4a775d94-d89f-4059-894a-f78b252c1c3c\") " pod="openshift-multus/multus-562j6" Apr 16 18:30:25.808999 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.808661 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4a775d94-d89f-4059-894a-f78b252c1c3c-host-var-lib-kubelet\") pod \"multus-562j6\" (UID: \"4a775d94-d89f-4059-894a-f78b252c1c3c\") " pod="openshift-multus/multus-562j6" Apr 16 18:30:25.808999 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.808680 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4a775d94-d89f-4059-894a-f78b252c1c3c-host-run-netns\") pod \"multus-562j6\" (UID: \"4a775d94-d89f-4059-894a-f78b252c1c3c\") " pod="openshift-multus/multus-562j6" Apr 16 18:30:25.808999 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.808695 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/02533a21-4e1f-4bc0-a493-7ac7d35295b8-host-kubelet\") pod \"ovnkube-node-fdzzg\" (UID: \"02533a21-4e1f-4bc0-a493-7ac7d35295b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:25.808999 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.808727 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b62474b5-9999-4dd6-83ae-96e3bc355df3-metrics-certs\") pod \"network-metrics-daemon-f4smb\" (UID: \"b62474b5-9999-4dd6-83ae-96e3bc355df3\") " pod="openshift-multus/network-metrics-daemon-f4smb" Apr 16 18:30:25.808999 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.808736 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/08946e7b-da1a-4169-a4aa-556c72f0074e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-4hbb8\" (UID: \"08946e7b-da1a-4169-a4aa-556c72f0074e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4hbb8" Apr 16 18:30:25.808999 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.808759 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/08946e7b-da1a-4169-a4aa-556c72f0074e-sys-fs\") pod \"aws-ebs-csi-driver-node-4hbb8\" (UID: \"08946e7b-da1a-4169-a4aa-556c72f0074e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4hbb8" Apr 16 18:30:25.808999 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.808848 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/08946e7b-da1a-4169-a4aa-556c72f0074e-sys-fs\") pod \"aws-ebs-csi-driver-node-4hbb8\" (UID: \"08946e7b-da1a-4169-a4aa-556c72f0074e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4hbb8" Apr 16 18:30:25.808999 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.808868 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1168c01f-c07f-44f9-b56f-cc88b2028e0b-serviceca\") pod \"node-ca-fj55k\" (UID: \"1168c01f-c07f-44f9-b56f-cc88b2028e0b\") " pod="openshift-image-registry/node-ca-fj55k" Apr 16 18:30:25.808999 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.808905 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4a775d94-d89f-4059-894a-f78b252c1c3c-system-cni-dir\") pod \"multus-562j6\" (UID: \"4a775d94-d89f-4059-894a-f78b252c1c3c\") " pod="openshift-multus/multus-562j6" Apr 16 18:30:25.808999 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.808949 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4a775d94-d89f-4059-894a-f78b252c1c3c-host-var-lib-kubelet\") pod \"multus-562j6\" (UID: \"4a775d94-d89f-4059-894a-f78b252c1c3c\") " pod="openshift-multus/multus-562j6" Apr 16 18:30:25.808999 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.808991 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/02533a21-4e1f-4bc0-a493-7ac7d35295b8-host-kubelet\") pod \"ovnkube-node-fdzzg\" (UID: \"02533a21-4e1f-4bc0-a493-7ac7d35295b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:25.809511 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:25.809079 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:25.809511 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:25.809150 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b62474b5-9999-4dd6-83ae-96e3bc355df3-metrics-certs podName:b62474b5-9999-4dd6-83ae-96e3bc355df3 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:26.309121668 +0000 UTC m=+3.096457919 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b62474b5-9999-4dd6-83ae-96e3bc355df3-metrics-certs") pod "network-metrics-daemon-f4smb" (UID: "b62474b5-9999-4dd6-83ae-96e3bc355df3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:25.810015 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.809954 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4a775d94-d89f-4059-894a-f78b252c1c3c-multus-daemon-config\") pod \"multus-562j6\" (UID: \"4a775d94-d89f-4059-894a-f78b252c1c3c\") " pod="openshift-multus/multus-562j6" Apr 16 18:30:25.810165 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.810135 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/7be083de-137d-4eb1-b371-dc0a37d2d527-etc-tuned\") pod \"tuned-fctsh\" (UID: \"7be083de-137d-4eb1-b371-dc0a37d2d527\") " pod="openshift-cluster-node-tuning-operator/tuned-fctsh" Apr 16 18:30:25.810251 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.810192 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7be083de-137d-4eb1-b371-dc0a37d2d527-tmp\") pod \"tuned-fctsh\" (UID: \"7be083de-137d-4eb1-b371-dc0a37d2d527\") " pod="openshift-cluster-node-tuning-operator/tuned-fctsh" Apr 16 18:30:25.810484 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.810349 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0003db0c-dda0-4476-bd64-528082f53f33-tmp-dir\") pod \"node-resolver-hkqvd\" (UID: \"0003db0c-dda0-4476-bd64-528082f53f33\") " pod="openshift-dns/node-resolver-hkqvd" Apr 16 18:30:25.810833 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.810811 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f4d6c00d-a887-4de5-87f8-5b4449359aa4-agent-certs\") pod \"konnectivity-agent-gcdwk\" (UID: \"f4d6c00d-a887-4de5-87f8-5b4449359aa4\") " pod="kube-system/konnectivity-agent-gcdwk" Apr 16 18:30:25.811835 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.811817 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/02533a21-4e1f-4bc0-a493-7ac7d35295b8-ovn-node-metrics-cert\") pod \"ovnkube-node-fdzzg\" (UID: \"02533a21-4e1f-4bc0-a493-7ac7d35295b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:25.813729 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.813702 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvz25\" (UniqueName: \"kubernetes.io/projected/1168c01f-c07f-44f9-b56f-cc88b2028e0b-kube-api-access-tvz25\") pod \"node-ca-fj55k\" (UID: \"1168c01f-c07f-44f9-b56f-cc88b2028e0b\") " pod="openshift-image-registry/node-ca-fj55k" Apr 16 18:30:25.813944 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:25.813922 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:30:25.814037 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:25.813951 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:30:25.814037 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:25.813965 2578 projected.go:194] Error preparing data for projected volume kube-api-access-dj56w for pod openshift-network-diagnostics/network-check-target-rp72n: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:25.814037 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:25.814029 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dcf43e1d-4165-4661-a113-011616920ebe-kube-api-access-dj56w podName:dcf43e1d-4165-4661-a113-011616920ebe nodeName:}" failed. No retries permitted until 2026-04-16 18:30:26.314012206 +0000 UTC m=+3.101348458 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-dj56w" (UniqueName: "kubernetes.io/projected/dcf43e1d-4165-4661-a113-011616920ebe-kube-api-access-dj56w") pod "network-check-target-rp72n" (UID: "dcf43e1d-4165-4661-a113-011616920ebe") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:25.814745 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.814718 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbl66\" (UniqueName: \"kubernetes.io/projected/3bf46358-96b5-41b1-9b21-a398d5f87d6e-kube-api-access-jbl66\") pod \"multus-additional-cni-plugins-nv72w\" (UID: \"3bf46358-96b5-41b1-9b21-a398d5f87d6e\") " pod="openshift-multus/multus-additional-cni-plugins-nv72w" Apr 16 18:30:25.816569 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.816544 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhxwk\" (UniqueName: \"kubernetes.io/projected/4a775d94-d89f-4059-894a-f78b252c1c3c-kube-api-access-xhxwk\") pod \"multus-562j6\" (UID: \"4a775d94-d89f-4059-894a-f78b252c1c3c\") " pod="openshift-multus/multus-562j6" Apr 16 18:30:25.816569 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.816559 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9brn\" (UniqueName: \"kubernetes.io/projected/02533a21-4e1f-4bc0-a493-7ac7d35295b8-kube-api-access-l9brn\") pod \"ovnkube-node-fdzzg\" (UID: \"02533a21-4e1f-4bc0-a493-7ac7d35295b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:25.816706 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.816580 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kpfw\" (UniqueName: \"kubernetes.io/projected/08946e7b-da1a-4169-a4aa-556c72f0074e-kube-api-access-5kpfw\") pod \"aws-ebs-csi-driver-node-4hbb8\" (UID: \"08946e7b-da1a-4169-a4aa-556c72f0074e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4hbb8" Apr 16 18:30:25.816706 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.816653 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbrfr\" (UniqueName: \"kubernetes.io/projected/b62474b5-9999-4dd6-83ae-96e3bc355df3-kube-api-access-lbrfr\") pod \"network-metrics-daemon-f4smb\" (UID: \"b62474b5-9999-4dd6-83ae-96e3bc355df3\") " pod="openshift-multus/network-metrics-daemon-f4smb" Apr 16 18:30:25.816918 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.816902 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dh97\" (UniqueName: \"kubernetes.io/projected/68a69615-3320-4f7f-b763-6991f367c93d-kube-api-access-4dh97\") pod \"iptables-alerter-npwmv\" (UID: \"68a69615-3320-4f7f-b763-6991f367c93d\") " pod="openshift-network-operator/iptables-alerter-npwmv" Apr 16 18:30:25.816977 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.816941 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjntw\" (UniqueName: \"kubernetes.io/projected/0003db0c-dda0-4476-bd64-528082f53f33-kube-api-access-sjntw\") pod \"node-resolver-hkqvd\" (UID: \"0003db0c-dda0-4476-bd64-528082f53f33\") " pod="openshift-dns/node-resolver-hkqvd" Apr 16 18:30:25.817956 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.817937 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5ssj\" (UniqueName: \"kubernetes.io/projected/7be083de-137d-4eb1-b371-dc0a37d2d527-kube-api-access-p5ssj\") pod \"tuned-fctsh\" (UID: \"7be083de-137d-4eb1-b371-dc0a37d2d527\") " pod="openshift-cluster-node-tuning-operator/tuned-fctsh" Apr 16 18:30:25.985912 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.985873 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-562j6" Apr 16 18:30:25.992979 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:25.992951 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:26.000591 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:26.000566 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-gcdwk" Apr 16 18:30:26.006111 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:26.006092 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4hbb8" Apr 16 18:30:26.016823 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:26.016805 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-fctsh" Apr 16 18:30:26.023368 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:26.023347 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-fj55k" Apr 16 18:30:26.029811 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:26.029789 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-npwmv" Apr 16 18:30:26.037338 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:26.037314 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-nv72w" Apr 16 18:30:26.042812 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:26.042796 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hkqvd" Apr 16 18:30:26.312510 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:26.312484 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b62474b5-9999-4dd6-83ae-96e3bc355df3-metrics-certs\") pod \"network-metrics-daemon-f4smb\" (UID: \"b62474b5-9999-4dd6-83ae-96e3bc355df3\") " pod="openshift-multus/network-metrics-daemon-f4smb" Apr 16 18:30:26.312683 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:26.312599 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:26.312683 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:26.312667 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b62474b5-9999-4dd6-83ae-96e3bc355df3-metrics-certs podName:b62474b5-9999-4dd6-83ae-96e3bc355df3 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:27.312648999 +0000 UTC m=+4.099985256 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b62474b5-9999-4dd6-83ae-96e3bc355df3-metrics-certs") pod "network-metrics-daemon-f4smb" (UID: "b62474b5-9999-4dd6-83ae-96e3bc355df3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:26.404334 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:26.404308 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08946e7b_da1a_4169_a4aa_556c72f0074e.slice/crio-5571f48557de65daa37110958ee1bb3b9286a0f8e1f63225e3f6407c158c4798 WatchSource:0}: Error finding container 5571f48557de65daa37110958ee1bb3b9286a0f8e1f63225e3f6407c158c4798: Status 404 returned error can't find the container with id 5571f48557de65daa37110958ee1bb3b9286a0f8e1f63225e3f6407c158c4798 Apr 16 18:30:26.413516 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:26.413497 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dj56w\" (UniqueName: \"kubernetes.io/projected/dcf43e1d-4165-4661-a113-011616920ebe-kube-api-access-dj56w\") pod \"network-check-target-rp72n\" (UID: \"dcf43e1d-4165-4661-a113-011616920ebe\") " pod="openshift-network-diagnostics/network-check-target-rp72n" Apr 16 18:30:26.413613 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:26.413602 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:30:26.413652 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:26.413617 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:30:26.413652 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:26.413625 2578 projected.go:194] Error preparing data for projected volume kube-api-access-dj56w for pod openshift-network-diagnostics/network-check-target-rp72n: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:26.413716 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:26.413667 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dcf43e1d-4165-4661-a113-011616920ebe-kube-api-access-dj56w podName:dcf43e1d-4165-4661-a113-011616920ebe nodeName:}" failed. No retries permitted until 2026-04-16 18:30:27.413653878 +0000 UTC m=+4.200990117 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-dj56w" (UniqueName: "kubernetes.io/projected/dcf43e1d-4165-4661-a113-011616920ebe-kube-api-access-dj56w") pod "network-check-target-rp72n" (UID: "dcf43e1d-4165-4661-a113-011616920ebe") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:26.422358 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:26.422332 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0003db0c_dda0_4476_bd64_528082f53f33.slice/crio-20236d4fdd7e2815e5743a9e4407b84822264933e31c8f2c62bedbe074a7ef96 WatchSource:0}: Error finding container 20236d4fdd7e2815e5743a9e4407b84822264933e31c8f2c62bedbe074a7ef96: Status 404 returned error can't find the container with id 20236d4fdd7e2815e5743a9e4407b84822264933e31c8f2c62bedbe074a7ef96 Apr 16 18:30:26.423799 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:26.422957 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4d6c00d_a887_4de5_87f8_5b4449359aa4.slice/crio-17336f169d226af6f7695ce12dcc31e7d9933ad046498732d17c6533a673d780 WatchSource:0}: Error finding container 17336f169d226af6f7695ce12dcc31e7d9933ad046498732d17c6533a673d780: Status 404 returned error can't find the container with id 17336f169d226af6f7695ce12dcc31e7d9933ad046498732d17c6533a673d780 Apr 16 18:30:26.424334 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:26.424314 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a775d94_d89f_4059_894a_f78b252c1c3c.slice/crio-a92879e0a71e298b148bc2c2461fe304d3aead0ce008fe2b46471675f56f0ec7 WatchSource:0}: Error finding container a92879e0a71e298b148bc2c2461fe304d3aead0ce008fe2b46471675f56f0ec7: Status 404 returned error can't find the container with id a92879e0a71e298b148bc2c2461fe304d3aead0ce008fe2b46471675f56f0ec7 Apr 16 18:30:26.426700 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:26.426674 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7be083de_137d_4eb1_b371_dc0a37d2d527.slice/crio-89b6bdd368b8eeadb21577d960ede9a024f915e94f932241404c00ef9043bde0 WatchSource:0}: Error finding container 89b6bdd368b8eeadb21577d960ede9a024f915e94f932241404c00ef9043bde0: Status 404 returned error can't find the container with id 89b6bdd368b8eeadb21577d960ede9a024f915e94f932241404c00ef9043bde0 Apr 16 18:30:26.428814 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:26.428408 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3bf46358_96b5_41b1_9b21_a398d5f87d6e.slice/crio-6a3bc6aa35dd0175f0f6c92ec32fe72cba33dc6bfa5b99086b9502261fa96aaa WatchSource:0}: Error finding container 6a3bc6aa35dd0175f0f6c92ec32fe72cba33dc6bfa5b99086b9502261fa96aaa: Status 404 returned error can't find the container with id 6a3bc6aa35dd0175f0f6c92ec32fe72cba33dc6bfa5b99086b9502261fa96aaa Apr 16 18:30:26.429323 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:26.429300 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02533a21_4e1f_4bc0_a493_7ac7d35295b8.slice/crio-75f927649c24c20f6cb8cb96a903e7fdc91ddc34fa3acaff7217ca69c59bdb9d WatchSource:0}: Error finding container 75f927649c24c20f6cb8cb96a903e7fdc91ddc34fa3acaff7217ca69c59bdb9d: Status 404 returned error can't find the container with id 75f927649c24c20f6cb8cb96a903e7fdc91ddc34fa3acaff7217ca69c59bdb9d Apr 16 18:30:26.430327 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:26.430304 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68a69615_3320_4f7f_b763_6991f367c93d.slice/crio-ef0f4881675d3d600f3ed478e1674b2a7e31b27386f289a21c94c1a8d77569b9 WatchSource:0}: Error finding container ef0f4881675d3d600f3ed478e1674b2a7e31b27386f289a21c94c1a8d77569b9: Status 404 returned error can't find the container with id ef0f4881675d3d600f3ed478e1674b2a7e31b27386f289a21c94c1a8d77569b9 Apr 16 18:30:26.431121 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:30:26.431045 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1168c01f_c07f_44f9_b56f_cc88b2028e0b.slice/crio-25a02a4818b9891c29a66d8497a6ed787d03c48a6aeddc79a8d7b8302e7e9bb6 WatchSource:0}: Error finding container 25a02a4818b9891c29a66d8497a6ed787d03c48a6aeddc79a8d7b8302e7e9bb6: Status 404 returned error can't find the container with id 25a02a4818b9891c29a66d8497a6ed787d03c48a6aeddc79a8d7b8302e7e9bb6 Apr 16 18:30:26.740631 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:26.740390 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 18:25:24 +0000 UTC" deadline="2027-12-14 08:21:37.677664186 +0000 UTC" Apr 16 18:30:26.740631 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:26.740566 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14557h51m10.93710196s" Apr 16 18:30:26.837354 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:26.837321 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4smb" Apr 16 18:30:26.837592 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:26.837476 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4smb" podUID="b62474b5-9999-4dd6-83ae-96e3bc355df3" Apr 16 18:30:26.847488 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:26.847391 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-fj55k" event={"ID":"1168c01f-c07f-44f9-b56f-cc88b2028e0b","Type":"ContainerStarted","Data":"25a02a4818b9891c29a66d8497a6ed787d03c48a6aeddc79a8d7b8302e7e9bb6"} Apr 16 18:30:26.849188 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:26.849097 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4hbb8" event={"ID":"08946e7b-da1a-4169-a4aa-556c72f0074e","Type":"ContainerStarted","Data":"5571f48557de65daa37110958ee1bb3b9286a0f8e1f63225e3f6407c158c4798"} Apr 16 18:30:26.851281 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:26.851240 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-npwmv" event={"ID":"68a69615-3320-4f7f-b763-6991f367c93d","Type":"ContainerStarted","Data":"ef0f4881675d3d600f3ed478e1674b2a7e31b27386f289a21c94c1a8d77569b9"} Apr 16 18:30:26.854412 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:26.854383 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" event={"ID":"02533a21-4e1f-4bc0-a493-7ac7d35295b8","Type":"ContainerStarted","Data":"75f927649c24c20f6cb8cb96a903e7fdc91ddc34fa3acaff7217ca69c59bdb9d"} Apr 16 18:30:26.855763 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:26.855724 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-fctsh" event={"ID":"7be083de-137d-4eb1-b371-dc0a37d2d527","Type":"ContainerStarted","Data":"89b6bdd368b8eeadb21577d960ede9a024f915e94f932241404c00ef9043bde0"} Apr 16 18:30:26.857024 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:26.856927 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nv72w" event={"ID":"3bf46358-96b5-41b1-9b21-a398d5f87d6e","Type":"ContainerStarted","Data":"6a3bc6aa35dd0175f0f6c92ec32fe72cba33dc6bfa5b99086b9502261fa96aaa"} Apr 16 18:30:26.862759 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:26.859916 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-gcdwk" event={"ID":"f4d6c00d-a887-4de5-87f8-5b4449359aa4","Type":"ContainerStarted","Data":"17336f169d226af6f7695ce12dcc31e7d9933ad046498732d17c6533a673d780"} Apr 16 18:30:26.863383 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:26.863334 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-562j6" event={"ID":"4a775d94-d89f-4059-894a-f78b252c1c3c","Type":"ContainerStarted","Data":"a92879e0a71e298b148bc2c2461fe304d3aead0ce008fe2b46471675f56f0ec7"} Apr 16 18:30:26.865731 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:26.865687 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hkqvd" event={"ID":"0003db0c-dda0-4476-bd64-528082f53f33","Type":"ContainerStarted","Data":"20236d4fdd7e2815e5743a9e4407b84822264933e31c8f2c62bedbe074a7ef96"} Apr 16 18:30:26.870250 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:26.870204 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-33.ec2.internal" event={"ID":"27e141a8b2c2991dacebb3be05ed01ab","Type":"ContainerStarted","Data":"4ed1b6038415f64dd42c5dfb2c8dd5051c4da42149ab6f481287c8a290bdd803"} Apr 16 18:30:26.887198 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:26.887133 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-33.ec2.internal" podStartSLOduration=1.887117055 podStartE2EDuration="1.887117055s" podCreationTimestamp="2026-04-16 18:30:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:30:26.886322752 +0000 UTC m=+3.673659009" watchObservedRunningTime="2026-04-16 18:30:26.887117055 +0000 UTC m=+3.674453311" Apr 16 18:30:27.203278 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:27.203248 2578 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:30:27.321363 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:27.320747 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b62474b5-9999-4dd6-83ae-96e3bc355df3-metrics-certs\") pod \"network-metrics-daemon-f4smb\" (UID: \"b62474b5-9999-4dd6-83ae-96e3bc355df3\") " pod="openshift-multus/network-metrics-daemon-f4smb" Apr 16 18:30:27.321363 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:27.320918 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:27.321363 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:27.320982 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b62474b5-9999-4dd6-83ae-96e3bc355df3-metrics-certs podName:b62474b5-9999-4dd6-83ae-96e3bc355df3 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:29.320963103 +0000 UTC m=+6.108299354 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b62474b5-9999-4dd6-83ae-96e3bc355df3-metrics-certs") pod "network-metrics-daemon-f4smb" (UID: "b62474b5-9999-4dd6-83ae-96e3bc355df3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:27.422131 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:27.421815 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dj56w\" (UniqueName: \"kubernetes.io/projected/dcf43e1d-4165-4661-a113-011616920ebe-kube-api-access-dj56w\") pod \"network-check-target-rp72n\" (UID: \"dcf43e1d-4165-4661-a113-011616920ebe\") " pod="openshift-network-diagnostics/network-check-target-rp72n" Apr 16 18:30:27.422131 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:27.421984 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:30:27.422131 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:27.422005 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:30:27.422131 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:27.422018 2578 projected.go:194] Error preparing data for projected volume kube-api-access-dj56w for pod openshift-network-diagnostics/network-check-target-rp72n: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:27.422131 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:27.422071 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dcf43e1d-4165-4661-a113-011616920ebe-kube-api-access-dj56w podName:dcf43e1d-4165-4661-a113-011616920ebe nodeName:}" failed. No retries permitted until 2026-04-16 18:30:29.422052894 +0000 UTC m=+6.209389135 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-dj56w" (UniqueName: "kubernetes.io/projected/dcf43e1d-4165-4661-a113-011616920ebe-kube-api-access-dj56w") pod "network-check-target-rp72n" (UID: "dcf43e1d-4165-4661-a113-011616920ebe") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:27.841685 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:27.841612 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rp72n" Apr 16 18:30:27.842094 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:27.841732 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rp72n" podUID="dcf43e1d-4165-4661-a113-011616920ebe" Apr 16 18:30:27.892829 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:27.892715 2578 generic.go:358] "Generic (PLEG): container finished" podID="c53f305fe96e7a58062f6a77dba1ae45" containerID="b96360f05fa5ab6305474d6537aa2d25f9adc33bec01e059019f23a6cb878069" exitCode=0 Apr 16 18:30:27.893275 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:27.893249 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-33.ec2.internal" event={"ID":"c53f305fe96e7a58062f6a77dba1ae45","Type":"ContainerDied","Data":"b96360f05fa5ab6305474d6537aa2d25f9adc33bec01e059019f23a6cb878069"} Apr 16 18:30:28.837935 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:28.837464 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4smb" Apr 16 18:30:28.837935 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:28.837604 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4smb" podUID="b62474b5-9999-4dd6-83ae-96e3bc355df3" Apr 16 18:30:28.904359 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:28.904318 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-33.ec2.internal" event={"ID":"c53f305fe96e7a58062f6a77dba1ae45","Type":"ContainerStarted","Data":"ecd5afd3f8ccc9fb1df0b78095ea9d3dd1fb23532d4671639368e69d98f6282a"} Apr 16 18:30:29.340417 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:29.340385 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b62474b5-9999-4dd6-83ae-96e3bc355df3-metrics-certs\") pod \"network-metrics-daemon-f4smb\" (UID: \"b62474b5-9999-4dd6-83ae-96e3bc355df3\") " pod="openshift-multus/network-metrics-daemon-f4smb" Apr 16 18:30:29.340575 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:29.340532 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:29.340650 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:29.340585 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b62474b5-9999-4dd6-83ae-96e3bc355df3-metrics-certs podName:b62474b5-9999-4dd6-83ae-96e3bc355df3 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:33.340569517 +0000 UTC m=+10.127905769 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b62474b5-9999-4dd6-83ae-96e3bc355df3-metrics-certs") pod "network-metrics-daemon-f4smb" (UID: "b62474b5-9999-4dd6-83ae-96e3bc355df3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:29.441464 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:29.441129 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dj56w\" (UniqueName: \"kubernetes.io/projected/dcf43e1d-4165-4661-a113-011616920ebe-kube-api-access-dj56w\") pod \"network-check-target-rp72n\" (UID: \"dcf43e1d-4165-4661-a113-011616920ebe\") " pod="openshift-network-diagnostics/network-check-target-rp72n" Apr 16 18:30:29.441464 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:29.441290 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:30:29.441464 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:29.441311 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:30:29.441464 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:29.441323 2578 projected.go:194] Error preparing data for projected volume kube-api-access-dj56w for pod openshift-network-diagnostics/network-check-target-rp72n: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:29.441464 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:29.441385 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dcf43e1d-4165-4661-a113-011616920ebe-kube-api-access-dj56w podName:dcf43e1d-4165-4661-a113-011616920ebe nodeName:}" failed. No retries permitted until 2026-04-16 18:30:33.441361872 +0000 UTC m=+10.228698111 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-dj56w" (UniqueName: "kubernetes.io/projected/dcf43e1d-4165-4661-a113-011616920ebe-kube-api-access-dj56w") pod "network-check-target-rp72n" (UID: "dcf43e1d-4165-4661-a113-011616920ebe") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:29.836733 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:29.836652 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rp72n" Apr 16 18:30:29.836886 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:29.836790 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rp72n" podUID="dcf43e1d-4165-4661-a113-011616920ebe" Apr 16 18:30:30.836555 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:30.836518 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4smb" Apr 16 18:30:30.836980 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:30.836673 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4smb" podUID="b62474b5-9999-4dd6-83ae-96e3bc355df3" Apr 16 18:30:31.837452 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:31.837394 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rp72n" Apr 16 18:30:31.837913 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:31.837551 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rp72n" podUID="dcf43e1d-4165-4661-a113-011616920ebe" Apr 16 18:30:32.836974 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:32.836936 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4smb" Apr 16 18:30:32.837202 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:32.837112 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4smb" podUID="b62474b5-9999-4dd6-83ae-96e3bc355df3" Apr 16 18:30:33.372678 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:33.372635 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b62474b5-9999-4dd6-83ae-96e3bc355df3-metrics-certs\") pod \"network-metrics-daemon-f4smb\" (UID: \"b62474b5-9999-4dd6-83ae-96e3bc355df3\") " pod="openshift-multus/network-metrics-daemon-f4smb" Apr 16 18:30:33.373140 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:33.372845 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:33.373140 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:33.372927 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b62474b5-9999-4dd6-83ae-96e3bc355df3-metrics-certs podName:b62474b5-9999-4dd6-83ae-96e3bc355df3 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:41.372907134 +0000 UTC m=+18.160243392 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b62474b5-9999-4dd6-83ae-96e3bc355df3-metrics-certs") pod "network-metrics-daemon-f4smb" (UID: "b62474b5-9999-4dd6-83ae-96e3bc355df3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:33.473441 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:33.473402 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dj56w\" (UniqueName: \"kubernetes.io/projected/dcf43e1d-4165-4661-a113-011616920ebe-kube-api-access-dj56w\") pod \"network-check-target-rp72n\" (UID: \"dcf43e1d-4165-4661-a113-011616920ebe\") " pod="openshift-network-diagnostics/network-check-target-rp72n" Apr 16 18:30:33.473595 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:33.473573 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:30:33.473595 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:33.473594 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:30:33.473709 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:33.473607 2578 projected.go:194] Error preparing data for projected volume kube-api-access-dj56w for pod openshift-network-diagnostics/network-check-target-rp72n: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:33.473709 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:33.473672 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dcf43e1d-4165-4661-a113-011616920ebe-kube-api-access-dj56w podName:dcf43e1d-4165-4661-a113-011616920ebe nodeName:}" failed. No retries permitted until 2026-04-16 18:30:41.473651904 +0000 UTC m=+18.260988145 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-dj56w" (UniqueName: "kubernetes.io/projected/dcf43e1d-4165-4661-a113-011616920ebe-kube-api-access-dj56w") pod "network-check-target-rp72n" (UID: "dcf43e1d-4165-4661-a113-011616920ebe") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:33.837422 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:33.837316 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rp72n" Apr 16 18:30:33.837587 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:33.837450 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rp72n" podUID="dcf43e1d-4165-4661-a113-011616920ebe" Apr 16 18:30:34.837232 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:34.837194 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4smb" Apr 16 18:30:34.837725 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:34.837356 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4smb" podUID="b62474b5-9999-4dd6-83ae-96e3bc355df3" Apr 16 18:30:35.836804 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:35.836766 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rp72n" Apr 16 18:30:35.836953 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:35.836907 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rp72n" podUID="dcf43e1d-4165-4661-a113-011616920ebe" Apr 16 18:30:36.836588 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:36.836555 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4smb" Apr 16 18:30:36.837061 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:36.836686 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4smb" podUID="b62474b5-9999-4dd6-83ae-96e3bc355df3" Apr 16 18:30:37.839544 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:37.839513 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rp72n" Apr 16 18:30:37.839916 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:37.839613 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rp72n" podUID="dcf43e1d-4165-4661-a113-011616920ebe" Apr 16 18:30:38.837345 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:38.837307 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4smb" Apr 16 18:30:38.837574 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:38.837471 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4smb" podUID="b62474b5-9999-4dd6-83ae-96e3bc355df3" Apr 16 18:30:39.837386 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:39.837342 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rp72n" Apr 16 18:30:39.837822 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:39.837498 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rp72n" podUID="dcf43e1d-4165-4661-a113-011616920ebe" Apr 16 18:30:40.836829 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:40.836795 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4smb" Apr 16 18:30:40.836976 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:40.836930 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4smb" podUID="b62474b5-9999-4dd6-83ae-96e3bc355df3" Apr 16 18:30:41.429903 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:41.429868 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b62474b5-9999-4dd6-83ae-96e3bc355df3-metrics-certs\") pod \"network-metrics-daemon-f4smb\" (UID: \"b62474b5-9999-4dd6-83ae-96e3bc355df3\") " pod="openshift-multus/network-metrics-daemon-f4smb" Apr 16 18:30:41.430318 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:41.429997 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:41.430318 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:41.430072 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b62474b5-9999-4dd6-83ae-96e3bc355df3-metrics-certs podName:b62474b5-9999-4dd6-83ae-96e3bc355df3 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:57.430052271 +0000 UTC m=+34.217388512 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b62474b5-9999-4dd6-83ae-96e3bc355df3-metrics-certs") pod "network-metrics-daemon-f4smb" (UID: "b62474b5-9999-4dd6-83ae-96e3bc355df3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:41.530384 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:41.530343 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dj56w\" (UniqueName: \"kubernetes.io/projected/dcf43e1d-4165-4661-a113-011616920ebe-kube-api-access-dj56w\") pod \"network-check-target-rp72n\" (UID: \"dcf43e1d-4165-4661-a113-011616920ebe\") " pod="openshift-network-diagnostics/network-check-target-rp72n" Apr 16 18:30:41.530566 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:41.530540 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:30:41.530617 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:41.530566 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:30:41.530617 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:41.530591 2578 projected.go:194] Error preparing data for projected volume kube-api-access-dj56w for pod openshift-network-diagnostics/network-check-target-rp72n: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:41.530695 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:41.530656 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dcf43e1d-4165-4661-a113-011616920ebe-kube-api-access-dj56w podName:dcf43e1d-4165-4661-a113-011616920ebe nodeName:}" failed. No retries permitted until 2026-04-16 18:30:57.530637701 +0000 UTC m=+34.317973935 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-dj56w" (UniqueName: "kubernetes.io/projected/dcf43e1d-4165-4661-a113-011616920ebe-kube-api-access-dj56w") pod "network-check-target-rp72n" (UID: "dcf43e1d-4165-4661-a113-011616920ebe") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:41.836519 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:41.836478 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rp72n" Apr 16 18:30:41.836771 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:41.836610 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rp72n" podUID="dcf43e1d-4165-4661-a113-011616920ebe" Apr 16 18:30:42.837080 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:42.837045 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4smb" Apr 16 18:30:42.837482 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:42.837161 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4smb" podUID="b62474b5-9999-4dd6-83ae-96e3bc355df3" Apr 16 18:30:43.837485 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:43.837456 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rp72n" Apr 16 18:30:43.838313 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:43.837540 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rp72n" podUID="dcf43e1d-4165-4661-a113-011616920ebe" Apr 16 18:30:43.931534 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:43.931505 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4hbb8" event={"ID":"08946e7b-da1a-4169-a4aa-556c72f0074e","Type":"ContainerStarted","Data":"41efb2c492ea978c8851c2a519c48922fdd70dd51bdb1838749c5bec15e7529c"} Apr 16 18:30:43.933191 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:43.933158 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fdzzg_02533a21-4e1f-4bc0-a493-7ac7d35295b8/ovn-acl-logging/0.log" Apr 16 18:30:43.933515 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:43.933492 2578 generic.go:358] "Generic (PLEG): container finished" podID="02533a21-4e1f-4bc0-a493-7ac7d35295b8" containerID="1f3772d356dffffed2479640cf3ccfcb0f172e90c7650fba5481a5363aa7371e" exitCode=1 Apr 16 18:30:43.933579 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:43.933555 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" event={"ID":"02533a21-4e1f-4bc0-a493-7ac7d35295b8","Type":"ContainerStarted","Data":"ae0fcbe6493f5661d9837f58df2dad57756db479db174bc50229298ceeb19da4"} Apr 16 18:30:43.933621 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:43.933583 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" event={"ID":"02533a21-4e1f-4bc0-a493-7ac7d35295b8","Type":"ContainerStarted","Data":"b10c31e14068f67318ad366ce173ccc9f76233a225ff967cccc9c786338f6e73"} Apr 16 18:30:43.933621 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:43.933593 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" event={"ID":"02533a21-4e1f-4bc0-a493-7ac7d35295b8","Type":"ContainerDied","Data":"1f3772d356dffffed2479640cf3ccfcb0f172e90c7650fba5481a5363aa7371e"} Apr 16 18:30:43.933621 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:43.933604 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" event={"ID":"02533a21-4e1f-4bc0-a493-7ac7d35295b8","Type":"ContainerStarted","Data":"75da853bb62aa888a545e3782eccb05444b673152531577bc8b7a1b2330665f3"} Apr 16 18:30:43.934923 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:43.934890 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-fctsh" event={"ID":"7be083de-137d-4eb1-b371-dc0a37d2d527","Type":"ContainerStarted","Data":"3379c25126658fb8fdb3576fc6ce109fd58cc9ab614bd375324fb8fbb386a42f"} Apr 16 18:30:43.936485 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:43.936464 2578 generic.go:358] "Generic (PLEG): container finished" podID="3bf46358-96b5-41b1-9b21-a398d5f87d6e" containerID="8967dadb0908b3a29d60a94118da4f1cfd8beb3eab28fc9875e72a7c2f245ae2" exitCode=0 Apr 16 18:30:43.936575 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:43.936526 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nv72w" event={"ID":"3bf46358-96b5-41b1-9b21-a398d5f87d6e","Type":"ContainerDied","Data":"8967dadb0908b3a29d60a94118da4f1cfd8beb3eab28fc9875e72a7c2f245ae2"} Apr 16 18:30:43.938123 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:43.938098 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-gcdwk" event={"ID":"f4d6c00d-a887-4de5-87f8-5b4449359aa4","Type":"ContainerStarted","Data":"b23ac84cd04085601b865decb5b7551f827ddc7965dbb7888901c26b68c9da05"} Apr 16 18:30:43.939587 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:43.939562 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-562j6" event={"ID":"4a775d94-d89f-4059-894a-f78b252c1c3c","Type":"ContainerStarted","Data":"35c1b0ad88ae3e0fbe18e4c11f4abbb40e4a1106081febf4e58eee78353e03d4"} Apr 16 18:30:43.940861 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:43.940836 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hkqvd" event={"ID":"0003db0c-dda0-4476-bd64-528082f53f33","Type":"ContainerStarted","Data":"54120a8160e2f7f1ab95d230a99be7d3d331cb94da851485b4605c2c2b9a6d80"} Apr 16 18:30:43.942228 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:43.942208 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-fj55k" event={"ID":"1168c01f-c07f-44f9-b56f-cc88b2028e0b","Type":"ContainerStarted","Data":"aafd193867a1eba9c1908474e099b4dadaaa01e55b95bd43ccbf6021bbf0e6ad"} Apr 16 18:30:43.957107 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:43.954805 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-33.ec2.internal" podStartSLOduration=18.954789873 podStartE2EDuration="18.954789873s" podCreationTimestamp="2026-04-16 18:30:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:30:28.918936842 +0000 UTC m=+5.706273104" watchObservedRunningTime="2026-04-16 18:30:43.954789873 +0000 UTC m=+20.742126140" Apr 16 18:30:43.957107 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:43.955504 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-fctsh" podStartSLOduration=4.104445674 podStartE2EDuration="20.955493107s" podCreationTimestamp="2026-04-16 18:30:23 +0000 UTC" firstStartedPulling="2026-04-16 18:30:26.428341429 +0000 UTC m=+3.215677667" lastFinishedPulling="2026-04-16 18:30:43.279388859 +0000 UTC m=+20.066725100" observedRunningTime="2026-04-16 18:30:43.954663511 +0000 UTC m=+20.741999766" watchObservedRunningTime="2026-04-16 18:30:43.955493107 +0000 UTC m=+20.742829386" Apr 16 18:30:43.974550 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:43.974498 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-562j6" podStartSLOduration=4.104818716 podStartE2EDuration="20.974481813s" podCreationTimestamp="2026-04-16 18:30:23 +0000 UTC" firstStartedPulling="2026-04-16 18:30:26.426105482 +0000 UTC m=+3.213441730" lastFinishedPulling="2026-04-16 18:30:43.295768579 +0000 UTC m=+20.083104827" observedRunningTime="2026-04-16 18:30:43.974332478 +0000 UTC m=+20.761668734" watchObservedRunningTime="2026-04-16 18:30:43.974481813 +0000 UTC m=+20.761818070" Apr 16 18:30:44.003914 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:44.003871 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-gcdwk" podStartSLOduration=12.280098962 podStartE2EDuration="21.003856781s" podCreationTimestamp="2026-04-16 18:30:23 +0000 UTC" firstStartedPulling="2026-04-16 18:30:26.424784252 +0000 UTC m=+3.212120490" lastFinishedPulling="2026-04-16 18:30:35.148542068 +0000 UTC m=+11.935878309" observedRunningTime="2026-04-16 18:30:43.989008517 +0000 UTC m=+20.776344811" watchObservedRunningTime="2026-04-16 18:30:44.003856781 +0000 UTC m=+20.791193036" Apr 16 18:30:44.004143 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:44.004114 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-fj55k" podStartSLOduration=4.159727179 podStartE2EDuration="21.004106484s" podCreationTimestamp="2026-04-16 18:30:23 +0000 UTC" firstStartedPulling="2026-04-16 18:30:26.432928515 +0000 UTC m=+3.220264748" lastFinishedPulling="2026-04-16 18:30:43.277307813 +0000 UTC m=+20.064644053" observedRunningTime="2026-04-16 18:30:44.00363434 +0000 UTC m=+20.790970594" watchObservedRunningTime="2026-04-16 18:30:44.004106484 +0000 UTC m=+20.791442738" Apr 16 18:30:44.039935 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:44.039841 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-hkqvd" podStartSLOduration=4.527611021 podStartE2EDuration="21.03982299s" podCreationTimestamp="2026-04-16 18:30:23 +0000 UTC" firstStartedPulling="2026-04-16 18:30:26.424471615 +0000 UTC m=+3.211807865" lastFinishedPulling="2026-04-16 18:30:42.936683587 +0000 UTC m=+19.724019834" observedRunningTime="2026-04-16 18:30:44.039302989 +0000 UTC m=+20.826639244" watchObservedRunningTime="2026-04-16 18:30:44.03982299 +0000 UTC m=+20.827159244" Apr 16 18:30:44.837085 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:44.837060 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4smb" Apr 16 18:30:44.837298 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:44.837266 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4smb" podUID="b62474b5-9999-4dd6-83ae-96e3bc355df3" Apr 16 18:30:44.945663 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:44.945599 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-npwmv" event={"ID":"68a69615-3320-4f7f-b763-6991f367c93d","Type":"ContainerStarted","Data":"f3a519d3c47425d51e2330c30e2e2cbce3ad40a98389be081ebb53724a9e74f4"} Apr 16 18:30:44.948833 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:44.948811 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fdzzg_02533a21-4e1f-4bc0-a493-7ac7d35295b8/ovn-acl-logging/0.log" Apr 16 18:30:44.949274 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:44.949243 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" event={"ID":"02533a21-4e1f-4bc0-a493-7ac7d35295b8","Type":"ContainerStarted","Data":"b4a87c995497befdcb8c52d95fa761c6328d0fcd768daad66c202d7da1313c32"} Apr 16 18:30:44.949384 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:44.949286 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" event={"ID":"02533a21-4e1f-4bc0-a493-7ac7d35295b8","Type":"ContainerStarted","Data":"c9dc706a1ab52141c8db69c2262c62bb232a003b851895cfcceb4aadf58ed307"} Apr 16 18:30:44.959403 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:44.959362 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-npwmv" podStartSLOduration=5.114472127 podStartE2EDuration="21.959345566s" podCreationTimestamp="2026-04-16 18:30:23 +0000 UTC" firstStartedPulling="2026-04-16 18:30:26.432390811 +0000 UTC m=+3.219727058" lastFinishedPulling="2026-04-16 18:30:43.277264249 +0000 UTC m=+20.064600497" observedRunningTime="2026-04-16 18:30:44.959142315 +0000 UTC m=+21.746478572" watchObservedRunningTime="2026-04-16 18:30:44.959345566 +0000 UTC m=+21.746681823" Apr 16 18:30:44.979313 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:44.979270 2578 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 18:30:45.766207 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:45.766064 2578 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T18:30:44.979289772Z","UUID":"8dc6b93c-3d7b-4114-bbaf-31c7fd1b5c86","Handler":null,"Name":"","Endpoint":""} Apr 16 18:30:45.768255 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:45.768232 2578 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 18:30:45.768255 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:45.768261 2578 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 18:30:45.836925 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:45.836893 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rp72n" Apr 16 18:30:45.837124 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:45.837019 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rp72n" podUID="dcf43e1d-4165-4661-a113-011616920ebe" Apr 16 18:30:45.953249 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:45.953206 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4hbb8" event={"ID":"08946e7b-da1a-4169-a4aa-556c72f0074e","Type":"ContainerStarted","Data":"d8301b773e50f916e6a685f37c0c5973785fbe2040d07a89798d4f71b09ca6e8"} Apr 16 18:30:46.837109 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:46.837079 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4smb" Apr 16 18:30:46.837306 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:46.837223 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4smb" podUID="b62474b5-9999-4dd6-83ae-96e3bc355df3" Apr 16 18:30:46.957554 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:46.957279 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4hbb8" event={"ID":"08946e7b-da1a-4169-a4aa-556c72f0074e","Type":"ContainerStarted","Data":"ff763ab827ce46f9d518a0e55d15ccc0dac4f2ef138d5adce62c0d55f352e5ec"} Apr 16 18:30:46.960452 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:46.960427 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fdzzg_02533a21-4e1f-4bc0-a493-7ac7d35295b8/ovn-acl-logging/0.log" Apr 16 18:30:46.960828 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:46.960797 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" event={"ID":"02533a21-4e1f-4bc0-a493-7ac7d35295b8","Type":"ContainerStarted","Data":"37d434ddc72c124872bcbdd784e2db0bc680488254ea9d0cd10ecbce615e8458"} Apr 16 18:30:46.974606 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:46.974568 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4hbb8" podStartSLOduration=4.403460791 podStartE2EDuration="23.974556605s" podCreationTimestamp="2026-04-16 18:30:23 +0000 UTC" firstStartedPulling="2026-04-16 18:30:26.415755826 +0000 UTC m=+3.203092060" lastFinishedPulling="2026-04-16 18:30:45.98685164 +0000 UTC m=+22.774187874" observedRunningTime="2026-04-16 18:30:46.974010333 +0000 UTC m=+23.761346592" watchObservedRunningTime="2026-04-16 18:30:46.974556605 +0000 UTC m=+23.761892860" Apr 16 18:30:47.529458 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:47.529370 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-gcdwk" Apr 16 18:30:47.530096 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:47.530069 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-gcdwk" Apr 16 18:30:47.836505 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:47.836426 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rp72n" Apr 16 18:30:47.836658 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:47.836550 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rp72n" podUID="dcf43e1d-4165-4661-a113-011616920ebe" Apr 16 18:30:48.837027 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:48.836849 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4smb" Apr 16 18:30:48.837738 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:48.837134 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4smb" podUID="b62474b5-9999-4dd6-83ae-96e3bc355df3" Apr 16 18:30:48.966993 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:48.966967 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fdzzg_02533a21-4e1f-4bc0-a493-7ac7d35295b8/ovn-acl-logging/0.log" Apr 16 18:30:48.967303 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:48.967281 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" event={"ID":"02533a21-4e1f-4bc0-a493-7ac7d35295b8","Type":"ContainerStarted","Data":"01f1fb08cdb2df8c384665a11dca2290d78b454814cbcc4127ff4931117cc570"} Apr 16 18:30:48.967686 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:48.967666 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:48.967782 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:48.967695 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:48.967857 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:48.967838 2578 scope.go:117] "RemoveContainer" containerID="1f3772d356dffffed2479640cf3ccfcb0f172e90c7650fba5481a5363aa7371e" Apr 16 18:30:48.969044 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:48.969021 2578 generic.go:358] "Generic (PLEG): container finished" podID="3bf46358-96b5-41b1-9b21-a398d5f87d6e" containerID="54855c07140a3058e75a691e6371072d25ad382aadcda64a27676143e6d0454c" exitCode=0 Apr 16 18:30:48.969161 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:48.969056 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nv72w" event={"ID":"3bf46358-96b5-41b1-9b21-a398d5f87d6e","Type":"ContainerDied","Data":"54855c07140a3058e75a691e6371072d25ad382aadcda64a27676143e6d0454c"} Apr 16 18:30:48.984209 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:48.984170 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:48.984281 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:48.984271 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:49.837127 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:49.837094 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rp72n" Apr 16 18:30:49.837490 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:49.837222 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rp72n" podUID="dcf43e1d-4165-4661-a113-011616920ebe" Apr 16 18:30:49.972411 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:49.972382 2578 generic.go:358] "Generic (PLEG): container finished" podID="3bf46358-96b5-41b1-9b21-a398d5f87d6e" containerID="b903060d50cdc743f45e13b1fe3ea9def9a625d0d79a8828e23a96c60acfe682" exitCode=0 Apr 16 18:30:49.972572 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:49.972473 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nv72w" event={"ID":"3bf46358-96b5-41b1-9b21-a398d5f87d6e","Type":"ContainerDied","Data":"b903060d50cdc743f45e13b1fe3ea9def9a625d0d79a8828e23a96c60acfe682"} Apr 16 18:30:49.975898 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:49.975880 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fdzzg_02533a21-4e1f-4bc0-a493-7ac7d35295b8/ovn-acl-logging/0.log" Apr 16 18:30:49.976259 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:49.976234 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" event={"ID":"02533a21-4e1f-4bc0-a493-7ac7d35295b8","Type":"ContainerStarted","Data":"fe869d7227b83c9db871f7867db5c823221239e7f27a2f9f7f187ed19426892a"} Apr 16 18:30:49.976530 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:49.976514 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:30:50.023578 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:50.023538 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" podStartSLOduration=10.120897875 podStartE2EDuration="27.023524771s" podCreationTimestamp="2026-04-16 18:30:23 +0000 UTC" firstStartedPulling="2026-04-16 18:30:26.43159567 +0000 UTC m=+3.218931915" lastFinishedPulling="2026-04-16 18:30:43.334222572 +0000 UTC m=+20.121558811" observedRunningTime="2026-04-16 18:30:50.023094181 +0000 UTC m=+26.810430461" watchObservedRunningTime="2026-04-16 18:30:50.023524771 +0000 UTC m=+26.810861025" Apr 16 18:30:50.359473 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:50.359390 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-f4smb"] Apr 16 18:30:50.359632 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:50.359599 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4smb" Apr 16 18:30:50.359781 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:50.359737 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4smb" podUID="b62474b5-9999-4dd6-83ae-96e3bc355df3" Apr 16 18:30:50.360679 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:50.360654 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-rp72n"] Apr 16 18:30:50.360793 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:50.360756 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rp72n" Apr 16 18:30:50.360846 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:50.360824 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rp72n" podUID="dcf43e1d-4165-4661-a113-011616920ebe" Apr 16 18:30:50.979737 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:50.979707 2578 generic.go:358] "Generic (PLEG): container finished" podID="3bf46358-96b5-41b1-9b21-a398d5f87d6e" containerID="10dc30ccb6653256e89a0e5176679026c742abe6f0730e21a65766a525c45239" exitCode=0 Apr 16 18:30:50.980115 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:50.979792 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nv72w" event={"ID":"3bf46358-96b5-41b1-9b21-a398d5f87d6e","Type":"ContainerDied","Data":"10dc30ccb6653256e89a0e5176679026c742abe6f0730e21a65766a525c45239"} Apr 16 18:30:51.837497 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:51.837249 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rp72n" Apr 16 18:30:51.837670 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:51.837288 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4smb" Apr 16 18:30:51.837670 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:51.837595 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rp72n" podUID="dcf43e1d-4165-4661-a113-011616920ebe" Apr 16 18:30:51.837787 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:51.837706 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4smb" podUID="b62474b5-9999-4dd6-83ae-96e3bc355df3" Apr 16 18:30:53.837985 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:53.837949 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rp72n" Apr 16 18:30:53.838459 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:53.838108 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rp72n" podUID="dcf43e1d-4165-4661-a113-011616920ebe" Apr 16 18:30:53.838459 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:53.838155 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4smb" Apr 16 18:30:53.838459 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:53.838292 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4smb" podUID="b62474b5-9999-4dd6-83ae-96e3bc355df3" Apr 16 18:30:55.634051 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:55.634011 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-gcdwk" Apr 16 18:30:55.634623 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:55.634193 2578 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 18:30:55.634841 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:55.634797 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-gcdwk" Apr 16 18:30:55.836693 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:55.836662 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rp72n" Apr 16 18:30:55.836863 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:55.836705 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4smb" Apr 16 18:30:55.836863 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:55.836792 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rp72n" podUID="dcf43e1d-4165-4661-a113-011616920ebe" Apr 16 18:30:55.836970 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:55.836924 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f4smb" podUID="b62474b5-9999-4dd6-83ae-96e3bc355df3" Apr 16 18:30:56.570137 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:56.570112 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-33.ec2.internal" event="NodeReady" Apr 16 18:30:56.570299 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:56.570245 2578 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 18:30:56.628060 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:56.628029 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-dlnfz"] Apr 16 18:30:56.642258 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:56.642234 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-pb8c8"] Apr 16 18:30:56.642775 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:56.642391 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-dlnfz" Apr 16 18:30:56.645935 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:56.645910 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 18:30:56.646047 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:56.645988 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-tdfb9\"" Apr 16 18:30:56.646291 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:56.646271 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 18:30:56.651012 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:56.650873 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-dlnfz"] Apr 16 18:30:56.651131 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:56.651024 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-pb8c8" Apr 16 18:30:56.653559 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:56.653539 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-pb8c8"] Apr 16 18:30:56.654016 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:56.653997 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 18:30:56.654102 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:56.654039 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 18:30:56.654646 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:56.654631 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-6c796\"" Apr 16 18:30:56.654877 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:56.654862 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 18:30:56.736188 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:56.736152 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ccb22e48-3cd7-442e-ac5a-7ec7666b48e9-metrics-tls\") pod \"dns-default-dlnfz\" (UID: \"ccb22e48-3cd7-442e-ac5a-7ec7666b48e9\") " pod="openshift-dns/dns-default-dlnfz" Apr 16 18:30:56.736309 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:56.736202 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88fjx\" (UniqueName: \"kubernetes.io/projected/714c4beb-40ac-4478-80ff-d058fb5fd1a3-kube-api-access-88fjx\") pod \"ingress-canary-pb8c8\" (UID: \"714c4beb-40ac-4478-80ff-d058fb5fd1a3\") " pod="openshift-ingress-canary/ingress-canary-pb8c8" Apr 16 18:30:56.736309 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:56.736229 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zn4vb\" (UniqueName: \"kubernetes.io/projected/ccb22e48-3cd7-442e-ac5a-7ec7666b48e9-kube-api-access-zn4vb\") pod \"dns-default-dlnfz\" (UID: \"ccb22e48-3cd7-442e-ac5a-7ec7666b48e9\") " pod="openshift-dns/dns-default-dlnfz" Apr 16 18:30:56.736309 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:56.736291 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ccb22e48-3cd7-442e-ac5a-7ec7666b48e9-config-volume\") pod \"dns-default-dlnfz\" (UID: \"ccb22e48-3cd7-442e-ac5a-7ec7666b48e9\") " pod="openshift-dns/dns-default-dlnfz" Apr 16 18:30:56.736433 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:56.736329 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ccb22e48-3cd7-442e-ac5a-7ec7666b48e9-tmp-dir\") pod \"dns-default-dlnfz\" (UID: \"ccb22e48-3cd7-442e-ac5a-7ec7666b48e9\") " pod="openshift-dns/dns-default-dlnfz" Apr 16 18:30:56.736433 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:56.736366 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/714c4beb-40ac-4478-80ff-d058fb5fd1a3-cert\") pod \"ingress-canary-pb8c8\" (UID: \"714c4beb-40ac-4478-80ff-d058fb5fd1a3\") " pod="openshift-ingress-canary/ingress-canary-pb8c8" Apr 16 18:30:56.837628 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:56.837599 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ccb22e48-3cd7-442e-ac5a-7ec7666b48e9-config-volume\") pod \"dns-default-dlnfz\" (UID: \"ccb22e48-3cd7-442e-ac5a-7ec7666b48e9\") " pod="openshift-dns/dns-default-dlnfz" Apr 16 18:30:56.837735 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:56.837648 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ccb22e48-3cd7-442e-ac5a-7ec7666b48e9-tmp-dir\") pod \"dns-default-dlnfz\" (UID: \"ccb22e48-3cd7-442e-ac5a-7ec7666b48e9\") " pod="openshift-dns/dns-default-dlnfz" Apr 16 18:30:56.837805 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:56.837785 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/714c4beb-40ac-4478-80ff-d058fb5fd1a3-cert\") pod \"ingress-canary-pb8c8\" (UID: \"714c4beb-40ac-4478-80ff-d058fb5fd1a3\") " pod="openshift-ingress-canary/ingress-canary-pb8c8" Apr 16 18:30:56.837843 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:56.837820 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ccb22e48-3cd7-442e-ac5a-7ec7666b48e9-metrics-tls\") pod \"dns-default-dlnfz\" (UID: \"ccb22e48-3cd7-442e-ac5a-7ec7666b48e9\") " pod="openshift-dns/dns-default-dlnfz" Apr 16 18:30:56.837880 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:56.837846 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-88fjx\" (UniqueName: \"kubernetes.io/projected/714c4beb-40ac-4478-80ff-d058fb5fd1a3-kube-api-access-88fjx\") pod \"ingress-canary-pb8c8\" (UID: \"714c4beb-40ac-4478-80ff-d058fb5fd1a3\") " pod="openshift-ingress-canary/ingress-canary-pb8c8" Apr 16 18:30:56.837928 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:56.837879 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zn4vb\" (UniqueName: \"kubernetes.io/projected/ccb22e48-3cd7-442e-ac5a-7ec7666b48e9-kube-api-access-zn4vb\") pod \"dns-default-dlnfz\" (UID: \"ccb22e48-3cd7-442e-ac5a-7ec7666b48e9\") " pod="openshift-dns/dns-default-dlnfz" Apr 16 18:30:56.837928 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:56.837915 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ccb22e48-3cd7-442e-ac5a-7ec7666b48e9-tmp-dir\") pod \"dns-default-dlnfz\" (UID: \"ccb22e48-3cd7-442e-ac5a-7ec7666b48e9\") " pod="openshift-dns/dns-default-dlnfz" Apr 16 18:30:56.838019 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:56.837927 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:30:56.838019 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:56.837944 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:30:56.838019 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:56.837988 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/714c4beb-40ac-4478-80ff-d058fb5fd1a3-cert podName:714c4beb-40ac-4478-80ff-d058fb5fd1a3 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:57.337971804 +0000 UTC m=+34.125308047 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/714c4beb-40ac-4478-80ff-d058fb5fd1a3-cert") pod "ingress-canary-pb8c8" (UID: "714c4beb-40ac-4478-80ff-d058fb5fd1a3") : secret "canary-serving-cert" not found Apr 16 18:30:56.838019 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:56.838005 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ccb22e48-3cd7-442e-ac5a-7ec7666b48e9-metrics-tls podName:ccb22e48-3cd7-442e-ac5a-7ec7666b48e9 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:57.337996383 +0000 UTC m=+34.125332617 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ccb22e48-3cd7-442e-ac5a-7ec7666b48e9-metrics-tls") pod "dns-default-dlnfz" (UID: "ccb22e48-3cd7-442e-ac5a-7ec7666b48e9") : secret "dns-default-metrics-tls" not found Apr 16 18:30:56.838195 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:56.838149 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ccb22e48-3cd7-442e-ac5a-7ec7666b48e9-config-volume\") pod \"dns-default-dlnfz\" (UID: \"ccb22e48-3cd7-442e-ac5a-7ec7666b48e9\") " pod="openshift-dns/dns-default-dlnfz" Apr 16 18:30:56.852093 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:56.852074 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zn4vb\" (UniqueName: \"kubernetes.io/projected/ccb22e48-3cd7-442e-ac5a-7ec7666b48e9-kube-api-access-zn4vb\") pod \"dns-default-dlnfz\" (UID: \"ccb22e48-3cd7-442e-ac5a-7ec7666b48e9\") " pod="openshift-dns/dns-default-dlnfz" Apr 16 18:30:56.852930 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:56.852914 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-88fjx\" (UniqueName: \"kubernetes.io/projected/714c4beb-40ac-4478-80ff-d058fb5fd1a3-kube-api-access-88fjx\") pod \"ingress-canary-pb8c8\" (UID: \"714c4beb-40ac-4478-80ff-d058fb5fd1a3\") " pod="openshift-ingress-canary/ingress-canary-pb8c8" Apr 16 18:30:56.993825 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:56.993798 2578 generic.go:358] "Generic (PLEG): container finished" podID="3bf46358-96b5-41b1-9b21-a398d5f87d6e" containerID="634fb52b946ecc2c7c9f53432e642c4fc2f1db060719c65dfbd241a8f75e1549" exitCode=0 Apr 16 18:30:56.993971 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:56.993835 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nv72w" event={"ID":"3bf46358-96b5-41b1-9b21-a398d5f87d6e","Type":"ContainerDied","Data":"634fb52b946ecc2c7c9f53432e642c4fc2f1db060719c65dfbd241a8f75e1549"} Apr 16 18:30:57.341851 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:57.341783 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/714c4beb-40ac-4478-80ff-d058fb5fd1a3-cert\") pod \"ingress-canary-pb8c8\" (UID: \"714c4beb-40ac-4478-80ff-d058fb5fd1a3\") " pod="openshift-ingress-canary/ingress-canary-pb8c8" Apr 16 18:30:57.341851 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:57.341821 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ccb22e48-3cd7-442e-ac5a-7ec7666b48e9-metrics-tls\") pod \"dns-default-dlnfz\" (UID: \"ccb22e48-3cd7-442e-ac5a-7ec7666b48e9\") " pod="openshift-dns/dns-default-dlnfz" Apr 16 18:30:57.342073 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:57.341915 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:30:57.342073 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:57.341927 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:30:57.342073 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:57.341968 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ccb22e48-3cd7-442e-ac5a-7ec7666b48e9-metrics-tls podName:ccb22e48-3cd7-442e-ac5a-7ec7666b48e9 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:58.341953249 +0000 UTC m=+35.129289502 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ccb22e48-3cd7-442e-ac5a-7ec7666b48e9-metrics-tls") pod "dns-default-dlnfz" (UID: "ccb22e48-3cd7-442e-ac5a-7ec7666b48e9") : secret "dns-default-metrics-tls" not found Apr 16 18:30:57.342073 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:57.341982 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/714c4beb-40ac-4478-80ff-d058fb5fd1a3-cert podName:714c4beb-40ac-4478-80ff-d058fb5fd1a3 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:58.341976258 +0000 UTC m=+35.129312491 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/714c4beb-40ac-4478-80ff-d058fb5fd1a3-cert") pod "ingress-canary-pb8c8" (UID: "714c4beb-40ac-4478-80ff-d058fb5fd1a3") : secret "canary-serving-cert" not found Apr 16 18:30:57.442643 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:57.442611 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b62474b5-9999-4dd6-83ae-96e3bc355df3-metrics-certs\") pod \"network-metrics-daemon-f4smb\" (UID: \"b62474b5-9999-4dd6-83ae-96e3bc355df3\") " pod="openshift-multus/network-metrics-daemon-f4smb" Apr 16 18:30:57.442829 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:57.442768 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:57.442892 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:57.442848 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b62474b5-9999-4dd6-83ae-96e3bc355df3-metrics-certs podName:b62474b5-9999-4dd6-83ae-96e3bc355df3 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:29.442827984 +0000 UTC m=+66.230164218 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b62474b5-9999-4dd6-83ae-96e3bc355df3-metrics-certs") pod "network-metrics-daemon-f4smb" (UID: "b62474b5-9999-4dd6-83ae-96e3bc355df3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:57.543098 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:57.543068 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dj56w\" (UniqueName: \"kubernetes.io/projected/dcf43e1d-4165-4661-a113-011616920ebe-kube-api-access-dj56w\") pod \"network-check-target-rp72n\" (UID: \"dcf43e1d-4165-4661-a113-011616920ebe\") " pod="openshift-network-diagnostics/network-check-target-rp72n" Apr 16 18:30:57.543325 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:57.543207 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:30:57.543325 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:57.543224 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:30:57.543325 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:57.543235 2578 projected.go:194] Error preparing data for projected volume kube-api-access-dj56w for pod openshift-network-diagnostics/network-check-target-rp72n: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:57.543325 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:57.543280 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dcf43e1d-4165-4661-a113-011616920ebe-kube-api-access-dj56w podName:dcf43e1d-4165-4661-a113-011616920ebe nodeName:}" failed. No retries permitted until 2026-04-16 18:31:29.543266831 +0000 UTC m=+66.330603068 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-dj56w" (UniqueName: "kubernetes.io/projected/dcf43e1d-4165-4661-a113-011616920ebe-kube-api-access-dj56w") pod "network-check-target-rp72n" (UID: "dcf43e1d-4165-4661-a113-011616920ebe") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:57.839462 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:57.839431 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4smb" Apr 16 18:30:57.840094 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:57.839431 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rp72n" Apr 16 18:30:57.842607 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:57.842579 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 18:30:57.842803 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:57.842777 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 18:30:57.842903 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:57.842882 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 18:30:57.843047 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:57.842992 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-kxnrw\"" Apr 16 18:30:57.843855 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:57.843842 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-7lgq2\"" Apr 16 18:30:57.998415 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:57.998389 2578 generic.go:358] "Generic (PLEG): container finished" podID="3bf46358-96b5-41b1-9b21-a398d5f87d6e" containerID="d96672f2d09210017e7e53ca73edf90197e2ca41d447eea6c0a26fb0a9628f28" exitCode=0 Apr 16 18:30:57.998588 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:57.998426 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nv72w" event={"ID":"3bf46358-96b5-41b1-9b21-a398d5f87d6e","Type":"ContainerDied","Data":"d96672f2d09210017e7e53ca73edf90197e2ca41d447eea6c0a26fb0a9628f28"} Apr 16 18:30:58.347927 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:58.347711 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/714c4beb-40ac-4478-80ff-d058fb5fd1a3-cert\") pod \"ingress-canary-pb8c8\" (UID: \"714c4beb-40ac-4478-80ff-d058fb5fd1a3\") " pod="openshift-ingress-canary/ingress-canary-pb8c8" Apr 16 18:30:58.347927 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:58.347897 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ccb22e48-3cd7-442e-ac5a-7ec7666b48e9-metrics-tls\") pod \"dns-default-dlnfz\" (UID: \"ccb22e48-3cd7-442e-ac5a-7ec7666b48e9\") " pod="openshift-dns/dns-default-dlnfz" Apr 16 18:30:58.348092 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:58.347850 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:30:58.348092 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:58.347969 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:30:58.348092 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:58.348011 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/714c4beb-40ac-4478-80ff-d058fb5fd1a3-cert podName:714c4beb-40ac-4478-80ff-d058fb5fd1a3 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:00.347996049 +0000 UTC m=+37.135332282 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/714c4beb-40ac-4478-80ff-d058fb5fd1a3-cert") pod "ingress-canary-pb8c8" (UID: "714c4beb-40ac-4478-80ff-d058fb5fd1a3") : secret "canary-serving-cert" not found Apr 16 18:30:58.348092 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:30:58.348026 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ccb22e48-3cd7-442e-ac5a-7ec7666b48e9-metrics-tls podName:ccb22e48-3cd7-442e-ac5a-7ec7666b48e9 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:00.348020341 +0000 UTC m=+37.135356574 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ccb22e48-3cd7-442e-ac5a-7ec7666b48e9-metrics-tls") pod "dns-default-dlnfz" (UID: "ccb22e48-3cd7-442e-ac5a-7ec7666b48e9") : secret "dns-default-metrics-tls" not found Apr 16 18:30:59.004041 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:59.004013 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nv72w" event={"ID":"3bf46358-96b5-41b1-9b21-a398d5f87d6e","Type":"ContainerStarted","Data":"4c8b663485d453762f3660a888c105c1c898193e67742f8e1b8c14d884c09776"} Apr 16 18:30:59.031524 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:30:59.031464 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-nv72w" podStartSLOduration=6.061418241 podStartE2EDuration="36.031448713s" podCreationTimestamp="2026-04-16 18:30:23 +0000 UTC" firstStartedPulling="2026-04-16 18:30:26.430433578 +0000 UTC m=+3.217769812" lastFinishedPulling="2026-04-16 18:30:56.400464034 +0000 UTC m=+33.187800284" observedRunningTime="2026-04-16 18:30:59.029633997 +0000 UTC m=+35.816970253" watchObservedRunningTime="2026-04-16 18:30:59.031448713 +0000 UTC m=+35.818784968" Apr 16 18:31:00.363305 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:31:00.363265 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/714c4beb-40ac-4478-80ff-d058fb5fd1a3-cert\") pod \"ingress-canary-pb8c8\" (UID: \"714c4beb-40ac-4478-80ff-d058fb5fd1a3\") " pod="openshift-ingress-canary/ingress-canary-pb8c8" Apr 16 18:31:00.363305 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:31:00.363309 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ccb22e48-3cd7-442e-ac5a-7ec7666b48e9-metrics-tls\") pod \"dns-default-dlnfz\" (UID: \"ccb22e48-3cd7-442e-ac5a-7ec7666b48e9\") " pod="openshift-dns/dns-default-dlnfz" Apr 16 18:31:00.363712 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:31:00.363407 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:31:00.363712 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:31:00.363412 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:31:00.363712 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:31:00.363459 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ccb22e48-3cd7-442e-ac5a-7ec7666b48e9-metrics-tls podName:ccb22e48-3cd7-442e-ac5a-7ec7666b48e9 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:04.363444919 +0000 UTC m=+41.150781152 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ccb22e48-3cd7-442e-ac5a-7ec7666b48e9-metrics-tls") pod "dns-default-dlnfz" (UID: "ccb22e48-3cd7-442e-ac5a-7ec7666b48e9") : secret "dns-default-metrics-tls" not found Apr 16 18:31:00.363712 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:31:00.363472 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/714c4beb-40ac-4478-80ff-d058fb5fd1a3-cert podName:714c4beb-40ac-4478-80ff-d058fb5fd1a3 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:04.363467202 +0000 UTC m=+41.150803435 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/714c4beb-40ac-4478-80ff-d058fb5fd1a3-cert") pod "ingress-canary-pb8c8" (UID: "714c4beb-40ac-4478-80ff-d058fb5fd1a3") : secret "canary-serving-cert" not found Apr 16 18:31:04.393631 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:31:04.393591 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/714c4beb-40ac-4478-80ff-d058fb5fd1a3-cert\") pod \"ingress-canary-pb8c8\" (UID: \"714c4beb-40ac-4478-80ff-d058fb5fd1a3\") " pod="openshift-ingress-canary/ingress-canary-pb8c8" Apr 16 18:31:04.394056 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:31:04.393639 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ccb22e48-3cd7-442e-ac5a-7ec7666b48e9-metrics-tls\") pod \"dns-default-dlnfz\" (UID: \"ccb22e48-3cd7-442e-ac5a-7ec7666b48e9\") " pod="openshift-dns/dns-default-dlnfz" Apr 16 18:31:04.394056 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:31:04.393736 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:31:04.394056 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:31:04.393752 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:31:04.394056 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:31:04.393798 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/714c4beb-40ac-4478-80ff-d058fb5fd1a3-cert podName:714c4beb-40ac-4478-80ff-d058fb5fd1a3 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:12.393782967 +0000 UTC m=+49.181119201 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/714c4beb-40ac-4478-80ff-d058fb5fd1a3-cert") pod "ingress-canary-pb8c8" (UID: "714c4beb-40ac-4478-80ff-d058fb5fd1a3") : secret "canary-serving-cert" not found Apr 16 18:31:04.394056 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:31:04.393813 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ccb22e48-3cd7-442e-ac5a-7ec7666b48e9-metrics-tls podName:ccb22e48-3cd7-442e-ac5a-7ec7666b48e9 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:12.393806726 +0000 UTC m=+49.181142960 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ccb22e48-3cd7-442e-ac5a-7ec7666b48e9-metrics-tls") pod "dns-default-dlnfz" (UID: "ccb22e48-3cd7-442e-ac5a-7ec7666b48e9") : secret "dns-default-metrics-tls" not found Apr 16 18:31:12.446402 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:31:12.446351 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/714c4beb-40ac-4478-80ff-d058fb5fd1a3-cert\") pod \"ingress-canary-pb8c8\" (UID: \"714c4beb-40ac-4478-80ff-d058fb5fd1a3\") " pod="openshift-ingress-canary/ingress-canary-pb8c8" Apr 16 18:31:12.446402 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:31:12.446406 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ccb22e48-3cd7-442e-ac5a-7ec7666b48e9-metrics-tls\") pod \"dns-default-dlnfz\" (UID: \"ccb22e48-3cd7-442e-ac5a-7ec7666b48e9\") " pod="openshift-dns/dns-default-dlnfz" Apr 16 18:31:12.446896 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:31:12.446497 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:31:12.446896 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:31:12.446501 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:31:12.446896 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:31:12.446547 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ccb22e48-3cd7-442e-ac5a-7ec7666b48e9-metrics-tls podName:ccb22e48-3cd7-442e-ac5a-7ec7666b48e9 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:28.446533629 +0000 UTC m=+65.233869863 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ccb22e48-3cd7-442e-ac5a-7ec7666b48e9-metrics-tls") pod "dns-default-dlnfz" (UID: "ccb22e48-3cd7-442e-ac5a-7ec7666b48e9") : secret "dns-default-metrics-tls" not found Apr 16 18:31:12.446896 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:31:12.446562 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/714c4beb-40ac-4478-80ff-d058fb5fd1a3-cert podName:714c4beb-40ac-4478-80ff-d058fb5fd1a3 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:28.44655554 +0000 UTC m=+65.233891773 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/714c4beb-40ac-4478-80ff-d058fb5fd1a3-cert") pod "ingress-canary-pb8c8" (UID: "714c4beb-40ac-4478-80ff-d058fb5fd1a3") : secret "canary-serving-cert" not found Apr 16 18:31:20.990418 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:31:20.990385 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fdzzg" Apr 16 18:31:28.455517 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:31:28.455476 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/714c4beb-40ac-4478-80ff-d058fb5fd1a3-cert\") pod \"ingress-canary-pb8c8\" (UID: \"714c4beb-40ac-4478-80ff-d058fb5fd1a3\") " pod="openshift-ingress-canary/ingress-canary-pb8c8" Apr 16 18:31:28.455961 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:31:28.455524 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ccb22e48-3cd7-442e-ac5a-7ec7666b48e9-metrics-tls\") pod \"dns-default-dlnfz\" (UID: \"ccb22e48-3cd7-442e-ac5a-7ec7666b48e9\") " pod="openshift-dns/dns-default-dlnfz" Apr 16 18:31:28.455961 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:31:28.455628 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:31:28.455961 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:31:28.455672 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:31:28.455961 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:31:28.455700 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/714c4beb-40ac-4478-80ff-d058fb5fd1a3-cert podName:714c4beb-40ac-4478-80ff-d058fb5fd1a3 nodeName:}" failed. No retries permitted until 2026-04-16 18:32:00.455684515 +0000 UTC m=+97.243020748 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/714c4beb-40ac-4478-80ff-d058fb5fd1a3-cert") pod "ingress-canary-pb8c8" (UID: "714c4beb-40ac-4478-80ff-d058fb5fd1a3") : secret "canary-serving-cert" not found Apr 16 18:31:28.455961 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:31:28.455727 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ccb22e48-3cd7-442e-ac5a-7ec7666b48e9-metrics-tls podName:ccb22e48-3cd7-442e-ac5a-7ec7666b48e9 nodeName:}" failed. No retries permitted until 2026-04-16 18:32:00.455712824 +0000 UTC m=+97.243049062 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ccb22e48-3cd7-442e-ac5a-7ec7666b48e9-metrics-tls") pod "dns-default-dlnfz" (UID: "ccb22e48-3cd7-442e-ac5a-7ec7666b48e9") : secret "dns-default-metrics-tls" not found Apr 16 18:31:29.463170 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:31:29.463130 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b62474b5-9999-4dd6-83ae-96e3bc355df3-metrics-certs\") pod \"network-metrics-daemon-f4smb\" (UID: \"b62474b5-9999-4dd6-83ae-96e3bc355df3\") " pod="openshift-multus/network-metrics-daemon-f4smb" Apr 16 18:31:29.465904 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:31:29.465887 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 18:31:29.474068 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:31:29.474051 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 18:31:29.474113 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:31:29.474105 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b62474b5-9999-4dd6-83ae-96e3bc355df3-metrics-certs podName:b62474b5-9999-4dd6-83ae-96e3bc355df3 nodeName:}" failed. No retries permitted until 2026-04-16 18:32:33.474091414 +0000 UTC m=+130.261427648 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b62474b5-9999-4dd6-83ae-96e3bc355df3-metrics-certs") pod "network-metrics-daemon-f4smb" (UID: "b62474b5-9999-4dd6-83ae-96e3bc355df3") : secret "metrics-daemon-secret" not found Apr 16 18:31:29.563573 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:31:29.563534 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dj56w\" (UniqueName: \"kubernetes.io/projected/dcf43e1d-4165-4661-a113-011616920ebe-kube-api-access-dj56w\") pod \"network-check-target-rp72n\" (UID: \"dcf43e1d-4165-4661-a113-011616920ebe\") " pod="openshift-network-diagnostics/network-check-target-rp72n" Apr 16 18:31:29.566161 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:31:29.566144 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 18:31:29.576694 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:31:29.576668 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 18:31:29.588100 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:31:29.588069 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dj56w\" (UniqueName: \"kubernetes.io/projected/dcf43e1d-4165-4661-a113-011616920ebe-kube-api-access-dj56w\") pod \"network-check-target-rp72n\" (UID: \"dcf43e1d-4165-4661-a113-011616920ebe\") " pod="openshift-network-diagnostics/network-check-target-rp72n" Apr 16 18:31:29.655715 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:31:29.655689 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-7lgq2\"" Apr 16 18:31:29.664100 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:31:29.664085 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rp72n" Apr 16 18:31:29.844357 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:31:29.844321 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-rp72n"] Apr 16 18:31:29.848253 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:31:29.848230 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddcf43e1d_4165_4661_a113_011616920ebe.slice/crio-2fb52538503f5b9c344740fa3db3f518a8e44f1d1198c5091b488d03aff3a58d WatchSource:0}: Error finding container 2fb52538503f5b9c344740fa3db3f518a8e44f1d1198c5091b488d03aff3a58d: Status 404 returned error can't find the container with id 2fb52538503f5b9c344740fa3db3f518a8e44f1d1198c5091b488d03aff3a58d Apr 16 18:31:30.063364 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:31:30.063287 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-rp72n" event={"ID":"dcf43e1d-4165-4661-a113-011616920ebe","Type":"ContainerStarted","Data":"2fb52538503f5b9c344740fa3db3f518a8e44f1d1198c5091b488d03aff3a58d"} Apr 16 18:31:33.070701 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:31:33.070667 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-rp72n" event={"ID":"dcf43e1d-4165-4661-a113-011616920ebe","Type":"ContainerStarted","Data":"2f4a4e8c9bfd107735d72f277b39616cd89e6013bd863445420aff4b3f39399e"} Apr 16 18:31:33.071119 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:31:33.070896 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-rp72n" Apr 16 18:31:33.085323 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:31:33.085283 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-rp72n" podStartSLOduration=67.390718776 podStartE2EDuration="1m10.085271986s" podCreationTimestamp="2026-04-16 18:30:23 +0000 UTC" firstStartedPulling="2026-04-16 18:31:29.850363482 +0000 UTC m=+66.637699717" lastFinishedPulling="2026-04-16 18:31:32.544916694 +0000 UTC m=+69.332252927" observedRunningTime="2026-04-16 18:31:33.084903247 +0000 UTC m=+69.872239502" watchObservedRunningTime="2026-04-16 18:31:33.085271986 +0000 UTC m=+69.872608244" Apr 16 18:32:00.459569 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:00.459534 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ccb22e48-3cd7-442e-ac5a-7ec7666b48e9-metrics-tls\") pod \"dns-default-dlnfz\" (UID: \"ccb22e48-3cd7-442e-ac5a-7ec7666b48e9\") " pod="openshift-dns/dns-default-dlnfz" Apr 16 18:32:00.460037 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:00.459599 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/714c4beb-40ac-4478-80ff-d058fb5fd1a3-cert\") pod \"ingress-canary-pb8c8\" (UID: \"714c4beb-40ac-4478-80ff-d058fb5fd1a3\") " pod="openshift-ingress-canary/ingress-canary-pb8c8" Apr 16 18:32:00.460037 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:32:00.459684 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:32:00.460037 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:32:00.459700 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:32:00.460037 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:32:00.459747 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/714c4beb-40ac-4478-80ff-d058fb5fd1a3-cert podName:714c4beb-40ac-4478-80ff-d058fb5fd1a3 nodeName:}" failed. No retries permitted until 2026-04-16 18:33:04.459732417 +0000 UTC m=+161.247068655 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/714c4beb-40ac-4478-80ff-d058fb5fd1a3-cert") pod "ingress-canary-pb8c8" (UID: "714c4beb-40ac-4478-80ff-d058fb5fd1a3") : secret "canary-serving-cert" not found Apr 16 18:32:00.460037 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:32:00.459778 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ccb22e48-3cd7-442e-ac5a-7ec7666b48e9-metrics-tls podName:ccb22e48-3cd7-442e-ac5a-7ec7666b48e9 nodeName:}" failed. No retries permitted until 2026-04-16 18:33:04.45975889 +0000 UTC m=+161.247095142 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ccb22e48-3cd7-442e-ac5a-7ec7666b48e9-metrics-tls") pod "dns-default-dlnfz" (UID: "ccb22e48-3cd7-442e-ac5a-7ec7666b48e9") : secret "dns-default-metrics-tls" not found Apr 16 18:32:04.076729 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:04.076695 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-rp72n" Apr 16 18:32:31.594790 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:31.594753 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-gdpt2"] Apr 16 18:32:31.596561 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:31.596547 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-gdpt2" Apr 16 18:32:31.599461 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:31.599439 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:32:31.599570 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:31.599442 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 16 18:32:31.601004 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:31.600977 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-4dxjh\"" Apr 16 18:32:31.608987 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:31.608968 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-gdpt2"] Apr 16 18:32:31.666719 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:31.666687 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwm65\" (UniqueName: \"kubernetes.io/projected/305a7e75-9ebd-4072-9b0a-9eff1f2ca870-kube-api-access-bwm65\") pod \"volume-data-source-validator-7d955d5dd4-gdpt2\" (UID: \"305a7e75-9ebd-4072-9b0a-9eff1f2ca870\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-gdpt2" Apr 16 18:32:31.699161 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:31.699133 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-dfzr4"] Apr 16 18:32:31.701005 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:31.700991 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-dfzr4" Apr 16 18:32:31.703467 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:31.703448 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-sbljq\"" Apr 16 18:32:31.703645 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:31.703624 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 16 18:32:31.703854 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:31.703840 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 18:32:31.704913 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:31.704899 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 18:32:31.705384 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:31.705367 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 16 18:32:31.715782 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:31.715761 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-dfzr4"] Apr 16 18:32:31.767088 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:31.767058 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bwm65\" (UniqueName: \"kubernetes.io/projected/305a7e75-9ebd-4072-9b0a-9eff1f2ca870-kube-api-access-bwm65\") pod \"volume-data-source-validator-7d955d5dd4-gdpt2\" (UID: \"305a7e75-9ebd-4072-9b0a-9eff1f2ca870\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-gdpt2" Apr 16 18:32:31.767245 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:31.767111 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/bdf8d744-507e-4943-8866-e60d7c582151-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-dfzr4\" (UID: \"bdf8d744-507e-4943-8866-e60d7c582151\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-dfzr4" Apr 16 18:32:31.767292 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:31.767257 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/bdf8d744-507e-4943-8866-e60d7c582151-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-dfzr4\" (UID: \"bdf8d744-507e-4943-8866-e60d7c582151\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-dfzr4" Apr 16 18:32:31.767292 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:31.767283 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzvc9\" (UniqueName: \"kubernetes.io/projected/bdf8d744-507e-4943-8866-e60d7c582151-kube-api-access-gzvc9\") pod \"cluster-monitoring-operator-6667474d89-dfzr4\" (UID: \"bdf8d744-507e-4943-8866-e60d7c582151\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-dfzr4" Apr 16 18:32:31.774785 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:31.774769 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwm65\" (UniqueName: \"kubernetes.io/projected/305a7e75-9ebd-4072-9b0a-9eff1f2ca870-kube-api-access-bwm65\") pod \"volume-data-source-validator-7d955d5dd4-gdpt2\" (UID: \"305a7e75-9ebd-4072-9b0a-9eff1f2ca870\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-gdpt2" Apr 16 18:32:31.805110 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:31.805089 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5b8555f68f-6s9wq"] Apr 16 18:32:31.806868 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:31.806856 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5b8555f68f-6s9wq" Apr 16 18:32:31.809932 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:31.809914 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 18:32:31.810054 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:31.809917 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-r7rs7\"" Apr 16 18:32:31.810242 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:31.810229 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 18:32:31.810506 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:31.810490 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 18:32:31.815696 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:31.815681 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 18:32:31.824464 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:31.824438 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5b8555f68f-6s9wq"] Apr 16 18:32:31.868461 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:31.868391 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/bdf8d744-507e-4943-8866-e60d7c582151-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-dfzr4\" (UID: \"bdf8d744-507e-4943-8866-e60d7c582151\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-dfzr4" Apr 16 18:32:31.868461 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:31.868418 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gzvc9\" (UniqueName: \"kubernetes.io/projected/bdf8d744-507e-4943-8866-e60d7c582151-kube-api-access-gzvc9\") pod \"cluster-monitoring-operator-6667474d89-dfzr4\" (UID: \"bdf8d744-507e-4943-8866-e60d7c582151\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-dfzr4" Apr 16 18:32:31.868461 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:31.868442 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/eb710a5a-c4ac-49d9-90fc-dd7e54250a60-registry-certificates\") pod \"image-registry-5b8555f68f-6s9wq\" (UID: \"eb710a5a-c4ac-49d9-90fc-dd7e54250a60\") " pod="openshift-image-registry/image-registry-5b8555f68f-6s9wq" Apr 16 18:32:31.868461 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:31.868457 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8vdv\" (UniqueName: \"kubernetes.io/projected/eb710a5a-c4ac-49d9-90fc-dd7e54250a60-kube-api-access-k8vdv\") pod \"image-registry-5b8555f68f-6s9wq\" (UID: \"eb710a5a-c4ac-49d9-90fc-dd7e54250a60\") " pod="openshift-image-registry/image-registry-5b8555f68f-6s9wq" Apr 16 18:32:31.868687 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:31.868480 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eb710a5a-c4ac-49d9-90fc-dd7e54250a60-trusted-ca\") pod \"image-registry-5b8555f68f-6s9wq\" (UID: \"eb710a5a-c4ac-49d9-90fc-dd7e54250a60\") " pod="openshift-image-registry/image-registry-5b8555f68f-6s9wq" Apr 16 18:32:31.868687 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:31.868544 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/eb710a5a-c4ac-49d9-90fc-dd7e54250a60-registry-tls\") pod \"image-registry-5b8555f68f-6s9wq\" (UID: \"eb710a5a-c4ac-49d9-90fc-dd7e54250a60\") " pod="openshift-image-registry/image-registry-5b8555f68f-6s9wq" Apr 16 18:32:31.868687 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:31.868569 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/eb710a5a-c4ac-49d9-90fc-dd7e54250a60-bound-sa-token\") pod \"image-registry-5b8555f68f-6s9wq\" (UID: \"eb710a5a-c4ac-49d9-90fc-dd7e54250a60\") " pod="openshift-image-registry/image-registry-5b8555f68f-6s9wq" Apr 16 18:32:31.868687 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:31.868639 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/bdf8d744-507e-4943-8866-e60d7c582151-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-dfzr4\" (UID: \"bdf8d744-507e-4943-8866-e60d7c582151\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-dfzr4" Apr 16 18:32:31.868687 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:31.868678 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/eb710a5a-c4ac-49d9-90fc-dd7e54250a60-installation-pull-secrets\") pod \"image-registry-5b8555f68f-6s9wq\" (UID: \"eb710a5a-c4ac-49d9-90fc-dd7e54250a60\") " pod="openshift-image-registry/image-registry-5b8555f68f-6s9wq" Apr 16 18:32:31.868848 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:31.868734 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/eb710a5a-c4ac-49d9-90fc-dd7e54250a60-image-registry-private-configuration\") pod \"image-registry-5b8555f68f-6s9wq\" (UID: \"eb710a5a-c4ac-49d9-90fc-dd7e54250a60\") " pod="openshift-image-registry/image-registry-5b8555f68f-6s9wq" Apr 16 18:32:31.868848 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:31.868765 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/eb710a5a-c4ac-49d9-90fc-dd7e54250a60-ca-trust-extracted\") pod \"image-registry-5b8555f68f-6s9wq\" (UID: \"eb710a5a-c4ac-49d9-90fc-dd7e54250a60\") " pod="openshift-image-registry/image-registry-5b8555f68f-6s9wq" Apr 16 18:32:31.868848 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:32:31.868769 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:32:31.868933 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:32:31.868860 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bdf8d744-507e-4943-8866-e60d7c582151-cluster-monitoring-operator-tls podName:bdf8d744-507e-4943-8866-e60d7c582151 nodeName:}" failed. No retries permitted until 2026-04-16 18:32:32.368845914 +0000 UTC m=+129.156182161 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/bdf8d744-507e-4943-8866-e60d7c582151-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-dfzr4" (UID: "bdf8d744-507e-4943-8866-e60d7c582151") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:32:31.869255 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:31.869238 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/bdf8d744-507e-4943-8866-e60d7c582151-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-dfzr4\" (UID: \"bdf8d744-507e-4943-8866-e60d7c582151\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-dfzr4" Apr 16 18:32:31.878437 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:31.878411 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzvc9\" (UniqueName: \"kubernetes.io/projected/bdf8d744-507e-4943-8866-e60d7c582151-kube-api-access-gzvc9\") pod \"cluster-monitoring-operator-6667474d89-dfzr4\" (UID: \"bdf8d744-507e-4943-8866-e60d7c582151\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-dfzr4" Apr 16 18:32:31.907256 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:31.907230 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-gdpt2" Apr 16 18:32:31.969372 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:31.969339 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/eb710a5a-c4ac-49d9-90fc-dd7e54250a60-registry-tls\") pod \"image-registry-5b8555f68f-6s9wq\" (UID: \"eb710a5a-c4ac-49d9-90fc-dd7e54250a60\") " pod="openshift-image-registry/image-registry-5b8555f68f-6s9wq" Apr 16 18:32:31.969522 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:31.969388 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/eb710a5a-c4ac-49d9-90fc-dd7e54250a60-bound-sa-token\") pod \"image-registry-5b8555f68f-6s9wq\" (UID: \"eb710a5a-c4ac-49d9-90fc-dd7e54250a60\") " pod="openshift-image-registry/image-registry-5b8555f68f-6s9wq" Apr 16 18:32:31.969522 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:32:31.969458 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:32:31.969522 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:32:31.969472 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5b8555f68f-6s9wq: secret "image-registry-tls" not found Apr 16 18:32:31.969662 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:32:31.969560 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eb710a5a-c4ac-49d9-90fc-dd7e54250a60-registry-tls podName:eb710a5a-c4ac-49d9-90fc-dd7e54250a60 nodeName:}" failed. No retries permitted until 2026-04-16 18:32:32.469540901 +0000 UTC m=+129.256877149 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/eb710a5a-c4ac-49d9-90fc-dd7e54250a60-registry-tls") pod "image-registry-5b8555f68f-6s9wq" (UID: "eb710a5a-c4ac-49d9-90fc-dd7e54250a60") : secret "image-registry-tls" not found Apr 16 18:32:31.969662 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:31.969457 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/eb710a5a-c4ac-49d9-90fc-dd7e54250a60-installation-pull-secrets\") pod \"image-registry-5b8555f68f-6s9wq\" (UID: \"eb710a5a-c4ac-49d9-90fc-dd7e54250a60\") " pod="openshift-image-registry/image-registry-5b8555f68f-6s9wq" Apr 16 18:32:31.969769 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:31.969661 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/eb710a5a-c4ac-49d9-90fc-dd7e54250a60-image-registry-private-configuration\") pod \"image-registry-5b8555f68f-6s9wq\" (UID: \"eb710a5a-c4ac-49d9-90fc-dd7e54250a60\") " pod="openshift-image-registry/image-registry-5b8555f68f-6s9wq" Apr 16 18:32:31.969769 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:31.969702 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/eb710a5a-c4ac-49d9-90fc-dd7e54250a60-ca-trust-extracted\") pod \"image-registry-5b8555f68f-6s9wq\" (UID: \"eb710a5a-c4ac-49d9-90fc-dd7e54250a60\") " pod="openshift-image-registry/image-registry-5b8555f68f-6s9wq" Apr 16 18:32:31.969769 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:31.969750 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/eb710a5a-c4ac-49d9-90fc-dd7e54250a60-registry-certificates\") pod \"image-registry-5b8555f68f-6s9wq\" (UID: \"eb710a5a-c4ac-49d9-90fc-dd7e54250a60\") " pod="openshift-image-registry/image-registry-5b8555f68f-6s9wq" Apr 16 18:32:31.969910 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:31.969842 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k8vdv\" (UniqueName: \"kubernetes.io/projected/eb710a5a-c4ac-49d9-90fc-dd7e54250a60-kube-api-access-k8vdv\") pod \"image-registry-5b8555f68f-6s9wq\" (UID: \"eb710a5a-c4ac-49d9-90fc-dd7e54250a60\") " pod="openshift-image-registry/image-registry-5b8555f68f-6s9wq" Apr 16 18:32:31.969910 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:31.969899 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eb710a5a-c4ac-49d9-90fc-dd7e54250a60-trusted-ca\") pod \"image-registry-5b8555f68f-6s9wq\" (UID: \"eb710a5a-c4ac-49d9-90fc-dd7e54250a60\") " pod="openshift-image-registry/image-registry-5b8555f68f-6s9wq" Apr 16 18:32:31.970635 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:31.970609 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/eb710a5a-c4ac-49d9-90fc-dd7e54250a60-ca-trust-extracted\") pod \"image-registry-5b8555f68f-6s9wq\" (UID: \"eb710a5a-c4ac-49d9-90fc-dd7e54250a60\") " pod="openshift-image-registry/image-registry-5b8555f68f-6s9wq" Apr 16 18:32:31.970859 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:31.970841 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/eb710a5a-c4ac-49d9-90fc-dd7e54250a60-registry-certificates\") pod \"image-registry-5b8555f68f-6s9wq\" (UID: \"eb710a5a-c4ac-49d9-90fc-dd7e54250a60\") " pod="openshift-image-registry/image-registry-5b8555f68f-6s9wq" Apr 16 18:32:31.971015 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:31.970995 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eb710a5a-c4ac-49d9-90fc-dd7e54250a60-trusted-ca\") pod \"image-registry-5b8555f68f-6s9wq\" (UID: \"eb710a5a-c4ac-49d9-90fc-dd7e54250a60\") " pod="openshift-image-registry/image-registry-5b8555f68f-6s9wq" Apr 16 18:32:31.971907 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:31.971884 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/eb710a5a-c4ac-49d9-90fc-dd7e54250a60-installation-pull-secrets\") pod \"image-registry-5b8555f68f-6s9wq\" (UID: \"eb710a5a-c4ac-49d9-90fc-dd7e54250a60\") " pod="openshift-image-registry/image-registry-5b8555f68f-6s9wq" Apr 16 18:32:31.972002 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:31.971947 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/eb710a5a-c4ac-49d9-90fc-dd7e54250a60-image-registry-private-configuration\") pod \"image-registry-5b8555f68f-6s9wq\" (UID: \"eb710a5a-c4ac-49d9-90fc-dd7e54250a60\") " pod="openshift-image-registry/image-registry-5b8555f68f-6s9wq" Apr 16 18:32:31.981263 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:31.980988 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/eb710a5a-c4ac-49d9-90fc-dd7e54250a60-bound-sa-token\") pod \"image-registry-5b8555f68f-6s9wq\" (UID: \"eb710a5a-c4ac-49d9-90fc-dd7e54250a60\") " pod="openshift-image-registry/image-registry-5b8555f68f-6s9wq" Apr 16 18:32:31.981369 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:31.981304 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8vdv\" (UniqueName: \"kubernetes.io/projected/eb710a5a-c4ac-49d9-90fc-dd7e54250a60-kube-api-access-k8vdv\") pod \"image-registry-5b8555f68f-6s9wq\" (UID: \"eb710a5a-c4ac-49d9-90fc-dd7e54250a60\") " pod="openshift-image-registry/image-registry-5b8555f68f-6s9wq" Apr 16 18:32:32.022332 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:32.022296 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-gdpt2"] Apr 16 18:32:32.025527 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:32:32.025477 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod305a7e75_9ebd_4072_9b0a_9eff1f2ca870.slice/crio-d437a811f77abe66d2ec703f747ff8e42bcbdbd479d7a2d133963d355b8d3e13 WatchSource:0}: Error finding container d437a811f77abe66d2ec703f747ff8e42bcbdbd479d7a2d133963d355b8d3e13: Status 404 returned error can't find the container with id d437a811f77abe66d2ec703f747ff8e42bcbdbd479d7a2d133963d355b8d3e13 Apr 16 18:32:32.180361 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:32.180286 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-gdpt2" event={"ID":"305a7e75-9ebd-4072-9b0a-9eff1f2ca870","Type":"ContainerStarted","Data":"d437a811f77abe66d2ec703f747ff8e42bcbdbd479d7a2d133963d355b8d3e13"} Apr 16 18:32:32.372760 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:32.372710 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/bdf8d744-507e-4943-8866-e60d7c582151-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-dfzr4\" (UID: \"bdf8d744-507e-4943-8866-e60d7c582151\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-dfzr4" Apr 16 18:32:32.372915 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:32:32.372861 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:32:32.372970 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:32:32.372928 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bdf8d744-507e-4943-8866-e60d7c582151-cluster-monitoring-operator-tls podName:bdf8d744-507e-4943-8866-e60d7c582151 nodeName:}" failed. No retries permitted until 2026-04-16 18:32:33.372912013 +0000 UTC m=+130.160248247 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/bdf8d744-507e-4943-8866-e60d7c582151-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-dfzr4" (UID: "bdf8d744-507e-4943-8866-e60d7c582151") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:32:32.474137 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:32.474106 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/eb710a5a-c4ac-49d9-90fc-dd7e54250a60-registry-tls\") pod \"image-registry-5b8555f68f-6s9wq\" (UID: \"eb710a5a-c4ac-49d9-90fc-dd7e54250a60\") " pod="openshift-image-registry/image-registry-5b8555f68f-6s9wq" Apr 16 18:32:32.474313 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:32:32.474284 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:32:32.474313 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:32:32.474302 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5b8555f68f-6s9wq: secret "image-registry-tls" not found Apr 16 18:32:32.474431 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:32:32.474363 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eb710a5a-c4ac-49d9-90fc-dd7e54250a60-registry-tls podName:eb710a5a-c4ac-49d9-90fc-dd7e54250a60 nodeName:}" failed. No retries permitted until 2026-04-16 18:32:33.474345551 +0000 UTC m=+130.261681784 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/eb710a5a-c4ac-49d9-90fc-dd7e54250a60-registry-tls") pod "image-registry-5b8555f68f-6s9wq" (UID: "eb710a5a-c4ac-49d9-90fc-dd7e54250a60") : secret "image-registry-tls" not found Apr 16 18:32:33.183148 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:33.183111 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-gdpt2" event={"ID":"305a7e75-9ebd-4072-9b0a-9eff1f2ca870","Type":"ContainerStarted","Data":"499017031ed01937ed2bbcee9582bab69ca452ff9fa585509369ae0c94ea2e90"} Apr 16 18:32:33.199325 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:33.199263 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-gdpt2" podStartSLOduration=1.136845108 podStartE2EDuration="2.199247013s" podCreationTimestamp="2026-04-16 18:32:31 +0000 UTC" firstStartedPulling="2026-04-16 18:32:32.027120804 +0000 UTC m=+128.814457038" lastFinishedPulling="2026-04-16 18:32:33.089522706 +0000 UTC m=+129.876858943" observedRunningTime="2026-04-16 18:32:33.198300967 +0000 UTC m=+129.985637235" watchObservedRunningTime="2026-04-16 18:32:33.199247013 +0000 UTC m=+129.986583271" Apr 16 18:32:33.382841 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:33.382738 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/bdf8d744-507e-4943-8866-e60d7c582151-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-dfzr4\" (UID: \"bdf8d744-507e-4943-8866-e60d7c582151\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-dfzr4" Apr 16 18:32:33.383028 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:32:33.382857 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:32:33.383028 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:32:33.382913 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bdf8d744-507e-4943-8866-e60d7c582151-cluster-monitoring-operator-tls podName:bdf8d744-507e-4943-8866-e60d7c582151 nodeName:}" failed. No retries permitted until 2026-04-16 18:32:35.382899111 +0000 UTC m=+132.170235344 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/bdf8d744-507e-4943-8866-e60d7c582151-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-dfzr4" (UID: "bdf8d744-507e-4943-8866-e60d7c582151") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:32:33.484089 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:33.484057 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/eb710a5a-c4ac-49d9-90fc-dd7e54250a60-registry-tls\") pod \"image-registry-5b8555f68f-6s9wq\" (UID: \"eb710a5a-c4ac-49d9-90fc-dd7e54250a60\") " pod="openshift-image-registry/image-registry-5b8555f68f-6s9wq" Apr 16 18:32:33.484195 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:33.484095 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b62474b5-9999-4dd6-83ae-96e3bc355df3-metrics-certs\") pod \"network-metrics-daemon-f4smb\" (UID: \"b62474b5-9999-4dd6-83ae-96e3bc355df3\") " pod="openshift-multus/network-metrics-daemon-f4smb" Apr 16 18:32:33.484231 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:32:33.484205 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 18:32:33.484267 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:32:33.484229 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:32:33.484267 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:32:33.484250 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b62474b5-9999-4dd6-83ae-96e3bc355df3-metrics-certs podName:b62474b5-9999-4dd6-83ae-96e3bc355df3 nodeName:}" failed. No retries permitted until 2026-04-16 18:34:35.484236911 +0000 UTC m=+252.271573144 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b62474b5-9999-4dd6-83ae-96e3bc355df3-metrics-certs") pod "network-metrics-daemon-f4smb" (UID: "b62474b5-9999-4dd6-83ae-96e3bc355df3") : secret "metrics-daemon-secret" not found Apr 16 18:32:33.484267 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:32:33.484250 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5b8555f68f-6s9wq: secret "image-registry-tls" not found Apr 16 18:32:33.484388 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:32:33.484315 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eb710a5a-c4ac-49d9-90fc-dd7e54250a60-registry-tls podName:eb710a5a-c4ac-49d9-90fc-dd7e54250a60 nodeName:}" failed. No retries permitted until 2026-04-16 18:32:35.484300699 +0000 UTC m=+132.271636936 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/eb710a5a-c4ac-49d9-90fc-dd7e54250a60-registry-tls") pod "image-registry-5b8555f68f-6s9wq" (UID: "eb710a5a-c4ac-49d9-90fc-dd7e54250a60") : secret "image-registry-tls" not found Apr 16 18:32:33.654972 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:33.654898 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-jlpz7"] Apr 16 18:32:33.656810 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:33.656796 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-jlpz7" Apr 16 18:32:33.659352 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:33.659326 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 16 18:32:33.659484 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:33.659354 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 16 18:32:33.659484 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:33.659416 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 16 18:32:33.659484 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:33.659423 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-ft8kk\"" Apr 16 18:32:33.659484 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:33.659438 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:32:33.667661 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:33.667638 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-jlpz7"] Apr 16 18:32:33.786470 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:33.786445 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d78602f6-4841-4d81-8a43-e0c53bc9137b-serving-cert\") pod \"service-ca-operator-69965bb79d-jlpz7\" (UID: \"d78602f6-4841-4d81-8a43-e0c53bc9137b\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-jlpz7" Apr 16 18:32:33.786639 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:33.786522 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctpb6\" (UniqueName: \"kubernetes.io/projected/d78602f6-4841-4d81-8a43-e0c53bc9137b-kube-api-access-ctpb6\") pod \"service-ca-operator-69965bb79d-jlpz7\" (UID: \"d78602f6-4841-4d81-8a43-e0c53bc9137b\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-jlpz7" Apr 16 18:32:33.786639 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:33.786552 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d78602f6-4841-4d81-8a43-e0c53bc9137b-config\") pod \"service-ca-operator-69965bb79d-jlpz7\" (UID: \"d78602f6-4841-4d81-8a43-e0c53bc9137b\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-jlpz7" Apr 16 18:32:33.887847 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:33.887810 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ctpb6\" (UniqueName: \"kubernetes.io/projected/d78602f6-4841-4d81-8a43-e0c53bc9137b-kube-api-access-ctpb6\") pod \"service-ca-operator-69965bb79d-jlpz7\" (UID: \"d78602f6-4841-4d81-8a43-e0c53bc9137b\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-jlpz7" Apr 16 18:32:33.887847 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:33.887851 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d78602f6-4841-4d81-8a43-e0c53bc9137b-config\") pod \"service-ca-operator-69965bb79d-jlpz7\" (UID: \"d78602f6-4841-4d81-8a43-e0c53bc9137b\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-jlpz7" Apr 16 18:32:33.888039 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:33.887946 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d78602f6-4841-4d81-8a43-e0c53bc9137b-serving-cert\") pod \"service-ca-operator-69965bb79d-jlpz7\" (UID: \"d78602f6-4841-4d81-8a43-e0c53bc9137b\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-jlpz7" Apr 16 18:32:33.888328 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:33.888306 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d78602f6-4841-4d81-8a43-e0c53bc9137b-config\") pod \"service-ca-operator-69965bb79d-jlpz7\" (UID: \"d78602f6-4841-4d81-8a43-e0c53bc9137b\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-jlpz7" Apr 16 18:32:33.890145 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:33.890116 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d78602f6-4841-4d81-8a43-e0c53bc9137b-serving-cert\") pod \"service-ca-operator-69965bb79d-jlpz7\" (UID: \"d78602f6-4841-4d81-8a43-e0c53bc9137b\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-jlpz7" Apr 16 18:32:33.896043 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:33.896015 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctpb6\" (UniqueName: \"kubernetes.io/projected/d78602f6-4841-4d81-8a43-e0c53bc9137b-kube-api-access-ctpb6\") pod \"service-ca-operator-69965bb79d-jlpz7\" (UID: \"d78602f6-4841-4d81-8a43-e0c53bc9137b\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-jlpz7" Apr 16 18:32:33.965198 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:33.965096 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-jlpz7" Apr 16 18:32:34.076193 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:34.076154 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-jlpz7"] Apr 16 18:32:34.079722 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:32:34.079700 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd78602f6_4841_4d81_8a43_e0c53bc9137b.slice/crio-a9d9ed4a7e6eb351bcc63c3b8031a1f445d53c9145f5690704e4ffc3ecfd6a3c WatchSource:0}: Error finding container a9d9ed4a7e6eb351bcc63c3b8031a1f445d53c9145f5690704e4ffc3ecfd6a3c: Status 404 returned error can't find the container with id a9d9ed4a7e6eb351bcc63c3b8031a1f445d53c9145f5690704e4ffc3ecfd6a3c Apr 16 18:32:34.186365 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:34.186331 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-jlpz7" event={"ID":"d78602f6-4841-4d81-8a43-e0c53bc9137b","Type":"ContainerStarted","Data":"a9d9ed4a7e6eb351bcc63c3b8031a1f445d53c9145f5690704e4ffc3ecfd6a3c"} Apr 16 18:32:35.401071 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:35.401034 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/bdf8d744-507e-4943-8866-e60d7c582151-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-dfzr4\" (UID: \"bdf8d744-507e-4943-8866-e60d7c582151\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-dfzr4" Apr 16 18:32:35.401531 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:32:35.401151 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:32:35.401531 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:32:35.401233 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bdf8d744-507e-4943-8866-e60d7c582151-cluster-monitoring-operator-tls podName:bdf8d744-507e-4943-8866-e60d7c582151 nodeName:}" failed. No retries permitted until 2026-04-16 18:32:39.401217676 +0000 UTC m=+136.188553909 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/bdf8d744-507e-4943-8866-e60d7c582151-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-dfzr4" (UID: "bdf8d744-507e-4943-8866-e60d7c582151") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:32:35.501853 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:35.501800 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/eb710a5a-c4ac-49d9-90fc-dd7e54250a60-registry-tls\") pod \"image-registry-5b8555f68f-6s9wq\" (UID: \"eb710a5a-c4ac-49d9-90fc-dd7e54250a60\") " pod="openshift-image-registry/image-registry-5b8555f68f-6s9wq" Apr 16 18:32:35.502046 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:32:35.501969 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:32:35.502046 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:32:35.501993 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5b8555f68f-6s9wq: secret "image-registry-tls" not found Apr 16 18:32:35.502152 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:32:35.502057 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eb710a5a-c4ac-49d9-90fc-dd7e54250a60-registry-tls podName:eb710a5a-c4ac-49d9-90fc-dd7e54250a60 nodeName:}" failed. No retries permitted until 2026-04-16 18:32:39.502036102 +0000 UTC m=+136.289372350 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/eb710a5a-c4ac-49d9-90fc-dd7e54250a60-registry-tls") pod "image-registry-5b8555f68f-6s9wq" (UID: "eb710a5a-c4ac-49d9-90fc-dd7e54250a60") : secret "image-registry-tls" not found Apr 16 18:32:36.177718 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:36.177683 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-7hlch"] Apr 16 18:32:36.179659 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:36.179645 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-7hlch" Apr 16 18:32:36.182225 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:36.182202 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 16 18:32:36.182364 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:36.182348 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-zqjhl\"" Apr 16 18:32:36.182409 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:36.182382 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 16 18:32:36.189861 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:36.189842 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-7hlch"] Apr 16 18:32:36.191416 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:36.191397 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-jlpz7" event={"ID":"d78602f6-4841-4d81-8a43-e0c53bc9137b","Type":"ContainerStarted","Data":"28064003cf49ee88a2bd1a920fd288081d1ab0565f0fb8273a77851d6ffcc6c8"} Apr 16 18:32:36.222009 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:36.221965 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-jlpz7" podStartSLOduration=1.717204493 podStartE2EDuration="3.22195323s" podCreationTimestamp="2026-04-16 18:32:33 +0000 UTC" firstStartedPulling="2026-04-16 18:32:34.081557243 +0000 UTC m=+130.868893476" lastFinishedPulling="2026-04-16 18:32:35.586305979 +0000 UTC m=+132.373642213" observedRunningTime="2026-04-16 18:32:36.221232512 +0000 UTC m=+133.008568770" watchObservedRunningTime="2026-04-16 18:32:36.22195323 +0000 UTC m=+133.009289486" Apr 16 18:32:36.308039 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:36.308007 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnrb9\" (UniqueName: \"kubernetes.io/projected/a8092b73-120e-4d31-8d9c-2567ffdcad38-kube-api-access-qnrb9\") pod \"migrator-64d4d94569-7hlch\" (UID: \"a8092b73-120e-4d31-8d9c-2567ffdcad38\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-7hlch" Apr 16 18:32:36.409447 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:36.409390 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qnrb9\" (UniqueName: \"kubernetes.io/projected/a8092b73-120e-4d31-8d9c-2567ffdcad38-kube-api-access-qnrb9\") pod \"migrator-64d4d94569-7hlch\" (UID: \"a8092b73-120e-4d31-8d9c-2567ffdcad38\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-7hlch" Apr 16 18:32:36.417990 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:36.417962 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnrb9\" (UniqueName: \"kubernetes.io/projected/a8092b73-120e-4d31-8d9c-2567ffdcad38-kube-api-access-qnrb9\") pod \"migrator-64d4d94569-7hlch\" (UID: \"a8092b73-120e-4d31-8d9c-2567ffdcad38\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-7hlch" Apr 16 18:32:36.488035 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:36.488006 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-7hlch" Apr 16 18:32:36.619027 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:36.618999 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-7hlch"] Apr 16 18:32:36.621519 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:32:36.621484 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8092b73_120e_4d31_8d9c_2567ffdcad38.slice/crio-893417c4d0ad9ae32841f0bfa990c40ff81899b43cd5999129e7fe076792ba0c WatchSource:0}: Error finding container 893417c4d0ad9ae32841f0bfa990c40ff81899b43cd5999129e7fe076792ba0c: Status 404 returned error can't find the container with id 893417c4d0ad9ae32841f0bfa990c40ff81899b43cd5999129e7fe076792ba0c Apr 16 18:32:37.194985 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:37.194942 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-7hlch" event={"ID":"a8092b73-120e-4d31-8d9c-2567ffdcad38","Type":"ContainerStarted","Data":"893417c4d0ad9ae32841f0bfa990c40ff81899b43cd5999129e7fe076792ba0c"} Apr 16 18:32:38.199536 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:38.199493 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-7hlch" event={"ID":"a8092b73-120e-4d31-8d9c-2567ffdcad38","Type":"ContainerStarted","Data":"643958b5f32fd1be6e87cf9d7e1673fbc1879bcc6d33ed533b8d76542a33c241"} Apr 16 18:32:38.199536 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:38.199538 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-7hlch" event={"ID":"a8092b73-120e-4d31-8d9c-2567ffdcad38","Type":"ContainerStarted","Data":"64812fc9fa20b7c63a92b6b6a834fe2373268b95ff3c7e3955db9445ac8ac196"} Apr 16 18:32:38.218226 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:38.218158 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-7hlch" podStartSLOduration=1.292612083 podStartE2EDuration="2.218141102s" podCreationTimestamp="2026-04-16 18:32:36 +0000 UTC" firstStartedPulling="2026-04-16 18:32:36.623369262 +0000 UTC m=+133.410705495" lastFinishedPulling="2026-04-16 18:32:37.548898264 +0000 UTC m=+134.336234514" observedRunningTime="2026-04-16 18:32:38.215845588 +0000 UTC m=+135.003181844" watchObservedRunningTime="2026-04-16 18:32:38.218141102 +0000 UTC m=+135.005477359" Apr 16 18:32:39.023538 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:39.023503 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-vqb95"] Apr 16 18:32:39.025367 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:39.025353 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-vqb95" Apr 16 18:32:39.028038 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:39.028013 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-7x995\"" Apr 16 18:32:39.028158 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:39.028050 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 16 18:32:39.029240 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:39.029222 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 16 18:32:39.029339 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:39.029242 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 16 18:32:39.029339 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:39.029224 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 16 18:32:39.033129 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:39.033111 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-vqb95"] Apr 16 18:32:39.128276 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:39.128239 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv7sz\" (UniqueName: \"kubernetes.io/projected/c991dcea-e9e7-4521-9e85-77689aaf560a-kube-api-access-rv7sz\") pod \"service-ca-bfc587fb7-vqb95\" (UID: \"c991dcea-e9e7-4521-9e85-77689aaf560a\") " pod="openshift-service-ca/service-ca-bfc587fb7-vqb95" Apr 16 18:32:39.128442 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:39.128290 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/c991dcea-e9e7-4521-9e85-77689aaf560a-signing-key\") pod \"service-ca-bfc587fb7-vqb95\" (UID: \"c991dcea-e9e7-4521-9e85-77689aaf560a\") " pod="openshift-service-ca/service-ca-bfc587fb7-vqb95" Apr 16 18:32:39.128442 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:39.128380 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/c991dcea-e9e7-4521-9e85-77689aaf560a-signing-cabundle\") pod \"service-ca-bfc587fb7-vqb95\" (UID: \"c991dcea-e9e7-4521-9e85-77689aaf560a\") " pod="openshift-service-ca/service-ca-bfc587fb7-vqb95" Apr 16 18:32:39.228813 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:39.228772 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/c991dcea-e9e7-4521-9e85-77689aaf560a-signing-cabundle\") pod \"service-ca-bfc587fb7-vqb95\" (UID: \"c991dcea-e9e7-4521-9e85-77689aaf560a\") " pod="openshift-service-ca/service-ca-bfc587fb7-vqb95" Apr 16 18:32:39.229170 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:39.228838 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rv7sz\" (UniqueName: \"kubernetes.io/projected/c991dcea-e9e7-4521-9e85-77689aaf560a-kube-api-access-rv7sz\") pod \"service-ca-bfc587fb7-vqb95\" (UID: \"c991dcea-e9e7-4521-9e85-77689aaf560a\") " pod="openshift-service-ca/service-ca-bfc587fb7-vqb95" Apr 16 18:32:39.229170 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:39.228867 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/c991dcea-e9e7-4521-9e85-77689aaf560a-signing-key\") pod \"service-ca-bfc587fb7-vqb95\" (UID: \"c991dcea-e9e7-4521-9e85-77689aaf560a\") " pod="openshift-service-ca/service-ca-bfc587fb7-vqb95" Apr 16 18:32:39.229398 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:39.229378 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/c991dcea-e9e7-4521-9e85-77689aaf560a-signing-cabundle\") pod \"service-ca-bfc587fb7-vqb95\" (UID: \"c991dcea-e9e7-4521-9e85-77689aaf560a\") " pod="openshift-service-ca/service-ca-bfc587fb7-vqb95" Apr 16 18:32:39.231138 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:39.231119 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/c991dcea-e9e7-4521-9e85-77689aaf560a-signing-key\") pod \"service-ca-bfc587fb7-vqb95\" (UID: \"c991dcea-e9e7-4521-9e85-77689aaf560a\") " pod="openshift-service-ca/service-ca-bfc587fb7-vqb95" Apr 16 18:32:39.237416 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:39.237392 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv7sz\" (UniqueName: \"kubernetes.io/projected/c991dcea-e9e7-4521-9e85-77689aaf560a-kube-api-access-rv7sz\") pod \"service-ca-bfc587fb7-vqb95\" (UID: \"c991dcea-e9e7-4521-9e85-77689aaf560a\") " pod="openshift-service-ca/service-ca-bfc587fb7-vqb95" Apr 16 18:32:39.238054 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:39.238035 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-hkqvd_0003db0c-dda0-4476-bd64-528082f53f33/dns-node-resolver/0.log" Apr 16 18:32:39.334854 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:39.334770 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-vqb95" Apr 16 18:32:39.430653 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:39.430622 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/bdf8d744-507e-4943-8866-e60d7c582151-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-dfzr4\" (UID: \"bdf8d744-507e-4943-8866-e60d7c582151\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-dfzr4" Apr 16 18:32:39.430789 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:32:39.430770 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:32:39.430841 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:32:39.430832 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bdf8d744-507e-4943-8866-e60d7c582151-cluster-monitoring-operator-tls podName:bdf8d744-507e-4943-8866-e60d7c582151 nodeName:}" failed. No retries permitted until 2026-04-16 18:32:47.430817504 +0000 UTC m=+144.218153738 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/bdf8d744-507e-4943-8866-e60d7c582151-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-dfzr4" (UID: "bdf8d744-507e-4943-8866-e60d7c582151") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:32:39.443045 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:39.442963 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-vqb95"] Apr 16 18:32:39.445370 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:32:39.445345 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc991dcea_e9e7_4521_9e85_77689aaf560a.slice/crio-b64ecacb51864a4a7a7b53d1c7823eed7338b1fdd2b0b6651e36b9526cb12283 WatchSource:0}: Error finding container b64ecacb51864a4a7a7b53d1c7823eed7338b1fdd2b0b6651e36b9526cb12283: Status 404 returned error can't find the container with id b64ecacb51864a4a7a7b53d1c7823eed7338b1fdd2b0b6651e36b9526cb12283 Apr 16 18:32:39.531589 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:39.531559 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/eb710a5a-c4ac-49d9-90fc-dd7e54250a60-registry-tls\") pod \"image-registry-5b8555f68f-6s9wq\" (UID: \"eb710a5a-c4ac-49d9-90fc-dd7e54250a60\") " pod="openshift-image-registry/image-registry-5b8555f68f-6s9wq" Apr 16 18:32:39.531716 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:32:39.531700 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:32:39.531755 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:32:39.531718 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5b8555f68f-6s9wq: secret "image-registry-tls" not found Apr 16 18:32:39.531794 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:32:39.531771 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eb710a5a-c4ac-49d9-90fc-dd7e54250a60-registry-tls podName:eb710a5a-c4ac-49d9-90fc-dd7e54250a60 nodeName:}" failed. No retries permitted until 2026-04-16 18:32:47.531753725 +0000 UTC m=+144.319089958 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/eb710a5a-c4ac-49d9-90fc-dd7e54250a60-registry-tls") pod "image-registry-5b8555f68f-6s9wq" (UID: "eb710a5a-c4ac-49d9-90fc-dd7e54250a60") : secret "image-registry-tls" not found Apr 16 18:32:39.839635 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:39.839609 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-fj55k_1168c01f-c07f-44f9-b56f-cc88b2028e0b/node-ca/0.log" Apr 16 18:32:40.205655 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:40.205565 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-vqb95" event={"ID":"c991dcea-e9e7-4521-9e85-77689aaf560a","Type":"ContainerStarted","Data":"248e0986b7d626a0e49c0cdee97a92f68706b2a4386474c5b9480418efbda89f"} Apr 16 18:32:40.205655 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:40.205605 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-vqb95" event={"ID":"c991dcea-e9e7-4521-9e85-77689aaf560a","Type":"ContainerStarted","Data":"b64ecacb51864a4a7a7b53d1c7823eed7338b1fdd2b0b6651e36b9526cb12283"} Apr 16 18:32:41.266354 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:41.266261 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-7hlch_a8092b73-120e-4d31-8d9c-2567ffdcad38/migrator/0.log" Apr 16 18:32:41.439053 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:41.439028 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-7hlch_a8092b73-120e-4d31-8d9c-2567ffdcad38/graceful-termination/0.log" Apr 16 18:32:47.495824 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:47.495790 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/bdf8d744-507e-4943-8866-e60d7c582151-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-dfzr4\" (UID: \"bdf8d744-507e-4943-8866-e60d7c582151\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-dfzr4" Apr 16 18:32:47.496317 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:32:47.495959 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:32:47.496317 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:32:47.496045 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bdf8d744-507e-4943-8866-e60d7c582151-cluster-monitoring-operator-tls podName:bdf8d744-507e-4943-8866-e60d7c582151 nodeName:}" failed. No retries permitted until 2026-04-16 18:33:03.496024153 +0000 UTC m=+160.283360401 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/bdf8d744-507e-4943-8866-e60d7c582151-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-dfzr4" (UID: "bdf8d744-507e-4943-8866-e60d7c582151") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:32:47.597278 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:47.597246 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/eb710a5a-c4ac-49d9-90fc-dd7e54250a60-registry-tls\") pod \"image-registry-5b8555f68f-6s9wq\" (UID: \"eb710a5a-c4ac-49d9-90fc-dd7e54250a60\") " pod="openshift-image-registry/image-registry-5b8555f68f-6s9wq" Apr 16 18:32:47.599607 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:47.599581 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/eb710a5a-c4ac-49d9-90fc-dd7e54250a60-registry-tls\") pod \"image-registry-5b8555f68f-6s9wq\" (UID: \"eb710a5a-c4ac-49d9-90fc-dd7e54250a60\") " pod="openshift-image-registry/image-registry-5b8555f68f-6s9wq" Apr 16 18:32:47.715499 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:47.715463 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5b8555f68f-6s9wq" Apr 16 18:32:47.831125 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:47.831064 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-bfc587fb7-vqb95" podStartSLOduration=8.831047042 podStartE2EDuration="8.831047042s" podCreationTimestamp="2026-04-16 18:32:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:32:40.228665216 +0000 UTC m=+137.016001470" watchObservedRunningTime="2026-04-16 18:32:47.831047042 +0000 UTC m=+144.618383312" Apr 16 18:32:47.831354 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:47.831337 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5b8555f68f-6s9wq"] Apr 16 18:32:47.835700 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:32:47.835677 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb710a5a_c4ac_49d9_90fc_dd7e54250a60.slice/crio-943eda79b5bc76651e0247d8f85c4841ff0653ec7851c7fdf9b7498c42a925cf WatchSource:0}: Error finding container 943eda79b5bc76651e0247d8f85c4841ff0653ec7851c7fdf9b7498c42a925cf: Status 404 returned error can't find the container with id 943eda79b5bc76651e0247d8f85c4841ff0653ec7851c7fdf9b7498c42a925cf Apr 16 18:32:48.230858 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:48.230825 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5b8555f68f-6s9wq" event={"ID":"eb710a5a-c4ac-49d9-90fc-dd7e54250a60","Type":"ContainerStarted","Data":"1733ce3fcdbeae218f3bb12f5f942702a21358f1c32dbc8a888dd01867da27e6"} Apr 16 18:32:48.230858 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:48.230863 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5b8555f68f-6s9wq" event={"ID":"eb710a5a-c4ac-49d9-90fc-dd7e54250a60","Type":"ContainerStarted","Data":"943eda79b5bc76651e0247d8f85c4841ff0653ec7851c7fdf9b7498c42a925cf"} Apr 16 18:32:48.231060 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:48.230992 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5b8555f68f-6s9wq" Apr 16 18:32:48.256501 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:48.256451 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5b8555f68f-6s9wq" podStartSLOduration=17.256435894 podStartE2EDuration="17.256435894s" podCreationTimestamp="2026-04-16 18:32:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:32:48.256077469 +0000 UTC m=+145.043413722" watchObservedRunningTime="2026-04-16 18:32:48.256435894 +0000 UTC m=+145.043772151" Apr 16 18:32:59.337758 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:59.337728 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-99n6h"] Apr 16 18:32:59.343913 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:59.343895 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-99n6h" Apr 16 18:32:59.349852 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:59.349743 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 18:32:59.349980 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:59.349959 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 18:32:59.350567 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:59.350548 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 18:32:59.350659 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:59.350578 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-gqtpm\"" Apr 16 18:32:59.350659 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:59.350552 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 18:32:59.358243 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:59.358223 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5b8555f68f-6s9wq"] Apr 16 18:32:59.361168 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:59.361149 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-99n6h"] Apr 16 18:32:59.486479 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:59.486446 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/9eddac0a-a3b8-4340-8157-5cbbd08512d7-crio-socket\") pod \"insights-runtime-extractor-99n6h\" (UID: \"9eddac0a-a3b8-4340-8157-5cbbd08512d7\") " pod="openshift-insights/insights-runtime-extractor-99n6h" Apr 16 18:32:59.486661 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:59.486487 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxpdx\" (UniqueName: \"kubernetes.io/projected/9eddac0a-a3b8-4340-8157-5cbbd08512d7-kube-api-access-sxpdx\") pod \"insights-runtime-extractor-99n6h\" (UID: \"9eddac0a-a3b8-4340-8157-5cbbd08512d7\") " pod="openshift-insights/insights-runtime-extractor-99n6h" Apr 16 18:32:59.486661 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:59.486568 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/9eddac0a-a3b8-4340-8157-5cbbd08512d7-data-volume\") pod \"insights-runtime-extractor-99n6h\" (UID: \"9eddac0a-a3b8-4340-8157-5cbbd08512d7\") " pod="openshift-insights/insights-runtime-extractor-99n6h" Apr 16 18:32:59.486661 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:59.486605 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/9eddac0a-a3b8-4340-8157-5cbbd08512d7-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-99n6h\" (UID: \"9eddac0a-a3b8-4340-8157-5cbbd08512d7\") " pod="openshift-insights/insights-runtime-extractor-99n6h" Apr 16 18:32:59.486661 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:59.486642 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/9eddac0a-a3b8-4340-8157-5cbbd08512d7-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-99n6h\" (UID: \"9eddac0a-a3b8-4340-8157-5cbbd08512d7\") " pod="openshift-insights/insights-runtime-extractor-99n6h" Apr 16 18:32:59.587383 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:59.587351 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/9eddac0a-a3b8-4340-8157-5cbbd08512d7-data-volume\") pod \"insights-runtime-extractor-99n6h\" (UID: \"9eddac0a-a3b8-4340-8157-5cbbd08512d7\") " pod="openshift-insights/insights-runtime-extractor-99n6h" Apr 16 18:32:59.587568 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:59.587390 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/9eddac0a-a3b8-4340-8157-5cbbd08512d7-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-99n6h\" (UID: \"9eddac0a-a3b8-4340-8157-5cbbd08512d7\") " pod="openshift-insights/insights-runtime-extractor-99n6h" Apr 16 18:32:59.587568 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:59.587429 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/9eddac0a-a3b8-4340-8157-5cbbd08512d7-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-99n6h\" (UID: \"9eddac0a-a3b8-4340-8157-5cbbd08512d7\") " pod="openshift-insights/insights-runtime-extractor-99n6h" Apr 16 18:32:59.587568 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:59.587503 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/9eddac0a-a3b8-4340-8157-5cbbd08512d7-crio-socket\") pod \"insights-runtime-extractor-99n6h\" (UID: \"9eddac0a-a3b8-4340-8157-5cbbd08512d7\") " pod="openshift-insights/insights-runtime-extractor-99n6h" Apr 16 18:32:59.587568 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:59.587534 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sxpdx\" (UniqueName: \"kubernetes.io/projected/9eddac0a-a3b8-4340-8157-5cbbd08512d7-kube-api-access-sxpdx\") pod \"insights-runtime-extractor-99n6h\" (UID: \"9eddac0a-a3b8-4340-8157-5cbbd08512d7\") " pod="openshift-insights/insights-runtime-extractor-99n6h" Apr 16 18:32:59.587748 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:59.587608 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/9eddac0a-a3b8-4340-8157-5cbbd08512d7-crio-socket\") pod \"insights-runtime-extractor-99n6h\" (UID: \"9eddac0a-a3b8-4340-8157-5cbbd08512d7\") " pod="openshift-insights/insights-runtime-extractor-99n6h" Apr 16 18:32:59.587748 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:59.587652 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/9eddac0a-a3b8-4340-8157-5cbbd08512d7-data-volume\") pod \"insights-runtime-extractor-99n6h\" (UID: \"9eddac0a-a3b8-4340-8157-5cbbd08512d7\") " pod="openshift-insights/insights-runtime-extractor-99n6h" Apr 16 18:32:59.587995 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:59.587951 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/9eddac0a-a3b8-4340-8157-5cbbd08512d7-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-99n6h\" (UID: \"9eddac0a-a3b8-4340-8157-5cbbd08512d7\") " pod="openshift-insights/insights-runtime-extractor-99n6h" Apr 16 18:32:59.589662 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:59.589638 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/9eddac0a-a3b8-4340-8157-5cbbd08512d7-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-99n6h\" (UID: \"9eddac0a-a3b8-4340-8157-5cbbd08512d7\") " pod="openshift-insights/insights-runtime-extractor-99n6h" Apr 16 18:32:59.600034 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:59.600013 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxpdx\" (UniqueName: \"kubernetes.io/projected/9eddac0a-a3b8-4340-8157-5cbbd08512d7-kube-api-access-sxpdx\") pod \"insights-runtime-extractor-99n6h\" (UID: \"9eddac0a-a3b8-4340-8157-5cbbd08512d7\") " pod="openshift-insights/insights-runtime-extractor-99n6h" Apr 16 18:32:59.653125 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:59.653102 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-99n6h" Apr 16 18:32:59.689245 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:32:59.689214 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-dlnfz" podUID="ccb22e48-3cd7-442e-ac5a-7ec7666b48e9" Apr 16 18:32:59.694192 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:32:59.694156 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-pb8c8" podUID="714c4beb-40ac-4478-80ff-d058fb5fd1a3" Apr 16 18:32:59.767238 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:32:59.767211 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-99n6h"] Apr 16 18:32:59.771018 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:32:59.770991 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9eddac0a_a3b8_4340_8157_5cbbd08512d7.slice/crio-dbfad0c0bc87d8362e1846313d7c00671f98771819538758fe1deae1c4f37b41 WatchSource:0}: Error finding container dbfad0c0bc87d8362e1846313d7c00671f98771819538758fe1deae1c4f37b41: Status 404 returned error can't find the container with id dbfad0c0bc87d8362e1846313d7c00671f98771819538758fe1deae1c4f37b41 Apr 16 18:33:00.261140 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:00.261111 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-pb8c8" Apr 16 18:33:00.261140 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:00.261125 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-99n6h" event={"ID":"9eddac0a-a3b8-4340-8157-5cbbd08512d7","Type":"ContainerStarted","Data":"67c6c98612ec5a068f423d127eaad508c91d1ddd9dae3c783d703d1843140529"} Apr 16 18:33:00.261341 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:00.261162 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-99n6h" event={"ID":"9eddac0a-a3b8-4340-8157-5cbbd08512d7","Type":"ContainerStarted","Data":"dbfad0c0bc87d8362e1846313d7c00671f98771819538758fe1deae1c4f37b41"} Apr 16 18:33:00.261341 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:00.261315 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-dlnfz" Apr 16 18:33:00.848672 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:33:00.848582 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-f4smb" podUID="b62474b5-9999-4dd6-83ae-96e3bc355df3" Apr 16 18:33:01.265531 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:01.265500 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-99n6h" event={"ID":"9eddac0a-a3b8-4340-8157-5cbbd08512d7","Type":"ContainerStarted","Data":"c60eb31d9d28ad5301ac26627bc864c12b02642e0474147b448892a609dde113"} Apr 16 18:33:02.273042 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:02.272998 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-99n6h" event={"ID":"9eddac0a-a3b8-4340-8157-5cbbd08512d7","Type":"ContainerStarted","Data":"86301dc49aa1733874a45ef8f15fb38aa6be0139ad25c0c74c2d632445b8f2ff"} Apr 16 18:33:02.292545 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:02.292506 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-99n6h" podStartSLOduration=1.4709096050000001 podStartE2EDuration="3.292494218s" podCreationTimestamp="2026-04-16 18:32:59 +0000 UTC" firstStartedPulling="2026-04-16 18:32:59.827986582 +0000 UTC m=+156.615322815" lastFinishedPulling="2026-04-16 18:33:01.64957119 +0000 UTC m=+158.436907428" observedRunningTime="2026-04-16 18:33:02.291393643 +0000 UTC m=+159.078729906" watchObservedRunningTime="2026-04-16 18:33:02.292494218 +0000 UTC m=+159.079830464" Apr 16 18:33:03.516720 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:03.516684 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/bdf8d744-507e-4943-8866-e60d7c582151-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-dfzr4\" (UID: \"bdf8d744-507e-4943-8866-e60d7c582151\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-dfzr4" Apr 16 18:33:03.519066 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:03.519043 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/bdf8d744-507e-4943-8866-e60d7c582151-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-dfzr4\" (UID: \"bdf8d744-507e-4943-8866-e60d7c582151\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-dfzr4" Apr 16 18:33:03.810145 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:03.810069 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-dfzr4" Apr 16 18:33:03.921474 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:03.921442 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-dfzr4"] Apr 16 18:33:03.924880 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:33:03.924854 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdf8d744_507e_4943_8866_e60d7c582151.slice/crio-f05b8180699a831980e0a40a2519e3b267d5701923a2a2b5a104a9538bed048a WatchSource:0}: Error finding container f05b8180699a831980e0a40a2519e3b267d5701923a2a2b5a104a9538bed048a: Status 404 returned error can't find the container with id f05b8180699a831980e0a40a2519e3b267d5701923a2a2b5a104a9538bed048a Apr 16 18:33:04.279825 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:04.279792 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-dfzr4" event={"ID":"bdf8d744-507e-4943-8866-e60d7c582151","Type":"ContainerStarted","Data":"f05b8180699a831980e0a40a2519e3b267d5701923a2a2b5a104a9538bed048a"} Apr 16 18:33:04.524473 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:04.524438 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/714c4beb-40ac-4478-80ff-d058fb5fd1a3-cert\") pod \"ingress-canary-pb8c8\" (UID: \"714c4beb-40ac-4478-80ff-d058fb5fd1a3\") " pod="openshift-ingress-canary/ingress-canary-pb8c8" Apr 16 18:33:04.524924 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:04.524485 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ccb22e48-3cd7-442e-ac5a-7ec7666b48e9-metrics-tls\") pod \"dns-default-dlnfz\" (UID: \"ccb22e48-3cd7-442e-ac5a-7ec7666b48e9\") " pod="openshift-dns/dns-default-dlnfz" Apr 16 18:33:04.527214 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:04.527170 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ccb22e48-3cd7-442e-ac5a-7ec7666b48e9-metrics-tls\") pod \"dns-default-dlnfz\" (UID: \"ccb22e48-3cd7-442e-ac5a-7ec7666b48e9\") " pod="openshift-dns/dns-default-dlnfz" Apr 16 18:33:04.527324 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:04.527249 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/714c4beb-40ac-4478-80ff-d058fb5fd1a3-cert\") pod \"ingress-canary-pb8c8\" (UID: \"714c4beb-40ac-4478-80ff-d058fb5fd1a3\") " pod="openshift-ingress-canary/ingress-canary-pb8c8" Apr 16 18:33:04.765239 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:04.765204 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-tdfb9\"" Apr 16 18:33:04.765239 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:04.765204 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-6c796\"" Apr 16 18:33:04.772336 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:04.772303 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-dlnfz" Apr 16 18:33:04.772485 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:04.772383 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-pb8c8" Apr 16 18:33:04.919704 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:04.919675 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-dlnfz"] Apr 16 18:33:04.923784 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:33:04.923726 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podccb22e48_3cd7_442e_ac5a_7ec7666b48e9.slice/crio-032d811ad6db8b3acb4d1192d1e5dc4c2e242bfa9d436125291e1233956bef9c WatchSource:0}: Error finding container 032d811ad6db8b3acb4d1192d1e5dc4c2e242bfa9d436125291e1233956bef9c: Status 404 returned error can't find the container with id 032d811ad6db8b3acb4d1192d1e5dc4c2e242bfa9d436125291e1233956bef9c Apr 16 18:33:04.938385 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:04.938362 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-pb8c8"] Apr 16 18:33:04.941590 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:33:04.941564 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod714c4beb_40ac_4478_80ff_d058fb5fd1a3.slice/crio-fcea1ed5dfb19e56fc05b4b3213ed1fce78a1f6b66fd6cadbcef5581f8fd5953 WatchSource:0}: Error finding container fcea1ed5dfb19e56fc05b4b3213ed1fce78a1f6b66fd6cadbcef5581f8fd5953: Status 404 returned error can't find the container with id fcea1ed5dfb19e56fc05b4b3213ed1fce78a1f6b66fd6cadbcef5581f8fd5953 Apr 16 18:33:05.283584 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:05.283547 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-pb8c8" event={"ID":"714c4beb-40ac-4478-80ff-d058fb5fd1a3","Type":"ContainerStarted","Data":"fcea1ed5dfb19e56fc05b4b3213ed1fce78a1f6b66fd6cadbcef5581f8fd5953"} Apr 16 18:33:05.284664 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:05.284637 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dlnfz" event={"ID":"ccb22e48-3cd7-442e-ac5a-7ec7666b48e9","Type":"ContainerStarted","Data":"032d811ad6db8b3acb4d1192d1e5dc4c2e242bfa9d436125291e1233956bef9c"} Apr 16 18:33:05.976905 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:05.976828 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-w6jqv"] Apr 16 18:33:05.980133 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:05.980106 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-w6jqv" Apr 16 18:33:05.983102 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:05.983071 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-w59hk\"" Apr 16 18:33:05.985374 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:05.985191 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 16 18:33:05.988281 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:05.988237 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-w6jqv"] Apr 16 18:33:06.135168 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:06.135115 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/c4040798-ae88-4f3b-abb7-2af899225127-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-w6jqv\" (UID: \"c4040798-ae88-4f3b-abb7-2af899225127\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-w6jqv" Apr 16 18:33:06.236092 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:06.236000 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/c4040798-ae88-4f3b-abb7-2af899225127-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-w6jqv\" (UID: \"c4040798-ae88-4f3b-abb7-2af899225127\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-w6jqv" Apr 16 18:33:06.236295 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:33:06.236171 2578 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 16 18:33:06.236295 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:33:06.236278 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c4040798-ae88-4f3b-abb7-2af899225127-tls-certificates podName:c4040798-ae88-4f3b-abb7-2af899225127 nodeName:}" failed. No retries permitted until 2026-04-16 18:33:06.736255263 +0000 UTC m=+163.523591498 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/c4040798-ae88-4f3b-abb7-2af899225127-tls-certificates") pod "prometheus-operator-admission-webhook-9cb97cd87-w6jqv" (UID: "c4040798-ae88-4f3b-abb7-2af899225127") : secret "prometheus-operator-admission-webhook-tls" not found Apr 16 18:33:06.288711 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:06.288669 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-dfzr4" event={"ID":"bdf8d744-507e-4943-8866-e60d7c582151","Type":"ContainerStarted","Data":"705c328f838f671cdc933bb08946205dd2d0968420e303292cfd16ecd9d14a79"} Apr 16 18:33:06.306304 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:06.306253 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-dfzr4" podStartSLOduration=33.764735392 podStartE2EDuration="35.306234766s" podCreationTimestamp="2026-04-16 18:32:31 +0000 UTC" firstStartedPulling="2026-04-16 18:33:03.926740461 +0000 UTC m=+160.714076708" lastFinishedPulling="2026-04-16 18:33:05.468239844 +0000 UTC m=+162.255576082" observedRunningTime="2026-04-16 18:33:06.305305268 +0000 UTC m=+163.092641525" watchObservedRunningTime="2026-04-16 18:33:06.306234766 +0000 UTC m=+163.093571023" Apr 16 18:33:06.740706 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:06.740659 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/c4040798-ae88-4f3b-abb7-2af899225127-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-w6jqv\" (UID: \"c4040798-ae88-4f3b-abb7-2af899225127\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-w6jqv" Apr 16 18:33:06.743494 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:06.743449 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/c4040798-ae88-4f3b-abb7-2af899225127-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-w6jqv\" (UID: \"c4040798-ae88-4f3b-abb7-2af899225127\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-w6jqv" Apr 16 18:33:06.897144 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:06.897097 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-w6jqv" Apr 16 18:33:07.155213 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:07.154277 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-w6jqv"] Apr 16 18:33:07.160095 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:33:07.160060 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4040798_ae88_4f3b_abb7_2af899225127.slice/crio-ac149c97e001e5e7e14d3357e0d1b155ffa2a37953730d02f5e08d45731ddd4e WatchSource:0}: Error finding container ac149c97e001e5e7e14d3357e0d1b155ffa2a37953730d02f5e08d45731ddd4e: Status 404 returned error can't find the container with id ac149c97e001e5e7e14d3357e0d1b155ffa2a37953730d02f5e08d45731ddd4e Apr 16 18:33:07.295436 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:07.295394 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-pb8c8" event={"ID":"714c4beb-40ac-4478-80ff-d058fb5fd1a3","Type":"ContainerStarted","Data":"8140f6f943678cdcc646b11838f48301a25e2795948e183a10f6fc0dd47a5bfc"} Apr 16 18:33:07.296642 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:07.296619 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-w6jqv" event={"ID":"c4040798-ae88-4f3b-abb7-2af899225127","Type":"ContainerStarted","Data":"ac149c97e001e5e7e14d3357e0d1b155ffa2a37953730d02f5e08d45731ddd4e"} Apr 16 18:33:07.298481 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:07.298442 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dlnfz" event={"ID":"ccb22e48-3cd7-442e-ac5a-7ec7666b48e9","Type":"ContainerStarted","Data":"05afc125e91a7ecb8df36d2fb74b8793ba3a13a39c3444395f1fa811f581793e"} Apr 16 18:33:07.298481 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:07.298470 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dlnfz" event={"ID":"ccb22e48-3cd7-442e-ac5a-7ec7666b48e9","Type":"ContainerStarted","Data":"68dd96a045fef8a9c56b3cae963d5ffaec4a425ace2d6c75c9d006b0adc38adc"} Apr 16 18:33:07.298653 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:07.298575 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-dlnfz" Apr 16 18:33:07.313001 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:07.312951 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-pb8c8" podStartSLOduration=129.229938264 podStartE2EDuration="2m11.312933402s" podCreationTimestamp="2026-04-16 18:30:56 +0000 UTC" firstStartedPulling="2026-04-16 18:33:04.94376282 +0000 UTC m=+161.731099053" lastFinishedPulling="2026-04-16 18:33:07.026757958 +0000 UTC m=+163.814094191" observedRunningTime="2026-04-16 18:33:07.312423104 +0000 UTC m=+164.099759359" watchObservedRunningTime="2026-04-16 18:33:07.312933402 +0000 UTC m=+164.100269655" Apr 16 18:33:07.333359 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:07.333317 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-dlnfz" podStartSLOduration=129.240441448 podStartE2EDuration="2m11.333303354s" podCreationTimestamp="2026-04-16 18:30:56 +0000 UTC" firstStartedPulling="2026-04-16 18:33:04.926396394 +0000 UTC m=+161.713732628" lastFinishedPulling="2026-04-16 18:33:07.019258286 +0000 UTC m=+163.806594534" observedRunningTime="2026-04-16 18:33:07.332695084 +0000 UTC m=+164.120031339" watchObservedRunningTime="2026-04-16 18:33:07.333303354 +0000 UTC m=+164.120639640" Apr 16 18:33:08.302401 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:08.302362 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-w6jqv" event={"ID":"c4040798-ae88-4f3b-abb7-2af899225127","Type":"ContainerStarted","Data":"5c3e616ab5ed7b07f3543f070bfa81c5a73c1e397e77e51cc7f3376ff2780724"} Apr 16 18:33:08.317839 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:08.317787 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-w6jqv" podStartSLOduration=2.3873155710000002 podStartE2EDuration="3.31777016s" podCreationTimestamp="2026-04-16 18:33:05 +0000 UTC" firstStartedPulling="2026-04-16 18:33:07.162625474 +0000 UTC m=+163.949961714" lastFinishedPulling="2026-04-16 18:33:08.093080066 +0000 UTC m=+164.880416303" observedRunningTime="2026-04-16 18:33:08.316523376 +0000 UTC m=+165.103859631" watchObservedRunningTime="2026-04-16 18:33:08.31777016 +0000 UTC m=+165.105106417" Apr 16 18:33:09.305171 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:09.305143 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-w6jqv" Apr 16 18:33:09.309863 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:09.309838 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-w6jqv" Apr 16 18:33:09.363397 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:09.363363 2578 patch_prober.go:28] interesting pod/image-registry-5b8555f68f-6s9wq container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 18:33:09.363552 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:09.363412 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-5b8555f68f-6s9wq" podUID="eb710a5a-c4ac-49d9-90fc-dd7e54250a60" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:33:10.029890 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:10.029860 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-7sgp9"] Apr 16 18:33:10.032743 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:10.032727 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-7sgp9" Apr 16 18:33:10.035507 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:10.035477 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 16 18:33:10.036884 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:10.036868 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 18:33:10.036998 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:10.036867 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 16 18:33:10.036998 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:10.036955 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-m6jpv\"" Apr 16 18:33:10.043437 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:10.043416 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-7sgp9"] Apr 16 18:33:10.070047 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:10.070013 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c1e239f4-fc22-47cc-bda3-343f85242b88-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-7sgp9\" (UID: \"c1e239f4-fc22-47cc-bda3-343f85242b88\") " pod="openshift-monitoring/prometheus-operator-78f957474d-7sgp9" Apr 16 18:33:10.070244 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:10.070056 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/c1e239f4-fc22-47cc-bda3-343f85242b88-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-7sgp9\" (UID: \"c1e239f4-fc22-47cc-bda3-343f85242b88\") " pod="openshift-monitoring/prometheus-operator-78f957474d-7sgp9" Apr 16 18:33:10.070244 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:10.070079 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c1e239f4-fc22-47cc-bda3-343f85242b88-metrics-client-ca\") pod \"prometheus-operator-78f957474d-7sgp9\" (UID: \"c1e239f4-fc22-47cc-bda3-343f85242b88\") " pod="openshift-monitoring/prometheus-operator-78f957474d-7sgp9" Apr 16 18:33:10.070244 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:10.070100 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kg74\" (UniqueName: \"kubernetes.io/projected/c1e239f4-fc22-47cc-bda3-343f85242b88-kube-api-access-5kg74\") pod \"prometheus-operator-78f957474d-7sgp9\" (UID: \"c1e239f4-fc22-47cc-bda3-343f85242b88\") " pod="openshift-monitoring/prometheus-operator-78f957474d-7sgp9" Apr 16 18:33:10.170432 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:10.170398 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/c1e239f4-fc22-47cc-bda3-343f85242b88-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-7sgp9\" (UID: \"c1e239f4-fc22-47cc-bda3-343f85242b88\") " pod="openshift-monitoring/prometheus-operator-78f957474d-7sgp9" Apr 16 18:33:10.170432 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:10.170437 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c1e239f4-fc22-47cc-bda3-343f85242b88-metrics-client-ca\") pod \"prometheus-operator-78f957474d-7sgp9\" (UID: \"c1e239f4-fc22-47cc-bda3-343f85242b88\") " pod="openshift-monitoring/prometheus-operator-78f957474d-7sgp9" Apr 16 18:33:10.170638 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:10.170472 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5kg74\" (UniqueName: \"kubernetes.io/projected/c1e239f4-fc22-47cc-bda3-343f85242b88-kube-api-access-5kg74\") pod \"prometheus-operator-78f957474d-7sgp9\" (UID: \"c1e239f4-fc22-47cc-bda3-343f85242b88\") " pod="openshift-monitoring/prometheus-operator-78f957474d-7sgp9" Apr 16 18:33:10.170638 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:33:10.170553 2578 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 16 18:33:10.170638 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:10.170613 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c1e239f4-fc22-47cc-bda3-343f85242b88-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-7sgp9\" (UID: \"c1e239f4-fc22-47cc-bda3-343f85242b88\") " pod="openshift-monitoring/prometheus-operator-78f957474d-7sgp9" Apr 16 18:33:10.170638 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:33:10.170627 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1e239f4-fc22-47cc-bda3-343f85242b88-prometheus-operator-tls podName:c1e239f4-fc22-47cc-bda3-343f85242b88 nodeName:}" failed. No retries permitted until 2026-04-16 18:33:10.670608787 +0000 UTC m=+167.457945034 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/c1e239f4-fc22-47cc-bda3-343f85242b88-prometheus-operator-tls") pod "prometheus-operator-78f957474d-7sgp9" (UID: "c1e239f4-fc22-47cc-bda3-343f85242b88") : secret "prometheus-operator-tls" not found Apr 16 18:33:10.171226 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:10.171203 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c1e239f4-fc22-47cc-bda3-343f85242b88-metrics-client-ca\") pod \"prometheus-operator-78f957474d-7sgp9\" (UID: \"c1e239f4-fc22-47cc-bda3-343f85242b88\") " pod="openshift-monitoring/prometheus-operator-78f957474d-7sgp9" Apr 16 18:33:10.172952 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:10.172924 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c1e239f4-fc22-47cc-bda3-343f85242b88-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-7sgp9\" (UID: \"c1e239f4-fc22-47cc-bda3-343f85242b88\") " pod="openshift-monitoring/prometheus-operator-78f957474d-7sgp9" Apr 16 18:33:10.179047 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:10.179028 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kg74\" (UniqueName: \"kubernetes.io/projected/c1e239f4-fc22-47cc-bda3-343f85242b88-kube-api-access-5kg74\") pod \"prometheus-operator-78f957474d-7sgp9\" (UID: \"c1e239f4-fc22-47cc-bda3-343f85242b88\") " pod="openshift-monitoring/prometheus-operator-78f957474d-7sgp9" Apr 16 18:33:10.674047 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:10.673993 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/c1e239f4-fc22-47cc-bda3-343f85242b88-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-7sgp9\" (UID: \"c1e239f4-fc22-47cc-bda3-343f85242b88\") " pod="openshift-monitoring/prometheus-operator-78f957474d-7sgp9" Apr 16 18:33:10.676449 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:10.676425 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/c1e239f4-fc22-47cc-bda3-343f85242b88-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-7sgp9\" (UID: \"c1e239f4-fc22-47cc-bda3-343f85242b88\") " pod="openshift-monitoring/prometheus-operator-78f957474d-7sgp9" Apr 16 18:33:10.942437 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:10.942335 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-7sgp9" Apr 16 18:33:11.068855 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:11.068822 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-7sgp9"] Apr 16 18:33:11.071956 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:33:11.071927 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1e239f4_fc22_47cc_bda3_343f85242b88.slice/crio-e1cbc3494da0901145c8a07644e2cc39e120edfd0ec4af857b5583159a38b5a3 WatchSource:0}: Error finding container e1cbc3494da0901145c8a07644e2cc39e120edfd0ec4af857b5583159a38b5a3: Status 404 returned error can't find the container with id e1cbc3494da0901145c8a07644e2cc39e120edfd0ec4af857b5583159a38b5a3 Apr 16 18:33:11.311359 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:11.311307 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-7sgp9" event={"ID":"c1e239f4-fc22-47cc-bda3-343f85242b88","Type":"ContainerStarted","Data":"e1cbc3494da0901145c8a07644e2cc39e120edfd0ec4af857b5583159a38b5a3"} Apr 16 18:33:12.316312 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:12.316268 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-7sgp9" event={"ID":"c1e239f4-fc22-47cc-bda3-343f85242b88","Type":"ContainerStarted","Data":"245bd0932da1192f8c86dca233876c7c3d30c77b96378cbf86dba1a4152261a9"} Apr 16 18:33:12.316312 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:12.316315 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-7sgp9" event={"ID":"c1e239f4-fc22-47cc-bda3-343f85242b88","Type":"ContainerStarted","Data":"9359b2f554787f33b07436a0513536531f16116722fd66b2c8f530ad843271c3"} Apr 16 18:33:12.332079 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:12.332030 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-78f957474d-7sgp9" podStartSLOduration=1.284119008 podStartE2EDuration="2.332015999s" podCreationTimestamp="2026-04-16 18:33:10 +0000 UTC" firstStartedPulling="2026-04-16 18:33:11.073745911 +0000 UTC m=+167.861082144" lastFinishedPulling="2026-04-16 18:33:12.121642884 +0000 UTC m=+168.908979135" observedRunningTime="2026-04-16 18:33:12.331770925 +0000 UTC m=+169.119107182" watchObservedRunningTime="2026-04-16 18:33:12.332015999 +0000 UTC m=+169.119352254" Apr 16 18:33:14.388312 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:14.388278 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-mcgzx"] Apr 16 18:33:14.392060 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:14.392042 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-mcgzx" Apr 16 18:33:14.396761 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:14.396308 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 16 18:33:14.396761 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:14.396334 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 16 18:33:14.396761 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:14.396563 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-hq4st\"" Apr 16 18:33:14.397079 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:14.397063 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 16 18:33:14.400531 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:14.399242 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-b6czb"] Apr 16 18:33:14.400531 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:14.400231 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/41a2f87e-530d-4207-8cc8-e7ee979357e0-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-mcgzx\" (UID: \"41a2f87e-530d-4207-8cc8-e7ee979357e0\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-mcgzx" Apr 16 18:33:14.400531 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:14.400280 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/41a2f87e-530d-4207-8cc8-e7ee979357e0-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-mcgzx\" (UID: \"41a2f87e-530d-4207-8cc8-e7ee979357e0\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-mcgzx" Apr 16 18:33:14.400531 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:14.400322 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/41a2f87e-530d-4207-8cc8-e7ee979357e0-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-mcgzx\" (UID: \"41a2f87e-530d-4207-8cc8-e7ee979357e0\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-mcgzx" Apr 16 18:33:14.400531 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:14.400370 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/41a2f87e-530d-4207-8cc8-e7ee979357e0-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-mcgzx\" (UID: \"41a2f87e-530d-4207-8cc8-e7ee979357e0\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-mcgzx" Apr 16 18:33:14.400531 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:14.400401 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdjm4\" (UniqueName: \"kubernetes.io/projected/41a2f87e-530d-4207-8cc8-e7ee979357e0-kube-api-access-mdjm4\") pod \"kube-state-metrics-7479c89684-mcgzx\" (UID: \"41a2f87e-530d-4207-8cc8-e7ee979357e0\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-mcgzx" Apr 16 18:33:14.400531 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:14.400432 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/41a2f87e-530d-4207-8cc8-e7ee979357e0-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-mcgzx\" (UID: \"41a2f87e-530d-4207-8cc8-e7ee979357e0\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-mcgzx" Apr 16 18:33:14.402534 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:14.402513 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-b6czb" Apr 16 18:33:14.405658 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:14.405639 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 18:33:14.405759 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:14.405723 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-x6np6\"" Apr 16 18:33:14.405960 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:14.405942 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 18:33:14.406036 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:14.405950 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 18:33:14.412635 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:14.412595 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-mcgzx"] Apr 16 18:33:14.501099 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:14.501067 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/41a2f87e-530d-4207-8cc8-e7ee979357e0-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-mcgzx\" (UID: \"41a2f87e-530d-4207-8cc8-e7ee979357e0\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-mcgzx" Apr 16 18:33:14.501282 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:14.501113 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/758ed149-3d81-4eec-bdac-e7eeca35aecc-node-exporter-wtmp\") pod \"node-exporter-b6czb\" (UID: \"758ed149-3d81-4eec-bdac-e7eeca35aecc\") " pod="openshift-monitoring/node-exporter-b6czb" Apr 16 18:33:14.501282 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:14.501261 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/758ed149-3d81-4eec-bdac-e7eeca35aecc-node-exporter-tls\") pod \"node-exporter-b6czb\" (UID: \"758ed149-3d81-4eec-bdac-e7eeca35aecc\") " pod="openshift-monitoring/node-exporter-b6czb" Apr 16 18:33:14.501407 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:14.501319 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/41a2f87e-530d-4207-8cc8-e7ee979357e0-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-mcgzx\" (UID: \"41a2f87e-530d-4207-8cc8-e7ee979357e0\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-mcgzx" Apr 16 18:33:14.501407 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:14.501367 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/758ed149-3d81-4eec-bdac-e7eeca35aecc-metrics-client-ca\") pod \"node-exporter-b6czb\" (UID: \"758ed149-3d81-4eec-bdac-e7eeca35aecc\") " pod="openshift-monitoring/node-exporter-b6czb" Apr 16 18:33:14.501407 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:14.501404 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/758ed149-3d81-4eec-bdac-e7eeca35aecc-root\") pod \"node-exporter-b6czb\" (UID: \"758ed149-3d81-4eec-bdac-e7eeca35aecc\") " pod="openshift-monitoring/node-exporter-b6czb" Apr 16 18:33:14.501559 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:14.501435 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/758ed149-3d81-4eec-bdac-e7eeca35aecc-sys\") pod \"node-exporter-b6czb\" (UID: \"758ed149-3d81-4eec-bdac-e7eeca35aecc\") " pod="openshift-monitoring/node-exporter-b6czb" Apr 16 18:33:14.501559 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:14.501463 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/41a2f87e-530d-4207-8cc8-e7ee979357e0-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-mcgzx\" (UID: \"41a2f87e-530d-4207-8cc8-e7ee979357e0\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-mcgzx" Apr 16 18:33:14.501559 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:14.501488 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/758ed149-3d81-4eec-bdac-e7eeca35aecc-node-exporter-textfile\") pod \"node-exporter-b6czb\" (UID: \"758ed149-3d81-4eec-bdac-e7eeca35aecc\") " pod="openshift-monitoring/node-exporter-b6czb" Apr 16 18:33:14.501700 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:33:14.501569 2578 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 16 18:33:14.501700 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:14.501607 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/758ed149-3d81-4eec-bdac-e7eeca35aecc-node-exporter-accelerators-collector-config\") pod \"node-exporter-b6czb\" (UID: \"758ed149-3d81-4eec-bdac-e7eeca35aecc\") " pod="openshift-monitoring/node-exporter-b6czb" Apr 16 18:33:14.501700 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:33:14.501634 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41a2f87e-530d-4207-8cc8-e7ee979357e0-kube-state-metrics-tls podName:41a2f87e-530d-4207-8cc8-e7ee979357e0 nodeName:}" failed. No retries permitted until 2026-04-16 18:33:15.001611781 +0000 UTC m=+171.788948018 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/41a2f87e-530d-4207-8cc8-e7ee979357e0-kube-state-metrics-tls") pod "kube-state-metrics-7479c89684-mcgzx" (UID: "41a2f87e-530d-4207-8cc8-e7ee979357e0") : secret "kube-state-metrics-tls" not found Apr 16 18:33:14.501842 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:14.501710 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/41a2f87e-530d-4207-8cc8-e7ee979357e0-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-mcgzx\" (UID: \"41a2f87e-530d-4207-8cc8-e7ee979357e0\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-mcgzx" Apr 16 18:33:14.501842 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:14.501754 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnt56\" (UniqueName: \"kubernetes.io/projected/758ed149-3d81-4eec-bdac-e7eeca35aecc-kube-api-access-rnt56\") pod \"node-exporter-b6czb\" (UID: \"758ed149-3d81-4eec-bdac-e7eeca35aecc\") " pod="openshift-monitoring/node-exporter-b6czb" Apr 16 18:33:14.501842 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:14.501813 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/41a2f87e-530d-4207-8cc8-e7ee979357e0-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-mcgzx\" (UID: \"41a2f87e-530d-4207-8cc8-e7ee979357e0\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-mcgzx" Apr 16 18:33:14.501986 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:14.501846 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mdjm4\" (UniqueName: \"kubernetes.io/projected/41a2f87e-530d-4207-8cc8-e7ee979357e0-kube-api-access-mdjm4\") pod \"kube-state-metrics-7479c89684-mcgzx\" (UID: \"41a2f87e-530d-4207-8cc8-e7ee979357e0\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-mcgzx" Apr 16 18:33:14.501986 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:14.501878 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/758ed149-3d81-4eec-bdac-e7eeca35aecc-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-b6czb\" (UID: \"758ed149-3d81-4eec-bdac-e7eeca35aecc\") " pod="openshift-monitoring/node-exporter-b6czb" Apr 16 18:33:14.502270 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:14.502249 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/41a2f87e-530d-4207-8cc8-e7ee979357e0-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-mcgzx\" (UID: \"41a2f87e-530d-4207-8cc8-e7ee979357e0\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-mcgzx" Apr 16 18:33:14.502979 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:14.502957 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/41a2f87e-530d-4207-8cc8-e7ee979357e0-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-mcgzx\" (UID: \"41a2f87e-530d-4207-8cc8-e7ee979357e0\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-mcgzx" Apr 16 18:33:14.503077 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:14.502957 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/41a2f87e-530d-4207-8cc8-e7ee979357e0-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-mcgzx\" (UID: \"41a2f87e-530d-4207-8cc8-e7ee979357e0\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-mcgzx" Apr 16 18:33:14.505033 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:14.505002 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/41a2f87e-530d-4207-8cc8-e7ee979357e0-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-mcgzx\" (UID: \"41a2f87e-530d-4207-8cc8-e7ee979357e0\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-mcgzx" Apr 16 18:33:14.515725 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:14.515701 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdjm4\" (UniqueName: \"kubernetes.io/projected/41a2f87e-530d-4207-8cc8-e7ee979357e0-kube-api-access-mdjm4\") pod \"kube-state-metrics-7479c89684-mcgzx\" (UID: \"41a2f87e-530d-4207-8cc8-e7ee979357e0\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-mcgzx" Apr 16 18:33:14.603367 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:14.602592 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/758ed149-3d81-4eec-bdac-e7eeca35aecc-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-b6czb\" (UID: \"758ed149-3d81-4eec-bdac-e7eeca35aecc\") " pod="openshift-monitoring/node-exporter-b6czb" Apr 16 18:33:14.603367 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:14.602654 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/758ed149-3d81-4eec-bdac-e7eeca35aecc-node-exporter-wtmp\") pod \"node-exporter-b6czb\" (UID: \"758ed149-3d81-4eec-bdac-e7eeca35aecc\") " pod="openshift-monitoring/node-exporter-b6czb" Apr 16 18:33:14.603367 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:14.602691 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/758ed149-3d81-4eec-bdac-e7eeca35aecc-node-exporter-tls\") pod \"node-exporter-b6czb\" (UID: \"758ed149-3d81-4eec-bdac-e7eeca35aecc\") " pod="openshift-monitoring/node-exporter-b6czb" Apr 16 18:33:14.603367 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:14.602727 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/758ed149-3d81-4eec-bdac-e7eeca35aecc-metrics-client-ca\") pod \"node-exporter-b6czb\" (UID: \"758ed149-3d81-4eec-bdac-e7eeca35aecc\") " pod="openshift-monitoring/node-exporter-b6czb" Apr 16 18:33:14.603367 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:14.602757 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/758ed149-3d81-4eec-bdac-e7eeca35aecc-root\") pod \"node-exporter-b6czb\" (UID: \"758ed149-3d81-4eec-bdac-e7eeca35aecc\") " pod="openshift-monitoring/node-exporter-b6czb" Apr 16 18:33:14.603367 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:14.602782 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/758ed149-3d81-4eec-bdac-e7eeca35aecc-sys\") pod \"node-exporter-b6czb\" (UID: \"758ed149-3d81-4eec-bdac-e7eeca35aecc\") " pod="openshift-monitoring/node-exporter-b6czb" Apr 16 18:33:14.603367 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:14.602815 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/758ed149-3d81-4eec-bdac-e7eeca35aecc-node-exporter-textfile\") pod \"node-exporter-b6czb\" (UID: \"758ed149-3d81-4eec-bdac-e7eeca35aecc\") " pod="openshift-monitoring/node-exporter-b6czb" Apr 16 18:33:14.603367 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:14.602839 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/758ed149-3d81-4eec-bdac-e7eeca35aecc-node-exporter-accelerators-collector-config\") pod \"node-exporter-b6czb\" (UID: \"758ed149-3d81-4eec-bdac-e7eeca35aecc\") " pod="openshift-monitoring/node-exporter-b6czb" Apr 16 18:33:14.603367 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:14.602872 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rnt56\" (UniqueName: \"kubernetes.io/projected/758ed149-3d81-4eec-bdac-e7eeca35aecc-kube-api-access-rnt56\") pod \"node-exporter-b6czb\" (UID: \"758ed149-3d81-4eec-bdac-e7eeca35aecc\") " pod="openshift-monitoring/node-exporter-b6czb" Apr 16 18:33:14.603867 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:14.603569 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/758ed149-3d81-4eec-bdac-e7eeca35aecc-node-exporter-wtmp\") pod \"node-exporter-b6czb\" (UID: \"758ed149-3d81-4eec-bdac-e7eeca35aecc\") " pod="openshift-monitoring/node-exporter-b6czb" Apr 16 18:33:14.603867 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:33:14.603671 2578 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 18:33:14.603867 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:33:14.603737 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/758ed149-3d81-4eec-bdac-e7eeca35aecc-node-exporter-tls podName:758ed149-3d81-4eec-bdac-e7eeca35aecc nodeName:}" failed. No retries permitted until 2026-04-16 18:33:15.103717522 +0000 UTC m=+171.891053759 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/758ed149-3d81-4eec-bdac-e7eeca35aecc-node-exporter-tls") pod "node-exporter-b6czb" (UID: "758ed149-3d81-4eec-bdac-e7eeca35aecc") : secret "node-exporter-tls" not found Apr 16 18:33:14.604924 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:14.604241 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/758ed149-3d81-4eec-bdac-e7eeca35aecc-sys\") pod \"node-exporter-b6czb\" (UID: \"758ed149-3d81-4eec-bdac-e7eeca35aecc\") " pod="openshift-monitoring/node-exporter-b6czb" Apr 16 18:33:14.604924 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:14.604560 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/758ed149-3d81-4eec-bdac-e7eeca35aecc-metrics-client-ca\") pod \"node-exporter-b6czb\" (UID: \"758ed149-3d81-4eec-bdac-e7eeca35aecc\") " pod="openshift-monitoring/node-exporter-b6czb" Apr 16 18:33:14.604924 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:14.604878 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/758ed149-3d81-4eec-bdac-e7eeca35aecc-node-exporter-accelerators-collector-config\") pod \"node-exporter-b6czb\" (UID: \"758ed149-3d81-4eec-bdac-e7eeca35aecc\") " pod="openshift-monitoring/node-exporter-b6czb" Apr 16 18:33:14.605206 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:14.605170 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/758ed149-3d81-4eec-bdac-e7eeca35aecc-root\") pod \"node-exporter-b6czb\" (UID: \"758ed149-3d81-4eec-bdac-e7eeca35aecc\") " pod="openshift-monitoring/node-exporter-b6czb" Apr 16 18:33:14.605490 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:14.605474 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/758ed149-3d81-4eec-bdac-e7eeca35aecc-node-exporter-textfile\") pod \"node-exporter-b6czb\" (UID: \"758ed149-3d81-4eec-bdac-e7eeca35aecc\") " pod="openshift-monitoring/node-exporter-b6czb" Apr 16 18:33:14.607196 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:14.607125 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/758ed149-3d81-4eec-bdac-e7eeca35aecc-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-b6czb\" (UID: \"758ed149-3d81-4eec-bdac-e7eeca35aecc\") " pod="openshift-monitoring/node-exporter-b6czb" Apr 16 18:33:14.619620 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:14.619593 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnt56\" (UniqueName: \"kubernetes.io/projected/758ed149-3d81-4eec-bdac-e7eeca35aecc-kube-api-access-rnt56\") pod \"node-exporter-b6czb\" (UID: \"758ed149-3d81-4eec-bdac-e7eeca35aecc\") " pod="openshift-monitoring/node-exporter-b6czb" Apr 16 18:33:14.837474 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:14.837421 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4smb" Apr 16 18:33:15.007459 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:15.007420 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/41a2f87e-530d-4207-8cc8-e7ee979357e0-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-mcgzx\" (UID: \"41a2f87e-530d-4207-8cc8-e7ee979357e0\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-mcgzx" Apr 16 18:33:15.009847 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:15.009824 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/41a2f87e-530d-4207-8cc8-e7ee979357e0-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-mcgzx\" (UID: \"41a2f87e-530d-4207-8cc8-e7ee979357e0\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-mcgzx" Apr 16 18:33:15.108054 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:15.107979 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/758ed149-3d81-4eec-bdac-e7eeca35aecc-node-exporter-tls\") pod \"node-exporter-b6czb\" (UID: \"758ed149-3d81-4eec-bdac-e7eeca35aecc\") " pod="openshift-monitoring/node-exporter-b6czb" Apr 16 18:33:15.110112 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:15.110084 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/758ed149-3d81-4eec-bdac-e7eeca35aecc-node-exporter-tls\") pod \"node-exporter-b6czb\" (UID: \"758ed149-3d81-4eec-bdac-e7eeca35aecc\") " pod="openshift-monitoring/node-exporter-b6czb" Apr 16 18:33:15.304006 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:15.303971 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-mcgzx" Apr 16 18:33:15.311975 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:15.311946 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-b6czb" Apr 16 18:33:15.320492 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:33:15.320455 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod758ed149_3d81_4eec_bdac_e7eeca35aecc.slice/crio-c38d7449155119b9368fe6b57f9a8d8561b6e741179ea2294f8ffbeb5d23230f WatchSource:0}: Error finding container c38d7449155119b9368fe6b57f9a8d8561b6e741179ea2294f8ffbeb5d23230f: Status 404 returned error can't find the container with id c38d7449155119b9368fe6b57f9a8d8561b6e741179ea2294f8ffbeb5d23230f Apr 16 18:33:15.327145 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:15.327114 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-b6czb" event={"ID":"758ed149-3d81-4eec-bdac-e7eeca35aecc","Type":"ContainerStarted","Data":"c38d7449155119b9368fe6b57f9a8d8561b6e741179ea2294f8ffbeb5d23230f"} Apr 16 18:33:15.443334 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:15.443294 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-mcgzx"] Apr 16 18:33:15.445921 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:33:15.445899 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41a2f87e_530d_4207_8cc8_e7ee979357e0.slice/crio-98c38ceee0bdbf4633359b2da11e8ae950c97a8b5451f8c4029e64272d1f275f WatchSource:0}: Error finding container 98c38ceee0bdbf4633359b2da11e8ae950c97a8b5451f8c4029e64272d1f275f: Status 404 returned error can't find the container with id 98c38ceee0bdbf4633359b2da11e8ae950c97a8b5451f8c4029e64272d1f275f Apr 16 18:33:16.331962 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:16.331927 2578 generic.go:358] "Generic (PLEG): container finished" podID="758ed149-3d81-4eec-bdac-e7eeca35aecc" containerID="2118d20bf2ae88d42037e292799b2062d6894d0dc51e0d543c77037fbae2caac" exitCode=0 Apr 16 18:33:16.332155 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:16.332006 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-b6czb" event={"ID":"758ed149-3d81-4eec-bdac-e7eeca35aecc","Type":"ContainerDied","Data":"2118d20bf2ae88d42037e292799b2062d6894d0dc51e0d543c77037fbae2caac"} Apr 16 18:33:16.333192 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:16.333150 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-mcgzx" event={"ID":"41a2f87e-530d-4207-8cc8-e7ee979357e0","Type":"ContainerStarted","Data":"98c38ceee0bdbf4633359b2da11e8ae950c97a8b5451f8c4029e64272d1f275f"} Apr 16 18:33:17.305044 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:17.305015 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-dlnfz" Apr 16 18:33:17.340193 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:17.340087 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-b6czb" event={"ID":"758ed149-3d81-4eec-bdac-e7eeca35aecc","Type":"ContainerStarted","Data":"85d1cc8128f9f01c976b610185c91e818532cad08aaf4d2c8db597dd704af92b"} Apr 16 18:33:17.340193 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:17.340124 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-b6czb" event={"ID":"758ed149-3d81-4eec-bdac-e7eeca35aecc","Type":"ContainerStarted","Data":"b99b2583252b23d7f359d8b2ef2b3e1d8079c3d9797f24a884a01b7f021d1199"} Apr 16 18:33:17.342162 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:17.342125 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-mcgzx" event={"ID":"41a2f87e-530d-4207-8cc8-e7ee979357e0","Type":"ContainerStarted","Data":"74ff363a7c9376f23600edeb760e05173e6b9c9430de9f0d27bb5826f6b69101"} Apr 16 18:33:17.342320 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:17.342171 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-mcgzx" event={"ID":"41a2f87e-530d-4207-8cc8-e7ee979357e0","Type":"ContainerStarted","Data":"e0c97793f1255f0dc3e4c0771f85c6737223377ddeefea194cc02355f7bf7a87"} Apr 16 18:33:17.342320 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:17.342209 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-mcgzx" event={"ID":"41a2f87e-530d-4207-8cc8-e7ee979357e0","Type":"ContainerStarted","Data":"575b71b5e721c83c6f79ac5b7693c48a38ae7c936496f46c42958b9708fcf2b8"} Apr 16 18:33:17.358783 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:17.358739 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-b6czb" podStartSLOduration=2.718817347 podStartE2EDuration="3.358724056s" podCreationTimestamp="2026-04-16 18:33:14 +0000 UTC" firstStartedPulling="2026-04-16 18:33:15.322766195 +0000 UTC m=+172.110102431" lastFinishedPulling="2026-04-16 18:33:15.962672892 +0000 UTC m=+172.750009140" observedRunningTime="2026-04-16 18:33:17.358232054 +0000 UTC m=+174.145568312" watchObservedRunningTime="2026-04-16 18:33:17.358724056 +0000 UTC m=+174.146060309" Apr 16 18:33:17.377246 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:17.377202 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-7479c89684-mcgzx" podStartSLOduration=2.05593972 podStartE2EDuration="3.377171155s" podCreationTimestamp="2026-04-16 18:33:14 +0000 UTC" firstStartedPulling="2026-04-16 18:33:15.448229539 +0000 UTC m=+172.235565783" lastFinishedPulling="2026-04-16 18:33:16.769460974 +0000 UTC m=+173.556797218" observedRunningTime="2026-04-16 18:33:17.376375221 +0000 UTC m=+174.163711478" watchObservedRunningTime="2026-04-16 18:33:17.377171155 +0000 UTC m=+174.164507410" Apr 16 18:33:18.679493 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:18.679456 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-67b789b557-rnptf"] Apr 16 18:33:18.682822 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:18.682804 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-67b789b557-rnptf" Apr 16 18:33:18.685536 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:18.685512 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 16 18:33:18.685639 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:18.685540 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-f86vg\"" Apr 16 18:33:18.685639 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:18.685548 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 16 18:33:18.686953 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:18.686936 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 18:33:18.687062 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:18.686996 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-7la8bp7fs2292\"" Apr 16 18:33:18.687062 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:18.687041 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 16 18:33:18.691040 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:18.690923 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-67b789b557-rnptf"] Apr 16 18:33:18.737781 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:18.737753 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hxpk\" (UniqueName: \"kubernetes.io/projected/bd1da6bd-aefd-4703-afb9-ec982489f130-kube-api-access-5hxpk\") pod \"metrics-server-67b789b557-rnptf\" (UID: \"bd1da6bd-aefd-4703-afb9-ec982489f130\") " pod="openshift-monitoring/metrics-server-67b789b557-rnptf" Apr 16 18:33:18.737901 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:18.737851 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/bd1da6bd-aefd-4703-afb9-ec982489f130-secret-metrics-server-client-certs\") pod \"metrics-server-67b789b557-rnptf\" (UID: \"bd1da6bd-aefd-4703-afb9-ec982489f130\") " pod="openshift-monitoring/metrics-server-67b789b557-rnptf" Apr 16 18:33:18.737958 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:18.737899 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/bd1da6bd-aefd-4703-afb9-ec982489f130-metrics-server-audit-profiles\") pod \"metrics-server-67b789b557-rnptf\" (UID: \"bd1da6bd-aefd-4703-afb9-ec982489f130\") " pod="openshift-monitoring/metrics-server-67b789b557-rnptf" Apr 16 18:33:18.737958 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:18.737919 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/bd1da6bd-aefd-4703-afb9-ec982489f130-audit-log\") pod \"metrics-server-67b789b557-rnptf\" (UID: \"bd1da6bd-aefd-4703-afb9-ec982489f130\") " pod="openshift-monitoring/metrics-server-67b789b557-rnptf" Apr 16 18:33:18.737958 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:18.737937 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/bd1da6bd-aefd-4703-afb9-ec982489f130-secret-metrics-server-tls\") pod \"metrics-server-67b789b557-rnptf\" (UID: \"bd1da6bd-aefd-4703-afb9-ec982489f130\") " pod="openshift-monitoring/metrics-server-67b789b557-rnptf" Apr 16 18:33:18.738045 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:18.737959 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd1da6bd-aefd-4703-afb9-ec982489f130-client-ca-bundle\") pod \"metrics-server-67b789b557-rnptf\" (UID: \"bd1da6bd-aefd-4703-afb9-ec982489f130\") " pod="openshift-monitoring/metrics-server-67b789b557-rnptf" Apr 16 18:33:18.738045 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:18.737990 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd1da6bd-aefd-4703-afb9-ec982489f130-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-67b789b557-rnptf\" (UID: \"bd1da6bd-aefd-4703-afb9-ec982489f130\") " pod="openshift-monitoring/metrics-server-67b789b557-rnptf" Apr 16 18:33:18.838372 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:18.838335 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/bd1da6bd-aefd-4703-afb9-ec982489f130-metrics-server-audit-profiles\") pod \"metrics-server-67b789b557-rnptf\" (UID: \"bd1da6bd-aefd-4703-afb9-ec982489f130\") " pod="openshift-monitoring/metrics-server-67b789b557-rnptf" Apr 16 18:33:18.838372 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:18.838374 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/bd1da6bd-aefd-4703-afb9-ec982489f130-audit-log\") pod \"metrics-server-67b789b557-rnptf\" (UID: \"bd1da6bd-aefd-4703-afb9-ec982489f130\") " pod="openshift-monitoring/metrics-server-67b789b557-rnptf" Apr 16 18:33:18.838590 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:18.838401 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/bd1da6bd-aefd-4703-afb9-ec982489f130-secret-metrics-server-tls\") pod \"metrics-server-67b789b557-rnptf\" (UID: \"bd1da6bd-aefd-4703-afb9-ec982489f130\") " pod="openshift-monitoring/metrics-server-67b789b557-rnptf" Apr 16 18:33:18.838590 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:18.838431 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd1da6bd-aefd-4703-afb9-ec982489f130-client-ca-bundle\") pod \"metrics-server-67b789b557-rnptf\" (UID: \"bd1da6bd-aefd-4703-afb9-ec982489f130\") " pod="openshift-monitoring/metrics-server-67b789b557-rnptf" Apr 16 18:33:18.838590 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:18.838529 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd1da6bd-aefd-4703-afb9-ec982489f130-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-67b789b557-rnptf\" (UID: \"bd1da6bd-aefd-4703-afb9-ec982489f130\") " pod="openshift-monitoring/metrics-server-67b789b557-rnptf" Apr 16 18:33:18.838590 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:18.838574 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5hxpk\" (UniqueName: \"kubernetes.io/projected/bd1da6bd-aefd-4703-afb9-ec982489f130-kube-api-access-5hxpk\") pod \"metrics-server-67b789b557-rnptf\" (UID: \"bd1da6bd-aefd-4703-afb9-ec982489f130\") " pod="openshift-monitoring/metrics-server-67b789b557-rnptf" Apr 16 18:33:18.838774 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:18.838615 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/bd1da6bd-aefd-4703-afb9-ec982489f130-secret-metrics-server-client-certs\") pod \"metrics-server-67b789b557-rnptf\" (UID: \"bd1da6bd-aefd-4703-afb9-ec982489f130\") " pod="openshift-monitoring/metrics-server-67b789b557-rnptf" Apr 16 18:33:18.838835 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:18.838794 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/bd1da6bd-aefd-4703-afb9-ec982489f130-audit-log\") pod \"metrics-server-67b789b557-rnptf\" (UID: \"bd1da6bd-aefd-4703-afb9-ec982489f130\") " pod="openshift-monitoring/metrics-server-67b789b557-rnptf" Apr 16 18:33:18.839263 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:18.839239 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd1da6bd-aefd-4703-afb9-ec982489f130-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-67b789b557-rnptf\" (UID: \"bd1da6bd-aefd-4703-afb9-ec982489f130\") " pod="openshift-monitoring/metrics-server-67b789b557-rnptf" Apr 16 18:33:18.839404 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:18.839388 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/bd1da6bd-aefd-4703-afb9-ec982489f130-metrics-server-audit-profiles\") pod \"metrics-server-67b789b557-rnptf\" (UID: \"bd1da6bd-aefd-4703-afb9-ec982489f130\") " pod="openshift-monitoring/metrics-server-67b789b557-rnptf" Apr 16 18:33:18.840806 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:18.840781 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/bd1da6bd-aefd-4703-afb9-ec982489f130-secret-metrics-server-tls\") pod \"metrics-server-67b789b557-rnptf\" (UID: \"bd1da6bd-aefd-4703-afb9-ec982489f130\") " pod="openshift-monitoring/metrics-server-67b789b557-rnptf" Apr 16 18:33:18.840901 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:18.840881 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd1da6bd-aefd-4703-afb9-ec982489f130-client-ca-bundle\") pod \"metrics-server-67b789b557-rnptf\" (UID: \"bd1da6bd-aefd-4703-afb9-ec982489f130\") " pod="openshift-monitoring/metrics-server-67b789b557-rnptf" Apr 16 18:33:18.840960 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:18.840945 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/bd1da6bd-aefd-4703-afb9-ec982489f130-secret-metrics-server-client-certs\") pod \"metrics-server-67b789b557-rnptf\" (UID: \"bd1da6bd-aefd-4703-afb9-ec982489f130\") " pod="openshift-monitoring/metrics-server-67b789b557-rnptf" Apr 16 18:33:18.847162 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:18.847143 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hxpk\" (UniqueName: \"kubernetes.io/projected/bd1da6bd-aefd-4703-afb9-ec982489f130-kube-api-access-5hxpk\") pod \"metrics-server-67b789b557-rnptf\" (UID: \"bd1da6bd-aefd-4703-afb9-ec982489f130\") " pod="openshift-monitoring/metrics-server-67b789b557-rnptf" Apr 16 18:33:18.993323 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:18.993298 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-67b789b557-rnptf" Apr 16 18:33:19.115580 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:19.115547 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-67b789b557-rnptf"] Apr 16 18:33:19.119144 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:33:19.119111 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd1da6bd_aefd_4703_afb9_ec982489f130.slice/crio-c35e4b4a2baaeeab20a180e47e04119bda1ea71d6b928761a335e4db5ab34207 WatchSource:0}: Error finding container c35e4b4a2baaeeab20a180e47e04119bda1ea71d6b928761a335e4db5ab34207: Status 404 returned error can't find the container with id c35e4b4a2baaeeab20a180e47e04119bda1ea71d6b928761a335e4db5ab34207 Apr 16 18:33:19.147596 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:19.147571 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-4zlht"] Apr 16 18:33:19.151806 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:19.151786 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-4zlht" Apr 16 18:33:19.154148 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:19.154119 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 16 18:33:19.154274 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:19.154254 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-kvcxt\"" Apr 16 18:33:19.158286 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:19.158249 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-4zlht"] Apr 16 18:33:19.242171 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:19.242142 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/414da762-c744-460c-8983-4a538c9e63e7-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-4zlht\" (UID: \"414da762-c744-460c-8983-4a538c9e63e7\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-4zlht" Apr 16 18:33:19.343332 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:19.343256 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/414da762-c744-460c-8983-4a538c9e63e7-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-4zlht\" (UID: \"414da762-c744-460c-8983-4a538c9e63e7\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-4zlht" Apr 16 18:33:19.343456 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:33:19.343386 2578 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 16 18:33:19.343456 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:33:19.343452 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/414da762-c744-460c-8983-4a538c9e63e7-monitoring-plugin-cert podName:414da762-c744-460c-8983-4a538c9e63e7 nodeName:}" failed. No retries permitted until 2026-04-16 18:33:19.843436493 +0000 UTC m=+176.630772731 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/414da762-c744-460c-8983-4a538c9e63e7-monitoring-plugin-cert") pod "monitoring-plugin-5876b4bbc7-4zlht" (UID: "414da762-c744-460c-8983-4a538c9e63e7") : secret "monitoring-plugin-cert" not found Apr 16 18:33:19.349339 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:19.349307 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-67b789b557-rnptf" event={"ID":"bd1da6bd-aefd-4703-afb9-ec982489f130","Type":"ContainerStarted","Data":"c35e4b4a2baaeeab20a180e47e04119bda1ea71d6b928761a335e4db5ab34207"} Apr 16 18:33:19.364048 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:19.364028 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5b8555f68f-6s9wq" Apr 16 18:33:19.847411 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:19.847374 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/414da762-c744-460c-8983-4a538c9e63e7-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-4zlht\" (UID: \"414da762-c744-460c-8983-4a538c9e63e7\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-4zlht" Apr 16 18:33:19.847866 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:33:19.847510 2578 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 16 18:33:19.847866 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:33:19.847576 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/414da762-c744-460c-8983-4a538c9e63e7-monitoring-plugin-cert podName:414da762-c744-460c-8983-4a538c9e63e7 nodeName:}" failed. No retries permitted until 2026-04-16 18:33:20.847557583 +0000 UTC m=+177.634893832 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/414da762-c744-460c-8983-4a538c9e63e7-monitoring-plugin-cert") pod "monitoring-plugin-5876b4bbc7-4zlht" (UID: "414da762-c744-460c-8983-4a538c9e63e7") : secret "monitoring-plugin-cert" not found Apr 16 18:33:20.856709 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:20.856613 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/414da762-c744-460c-8983-4a538c9e63e7-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-4zlht\" (UID: \"414da762-c744-460c-8983-4a538c9e63e7\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-4zlht" Apr 16 18:33:20.859393 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:20.859346 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/414da762-c744-460c-8983-4a538c9e63e7-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-4zlht\" (UID: \"414da762-c744-460c-8983-4a538c9e63e7\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-4zlht" Apr 16 18:33:20.962208 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:20.962165 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-4zlht" Apr 16 18:33:21.076801 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:21.076774 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-4zlht"] Apr 16 18:33:21.080671 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:33:21.080635 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod414da762_c744_460c_8983_4a538c9e63e7.slice/crio-d348aaf123967207ad1cb7d6739e05c765c2817a4f567c45e75ebe896a86ef60 WatchSource:0}: Error finding container d348aaf123967207ad1cb7d6739e05c765c2817a4f567c45e75ebe896a86ef60: Status 404 returned error can't find the container with id d348aaf123967207ad1cb7d6739e05c765c2817a4f567c45e75ebe896a86ef60 Apr 16 18:33:21.357356 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:21.357323 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-67b789b557-rnptf" event={"ID":"bd1da6bd-aefd-4703-afb9-ec982489f130","Type":"ContainerStarted","Data":"31709d821086f42636a4067c4450ab72b3528492717db9ba3f643ab28854f198"} Apr 16 18:33:21.358331 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:21.358309 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-4zlht" event={"ID":"414da762-c744-460c-8983-4a538c9e63e7","Type":"ContainerStarted","Data":"d348aaf123967207ad1cb7d6739e05c765c2817a4f567c45e75ebe896a86ef60"} Apr 16 18:33:21.373678 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:21.373638 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-67b789b557-rnptf" podStartSLOduration=2.000959208 podStartE2EDuration="3.373626832s" podCreationTimestamp="2026-04-16 18:33:18 +0000 UTC" firstStartedPulling="2026-04-16 18:33:19.121643936 +0000 UTC m=+175.908980185" lastFinishedPulling="2026-04-16 18:33:20.494311573 +0000 UTC m=+177.281647809" observedRunningTime="2026-04-16 18:33:21.373229229 +0000 UTC m=+178.160565483" watchObservedRunningTime="2026-04-16 18:33:21.373626832 +0000 UTC m=+178.160963087" Apr 16 18:33:23.365028 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:23.364998 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-4zlht" event={"ID":"414da762-c744-460c-8983-4a538c9e63e7","Type":"ContainerStarted","Data":"1242ee42d92838ab2721db455db11f0a7b9bef839ef532beba4af1a87e067115"} Apr 16 18:33:23.365399 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:23.365161 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-4zlht" Apr 16 18:33:23.369728 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:23.369708 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-4zlht" Apr 16 18:33:23.381022 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:23.380981 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-4zlht" podStartSLOduration=3.055047954 podStartE2EDuration="4.380971976s" podCreationTimestamp="2026-04-16 18:33:19 +0000 UTC" firstStartedPulling="2026-04-16 18:33:21.082638673 +0000 UTC m=+177.869974911" lastFinishedPulling="2026-04-16 18:33:22.408562685 +0000 UTC m=+179.195898933" observedRunningTime="2026-04-16 18:33:23.37959577 +0000 UTC m=+180.166932025" watchObservedRunningTime="2026-04-16 18:33:23.380971976 +0000 UTC m=+180.168308230" Apr 16 18:33:24.376716 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:24.376665 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-5b8555f68f-6s9wq" podUID="eb710a5a-c4ac-49d9-90fc-dd7e54250a60" containerName="registry" containerID="cri-o://1733ce3fcdbeae218f3bb12f5f942702a21358f1c32dbc8a888dd01867da27e6" gracePeriod=30 Apr 16 18:33:24.605712 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:24.605690 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5b8555f68f-6s9wq" Apr 16 18:33:24.686864 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:24.686797 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/eb710a5a-c4ac-49d9-90fc-dd7e54250a60-installation-pull-secrets\") pod \"eb710a5a-c4ac-49d9-90fc-dd7e54250a60\" (UID: \"eb710a5a-c4ac-49d9-90fc-dd7e54250a60\") " Apr 16 18:33:24.686864 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:24.686833 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/eb710a5a-c4ac-49d9-90fc-dd7e54250a60-bound-sa-token\") pod \"eb710a5a-c4ac-49d9-90fc-dd7e54250a60\" (UID: \"eb710a5a-c4ac-49d9-90fc-dd7e54250a60\") " Apr 16 18:33:24.686864 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:24.686862 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/eb710a5a-c4ac-49d9-90fc-dd7e54250a60-image-registry-private-configuration\") pod \"eb710a5a-c4ac-49d9-90fc-dd7e54250a60\" (UID: \"eb710a5a-c4ac-49d9-90fc-dd7e54250a60\") " Apr 16 18:33:24.687875 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:24.687392 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8vdv\" (UniqueName: \"kubernetes.io/projected/eb710a5a-c4ac-49d9-90fc-dd7e54250a60-kube-api-access-k8vdv\") pod \"eb710a5a-c4ac-49d9-90fc-dd7e54250a60\" (UID: \"eb710a5a-c4ac-49d9-90fc-dd7e54250a60\") " Apr 16 18:33:24.687875 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:24.687448 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eb710a5a-c4ac-49d9-90fc-dd7e54250a60-trusted-ca\") pod \"eb710a5a-c4ac-49d9-90fc-dd7e54250a60\" (UID: \"eb710a5a-c4ac-49d9-90fc-dd7e54250a60\") " Apr 16 18:33:24.687875 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:24.687496 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/eb710a5a-c4ac-49d9-90fc-dd7e54250a60-registry-certificates\") pod \"eb710a5a-c4ac-49d9-90fc-dd7e54250a60\" (UID: \"eb710a5a-c4ac-49d9-90fc-dd7e54250a60\") " Apr 16 18:33:24.687875 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:24.687531 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/eb710a5a-c4ac-49d9-90fc-dd7e54250a60-ca-trust-extracted\") pod \"eb710a5a-c4ac-49d9-90fc-dd7e54250a60\" (UID: \"eb710a5a-c4ac-49d9-90fc-dd7e54250a60\") " Apr 16 18:33:24.687875 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:24.687565 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/eb710a5a-c4ac-49d9-90fc-dd7e54250a60-registry-tls\") pod \"eb710a5a-c4ac-49d9-90fc-dd7e54250a60\" (UID: \"eb710a5a-c4ac-49d9-90fc-dd7e54250a60\") " Apr 16 18:33:24.688759 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:24.688677 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb710a5a-c4ac-49d9-90fc-dd7e54250a60-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "eb710a5a-c4ac-49d9-90fc-dd7e54250a60" (UID: "eb710a5a-c4ac-49d9-90fc-dd7e54250a60"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:33:24.689683 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:24.688932 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb710a5a-c4ac-49d9-90fc-dd7e54250a60-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "eb710a5a-c4ac-49d9-90fc-dd7e54250a60" (UID: "eb710a5a-c4ac-49d9-90fc-dd7e54250a60"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:33:24.689683 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:24.689555 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eb710a5a-c4ac-49d9-90fc-dd7e54250a60-trusted-ca\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 18:33:24.689683 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:24.689585 2578 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/eb710a5a-c4ac-49d9-90fc-dd7e54250a60-registry-certificates\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 18:33:24.693926 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:24.693886 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb710a5a-c4ac-49d9-90fc-dd7e54250a60-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "eb710a5a-c4ac-49d9-90fc-dd7e54250a60" (UID: "eb710a5a-c4ac-49d9-90fc-dd7e54250a60"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:33:24.693926 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:24.693886 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb710a5a-c4ac-49d9-90fc-dd7e54250a60-kube-api-access-k8vdv" (OuterVolumeSpecName: "kube-api-access-k8vdv") pod "eb710a5a-c4ac-49d9-90fc-dd7e54250a60" (UID: "eb710a5a-c4ac-49d9-90fc-dd7e54250a60"). InnerVolumeSpecName "kube-api-access-k8vdv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:33:24.694297 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:24.694003 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb710a5a-c4ac-49d9-90fc-dd7e54250a60-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "eb710a5a-c4ac-49d9-90fc-dd7e54250a60" (UID: "eb710a5a-c4ac-49d9-90fc-dd7e54250a60"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:33:24.694500 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:24.694137 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb710a5a-c4ac-49d9-90fc-dd7e54250a60-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "eb710a5a-c4ac-49d9-90fc-dd7e54250a60" (UID: "eb710a5a-c4ac-49d9-90fc-dd7e54250a60"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:33:24.694536 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:24.694514 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb710a5a-c4ac-49d9-90fc-dd7e54250a60-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "eb710a5a-c4ac-49d9-90fc-dd7e54250a60" (UID: "eb710a5a-c4ac-49d9-90fc-dd7e54250a60"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:33:24.698598 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:24.698574 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb710a5a-c4ac-49d9-90fc-dd7e54250a60-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "eb710a5a-c4ac-49d9-90fc-dd7e54250a60" (UID: "eb710a5a-c4ac-49d9-90fc-dd7e54250a60"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:33:24.790843 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:24.790818 2578 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/eb710a5a-c4ac-49d9-90fc-dd7e54250a60-installation-pull-secrets\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 18:33:24.790843 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:24.790840 2578 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/eb710a5a-c4ac-49d9-90fc-dd7e54250a60-bound-sa-token\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 18:33:24.790962 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:24.790851 2578 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/eb710a5a-c4ac-49d9-90fc-dd7e54250a60-image-registry-private-configuration\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 18:33:24.790962 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:24.790860 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k8vdv\" (UniqueName: \"kubernetes.io/projected/eb710a5a-c4ac-49d9-90fc-dd7e54250a60-kube-api-access-k8vdv\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 18:33:24.790962 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:24.790869 2578 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/eb710a5a-c4ac-49d9-90fc-dd7e54250a60-ca-trust-extracted\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 18:33:24.790962 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:24.790878 2578 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/eb710a5a-c4ac-49d9-90fc-dd7e54250a60-registry-tls\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 18:33:25.373215 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:25.372014 2578 generic.go:358] "Generic (PLEG): container finished" podID="eb710a5a-c4ac-49d9-90fc-dd7e54250a60" containerID="1733ce3fcdbeae218f3bb12f5f942702a21358f1c32dbc8a888dd01867da27e6" exitCode=0 Apr 16 18:33:25.373215 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:25.372693 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5b8555f68f-6s9wq" Apr 16 18:33:25.373477 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:25.373256 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5b8555f68f-6s9wq" event={"ID":"eb710a5a-c4ac-49d9-90fc-dd7e54250a60","Type":"ContainerDied","Data":"1733ce3fcdbeae218f3bb12f5f942702a21358f1c32dbc8a888dd01867da27e6"} Apr 16 18:33:25.373658 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:25.373623 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5b8555f68f-6s9wq" event={"ID":"eb710a5a-c4ac-49d9-90fc-dd7e54250a60","Type":"ContainerDied","Data":"943eda79b5bc76651e0247d8f85c4841ff0653ec7851c7fdf9b7498c42a925cf"} Apr 16 18:33:25.373784 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:25.373657 2578 scope.go:117] "RemoveContainer" containerID="1733ce3fcdbeae218f3bb12f5f942702a21358f1c32dbc8a888dd01867da27e6" Apr 16 18:33:25.383332 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:25.383287 2578 scope.go:117] "RemoveContainer" containerID="1733ce3fcdbeae218f3bb12f5f942702a21358f1c32dbc8a888dd01867da27e6" Apr 16 18:33:25.383569 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:33:25.383552 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1733ce3fcdbeae218f3bb12f5f942702a21358f1c32dbc8a888dd01867da27e6\": container with ID starting with 1733ce3fcdbeae218f3bb12f5f942702a21358f1c32dbc8a888dd01867da27e6 not found: ID does not exist" containerID="1733ce3fcdbeae218f3bb12f5f942702a21358f1c32dbc8a888dd01867da27e6" Apr 16 18:33:25.383607 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:25.383574 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1733ce3fcdbeae218f3bb12f5f942702a21358f1c32dbc8a888dd01867da27e6"} err="failed to get container status \"1733ce3fcdbeae218f3bb12f5f942702a21358f1c32dbc8a888dd01867da27e6\": rpc error: code = NotFound desc = could not find container \"1733ce3fcdbeae218f3bb12f5f942702a21358f1c32dbc8a888dd01867da27e6\": container with ID starting with 1733ce3fcdbeae218f3bb12f5f942702a21358f1c32dbc8a888dd01867da27e6 not found: ID does not exist" Apr 16 18:33:25.400416 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:25.400384 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5b8555f68f-6s9wq"] Apr 16 18:33:25.403636 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:25.403613 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-5b8555f68f-6s9wq"] Apr 16 18:33:25.840972 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:25.840935 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb710a5a-c4ac-49d9-90fc-dd7e54250a60" path="/var/lib/kubelet/pods/eb710a5a-c4ac-49d9-90fc-dd7e54250a60/volumes" Apr 16 18:33:27.491657 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:27.491623 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-76cd4fdfb8-5md7w"] Apr 16 18:33:27.492008 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:27.491875 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eb710a5a-c4ac-49d9-90fc-dd7e54250a60" containerName="registry" Apr 16 18:33:27.492008 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:27.491885 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb710a5a-c4ac-49d9-90fc-dd7e54250a60" containerName="registry" Apr 16 18:33:27.492008 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:27.491956 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="eb710a5a-c4ac-49d9-90fc-dd7e54250a60" containerName="registry" Apr 16 18:33:27.496832 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:27.496811 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76cd4fdfb8-5md7w" Apr 16 18:33:27.499370 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:27.499343 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 18:33:27.499370 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:27.499344 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 18:33:27.499561 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:27.499342 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 18:33:27.500600 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:27.500581 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 18:33:27.500700 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:27.500584 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 18:33:27.500700 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:27.500617 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 18:33:27.500700 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:27.500651 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-kdmh8\"" Apr 16 18:33:27.500816 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:27.500662 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 18:33:27.503632 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:27.503612 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-76cd4fdfb8-5md7w"] Apr 16 18:33:27.612939 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:27.612916 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e0f68098-b496-4807-aab1-fd88d4926ab0-console-config\") pod \"console-76cd4fdfb8-5md7w\" (UID: \"e0f68098-b496-4807-aab1-fd88d4926ab0\") " pod="openshift-console/console-76cd4fdfb8-5md7w" Apr 16 18:33:27.613051 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:27.612944 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7crkg\" (UniqueName: \"kubernetes.io/projected/e0f68098-b496-4807-aab1-fd88d4926ab0-kube-api-access-7crkg\") pod \"console-76cd4fdfb8-5md7w\" (UID: \"e0f68098-b496-4807-aab1-fd88d4926ab0\") " pod="openshift-console/console-76cd4fdfb8-5md7w" Apr 16 18:33:27.613051 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:27.612963 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e0f68098-b496-4807-aab1-fd88d4926ab0-console-serving-cert\") pod \"console-76cd4fdfb8-5md7w\" (UID: \"e0f68098-b496-4807-aab1-fd88d4926ab0\") " pod="openshift-console/console-76cd4fdfb8-5md7w" Apr 16 18:33:27.613051 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:27.612982 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e0f68098-b496-4807-aab1-fd88d4926ab0-oauth-serving-cert\") pod \"console-76cd4fdfb8-5md7w\" (UID: \"e0f68098-b496-4807-aab1-fd88d4926ab0\") " pod="openshift-console/console-76cd4fdfb8-5md7w" Apr 16 18:33:27.613173 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:27.613051 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e0f68098-b496-4807-aab1-fd88d4926ab0-service-ca\") pod \"console-76cd4fdfb8-5md7w\" (UID: \"e0f68098-b496-4807-aab1-fd88d4926ab0\") " pod="openshift-console/console-76cd4fdfb8-5md7w" Apr 16 18:33:27.613173 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:27.613077 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e0f68098-b496-4807-aab1-fd88d4926ab0-console-oauth-config\") pod \"console-76cd4fdfb8-5md7w\" (UID: \"e0f68098-b496-4807-aab1-fd88d4926ab0\") " pod="openshift-console/console-76cd4fdfb8-5md7w" Apr 16 18:33:27.713788 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:27.713756 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e0f68098-b496-4807-aab1-fd88d4926ab0-console-oauth-config\") pod \"console-76cd4fdfb8-5md7w\" (UID: \"e0f68098-b496-4807-aab1-fd88d4926ab0\") " pod="openshift-console/console-76cd4fdfb8-5md7w" Apr 16 18:33:27.713918 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:27.713822 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e0f68098-b496-4807-aab1-fd88d4926ab0-console-config\") pod \"console-76cd4fdfb8-5md7w\" (UID: \"e0f68098-b496-4807-aab1-fd88d4926ab0\") " pod="openshift-console/console-76cd4fdfb8-5md7w" Apr 16 18:33:27.713918 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:27.713841 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7crkg\" (UniqueName: \"kubernetes.io/projected/e0f68098-b496-4807-aab1-fd88d4926ab0-kube-api-access-7crkg\") pod \"console-76cd4fdfb8-5md7w\" (UID: \"e0f68098-b496-4807-aab1-fd88d4926ab0\") " pod="openshift-console/console-76cd4fdfb8-5md7w" Apr 16 18:33:27.713918 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:27.713859 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e0f68098-b496-4807-aab1-fd88d4926ab0-console-serving-cert\") pod \"console-76cd4fdfb8-5md7w\" (UID: \"e0f68098-b496-4807-aab1-fd88d4926ab0\") " pod="openshift-console/console-76cd4fdfb8-5md7w" Apr 16 18:33:27.713918 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:27.713880 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e0f68098-b496-4807-aab1-fd88d4926ab0-oauth-serving-cert\") pod \"console-76cd4fdfb8-5md7w\" (UID: \"e0f68098-b496-4807-aab1-fd88d4926ab0\") " pod="openshift-console/console-76cd4fdfb8-5md7w" Apr 16 18:33:27.713918 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:27.713915 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e0f68098-b496-4807-aab1-fd88d4926ab0-service-ca\") pod \"console-76cd4fdfb8-5md7w\" (UID: \"e0f68098-b496-4807-aab1-fd88d4926ab0\") " pod="openshift-console/console-76cd4fdfb8-5md7w" Apr 16 18:33:27.714582 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:27.714558 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e0f68098-b496-4807-aab1-fd88d4926ab0-service-ca\") pod \"console-76cd4fdfb8-5md7w\" (UID: \"e0f68098-b496-4807-aab1-fd88d4926ab0\") " pod="openshift-console/console-76cd4fdfb8-5md7w" Apr 16 18:33:27.714702 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:27.714662 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e0f68098-b496-4807-aab1-fd88d4926ab0-oauth-serving-cert\") pod \"console-76cd4fdfb8-5md7w\" (UID: \"e0f68098-b496-4807-aab1-fd88d4926ab0\") " pod="openshift-console/console-76cd4fdfb8-5md7w" Apr 16 18:33:27.714760 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:27.714718 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e0f68098-b496-4807-aab1-fd88d4926ab0-console-config\") pod \"console-76cd4fdfb8-5md7w\" (UID: \"e0f68098-b496-4807-aab1-fd88d4926ab0\") " pod="openshift-console/console-76cd4fdfb8-5md7w" Apr 16 18:33:27.716227 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:27.716210 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e0f68098-b496-4807-aab1-fd88d4926ab0-console-serving-cert\") pod \"console-76cd4fdfb8-5md7w\" (UID: \"e0f68098-b496-4807-aab1-fd88d4926ab0\") " pod="openshift-console/console-76cd4fdfb8-5md7w" Apr 16 18:33:27.716289 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:27.716237 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e0f68098-b496-4807-aab1-fd88d4926ab0-console-oauth-config\") pod \"console-76cd4fdfb8-5md7w\" (UID: \"e0f68098-b496-4807-aab1-fd88d4926ab0\") " pod="openshift-console/console-76cd4fdfb8-5md7w" Apr 16 18:33:27.721999 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:27.721979 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7crkg\" (UniqueName: \"kubernetes.io/projected/e0f68098-b496-4807-aab1-fd88d4926ab0-kube-api-access-7crkg\") pod \"console-76cd4fdfb8-5md7w\" (UID: \"e0f68098-b496-4807-aab1-fd88d4926ab0\") " pod="openshift-console/console-76cd4fdfb8-5md7w" Apr 16 18:33:27.805962 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:27.805886 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76cd4fdfb8-5md7w" Apr 16 18:33:27.935402 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:27.935368 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-76cd4fdfb8-5md7w"] Apr 16 18:33:27.938363 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:33:27.938333 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0f68098_b496_4807_aab1_fd88d4926ab0.slice/crio-07d975a79c20a6928bc4f5304edebfbfbb18d9459f099d9f62c3f03a9dfbed79 WatchSource:0}: Error finding container 07d975a79c20a6928bc4f5304edebfbfbb18d9459f099d9f62c3f03a9dfbed79: Status 404 returned error can't find the container with id 07d975a79c20a6928bc4f5304edebfbfbb18d9459f099d9f62c3f03a9dfbed79 Apr 16 18:33:28.383424 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:28.383388 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76cd4fdfb8-5md7w" event={"ID":"e0f68098-b496-4807-aab1-fd88d4926ab0","Type":"ContainerStarted","Data":"07d975a79c20a6928bc4f5304edebfbfbb18d9459f099d9f62c3f03a9dfbed79"} Apr 16 18:33:31.392775 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:31.392735 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76cd4fdfb8-5md7w" event={"ID":"e0f68098-b496-4807-aab1-fd88d4926ab0","Type":"ContainerStarted","Data":"db85a74edde621e50feb19869ac6fd5f3c1f557bd74a7b61c8037bd4a61a8556"} Apr 16 18:33:31.409945 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:31.409903 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-76cd4fdfb8-5md7w" podStartSLOduration=1.75131883 podStartE2EDuration="4.409888192s" podCreationTimestamp="2026-04-16 18:33:27 +0000 UTC" firstStartedPulling="2026-04-16 18:33:27.940140286 +0000 UTC m=+184.727476520" lastFinishedPulling="2026-04-16 18:33:30.598709649 +0000 UTC m=+187.386045882" observedRunningTime="2026-04-16 18:33:31.409475467 +0000 UTC m=+188.196811722" watchObservedRunningTime="2026-04-16 18:33:31.409888192 +0000 UTC m=+188.197224446" Apr 16 18:33:36.694795 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:36.694765 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6778fb5c74-8f269"] Apr 16 18:33:36.698738 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:36.698720 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6778fb5c74-8f269" Apr 16 18:33:36.707603 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:36.707584 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 18:33:36.709169 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:36.709150 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6778fb5c74-8f269"] Apr 16 18:33:36.793041 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:36.793013 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e9188ba8-56f9-4437-a079-fa3404334317-trusted-ca-bundle\") pod \"console-6778fb5c74-8f269\" (UID: \"e9188ba8-56f9-4437-a079-fa3404334317\") " pod="openshift-console/console-6778fb5c74-8f269" Apr 16 18:33:36.793041 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:36.793044 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbtt2\" (UniqueName: \"kubernetes.io/projected/e9188ba8-56f9-4437-a079-fa3404334317-kube-api-access-sbtt2\") pod \"console-6778fb5c74-8f269\" (UID: \"e9188ba8-56f9-4437-a079-fa3404334317\") " pod="openshift-console/console-6778fb5c74-8f269" Apr 16 18:33:36.793233 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:36.793122 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e9188ba8-56f9-4437-a079-fa3404334317-console-serving-cert\") pod \"console-6778fb5c74-8f269\" (UID: \"e9188ba8-56f9-4437-a079-fa3404334317\") " pod="openshift-console/console-6778fb5c74-8f269" Apr 16 18:33:36.793233 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:36.793149 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e9188ba8-56f9-4437-a079-fa3404334317-service-ca\") pod \"console-6778fb5c74-8f269\" (UID: \"e9188ba8-56f9-4437-a079-fa3404334317\") " pod="openshift-console/console-6778fb5c74-8f269" Apr 16 18:33:36.793302 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:36.793245 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e9188ba8-56f9-4437-a079-fa3404334317-oauth-serving-cert\") pod \"console-6778fb5c74-8f269\" (UID: \"e9188ba8-56f9-4437-a079-fa3404334317\") " pod="openshift-console/console-6778fb5c74-8f269" Apr 16 18:33:36.793302 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:36.793286 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e9188ba8-56f9-4437-a079-fa3404334317-console-config\") pod \"console-6778fb5c74-8f269\" (UID: \"e9188ba8-56f9-4437-a079-fa3404334317\") " pod="openshift-console/console-6778fb5c74-8f269" Apr 16 18:33:36.793361 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:36.793308 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e9188ba8-56f9-4437-a079-fa3404334317-console-oauth-config\") pod \"console-6778fb5c74-8f269\" (UID: \"e9188ba8-56f9-4437-a079-fa3404334317\") " pod="openshift-console/console-6778fb5c74-8f269" Apr 16 18:33:36.893839 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:36.893808 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e9188ba8-56f9-4437-a079-fa3404334317-console-config\") pod \"console-6778fb5c74-8f269\" (UID: \"e9188ba8-56f9-4437-a079-fa3404334317\") " pod="openshift-console/console-6778fb5c74-8f269" Apr 16 18:33:36.893839 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:36.893842 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e9188ba8-56f9-4437-a079-fa3404334317-console-oauth-config\") pod \"console-6778fb5c74-8f269\" (UID: \"e9188ba8-56f9-4437-a079-fa3404334317\") " pod="openshift-console/console-6778fb5c74-8f269" Apr 16 18:33:36.894025 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:36.893875 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e9188ba8-56f9-4437-a079-fa3404334317-trusted-ca-bundle\") pod \"console-6778fb5c74-8f269\" (UID: \"e9188ba8-56f9-4437-a079-fa3404334317\") " pod="openshift-console/console-6778fb5c74-8f269" Apr 16 18:33:36.894025 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:36.894012 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sbtt2\" (UniqueName: \"kubernetes.io/projected/e9188ba8-56f9-4437-a079-fa3404334317-kube-api-access-sbtt2\") pod \"console-6778fb5c74-8f269\" (UID: \"e9188ba8-56f9-4437-a079-fa3404334317\") " pod="openshift-console/console-6778fb5c74-8f269" Apr 16 18:33:36.894127 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:36.894111 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e9188ba8-56f9-4437-a079-fa3404334317-console-serving-cert\") pod \"console-6778fb5c74-8f269\" (UID: \"e9188ba8-56f9-4437-a079-fa3404334317\") " pod="openshift-console/console-6778fb5c74-8f269" Apr 16 18:33:36.894198 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:36.894150 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e9188ba8-56f9-4437-a079-fa3404334317-service-ca\") pod \"console-6778fb5c74-8f269\" (UID: \"e9188ba8-56f9-4437-a079-fa3404334317\") " pod="openshift-console/console-6778fb5c74-8f269" Apr 16 18:33:36.894253 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:36.894199 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e9188ba8-56f9-4437-a079-fa3404334317-oauth-serving-cert\") pod \"console-6778fb5c74-8f269\" (UID: \"e9188ba8-56f9-4437-a079-fa3404334317\") " pod="openshift-console/console-6778fb5c74-8f269" Apr 16 18:33:36.895199 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:36.895154 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e9188ba8-56f9-4437-a079-fa3404334317-console-config\") pod \"console-6778fb5c74-8f269\" (UID: \"e9188ba8-56f9-4437-a079-fa3404334317\") " pod="openshift-console/console-6778fb5c74-8f269" Apr 16 18:33:36.895320 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:36.895303 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e9188ba8-56f9-4437-a079-fa3404334317-oauth-serving-cert\") pod \"console-6778fb5c74-8f269\" (UID: \"e9188ba8-56f9-4437-a079-fa3404334317\") " pod="openshift-console/console-6778fb5c74-8f269" Apr 16 18:33:36.895320 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:36.895307 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e9188ba8-56f9-4437-a079-fa3404334317-service-ca\") pod \"console-6778fb5c74-8f269\" (UID: \"e9188ba8-56f9-4437-a079-fa3404334317\") " pod="openshift-console/console-6778fb5c74-8f269" Apr 16 18:33:36.896116 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:36.896094 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e9188ba8-56f9-4437-a079-fa3404334317-trusted-ca-bundle\") pod \"console-6778fb5c74-8f269\" (UID: \"e9188ba8-56f9-4437-a079-fa3404334317\") " pod="openshift-console/console-6778fb5c74-8f269" Apr 16 18:33:36.896800 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:36.896783 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e9188ba8-56f9-4437-a079-fa3404334317-console-oauth-config\") pod \"console-6778fb5c74-8f269\" (UID: \"e9188ba8-56f9-4437-a079-fa3404334317\") " pod="openshift-console/console-6778fb5c74-8f269" Apr 16 18:33:36.896993 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:36.896976 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e9188ba8-56f9-4437-a079-fa3404334317-console-serving-cert\") pod \"console-6778fb5c74-8f269\" (UID: \"e9188ba8-56f9-4437-a079-fa3404334317\") " pod="openshift-console/console-6778fb5c74-8f269" Apr 16 18:33:36.902016 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:36.901993 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbtt2\" (UniqueName: \"kubernetes.io/projected/e9188ba8-56f9-4437-a079-fa3404334317-kube-api-access-sbtt2\") pod \"console-6778fb5c74-8f269\" (UID: \"e9188ba8-56f9-4437-a079-fa3404334317\") " pod="openshift-console/console-6778fb5c74-8f269" Apr 16 18:33:37.007231 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:37.007195 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6778fb5c74-8f269" Apr 16 18:33:37.122500 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:37.122471 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6778fb5c74-8f269"] Apr 16 18:33:37.125570 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:33:37.125537 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9188ba8_56f9_4437_a079_fa3404334317.slice/crio-62f9504f7c9ab079b4ffcb6fb78d1cf055bcaf246b233a250d47761450b14720 WatchSource:0}: Error finding container 62f9504f7c9ab079b4ffcb6fb78d1cf055bcaf246b233a250d47761450b14720: Status 404 returned error can't find the container with id 62f9504f7c9ab079b4ffcb6fb78d1cf055bcaf246b233a250d47761450b14720 Apr 16 18:33:37.411387 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:37.411298 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6778fb5c74-8f269" event={"ID":"e9188ba8-56f9-4437-a079-fa3404334317","Type":"ContainerStarted","Data":"761b48b0620bc7001b2b4f6fc90f63db1a955948e16bc16723fcab186da61155"} Apr 16 18:33:37.411387 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:37.411337 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6778fb5c74-8f269" event={"ID":"e9188ba8-56f9-4437-a079-fa3404334317","Type":"ContainerStarted","Data":"62f9504f7c9ab079b4ffcb6fb78d1cf055bcaf246b233a250d47761450b14720"} Apr 16 18:33:37.428882 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:37.428838 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6778fb5c74-8f269" podStartSLOduration=1.428823922 podStartE2EDuration="1.428823922s" podCreationTimestamp="2026-04-16 18:33:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:33:37.427747075 +0000 UTC m=+194.215083332" watchObservedRunningTime="2026-04-16 18:33:37.428823922 +0000 UTC m=+194.216160177" Apr 16 18:33:37.806810 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:37.806782 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-76cd4fdfb8-5md7w" Apr 16 18:33:37.807140 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:37.806818 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-76cd4fdfb8-5md7w" Apr 16 18:33:37.811352 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:37.811331 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-76cd4fdfb8-5md7w" Apr 16 18:33:38.417540 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:38.417499 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-76cd4fdfb8-5md7w" Apr 16 18:33:38.993990 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:38.993962 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-67b789b557-rnptf" Apr 16 18:33:38.994335 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:38.994000 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-67b789b557-rnptf" Apr 16 18:33:41.423203 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:41.423110 2578 generic.go:358] "Generic (PLEG): container finished" podID="d78602f6-4841-4d81-8a43-e0c53bc9137b" containerID="28064003cf49ee88a2bd1a920fd288081d1ab0565f0fb8273a77851d6ffcc6c8" exitCode=0 Apr 16 18:33:41.423203 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:41.423146 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-jlpz7" event={"ID":"d78602f6-4841-4d81-8a43-e0c53bc9137b","Type":"ContainerDied","Data":"28064003cf49ee88a2bd1a920fd288081d1ab0565f0fb8273a77851d6ffcc6c8"} Apr 16 18:33:41.423577 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:41.423413 2578 scope.go:117] "RemoveContainer" containerID="28064003cf49ee88a2bd1a920fd288081d1ab0565f0fb8273a77851d6ffcc6c8" Apr 16 18:33:42.427117 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:42.427084 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-jlpz7" event={"ID":"d78602f6-4841-4d81-8a43-e0c53bc9137b","Type":"ContainerStarted","Data":"d33ffabcbd3cdb03ffa23a4803dc325132d2377ea53b6cb666c2cec416157906"} Apr 16 18:33:47.007820 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:47.007783 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6778fb5c74-8f269" Apr 16 18:33:47.008320 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:47.007866 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6778fb5c74-8f269" Apr 16 18:33:47.012815 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:47.012793 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6778fb5c74-8f269" Apr 16 18:33:47.443971 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:47.443884 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6778fb5c74-8f269" Apr 16 18:33:47.488743 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:47.488711 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-76cd4fdfb8-5md7w"] Apr 16 18:33:58.998999 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:58.998969 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-67b789b557-rnptf" Apr 16 18:33:59.002777 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:33:59.002756 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-67b789b557-rnptf" Apr 16 18:34:08.997370 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:08.997339 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-dfzr4_bdf8d744-507e-4943-8866-e60d7c582151/cluster-monitoring-operator/0.log" Apr 16 18:34:09.195231 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:09.195194 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-mcgzx_41a2f87e-530d-4207-8cc8-e7ee979357e0/kube-state-metrics/0.log" Apr 16 18:34:09.394532 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:09.394452 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-mcgzx_41a2f87e-530d-4207-8cc8-e7ee979357e0/kube-rbac-proxy-main/0.log" Apr 16 18:34:09.594665 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:09.594632 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-mcgzx_41a2f87e-530d-4207-8cc8-e7ee979357e0/kube-rbac-proxy-self/0.log" Apr 16 18:34:09.794247 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:09.794216 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-67b789b557-rnptf_bd1da6bd-aefd-4703-afb9-ec982489f130/metrics-server/0.log" Apr 16 18:34:09.994334 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:09.994305 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-5876b4bbc7-4zlht_414da762-c744-460c-8983-4a538c9e63e7/monitoring-plugin/0.log" Apr 16 18:34:10.793525 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:10.793482 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-b6czb_758ed149-3d81-4eec-bdac-e7eeca35aecc/init-textfile/0.log" Apr 16 18:34:10.994853 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:10.994822 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-b6czb_758ed149-3d81-4eec-bdac-e7eeca35aecc/node-exporter/0.log" Apr 16 18:34:11.194834 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:11.194756 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-b6czb_758ed149-3d81-4eec-bdac-e7eeca35aecc/kube-rbac-proxy/0.log" Apr 16 18:34:12.507531 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:12.507497 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-76cd4fdfb8-5md7w" podUID="e0f68098-b496-4807-aab1-fd88d4926ab0" containerName="console" containerID="cri-o://db85a74edde621e50feb19869ac6fd5f3c1f557bd74a7b61c8037bd4a61a8556" gracePeriod=15 Apr 16 18:34:12.738862 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:12.738844 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-76cd4fdfb8-5md7w_e0f68098-b496-4807-aab1-fd88d4926ab0/console/0.log" Apr 16 18:34:12.738975 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:12.738904 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76cd4fdfb8-5md7w" Apr 16 18:34:12.795665 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:12.795561 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e0f68098-b496-4807-aab1-fd88d4926ab0-console-serving-cert\") pod \"e0f68098-b496-4807-aab1-fd88d4926ab0\" (UID: \"e0f68098-b496-4807-aab1-fd88d4926ab0\") " Apr 16 18:34:12.795825 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:12.795684 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e0f68098-b496-4807-aab1-fd88d4926ab0-console-config\") pod \"e0f68098-b496-4807-aab1-fd88d4926ab0\" (UID: \"e0f68098-b496-4807-aab1-fd88d4926ab0\") " Apr 16 18:34:12.795897 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:12.795809 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7crkg\" (UniqueName: \"kubernetes.io/projected/e0f68098-b496-4807-aab1-fd88d4926ab0-kube-api-access-7crkg\") pod \"e0f68098-b496-4807-aab1-fd88d4926ab0\" (UID: \"e0f68098-b496-4807-aab1-fd88d4926ab0\") " Apr 16 18:34:12.795949 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:12.795894 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e0f68098-b496-4807-aab1-fd88d4926ab0-service-ca\") pod \"e0f68098-b496-4807-aab1-fd88d4926ab0\" (UID: \"e0f68098-b496-4807-aab1-fd88d4926ab0\") " Apr 16 18:34:12.795996 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:12.795952 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e0f68098-b496-4807-aab1-fd88d4926ab0-console-oauth-config\") pod \"e0f68098-b496-4807-aab1-fd88d4926ab0\" (UID: \"e0f68098-b496-4807-aab1-fd88d4926ab0\") " Apr 16 18:34:12.795996 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:12.795988 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e0f68098-b496-4807-aab1-fd88d4926ab0-oauth-serving-cert\") pod \"e0f68098-b496-4807-aab1-fd88d4926ab0\" (UID: \"e0f68098-b496-4807-aab1-fd88d4926ab0\") " Apr 16 18:34:12.796231 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:12.796202 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0f68098-b496-4807-aab1-fd88d4926ab0-console-config" (OuterVolumeSpecName: "console-config") pod "e0f68098-b496-4807-aab1-fd88d4926ab0" (UID: "e0f68098-b496-4807-aab1-fd88d4926ab0"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:34:12.796390 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:12.796323 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0f68098-b496-4807-aab1-fd88d4926ab0-service-ca" (OuterVolumeSpecName: "service-ca") pod "e0f68098-b496-4807-aab1-fd88d4926ab0" (UID: "e0f68098-b496-4807-aab1-fd88d4926ab0"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:34:12.796538 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:12.796518 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0f68098-b496-4807-aab1-fd88d4926ab0-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "e0f68098-b496-4807-aab1-fd88d4926ab0" (UID: "e0f68098-b496-4807-aab1-fd88d4926ab0"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:34:12.798045 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:12.798020 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0f68098-b496-4807-aab1-fd88d4926ab0-kube-api-access-7crkg" (OuterVolumeSpecName: "kube-api-access-7crkg") pod "e0f68098-b496-4807-aab1-fd88d4926ab0" (UID: "e0f68098-b496-4807-aab1-fd88d4926ab0"). InnerVolumeSpecName "kube-api-access-7crkg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:34:12.798275 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:12.798256 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0f68098-b496-4807-aab1-fd88d4926ab0-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "e0f68098-b496-4807-aab1-fd88d4926ab0" (UID: "e0f68098-b496-4807-aab1-fd88d4926ab0"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:34:12.798329 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:12.798279 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0f68098-b496-4807-aab1-fd88d4926ab0-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "e0f68098-b496-4807-aab1-fd88d4926ab0" (UID: "e0f68098-b496-4807-aab1-fd88d4926ab0"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:34:12.897472 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:12.897430 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e0f68098-b496-4807-aab1-fd88d4926ab0-console-serving-cert\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 18:34:12.897472 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:12.897458 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e0f68098-b496-4807-aab1-fd88d4926ab0-console-config\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 18:34:12.897472 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:12.897476 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7crkg\" (UniqueName: \"kubernetes.io/projected/e0f68098-b496-4807-aab1-fd88d4926ab0-kube-api-access-7crkg\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 18:34:12.897683 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:12.897486 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e0f68098-b496-4807-aab1-fd88d4926ab0-service-ca\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 18:34:12.897683 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:12.897495 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e0f68098-b496-4807-aab1-fd88d4926ab0-console-oauth-config\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 18:34:12.897683 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:12.897504 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e0f68098-b496-4807-aab1-fd88d4926ab0-oauth-serving-cert\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 18:34:13.511336 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:13.511308 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-76cd4fdfb8-5md7w_e0f68098-b496-4807-aab1-fd88d4926ab0/console/0.log" Apr 16 18:34:13.511757 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:13.511347 2578 generic.go:358] "Generic (PLEG): container finished" podID="e0f68098-b496-4807-aab1-fd88d4926ab0" containerID="db85a74edde621e50feb19869ac6fd5f3c1f557bd74a7b61c8037bd4a61a8556" exitCode=2 Apr 16 18:34:13.511757 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:13.511407 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76cd4fdfb8-5md7w" Apr 16 18:34:13.511757 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:13.511409 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76cd4fdfb8-5md7w" event={"ID":"e0f68098-b496-4807-aab1-fd88d4926ab0","Type":"ContainerDied","Data":"db85a74edde621e50feb19869ac6fd5f3c1f557bd74a7b61c8037bd4a61a8556"} Apr 16 18:34:13.511757 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:13.511502 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76cd4fdfb8-5md7w" event={"ID":"e0f68098-b496-4807-aab1-fd88d4926ab0","Type":"ContainerDied","Data":"07d975a79c20a6928bc4f5304edebfbfbb18d9459f099d9f62c3f03a9dfbed79"} Apr 16 18:34:13.511757 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:13.511517 2578 scope.go:117] "RemoveContainer" containerID="db85a74edde621e50feb19869ac6fd5f3c1f557bd74a7b61c8037bd4a61a8556" Apr 16 18:34:13.519727 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:13.519713 2578 scope.go:117] "RemoveContainer" containerID="db85a74edde621e50feb19869ac6fd5f3c1f557bd74a7b61c8037bd4a61a8556" Apr 16 18:34:13.519990 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:34:13.519970 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db85a74edde621e50feb19869ac6fd5f3c1f557bd74a7b61c8037bd4a61a8556\": container with ID starting with db85a74edde621e50feb19869ac6fd5f3c1f557bd74a7b61c8037bd4a61a8556 not found: ID does not exist" containerID="db85a74edde621e50feb19869ac6fd5f3c1f557bd74a7b61c8037bd4a61a8556" Apr 16 18:34:13.520028 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:13.519999 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db85a74edde621e50feb19869ac6fd5f3c1f557bd74a7b61c8037bd4a61a8556"} err="failed to get container status \"db85a74edde621e50feb19869ac6fd5f3c1f557bd74a7b61c8037bd4a61a8556\": rpc error: code = NotFound desc = could not find container \"db85a74edde621e50feb19869ac6fd5f3c1f557bd74a7b61c8037bd4a61a8556\": container with ID starting with db85a74edde621e50feb19869ac6fd5f3c1f557bd74a7b61c8037bd4a61a8556 not found: ID does not exist" Apr 16 18:34:13.532435 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:13.532409 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-76cd4fdfb8-5md7w"] Apr 16 18:34:13.536013 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:13.535992 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-76cd4fdfb8-5md7w"] Apr 16 18:34:13.841050 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:13.840970 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0f68098-b496-4807-aab1-fd88d4926ab0" path="/var/lib/kubelet/pods/e0f68098-b496-4807-aab1-fd88d4926ab0/volumes" Apr 16 18:34:13.996496 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:13.996467 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-7sgp9_c1e239f4-fc22-47cc-bda3-343f85242b88/prometheus-operator/0.log" Apr 16 18:34:14.194627 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:14.194555 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-7sgp9_c1e239f4-fc22-47cc-bda3-343f85242b88/kube-rbac-proxy/0.log" Apr 16 18:34:14.394820 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:14.394787 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-9cb97cd87-w6jqv_c4040798-ae88-4f3b-abb7-2af899225127/prometheus-operator-admission-webhook/0.log" Apr 16 18:34:16.394972 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:16.394944 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6778fb5c74-8f269_e9188ba8-56f9-4437-a079-fa3404334317/console/0.log" Apr 16 18:34:17.394602 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:17.394568 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-pb8c8_714c4beb-40ac-4478-80ff-d058fb5fd1a3/serve-healthcheck-canary/0.log" Apr 16 18:34:35.583257 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:35.583220 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b62474b5-9999-4dd6-83ae-96e3bc355df3-metrics-certs\") pod \"network-metrics-daemon-f4smb\" (UID: \"b62474b5-9999-4dd6-83ae-96e3bc355df3\") " pod="openshift-multus/network-metrics-daemon-f4smb" Apr 16 18:34:35.585454 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:35.585436 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b62474b5-9999-4dd6-83ae-96e3bc355df3-metrics-certs\") pod \"network-metrics-daemon-f4smb\" (UID: \"b62474b5-9999-4dd6-83ae-96e3bc355df3\") " pod="openshift-multus/network-metrics-daemon-f4smb" Apr 16 18:34:35.841200 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:35.841117 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-kxnrw\"" Apr 16 18:34:35.849285 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:35.849267 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f4smb" Apr 16 18:34:35.965831 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:35.965799 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-f4smb"] Apr 16 18:34:35.968565 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:34:35.968525 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb62474b5_9999_4dd6_83ae_96e3bc355df3.slice/crio-4bbce2d5acc93214078f41deaf68dd86287165c975c403e9da326e3cddb79799 WatchSource:0}: Error finding container 4bbce2d5acc93214078f41deaf68dd86287165c975c403e9da326e3cddb79799: Status 404 returned error can't find the container with id 4bbce2d5acc93214078f41deaf68dd86287165c975c403e9da326e3cddb79799 Apr 16 18:34:36.580486 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:36.580452 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-f4smb" event={"ID":"b62474b5-9999-4dd6-83ae-96e3bc355df3","Type":"ContainerStarted","Data":"4bbce2d5acc93214078f41deaf68dd86287165c975c403e9da326e3cddb79799"} Apr 16 18:34:37.584845 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:37.584813 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-f4smb" event={"ID":"b62474b5-9999-4dd6-83ae-96e3bc355df3","Type":"ContainerStarted","Data":"28b9ccabe9cd7d7e45507a6f3c2d6fd28a361a402817caeb385be53581c7c0b8"} Apr 16 18:34:37.585310 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:37.584850 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-f4smb" event={"ID":"b62474b5-9999-4dd6-83ae-96e3bc355df3","Type":"ContainerStarted","Data":"e7b5d371c22b0bdff9884abffc3714d3ca2d6e88912a3e20100dae44c488a051"} Apr 16 18:34:37.603395 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:37.603347 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-f4smb" podStartSLOduration=253.719884285 podStartE2EDuration="4m14.603334079s" podCreationTimestamp="2026-04-16 18:30:23 +0000 UTC" firstStartedPulling="2026-04-16 18:34:35.970305851 +0000 UTC m=+252.757642088" lastFinishedPulling="2026-04-16 18:34:36.853755645 +0000 UTC m=+253.641091882" observedRunningTime="2026-04-16 18:34:37.602243532 +0000 UTC m=+254.389579788" watchObservedRunningTime="2026-04-16 18:34:37.603334079 +0000 UTC m=+254.390670333" Apr 16 18:34:38.645390 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:38.645354 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-6d557589b4-w5v7x"] Apr 16 18:34:38.645817 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:38.645661 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e0f68098-b496-4807-aab1-fd88d4926ab0" containerName="console" Apr 16 18:34:38.645817 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:38.645676 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0f68098-b496-4807-aab1-fd88d4926ab0" containerName="console" Apr 16 18:34:38.645817 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:38.645751 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="e0f68098-b496-4807-aab1-fd88d4926ab0" containerName="console" Apr 16 18:34:38.648919 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:38.648883 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-6d557589b4-w5v7x" Apr 16 18:34:38.651768 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:38.651743 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-vltmj\"" Apr 16 18:34:38.651768 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:38.651755 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 16 18:34:38.652007 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:38.651761 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 16 18:34:38.652007 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:38.651840 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 16 18:34:38.652007 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:38.651976 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 16 18:34:38.652211 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:38.652116 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 16 18:34:38.657244 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:38.657226 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 16 18:34:38.659218 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:38.659171 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-6d557589b4-w5v7x"] Apr 16 18:34:38.806510 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:38.806474 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6472b575-f118-402f-b723-1a8ec3fdae51-serving-certs-ca-bundle\") pod \"telemeter-client-6d557589b4-w5v7x\" (UID: \"6472b575-f118-402f-b723-1a8ec3fdae51\") " pod="openshift-monitoring/telemeter-client-6d557589b4-w5v7x" Apr 16 18:34:38.806510 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:38.806514 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/6472b575-f118-402f-b723-1a8ec3fdae51-secret-telemeter-client\") pod \"telemeter-client-6d557589b4-w5v7x\" (UID: \"6472b575-f118-402f-b723-1a8ec3fdae51\") " pod="openshift-monitoring/telemeter-client-6d557589b4-w5v7x" Apr 16 18:34:38.806736 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:38.806535 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6472b575-f118-402f-b723-1a8ec3fdae51-metrics-client-ca\") pod \"telemeter-client-6d557589b4-w5v7x\" (UID: \"6472b575-f118-402f-b723-1a8ec3fdae51\") " pod="openshift-monitoring/telemeter-client-6d557589b4-w5v7x" Apr 16 18:34:38.806736 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:38.806612 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbqwg\" (UniqueName: \"kubernetes.io/projected/6472b575-f118-402f-b723-1a8ec3fdae51-kube-api-access-wbqwg\") pod \"telemeter-client-6d557589b4-w5v7x\" (UID: \"6472b575-f118-402f-b723-1a8ec3fdae51\") " pod="openshift-monitoring/telemeter-client-6d557589b4-w5v7x" Apr 16 18:34:38.806736 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:38.806640 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6472b575-f118-402f-b723-1a8ec3fdae51-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6d557589b4-w5v7x\" (UID: \"6472b575-f118-402f-b723-1a8ec3fdae51\") " pod="openshift-monitoring/telemeter-client-6d557589b4-w5v7x" Apr 16 18:34:38.806736 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:38.806659 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6472b575-f118-402f-b723-1a8ec3fdae51-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6d557589b4-w5v7x\" (UID: \"6472b575-f118-402f-b723-1a8ec3fdae51\") " pod="openshift-monitoring/telemeter-client-6d557589b4-w5v7x" Apr 16 18:34:38.806736 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:38.806684 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/6472b575-f118-402f-b723-1a8ec3fdae51-federate-client-tls\") pod \"telemeter-client-6d557589b4-w5v7x\" (UID: \"6472b575-f118-402f-b723-1a8ec3fdae51\") " pod="openshift-monitoring/telemeter-client-6d557589b4-w5v7x" Apr 16 18:34:38.806928 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:38.806782 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/6472b575-f118-402f-b723-1a8ec3fdae51-telemeter-client-tls\") pod \"telemeter-client-6d557589b4-w5v7x\" (UID: \"6472b575-f118-402f-b723-1a8ec3fdae51\") " pod="openshift-monitoring/telemeter-client-6d557589b4-w5v7x" Apr 16 18:34:38.908558 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:38.907895 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wbqwg\" (UniqueName: \"kubernetes.io/projected/6472b575-f118-402f-b723-1a8ec3fdae51-kube-api-access-wbqwg\") pod \"telemeter-client-6d557589b4-w5v7x\" (UID: \"6472b575-f118-402f-b723-1a8ec3fdae51\") " pod="openshift-monitoring/telemeter-client-6d557589b4-w5v7x" Apr 16 18:34:38.908558 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:38.907947 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6472b575-f118-402f-b723-1a8ec3fdae51-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6d557589b4-w5v7x\" (UID: \"6472b575-f118-402f-b723-1a8ec3fdae51\") " pod="openshift-monitoring/telemeter-client-6d557589b4-w5v7x" Apr 16 18:34:38.908558 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:38.907977 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6472b575-f118-402f-b723-1a8ec3fdae51-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6d557589b4-w5v7x\" (UID: \"6472b575-f118-402f-b723-1a8ec3fdae51\") " pod="openshift-monitoring/telemeter-client-6d557589b4-w5v7x" Apr 16 18:34:38.908558 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:38.908006 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/6472b575-f118-402f-b723-1a8ec3fdae51-federate-client-tls\") pod \"telemeter-client-6d557589b4-w5v7x\" (UID: \"6472b575-f118-402f-b723-1a8ec3fdae51\") " pod="openshift-monitoring/telemeter-client-6d557589b4-w5v7x" Apr 16 18:34:38.908558 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:38.908049 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/6472b575-f118-402f-b723-1a8ec3fdae51-telemeter-client-tls\") pod \"telemeter-client-6d557589b4-w5v7x\" (UID: \"6472b575-f118-402f-b723-1a8ec3fdae51\") " pod="openshift-monitoring/telemeter-client-6d557589b4-w5v7x" Apr 16 18:34:38.908558 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:38.908085 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6472b575-f118-402f-b723-1a8ec3fdae51-serving-certs-ca-bundle\") pod \"telemeter-client-6d557589b4-w5v7x\" (UID: \"6472b575-f118-402f-b723-1a8ec3fdae51\") " pod="openshift-monitoring/telemeter-client-6d557589b4-w5v7x" Apr 16 18:34:38.908558 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:38.908115 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/6472b575-f118-402f-b723-1a8ec3fdae51-secret-telemeter-client\") pod \"telemeter-client-6d557589b4-w5v7x\" (UID: \"6472b575-f118-402f-b723-1a8ec3fdae51\") " pod="openshift-monitoring/telemeter-client-6d557589b4-w5v7x" Apr 16 18:34:38.908558 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:38.908137 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6472b575-f118-402f-b723-1a8ec3fdae51-metrics-client-ca\") pod \"telemeter-client-6d557589b4-w5v7x\" (UID: \"6472b575-f118-402f-b723-1a8ec3fdae51\") " pod="openshift-monitoring/telemeter-client-6d557589b4-w5v7x" Apr 16 18:34:38.909168 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:38.908971 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6472b575-f118-402f-b723-1a8ec3fdae51-metrics-client-ca\") pod \"telemeter-client-6d557589b4-w5v7x\" (UID: \"6472b575-f118-402f-b723-1a8ec3fdae51\") " pod="openshift-monitoring/telemeter-client-6d557589b4-w5v7x" Apr 16 18:34:38.909301 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:38.909165 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6472b575-f118-402f-b723-1a8ec3fdae51-serving-certs-ca-bundle\") pod \"telemeter-client-6d557589b4-w5v7x\" (UID: \"6472b575-f118-402f-b723-1a8ec3fdae51\") " pod="openshift-monitoring/telemeter-client-6d557589b4-w5v7x" Apr 16 18:34:38.909857 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:38.909805 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6472b575-f118-402f-b723-1a8ec3fdae51-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6d557589b4-w5v7x\" (UID: \"6472b575-f118-402f-b723-1a8ec3fdae51\") " pod="openshift-monitoring/telemeter-client-6d557589b4-w5v7x" Apr 16 18:34:38.911466 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:38.911369 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6472b575-f118-402f-b723-1a8ec3fdae51-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6d557589b4-w5v7x\" (UID: \"6472b575-f118-402f-b723-1a8ec3fdae51\") " pod="openshift-monitoring/telemeter-client-6d557589b4-w5v7x" Apr 16 18:34:38.911896 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:38.911879 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/6472b575-f118-402f-b723-1a8ec3fdae51-secret-telemeter-client\") pod \"telemeter-client-6d557589b4-w5v7x\" (UID: \"6472b575-f118-402f-b723-1a8ec3fdae51\") " pod="openshift-monitoring/telemeter-client-6d557589b4-w5v7x" Apr 16 18:34:38.911961 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:38.911948 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/6472b575-f118-402f-b723-1a8ec3fdae51-telemeter-client-tls\") pod \"telemeter-client-6d557589b4-w5v7x\" (UID: \"6472b575-f118-402f-b723-1a8ec3fdae51\") " pod="openshift-monitoring/telemeter-client-6d557589b4-w5v7x" Apr 16 18:34:38.912017 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:38.911964 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/6472b575-f118-402f-b723-1a8ec3fdae51-federate-client-tls\") pod \"telemeter-client-6d557589b4-w5v7x\" (UID: \"6472b575-f118-402f-b723-1a8ec3fdae51\") " pod="openshift-monitoring/telemeter-client-6d557589b4-w5v7x" Apr 16 18:34:38.922495 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:38.922471 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbqwg\" (UniqueName: \"kubernetes.io/projected/6472b575-f118-402f-b723-1a8ec3fdae51-kube-api-access-wbqwg\") pod \"telemeter-client-6d557589b4-w5v7x\" (UID: \"6472b575-f118-402f-b723-1a8ec3fdae51\") " pod="openshift-monitoring/telemeter-client-6d557589b4-w5v7x" Apr 16 18:34:38.959732 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:38.959704 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-6d557589b4-w5v7x" Apr 16 18:34:39.103382 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:39.103331 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-6d557589b4-w5v7x"] Apr 16 18:34:39.106380 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:34:39.106350 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6472b575_f118_402f_b723_1a8ec3fdae51.slice/crio-6eed18411cba932fbee7264b4658aa27d36f54a82f3cdbfdcb0033ca9186f2db WatchSource:0}: Error finding container 6eed18411cba932fbee7264b4658aa27d36f54a82f3cdbfdcb0033ca9186f2db: Status 404 returned error can't find the container with id 6eed18411cba932fbee7264b4658aa27d36f54a82f3cdbfdcb0033ca9186f2db Apr 16 18:34:39.592195 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:39.592145 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6d557589b4-w5v7x" event={"ID":"6472b575-f118-402f-b723-1a8ec3fdae51","Type":"ContainerStarted","Data":"6eed18411cba932fbee7264b4658aa27d36f54a82f3cdbfdcb0033ca9186f2db"} Apr 16 18:34:41.600117 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:41.600077 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6d557589b4-w5v7x" event={"ID":"6472b575-f118-402f-b723-1a8ec3fdae51","Type":"ContainerStarted","Data":"26029f84d7ab9bc80333cfcaf5b164c23f16659e8c5f2456f895e2f7f8ce7e57"} Apr 16 18:34:42.604396 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:42.604356 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6d557589b4-w5v7x" event={"ID":"6472b575-f118-402f-b723-1a8ec3fdae51","Type":"ContainerStarted","Data":"38a50aa8162b564772d0cc3dc9457d4f9eaa92baff091e9dcdaf6c4e5810a015"} Apr 16 18:34:42.604396 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:42.604398 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6d557589b4-w5v7x" event={"ID":"6472b575-f118-402f-b723-1a8ec3fdae51","Type":"ContainerStarted","Data":"ffd3872c77da22d9c1ae6cf649e2c826fc045b973292725cd4396859d4cd1ffe"} Apr 16 18:34:42.626621 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:42.626566 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-6d557589b4-w5v7x" podStartSLOduration=2.065875653 podStartE2EDuration="4.626552334s" podCreationTimestamp="2026-04-16 18:34:38 +0000 UTC" firstStartedPulling="2026-04-16 18:34:39.109898306 +0000 UTC m=+255.897234541" lastFinishedPulling="2026-04-16 18:34:41.670574984 +0000 UTC m=+258.457911222" observedRunningTime="2026-04-16 18:34:42.625143429 +0000 UTC m=+259.412479684" watchObservedRunningTime="2026-04-16 18:34:42.626552334 +0000 UTC m=+259.413888591" Apr 16 18:34:43.204430 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:43.204397 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-579f8479d6-t9b6r"] Apr 16 18:34:43.207771 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:43.207753 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-579f8479d6-t9b6r" Apr 16 18:34:43.219570 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:43.219543 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-579f8479d6-t9b6r"] Apr 16 18:34:43.342747 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:43.342719 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c5d7c692-c58a-4c01-8726-225a5e30f2d4-console-config\") pod \"console-579f8479d6-t9b6r\" (UID: \"c5d7c692-c58a-4c01-8726-225a5e30f2d4\") " pod="openshift-console/console-579f8479d6-t9b6r" Apr 16 18:34:43.342875 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:43.342751 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5t22c\" (UniqueName: \"kubernetes.io/projected/c5d7c692-c58a-4c01-8726-225a5e30f2d4-kube-api-access-5t22c\") pod \"console-579f8479d6-t9b6r\" (UID: \"c5d7c692-c58a-4c01-8726-225a5e30f2d4\") " pod="openshift-console/console-579f8479d6-t9b6r" Apr 16 18:34:43.342875 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:43.342778 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c5d7c692-c58a-4c01-8726-225a5e30f2d4-console-oauth-config\") pod \"console-579f8479d6-t9b6r\" (UID: \"c5d7c692-c58a-4c01-8726-225a5e30f2d4\") " pod="openshift-console/console-579f8479d6-t9b6r" Apr 16 18:34:43.342875 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:43.342859 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c5d7c692-c58a-4c01-8726-225a5e30f2d4-service-ca\") pod \"console-579f8479d6-t9b6r\" (UID: \"c5d7c692-c58a-4c01-8726-225a5e30f2d4\") " pod="openshift-console/console-579f8479d6-t9b6r" Apr 16 18:34:43.342980 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:43.342896 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5d7c692-c58a-4c01-8726-225a5e30f2d4-trusted-ca-bundle\") pod \"console-579f8479d6-t9b6r\" (UID: \"c5d7c692-c58a-4c01-8726-225a5e30f2d4\") " pod="openshift-console/console-579f8479d6-t9b6r" Apr 16 18:34:43.342980 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:43.342917 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c5d7c692-c58a-4c01-8726-225a5e30f2d4-console-serving-cert\") pod \"console-579f8479d6-t9b6r\" (UID: \"c5d7c692-c58a-4c01-8726-225a5e30f2d4\") " pod="openshift-console/console-579f8479d6-t9b6r" Apr 16 18:34:43.342980 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:43.342941 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c5d7c692-c58a-4c01-8726-225a5e30f2d4-oauth-serving-cert\") pod \"console-579f8479d6-t9b6r\" (UID: \"c5d7c692-c58a-4c01-8726-225a5e30f2d4\") " pod="openshift-console/console-579f8479d6-t9b6r" Apr 16 18:34:43.443570 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:43.443538 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5d7c692-c58a-4c01-8726-225a5e30f2d4-trusted-ca-bundle\") pod \"console-579f8479d6-t9b6r\" (UID: \"c5d7c692-c58a-4c01-8726-225a5e30f2d4\") " pod="openshift-console/console-579f8479d6-t9b6r" Apr 16 18:34:43.443570 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:43.443576 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c5d7c692-c58a-4c01-8726-225a5e30f2d4-console-serving-cert\") pod \"console-579f8479d6-t9b6r\" (UID: \"c5d7c692-c58a-4c01-8726-225a5e30f2d4\") " pod="openshift-console/console-579f8479d6-t9b6r" Apr 16 18:34:43.443769 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:43.443608 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c5d7c692-c58a-4c01-8726-225a5e30f2d4-oauth-serving-cert\") pod \"console-579f8479d6-t9b6r\" (UID: \"c5d7c692-c58a-4c01-8726-225a5e30f2d4\") " pod="openshift-console/console-579f8479d6-t9b6r" Apr 16 18:34:43.443769 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:43.443647 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c5d7c692-c58a-4c01-8726-225a5e30f2d4-console-config\") pod \"console-579f8479d6-t9b6r\" (UID: \"c5d7c692-c58a-4c01-8726-225a5e30f2d4\") " pod="openshift-console/console-579f8479d6-t9b6r" Apr 16 18:34:43.443769 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:43.443687 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5t22c\" (UniqueName: \"kubernetes.io/projected/c5d7c692-c58a-4c01-8726-225a5e30f2d4-kube-api-access-5t22c\") pod \"console-579f8479d6-t9b6r\" (UID: \"c5d7c692-c58a-4c01-8726-225a5e30f2d4\") " pod="openshift-console/console-579f8479d6-t9b6r" Apr 16 18:34:43.443769 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:43.443730 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c5d7c692-c58a-4c01-8726-225a5e30f2d4-console-oauth-config\") pod \"console-579f8479d6-t9b6r\" (UID: \"c5d7c692-c58a-4c01-8726-225a5e30f2d4\") " pod="openshift-console/console-579f8479d6-t9b6r" Apr 16 18:34:43.443947 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:43.443802 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c5d7c692-c58a-4c01-8726-225a5e30f2d4-service-ca\") pod \"console-579f8479d6-t9b6r\" (UID: \"c5d7c692-c58a-4c01-8726-225a5e30f2d4\") " pod="openshift-console/console-579f8479d6-t9b6r" Apr 16 18:34:43.444505 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:43.444477 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c5d7c692-c58a-4c01-8726-225a5e30f2d4-oauth-serving-cert\") pod \"console-579f8479d6-t9b6r\" (UID: \"c5d7c692-c58a-4c01-8726-225a5e30f2d4\") " pod="openshift-console/console-579f8479d6-t9b6r" Apr 16 18:34:43.444633 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:43.444577 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c5d7c692-c58a-4c01-8726-225a5e30f2d4-service-ca\") pod \"console-579f8479d6-t9b6r\" (UID: \"c5d7c692-c58a-4c01-8726-225a5e30f2d4\") " pod="openshift-console/console-579f8479d6-t9b6r" Apr 16 18:34:43.444633 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:43.444597 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5d7c692-c58a-4c01-8726-225a5e30f2d4-trusted-ca-bundle\") pod \"console-579f8479d6-t9b6r\" (UID: \"c5d7c692-c58a-4c01-8726-225a5e30f2d4\") " pod="openshift-console/console-579f8479d6-t9b6r" Apr 16 18:34:43.444633 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:43.444617 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c5d7c692-c58a-4c01-8726-225a5e30f2d4-console-config\") pod \"console-579f8479d6-t9b6r\" (UID: \"c5d7c692-c58a-4c01-8726-225a5e30f2d4\") " pod="openshift-console/console-579f8479d6-t9b6r" Apr 16 18:34:43.446045 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:43.446024 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c5d7c692-c58a-4c01-8726-225a5e30f2d4-console-oauth-config\") pod \"console-579f8479d6-t9b6r\" (UID: \"c5d7c692-c58a-4c01-8726-225a5e30f2d4\") " pod="openshift-console/console-579f8479d6-t9b6r" Apr 16 18:34:43.446160 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:43.446142 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c5d7c692-c58a-4c01-8726-225a5e30f2d4-console-serving-cert\") pod \"console-579f8479d6-t9b6r\" (UID: \"c5d7c692-c58a-4c01-8726-225a5e30f2d4\") " pod="openshift-console/console-579f8479d6-t9b6r" Apr 16 18:34:43.451811 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:43.451790 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5t22c\" (UniqueName: \"kubernetes.io/projected/c5d7c692-c58a-4c01-8726-225a5e30f2d4-kube-api-access-5t22c\") pod \"console-579f8479d6-t9b6r\" (UID: \"c5d7c692-c58a-4c01-8726-225a5e30f2d4\") " pod="openshift-console/console-579f8479d6-t9b6r" Apr 16 18:34:43.516290 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:43.516259 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-579f8479d6-t9b6r" Apr 16 18:34:43.630437 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:43.630407 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-579f8479d6-t9b6r"] Apr 16 18:34:44.611438 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:44.611399 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-579f8479d6-t9b6r" event={"ID":"c5d7c692-c58a-4c01-8726-225a5e30f2d4","Type":"ContainerStarted","Data":"c397e3093ca544026d9ae8f4aac75e1d2c0284495041525d18f1d1e63fa27ecd"} Apr 16 18:34:44.611593 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:44.611443 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-579f8479d6-t9b6r" event={"ID":"c5d7c692-c58a-4c01-8726-225a5e30f2d4","Type":"ContainerStarted","Data":"0a7f4586b0895b9912b41e308ac8c3c8e69f22c0888662f802fe80525742862c"} Apr 16 18:34:44.628443 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:44.628404 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-579f8479d6-t9b6r" podStartSLOduration=1.628389886 podStartE2EDuration="1.628389886s" podCreationTimestamp="2026-04-16 18:34:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:34:44.628019883 +0000 UTC m=+261.415356129" watchObservedRunningTime="2026-04-16 18:34:44.628389886 +0000 UTC m=+261.415726140" Apr 16 18:34:53.517113 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:53.517065 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-579f8479d6-t9b6r" Apr 16 18:34:53.517113 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:53.517116 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-579f8479d6-t9b6r" Apr 16 18:34:53.521752 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:53.521726 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-579f8479d6-t9b6r" Apr 16 18:34:53.641589 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:53.641560 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-579f8479d6-t9b6r" Apr 16 18:34:53.687440 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:34:53.687396 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6778fb5c74-8f269"] Apr 16 18:35:18.707461 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:35:18.707405 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6778fb5c74-8f269" podUID="e9188ba8-56f9-4437-a079-fa3404334317" containerName="console" containerID="cri-o://761b48b0620bc7001b2b4f6fc90f63db1a955948e16bc16723fcab186da61155" gracePeriod=15 Apr 16 18:35:18.931281 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:35:18.931260 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6778fb5c74-8f269_e9188ba8-56f9-4437-a079-fa3404334317/console/0.log" Apr 16 18:35:18.931404 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:35:18.931332 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6778fb5c74-8f269" Apr 16 18:35:19.015278 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:35:19.015249 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e9188ba8-56f9-4437-a079-fa3404334317-oauth-serving-cert\") pod \"e9188ba8-56f9-4437-a079-fa3404334317\" (UID: \"e9188ba8-56f9-4437-a079-fa3404334317\") " Apr 16 18:35:19.015460 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:35:19.015285 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e9188ba8-56f9-4437-a079-fa3404334317-console-oauth-config\") pod \"e9188ba8-56f9-4437-a079-fa3404334317\" (UID: \"e9188ba8-56f9-4437-a079-fa3404334317\") " Apr 16 18:35:19.015460 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:35:19.015344 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e9188ba8-56f9-4437-a079-fa3404334317-trusted-ca-bundle\") pod \"e9188ba8-56f9-4437-a079-fa3404334317\" (UID: \"e9188ba8-56f9-4437-a079-fa3404334317\") " Apr 16 18:35:19.015460 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:35:19.015360 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e9188ba8-56f9-4437-a079-fa3404334317-service-ca\") pod \"e9188ba8-56f9-4437-a079-fa3404334317\" (UID: \"e9188ba8-56f9-4437-a079-fa3404334317\") " Apr 16 18:35:19.015460 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:35:19.015380 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbtt2\" (UniqueName: \"kubernetes.io/projected/e9188ba8-56f9-4437-a079-fa3404334317-kube-api-access-sbtt2\") pod \"e9188ba8-56f9-4437-a079-fa3404334317\" (UID: \"e9188ba8-56f9-4437-a079-fa3404334317\") " Apr 16 18:35:19.015460 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:35:19.015408 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e9188ba8-56f9-4437-a079-fa3404334317-console-config\") pod \"e9188ba8-56f9-4437-a079-fa3404334317\" (UID: \"e9188ba8-56f9-4437-a079-fa3404334317\") " Apr 16 18:35:19.015460 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:35:19.015447 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e9188ba8-56f9-4437-a079-fa3404334317-console-serving-cert\") pod \"e9188ba8-56f9-4437-a079-fa3404334317\" (UID: \"e9188ba8-56f9-4437-a079-fa3404334317\") " Apr 16 18:35:19.015836 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:35:19.015804 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9188ba8-56f9-4437-a079-fa3404334317-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "e9188ba8-56f9-4437-a079-fa3404334317" (UID: "e9188ba8-56f9-4437-a079-fa3404334317"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:35:19.015945 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:35:19.015820 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9188ba8-56f9-4437-a079-fa3404334317-service-ca" (OuterVolumeSpecName: "service-ca") pod "e9188ba8-56f9-4437-a079-fa3404334317" (UID: "e9188ba8-56f9-4437-a079-fa3404334317"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:35:19.015945 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:35:19.015846 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9188ba8-56f9-4437-a079-fa3404334317-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "e9188ba8-56f9-4437-a079-fa3404334317" (UID: "e9188ba8-56f9-4437-a079-fa3404334317"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:35:19.015945 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:35:19.015920 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9188ba8-56f9-4437-a079-fa3404334317-console-config" (OuterVolumeSpecName: "console-config") pod "e9188ba8-56f9-4437-a079-fa3404334317" (UID: "e9188ba8-56f9-4437-a079-fa3404334317"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:35:19.017594 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:35:19.017565 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9188ba8-56f9-4437-a079-fa3404334317-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "e9188ba8-56f9-4437-a079-fa3404334317" (UID: "e9188ba8-56f9-4437-a079-fa3404334317"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:35:19.017594 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:35:19.017574 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9188ba8-56f9-4437-a079-fa3404334317-kube-api-access-sbtt2" (OuterVolumeSpecName: "kube-api-access-sbtt2") pod "e9188ba8-56f9-4437-a079-fa3404334317" (UID: "e9188ba8-56f9-4437-a079-fa3404334317"). InnerVolumeSpecName "kube-api-access-sbtt2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:35:19.017722 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:35:19.017593 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9188ba8-56f9-4437-a079-fa3404334317-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "e9188ba8-56f9-4437-a079-fa3404334317" (UID: "e9188ba8-56f9-4437-a079-fa3404334317"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:35:19.116844 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:35:19.116811 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e9188ba8-56f9-4437-a079-fa3404334317-oauth-serving-cert\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 18:35:19.116844 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:35:19.116838 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e9188ba8-56f9-4437-a079-fa3404334317-console-oauth-config\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 18:35:19.116844 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:35:19.116848 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e9188ba8-56f9-4437-a079-fa3404334317-trusted-ca-bundle\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 18:35:19.117061 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:35:19.116862 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e9188ba8-56f9-4437-a079-fa3404334317-service-ca\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 18:35:19.117061 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:35:19.116871 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sbtt2\" (UniqueName: \"kubernetes.io/projected/e9188ba8-56f9-4437-a079-fa3404334317-kube-api-access-sbtt2\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 18:35:19.117061 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:35:19.116881 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e9188ba8-56f9-4437-a079-fa3404334317-console-config\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 18:35:19.117061 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:35:19.116889 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e9188ba8-56f9-4437-a079-fa3404334317-console-serving-cert\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 18:35:19.707245 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:35:19.707218 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6778fb5c74-8f269_e9188ba8-56f9-4437-a079-fa3404334317/console/0.log" Apr 16 18:35:19.707428 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:35:19.707255 2578 generic.go:358] "Generic (PLEG): container finished" podID="e9188ba8-56f9-4437-a079-fa3404334317" containerID="761b48b0620bc7001b2b4f6fc90f63db1a955948e16bc16723fcab186da61155" exitCode=2 Apr 16 18:35:19.707428 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:35:19.707286 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6778fb5c74-8f269" event={"ID":"e9188ba8-56f9-4437-a079-fa3404334317","Type":"ContainerDied","Data":"761b48b0620bc7001b2b4f6fc90f63db1a955948e16bc16723fcab186da61155"} Apr 16 18:35:19.707428 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:35:19.707329 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6778fb5c74-8f269" event={"ID":"e9188ba8-56f9-4437-a079-fa3404334317","Type":"ContainerDied","Data":"62f9504f7c9ab079b4ffcb6fb78d1cf055bcaf246b233a250d47761450b14720"} Apr 16 18:35:19.707428 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:35:19.707333 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6778fb5c74-8f269" Apr 16 18:35:19.707428 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:35:19.707344 2578 scope.go:117] "RemoveContainer" containerID="761b48b0620bc7001b2b4f6fc90f63db1a955948e16bc16723fcab186da61155" Apr 16 18:35:19.716067 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:35:19.716052 2578 scope.go:117] "RemoveContainer" containerID="761b48b0620bc7001b2b4f6fc90f63db1a955948e16bc16723fcab186da61155" Apr 16 18:35:19.716347 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:35:19.716330 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"761b48b0620bc7001b2b4f6fc90f63db1a955948e16bc16723fcab186da61155\": container with ID starting with 761b48b0620bc7001b2b4f6fc90f63db1a955948e16bc16723fcab186da61155 not found: ID does not exist" containerID="761b48b0620bc7001b2b4f6fc90f63db1a955948e16bc16723fcab186da61155" Apr 16 18:35:19.716400 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:35:19.716356 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"761b48b0620bc7001b2b4f6fc90f63db1a955948e16bc16723fcab186da61155"} err="failed to get container status \"761b48b0620bc7001b2b4f6fc90f63db1a955948e16bc16723fcab186da61155\": rpc error: code = NotFound desc = could not find container \"761b48b0620bc7001b2b4f6fc90f63db1a955948e16bc16723fcab186da61155\": container with ID starting with 761b48b0620bc7001b2b4f6fc90f63db1a955948e16bc16723fcab186da61155 not found: ID does not exist" Apr 16 18:35:19.727955 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:35:19.727931 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6778fb5c74-8f269"] Apr 16 18:35:19.732119 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:35:19.732100 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6778fb5c74-8f269"] Apr 16 18:35:19.840838 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:35:19.840805 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9188ba8-56f9-4437-a079-fa3404334317" path="/var/lib/kubelet/pods/e9188ba8-56f9-4437-a079-fa3404334317/volumes" Apr 16 18:35:23.708427 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:35:23.708390 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fdzzg_02533a21-4e1f-4bc0-a493-7ac7d35295b8/ovn-acl-logging/0.log" Apr 16 18:35:23.708906 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:35:23.708610 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fdzzg_02533a21-4e1f-4bc0-a493-7ac7d35295b8/ovn-acl-logging/0.log" Apr 16 18:35:23.715315 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:35:23.715298 2578 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 18:36:18.189022 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:18.188929 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7b49dcbf9f-npgkc"] Apr 16 18:36:18.189535 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:18.189267 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e9188ba8-56f9-4437-a079-fa3404334317" containerName="console" Apr 16 18:36:18.189535 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:18.189280 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9188ba8-56f9-4437-a079-fa3404334317" containerName="console" Apr 16 18:36:18.189535 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:18.189365 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="e9188ba8-56f9-4437-a079-fa3404334317" containerName="console" Apr 16 18:36:18.192269 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:18.192251 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b49dcbf9f-npgkc" Apr 16 18:36:18.203080 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:18.203057 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7b49dcbf9f-npgkc"] Apr 16 18:36:18.376667 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:18.376629 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a1b0ae9c-d0d2-465b-8b06-9b5710035c19-oauth-serving-cert\") pod \"console-7b49dcbf9f-npgkc\" (UID: \"a1b0ae9c-d0d2-465b-8b06-9b5710035c19\") " pod="openshift-console/console-7b49dcbf9f-npgkc" Apr 16 18:36:18.376856 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:18.376685 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a1b0ae9c-d0d2-465b-8b06-9b5710035c19-console-oauth-config\") pod \"console-7b49dcbf9f-npgkc\" (UID: \"a1b0ae9c-d0d2-465b-8b06-9b5710035c19\") " pod="openshift-console/console-7b49dcbf9f-npgkc" Apr 16 18:36:18.376856 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:18.376717 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a1b0ae9c-d0d2-465b-8b06-9b5710035c19-service-ca\") pod \"console-7b49dcbf9f-npgkc\" (UID: \"a1b0ae9c-d0d2-465b-8b06-9b5710035c19\") " pod="openshift-console/console-7b49dcbf9f-npgkc" Apr 16 18:36:18.376856 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:18.376743 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a1b0ae9c-d0d2-465b-8b06-9b5710035c19-console-config\") pod \"console-7b49dcbf9f-npgkc\" (UID: \"a1b0ae9c-d0d2-465b-8b06-9b5710035c19\") " pod="openshift-console/console-7b49dcbf9f-npgkc" Apr 16 18:36:18.376856 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:18.376763 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a1b0ae9c-d0d2-465b-8b06-9b5710035c19-console-serving-cert\") pod \"console-7b49dcbf9f-npgkc\" (UID: \"a1b0ae9c-d0d2-465b-8b06-9b5710035c19\") " pod="openshift-console/console-7b49dcbf9f-npgkc" Apr 16 18:36:18.376856 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:18.376779 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a1b0ae9c-d0d2-465b-8b06-9b5710035c19-trusted-ca-bundle\") pod \"console-7b49dcbf9f-npgkc\" (UID: \"a1b0ae9c-d0d2-465b-8b06-9b5710035c19\") " pod="openshift-console/console-7b49dcbf9f-npgkc" Apr 16 18:36:18.376856 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:18.376798 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dg4nk\" (UniqueName: \"kubernetes.io/projected/a1b0ae9c-d0d2-465b-8b06-9b5710035c19-kube-api-access-dg4nk\") pod \"console-7b49dcbf9f-npgkc\" (UID: \"a1b0ae9c-d0d2-465b-8b06-9b5710035c19\") " pod="openshift-console/console-7b49dcbf9f-npgkc" Apr 16 18:36:18.477591 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:18.477554 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a1b0ae9c-d0d2-465b-8b06-9b5710035c19-oauth-serving-cert\") pod \"console-7b49dcbf9f-npgkc\" (UID: \"a1b0ae9c-d0d2-465b-8b06-9b5710035c19\") " pod="openshift-console/console-7b49dcbf9f-npgkc" Apr 16 18:36:18.477774 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:18.477608 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a1b0ae9c-d0d2-465b-8b06-9b5710035c19-console-oauth-config\") pod \"console-7b49dcbf9f-npgkc\" (UID: \"a1b0ae9c-d0d2-465b-8b06-9b5710035c19\") " pod="openshift-console/console-7b49dcbf9f-npgkc" Apr 16 18:36:18.477774 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:18.477643 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a1b0ae9c-d0d2-465b-8b06-9b5710035c19-service-ca\") pod \"console-7b49dcbf9f-npgkc\" (UID: \"a1b0ae9c-d0d2-465b-8b06-9b5710035c19\") " pod="openshift-console/console-7b49dcbf9f-npgkc" Apr 16 18:36:18.477774 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:18.477670 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a1b0ae9c-d0d2-465b-8b06-9b5710035c19-console-config\") pod \"console-7b49dcbf9f-npgkc\" (UID: \"a1b0ae9c-d0d2-465b-8b06-9b5710035c19\") " pod="openshift-console/console-7b49dcbf9f-npgkc" Apr 16 18:36:18.477774 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:18.477696 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a1b0ae9c-d0d2-465b-8b06-9b5710035c19-console-serving-cert\") pod \"console-7b49dcbf9f-npgkc\" (UID: \"a1b0ae9c-d0d2-465b-8b06-9b5710035c19\") " pod="openshift-console/console-7b49dcbf9f-npgkc" Apr 16 18:36:18.477774 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:18.477721 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a1b0ae9c-d0d2-465b-8b06-9b5710035c19-trusted-ca-bundle\") pod \"console-7b49dcbf9f-npgkc\" (UID: \"a1b0ae9c-d0d2-465b-8b06-9b5710035c19\") " pod="openshift-console/console-7b49dcbf9f-npgkc" Apr 16 18:36:18.478042 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:18.477861 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dg4nk\" (UniqueName: \"kubernetes.io/projected/a1b0ae9c-d0d2-465b-8b06-9b5710035c19-kube-api-access-dg4nk\") pod \"console-7b49dcbf9f-npgkc\" (UID: \"a1b0ae9c-d0d2-465b-8b06-9b5710035c19\") " pod="openshift-console/console-7b49dcbf9f-npgkc" Apr 16 18:36:18.478378 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:18.478354 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a1b0ae9c-d0d2-465b-8b06-9b5710035c19-oauth-serving-cert\") pod \"console-7b49dcbf9f-npgkc\" (UID: \"a1b0ae9c-d0d2-465b-8b06-9b5710035c19\") " pod="openshift-console/console-7b49dcbf9f-npgkc" Apr 16 18:36:18.478474 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:18.478418 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a1b0ae9c-d0d2-465b-8b06-9b5710035c19-service-ca\") pod \"console-7b49dcbf9f-npgkc\" (UID: \"a1b0ae9c-d0d2-465b-8b06-9b5710035c19\") " pod="openshift-console/console-7b49dcbf9f-npgkc" Apr 16 18:36:18.478829 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:18.478806 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a1b0ae9c-d0d2-465b-8b06-9b5710035c19-console-config\") pod \"console-7b49dcbf9f-npgkc\" (UID: \"a1b0ae9c-d0d2-465b-8b06-9b5710035c19\") " pod="openshift-console/console-7b49dcbf9f-npgkc" Apr 16 18:36:18.478873 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:18.478817 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a1b0ae9c-d0d2-465b-8b06-9b5710035c19-trusted-ca-bundle\") pod \"console-7b49dcbf9f-npgkc\" (UID: \"a1b0ae9c-d0d2-465b-8b06-9b5710035c19\") " pod="openshift-console/console-7b49dcbf9f-npgkc" Apr 16 18:36:18.480087 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:18.480066 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a1b0ae9c-d0d2-465b-8b06-9b5710035c19-console-oauth-config\") pod \"console-7b49dcbf9f-npgkc\" (UID: \"a1b0ae9c-d0d2-465b-8b06-9b5710035c19\") " pod="openshift-console/console-7b49dcbf9f-npgkc" Apr 16 18:36:18.480277 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:18.480260 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a1b0ae9c-d0d2-465b-8b06-9b5710035c19-console-serving-cert\") pod \"console-7b49dcbf9f-npgkc\" (UID: \"a1b0ae9c-d0d2-465b-8b06-9b5710035c19\") " pod="openshift-console/console-7b49dcbf9f-npgkc" Apr 16 18:36:18.487728 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:18.487696 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dg4nk\" (UniqueName: \"kubernetes.io/projected/a1b0ae9c-d0d2-465b-8b06-9b5710035c19-kube-api-access-dg4nk\") pod \"console-7b49dcbf9f-npgkc\" (UID: \"a1b0ae9c-d0d2-465b-8b06-9b5710035c19\") " pod="openshift-console/console-7b49dcbf9f-npgkc" Apr 16 18:36:18.501571 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:18.501550 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b49dcbf9f-npgkc" Apr 16 18:36:18.622262 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:18.622236 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7b49dcbf9f-npgkc"] Apr 16 18:36:18.624801 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:36:18.624777 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1b0ae9c_d0d2_465b_8b06_9b5710035c19.slice/crio-b4a2b09a71b38f62cf3f602bdef46d06c16f38d84616207f47e0e50d7a675cb0 WatchSource:0}: Error finding container b4a2b09a71b38f62cf3f602bdef46d06c16f38d84616207f47e0e50d7a675cb0: Status 404 returned error can't find the container with id b4a2b09a71b38f62cf3f602bdef46d06c16f38d84616207f47e0e50d7a675cb0 Apr 16 18:36:18.626628 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:18.626612 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:36:18.874656 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:18.874560 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b49dcbf9f-npgkc" event={"ID":"a1b0ae9c-d0d2-465b-8b06-9b5710035c19","Type":"ContainerStarted","Data":"26e3d10b9b4e2e14d5150bdf5c4f1e395d985dd24eac51481af2250897ac19e2"} Apr 16 18:36:18.874656 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:18.874595 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b49dcbf9f-npgkc" event={"ID":"a1b0ae9c-d0d2-465b-8b06-9b5710035c19","Type":"ContainerStarted","Data":"b4a2b09a71b38f62cf3f602bdef46d06c16f38d84616207f47e0e50d7a675cb0"} Apr 16 18:36:18.892487 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:18.892439 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7b49dcbf9f-npgkc" podStartSLOduration=0.892423901 podStartE2EDuration="892.423901ms" podCreationTimestamp="2026-04-16 18:36:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:36:18.890926722 +0000 UTC m=+355.678262991" watchObservedRunningTime="2026-04-16 18:36:18.892423901 +0000 UTC m=+355.679760157" Apr 16 18:36:28.502034 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:28.501991 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7b49dcbf9f-npgkc" Apr 16 18:36:28.502034 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:28.502034 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7b49dcbf9f-npgkc" Apr 16 18:36:28.506982 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:28.506952 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7b49dcbf9f-npgkc" Apr 16 18:36:28.910973 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:28.910888 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7b49dcbf9f-npgkc" Apr 16 18:36:28.957014 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:28.956978 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-579f8479d6-t9b6r"] Apr 16 18:36:35.417557 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:35.417519 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-dgp56"] Apr 16 18:36:35.422037 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:35.422017 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dgp56" Apr 16 18:36:35.424657 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:35.424634 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 18:36:35.429101 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:35.428992 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-dgp56"] Apr 16 18:36:35.498334 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:35.498292 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/38d2b4bc-50cc-4d31-b125-9af46f137f46-dbus\") pod \"global-pull-secret-syncer-dgp56\" (UID: \"38d2b4bc-50cc-4d31-b125-9af46f137f46\") " pod="kube-system/global-pull-secret-syncer-dgp56" Apr 16 18:36:35.498334 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:35.498339 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/38d2b4bc-50cc-4d31-b125-9af46f137f46-original-pull-secret\") pod \"global-pull-secret-syncer-dgp56\" (UID: \"38d2b4bc-50cc-4d31-b125-9af46f137f46\") " pod="kube-system/global-pull-secret-syncer-dgp56" Apr 16 18:36:35.498561 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:35.498382 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/38d2b4bc-50cc-4d31-b125-9af46f137f46-kubelet-config\") pod \"global-pull-secret-syncer-dgp56\" (UID: \"38d2b4bc-50cc-4d31-b125-9af46f137f46\") " pod="kube-system/global-pull-secret-syncer-dgp56" Apr 16 18:36:35.598915 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:35.598884 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/38d2b4bc-50cc-4d31-b125-9af46f137f46-dbus\") pod \"global-pull-secret-syncer-dgp56\" (UID: \"38d2b4bc-50cc-4d31-b125-9af46f137f46\") " pod="kube-system/global-pull-secret-syncer-dgp56" Apr 16 18:36:35.599038 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:35.598924 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/38d2b4bc-50cc-4d31-b125-9af46f137f46-original-pull-secret\") pod \"global-pull-secret-syncer-dgp56\" (UID: \"38d2b4bc-50cc-4d31-b125-9af46f137f46\") " pod="kube-system/global-pull-secret-syncer-dgp56" Apr 16 18:36:35.599038 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:35.598966 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/38d2b4bc-50cc-4d31-b125-9af46f137f46-kubelet-config\") pod \"global-pull-secret-syncer-dgp56\" (UID: \"38d2b4bc-50cc-4d31-b125-9af46f137f46\") " pod="kube-system/global-pull-secret-syncer-dgp56" Apr 16 18:36:35.599115 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:35.599050 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/38d2b4bc-50cc-4d31-b125-9af46f137f46-kubelet-config\") pod \"global-pull-secret-syncer-dgp56\" (UID: \"38d2b4bc-50cc-4d31-b125-9af46f137f46\") " pod="kube-system/global-pull-secret-syncer-dgp56" Apr 16 18:36:35.599115 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:35.599073 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/38d2b4bc-50cc-4d31-b125-9af46f137f46-dbus\") pod \"global-pull-secret-syncer-dgp56\" (UID: \"38d2b4bc-50cc-4d31-b125-9af46f137f46\") " pod="kube-system/global-pull-secret-syncer-dgp56" Apr 16 18:36:35.601291 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:35.601265 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/38d2b4bc-50cc-4d31-b125-9af46f137f46-original-pull-secret\") pod \"global-pull-secret-syncer-dgp56\" (UID: \"38d2b4bc-50cc-4d31-b125-9af46f137f46\") " pod="kube-system/global-pull-secret-syncer-dgp56" Apr 16 18:36:35.731967 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:35.731930 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dgp56" Apr 16 18:36:35.854004 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:35.853971 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-dgp56"] Apr 16 18:36:35.857424 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:36:35.857396 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38d2b4bc_50cc_4d31_b125_9af46f137f46.slice/crio-8a65b1cd1611dd7599a75375c5fbb233e49f08b08655d133b53c2bfd0eb0c435 WatchSource:0}: Error finding container 8a65b1cd1611dd7599a75375c5fbb233e49f08b08655d133b53c2bfd0eb0c435: Status 404 returned error can't find the container with id 8a65b1cd1611dd7599a75375c5fbb233e49f08b08655d133b53c2bfd0eb0c435 Apr 16 18:36:35.928523 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:35.928486 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-dgp56" event={"ID":"38d2b4bc-50cc-4d31-b125-9af46f137f46","Type":"ContainerStarted","Data":"8a65b1cd1611dd7599a75375c5fbb233e49f08b08655d133b53c2bfd0eb0c435"} Apr 16 18:36:39.944470 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:39.944375 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-dgp56" event={"ID":"38d2b4bc-50cc-4d31-b125-9af46f137f46","Type":"ContainerStarted","Data":"e4d62fd9ecf79a81d96c153f985260b5d35f34f0216ff460c1381f1733715249"} Apr 16 18:36:39.959827 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:39.959771 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-dgp56" podStartSLOduration=1.186612395 podStartE2EDuration="4.959755301s" podCreationTimestamp="2026-04-16 18:36:35 +0000 UTC" firstStartedPulling="2026-04-16 18:36:35.859343035 +0000 UTC m=+372.646679272" lastFinishedPulling="2026-04-16 18:36:39.632485942 +0000 UTC m=+376.419822178" observedRunningTime="2026-04-16 18:36:39.95829794 +0000 UTC m=+376.745634209" watchObservedRunningTime="2026-04-16 18:36:39.959755301 +0000 UTC m=+376.747091606" Apr 16 18:36:53.979459 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:53.979400 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-579f8479d6-t9b6r" podUID="c5d7c692-c58a-4c01-8726-225a5e30f2d4" containerName="console" containerID="cri-o://c397e3093ca544026d9ae8f4aac75e1d2c0284495041525d18f1d1e63fa27ecd" gracePeriod=15 Apr 16 18:36:54.215194 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:54.215160 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-579f8479d6-t9b6r_c5d7c692-c58a-4c01-8726-225a5e30f2d4/console/0.log" Apr 16 18:36:54.215318 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:54.215235 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-579f8479d6-t9b6r" Apr 16 18:36:54.225163 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:54.225130 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c5d7c692-c58a-4c01-8726-225a5e30f2d4-console-serving-cert\") pod \"c5d7c692-c58a-4c01-8726-225a5e30f2d4\" (UID: \"c5d7c692-c58a-4c01-8726-225a5e30f2d4\") " Apr 16 18:36:54.225163 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:54.225187 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c5d7c692-c58a-4c01-8726-225a5e30f2d4-oauth-serving-cert\") pod \"c5d7c692-c58a-4c01-8726-225a5e30f2d4\" (UID: \"c5d7c692-c58a-4c01-8726-225a5e30f2d4\") " Apr 16 18:36:54.225374 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:54.225209 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c5d7c692-c58a-4c01-8726-225a5e30f2d4-service-ca\") pod \"c5d7c692-c58a-4c01-8726-225a5e30f2d4\" (UID: \"c5d7c692-c58a-4c01-8726-225a5e30f2d4\") " Apr 16 18:36:54.225374 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:54.225243 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5t22c\" (UniqueName: \"kubernetes.io/projected/c5d7c692-c58a-4c01-8726-225a5e30f2d4-kube-api-access-5t22c\") pod \"c5d7c692-c58a-4c01-8726-225a5e30f2d4\" (UID: \"c5d7c692-c58a-4c01-8726-225a5e30f2d4\") " Apr 16 18:36:54.225374 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:54.225312 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5d7c692-c58a-4c01-8726-225a5e30f2d4-trusted-ca-bundle\") pod \"c5d7c692-c58a-4c01-8726-225a5e30f2d4\" (UID: \"c5d7c692-c58a-4c01-8726-225a5e30f2d4\") " Apr 16 18:36:54.225374 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:54.225350 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c5d7c692-c58a-4c01-8726-225a5e30f2d4-console-config\") pod \"c5d7c692-c58a-4c01-8726-225a5e30f2d4\" (UID: \"c5d7c692-c58a-4c01-8726-225a5e30f2d4\") " Apr 16 18:36:54.225782 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:54.225749 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5d7c692-c58a-4c01-8726-225a5e30f2d4-service-ca" (OuterVolumeSpecName: "service-ca") pod "c5d7c692-c58a-4c01-8726-225a5e30f2d4" (UID: "c5d7c692-c58a-4c01-8726-225a5e30f2d4"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:36:54.225782 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:54.225767 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5d7c692-c58a-4c01-8726-225a5e30f2d4-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "c5d7c692-c58a-4c01-8726-225a5e30f2d4" (UID: "c5d7c692-c58a-4c01-8726-225a5e30f2d4"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:36:54.225955 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:54.225791 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5d7c692-c58a-4c01-8726-225a5e30f2d4-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c5d7c692-c58a-4c01-8726-225a5e30f2d4" (UID: "c5d7c692-c58a-4c01-8726-225a5e30f2d4"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:36:54.225955 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:54.225884 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5d7c692-c58a-4c01-8726-225a5e30f2d4-console-config" (OuterVolumeSpecName: "console-config") pod "c5d7c692-c58a-4c01-8726-225a5e30f2d4" (UID: "c5d7c692-c58a-4c01-8726-225a5e30f2d4"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:36:54.227418 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:54.227396 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5d7c692-c58a-4c01-8726-225a5e30f2d4-kube-api-access-5t22c" (OuterVolumeSpecName: "kube-api-access-5t22c") pod "c5d7c692-c58a-4c01-8726-225a5e30f2d4" (UID: "c5d7c692-c58a-4c01-8726-225a5e30f2d4"). InnerVolumeSpecName "kube-api-access-5t22c". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:36:54.227418 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:54.227408 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5d7c692-c58a-4c01-8726-225a5e30f2d4-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c5d7c692-c58a-4c01-8726-225a5e30f2d4" (UID: "c5d7c692-c58a-4c01-8726-225a5e30f2d4"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:36:54.325888 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:54.325850 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c5d7c692-c58a-4c01-8726-225a5e30f2d4-console-oauth-config\") pod \"c5d7c692-c58a-4c01-8726-225a5e30f2d4\" (UID: \"c5d7c692-c58a-4c01-8726-225a5e30f2d4\") " Apr 16 18:36:54.326060 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:54.326014 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5t22c\" (UniqueName: \"kubernetes.io/projected/c5d7c692-c58a-4c01-8726-225a5e30f2d4-kube-api-access-5t22c\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 18:36:54.326060 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:54.326025 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5d7c692-c58a-4c01-8726-225a5e30f2d4-trusted-ca-bundle\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 18:36:54.326060 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:54.326035 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c5d7c692-c58a-4c01-8726-225a5e30f2d4-console-config\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 18:36:54.326060 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:54.326043 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c5d7c692-c58a-4c01-8726-225a5e30f2d4-console-serving-cert\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 18:36:54.326060 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:54.326053 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c5d7c692-c58a-4c01-8726-225a5e30f2d4-oauth-serving-cert\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 18:36:54.326238 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:54.326063 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c5d7c692-c58a-4c01-8726-225a5e30f2d4-service-ca\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 18:36:54.327942 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:54.327917 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5d7c692-c58a-4c01-8726-225a5e30f2d4-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c5d7c692-c58a-4c01-8726-225a5e30f2d4" (UID: "c5d7c692-c58a-4c01-8726-225a5e30f2d4"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:36:54.426412 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:54.426378 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c5d7c692-c58a-4c01-8726-225a5e30f2d4-console-oauth-config\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 18:36:54.992870 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:54.992844 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-579f8479d6-t9b6r_c5d7c692-c58a-4c01-8726-225a5e30f2d4/console/0.log" Apr 16 18:36:54.993235 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:54.992883 2578 generic.go:358] "Generic (PLEG): container finished" podID="c5d7c692-c58a-4c01-8726-225a5e30f2d4" containerID="c397e3093ca544026d9ae8f4aac75e1d2c0284495041525d18f1d1e63fa27ecd" exitCode=2 Apr 16 18:36:54.993235 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:54.992913 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-579f8479d6-t9b6r" event={"ID":"c5d7c692-c58a-4c01-8726-225a5e30f2d4","Type":"ContainerDied","Data":"c397e3093ca544026d9ae8f4aac75e1d2c0284495041525d18f1d1e63fa27ecd"} Apr 16 18:36:54.993235 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:54.992958 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-579f8479d6-t9b6r" event={"ID":"c5d7c692-c58a-4c01-8726-225a5e30f2d4","Type":"ContainerDied","Data":"0a7f4586b0895b9912b41e308ac8c3c8e69f22c0888662f802fe80525742862c"} Apr 16 18:36:54.993235 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:54.992960 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-579f8479d6-t9b6r" Apr 16 18:36:54.993235 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:54.992975 2578 scope.go:117] "RemoveContainer" containerID="c397e3093ca544026d9ae8f4aac75e1d2c0284495041525d18f1d1e63fa27ecd" Apr 16 18:36:55.000902 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:55.000880 2578 scope.go:117] "RemoveContainer" containerID="c397e3093ca544026d9ae8f4aac75e1d2c0284495041525d18f1d1e63fa27ecd" Apr 16 18:36:55.001168 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:36:55.001148 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c397e3093ca544026d9ae8f4aac75e1d2c0284495041525d18f1d1e63fa27ecd\": container with ID starting with c397e3093ca544026d9ae8f4aac75e1d2c0284495041525d18f1d1e63fa27ecd not found: ID does not exist" containerID="c397e3093ca544026d9ae8f4aac75e1d2c0284495041525d18f1d1e63fa27ecd" Apr 16 18:36:55.001251 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:55.001196 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c397e3093ca544026d9ae8f4aac75e1d2c0284495041525d18f1d1e63fa27ecd"} err="failed to get container status \"c397e3093ca544026d9ae8f4aac75e1d2c0284495041525d18f1d1e63fa27ecd\": rpc error: code = NotFound desc = could not find container \"c397e3093ca544026d9ae8f4aac75e1d2c0284495041525d18f1d1e63fa27ecd\": container with ID starting with c397e3093ca544026d9ae8f4aac75e1d2c0284495041525d18f1d1e63fa27ecd not found: ID does not exist" Apr 16 18:36:55.029072 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:55.029046 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-579f8479d6-t9b6r"] Apr 16 18:36:55.038480 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:55.038456 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-579f8479d6-t9b6r"] Apr 16 18:36:55.841807 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:36:55.841774 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5d7c692-c58a-4c01-8726-225a5e30f2d4" path="/var/lib/kubelet/pods/c5d7c692-c58a-4c01-8726-225a5e30f2d4/volumes" Apr 16 18:37:01.359970 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:01.359939 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl7d5f"] Apr 16 18:37:01.360340 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:01.360228 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c5d7c692-c58a-4c01-8726-225a5e30f2d4" containerName="console" Apr 16 18:37:01.360340 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:01.360239 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5d7c692-c58a-4c01-8726-225a5e30f2d4" containerName="console" Apr 16 18:37:01.360340 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:01.360311 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="c5d7c692-c58a-4c01-8726-225a5e30f2d4" containerName="console" Apr 16 18:37:01.364880 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:01.364862 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl7d5f" Apr 16 18:37:01.367710 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:01.367691 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-zlnl5\"" Apr 16 18:37:01.367710 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:01.367698 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 18:37:01.367842 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:01.367690 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 18:37:01.370747 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:01.370728 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl7d5f"] Apr 16 18:37:01.380107 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:01.380085 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/21722f32-f2c8-4f47-a05a-1c364f39a599-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl7d5f\" (UID: \"21722f32-f2c8-4f47-a05a-1c364f39a599\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl7d5f" Apr 16 18:37:01.380257 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:01.380137 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/21722f32-f2c8-4f47-a05a-1c364f39a599-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl7d5f\" (UID: \"21722f32-f2c8-4f47-a05a-1c364f39a599\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl7d5f" Apr 16 18:37:01.380257 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:01.380187 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrvrj\" (UniqueName: \"kubernetes.io/projected/21722f32-f2c8-4f47-a05a-1c364f39a599-kube-api-access-lrvrj\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl7d5f\" (UID: \"21722f32-f2c8-4f47-a05a-1c364f39a599\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl7d5f" Apr 16 18:37:01.481302 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:01.481267 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lrvrj\" (UniqueName: \"kubernetes.io/projected/21722f32-f2c8-4f47-a05a-1c364f39a599-kube-api-access-lrvrj\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl7d5f\" (UID: \"21722f32-f2c8-4f47-a05a-1c364f39a599\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl7d5f" Apr 16 18:37:01.481302 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:01.481304 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/21722f32-f2c8-4f47-a05a-1c364f39a599-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl7d5f\" (UID: \"21722f32-f2c8-4f47-a05a-1c364f39a599\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl7d5f" Apr 16 18:37:01.481498 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:01.481352 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/21722f32-f2c8-4f47-a05a-1c364f39a599-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl7d5f\" (UID: \"21722f32-f2c8-4f47-a05a-1c364f39a599\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl7d5f" Apr 16 18:37:01.481693 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:01.481679 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/21722f32-f2c8-4f47-a05a-1c364f39a599-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl7d5f\" (UID: \"21722f32-f2c8-4f47-a05a-1c364f39a599\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl7d5f" Apr 16 18:37:01.481728 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:01.481710 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/21722f32-f2c8-4f47-a05a-1c364f39a599-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl7d5f\" (UID: \"21722f32-f2c8-4f47-a05a-1c364f39a599\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl7d5f" Apr 16 18:37:01.490255 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:01.490227 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrvrj\" (UniqueName: \"kubernetes.io/projected/21722f32-f2c8-4f47-a05a-1c364f39a599-kube-api-access-lrvrj\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl7d5f\" (UID: \"21722f32-f2c8-4f47-a05a-1c364f39a599\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl7d5f" Apr 16 18:37:01.674446 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:01.674356 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl7d5f" Apr 16 18:37:01.793231 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:01.793207 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl7d5f"] Apr 16 18:37:01.795708 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:37:01.795681 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21722f32_f2c8_4f47_a05a_1c364f39a599.slice/crio-55dbe36f1e7e6fa328d8582b212f93cee4d6ddd2ebfb7cd602ab47ca36b4e3f4 WatchSource:0}: Error finding container 55dbe36f1e7e6fa328d8582b212f93cee4d6ddd2ebfb7cd602ab47ca36b4e3f4: Status 404 returned error can't find the container with id 55dbe36f1e7e6fa328d8582b212f93cee4d6ddd2ebfb7cd602ab47ca36b4e3f4 Apr 16 18:37:02.018776 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:02.018742 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl7d5f" event={"ID":"21722f32-f2c8-4f47-a05a-1c364f39a599","Type":"ContainerStarted","Data":"55dbe36f1e7e6fa328d8582b212f93cee4d6ddd2ebfb7cd602ab47ca36b4e3f4"} Apr 16 18:37:07.035094 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:07.035058 2578 generic.go:358] "Generic (PLEG): container finished" podID="21722f32-f2c8-4f47-a05a-1c364f39a599" containerID="da95de7e5fe660bbf2feb2800b2711d24d3f3de4dcab1dfda016bbc692672282" exitCode=0 Apr 16 18:37:07.035532 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:07.035146 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl7d5f" event={"ID":"21722f32-f2c8-4f47-a05a-1c364f39a599","Type":"ContainerDied","Data":"da95de7e5fe660bbf2feb2800b2711d24d3f3de4dcab1dfda016bbc692672282"} Apr 16 18:37:10.051250 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:10.051218 2578 generic.go:358] "Generic (PLEG): container finished" podID="21722f32-f2c8-4f47-a05a-1c364f39a599" containerID="fac56e077793d8aacbd722d2cf8dbeac49e0427e4cb4cb4c498d76830da5a723" exitCode=0 Apr 16 18:37:10.051644 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:10.051283 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl7d5f" event={"ID":"21722f32-f2c8-4f47-a05a-1c364f39a599","Type":"ContainerDied","Data":"fac56e077793d8aacbd722d2cf8dbeac49e0427e4cb4cb4c498d76830da5a723"} Apr 16 18:37:16.071861 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:16.071824 2578 generic.go:358] "Generic (PLEG): container finished" podID="21722f32-f2c8-4f47-a05a-1c364f39a599" containerID="ca263db872a89a722c821592ccbd1270d8bd200d3a5b750ab255dd52e2787984" exitCode=0 Apr 16 18:37:16.072244 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:16.071907 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl7d5f" event={"ID":"21722f32-f2c8-4f47-a05a-1c364f39a599","Type":"ContainerDied","Data":"ca263db872a89a722c821592ccbd1270d8bd200d3a5b750ab255dd52e2787984"} Apr 16 18:37:17.193696 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:17.193667 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl7d5f" Apr 16 18:37:17.327274 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:17.327165 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrvrj\" (UniqueName: \"kubernetes.io/projected/21722f32-f2c8-4f47-a05a-1c364f39a599-kube-api-access-lrvrj\") pod \"21722f32-f2c8-4f47-a05a-1c364f39a599\" (UID: \"21722f32-f2c8-4f47-a05a-1c364f39a599\") " Apr 16 18:37:17.327274 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:17.327244 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/21722f32-f2c8-4f47-a05a-1c364f39a599-util\") pod \"21722f32-f2c8-4f47-a05a-1c364f39a599\" (UID: \"21722f32-f2c8-4f47-a05a-1c364f39a599\") " Apr 16 18:37:17.327500 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:17.327285 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/21722f32-f2c8-4f47-a05a-1c364f39a599-bundle\") pod \"21722f32-f2c8-4f47-a05a-1c364f39a599\" (UID: \"21722f32-f2c8-4f47-a05a-1c364f39a599\") " Apr 16 18:37:17.327869 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:17.327839 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21722f32-f2c8-4f47-a05a-1c364f39a599-bundle" (OuterVolumeSpecName: "bundle") pod "21722f32-f2c8-4f47-a05a-1c364f39a599" (UID: "21722f32-f2c8-4f47-a05a-1c364f39a599"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:37:17.329371 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:17.329339 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21722f32-f2c8-4f47-a05a-1c364f39a599-kube-api-access-lrvrj" (OuterVolumeSpecName: "kube-api-access-lrvrj") pod "21722f32-f2c8-4f47-a05a-1c364f39a599" (UID: "21722f32-f2c8-4f47-a05a-1c364f39a599"). InnerVolumeSpecName "kube-api-access-lrvrj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:37:17.331186 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:17.331158 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21722f32-f2c8-4f47-a05a-1c364f39a599-util" (OuterVolumeSpecName: "util") pod "21722f32-f2c8-4f47-a05a-1c364f39a599" (UID: "21722f32-f2c8-4f47-a05a-1c364f39a599"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:37:17.427728 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:17.427687 2578 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/21722f32-f2c8-4f47-a05a-1c364f39a599-bundle\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 18:37:17.427728 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:17.427726 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lrvrj\" (UniqueName: \"kubernetes.io/projected/21722f32-f2c8-4f47-a05a-1c364f39a599-kube-api-access-lrvrj\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 18:37:17.427728 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:17.427736 2578 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/21722f32-f2c8-4f47-a05a-1c364f39a599-util\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 18:37:18.079589 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:18.079557 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl7d5f" event={"ID":"21722f32-f2c8-4f47-a05a-1c364f39a599","Type":"ContainerDied","Data":"55dbe36f1e7e6fa328d8582b212f93cee4d6ddd2ebfb7cd602ab47ca36b4e3f4"} Apr 16 18:37:18.079589 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:18.079590 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55dbe36f1e7e6fa328d8582b212f93cee4d6ddd2ebfb7cd602ab47ca36b4e3f4" Apr 16 18:37:18.079804 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:18.079588 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl7d5f" Apr 16 18:37:22.953653 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:22.953618 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4k6wm"] Apr 16 18:37:22.954137 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:22.954032 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="21722f32-f2c8-4f47-a05a-1c364f39a599" containerName="extract" Apr 16 18:37:22.954137 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:22.954050 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="21722f32-f2c8-4f47-a05a-1c364f39a599" containerName="extract" Apr 16 18:37:22.954137 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:22.954070 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="21722f32-f2c8-4f47-a05a-1c364f39a599" containerName="util" Apr 16 18:37:22.954137 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:22.954079 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="21722f32-f2c8-4f47-a05a-1c364f39a599" containerName="util" Apr 16 18:37:22.954137 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:22.954097 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="21722f32-f2c8-4f47-a05a-1c364f39a599" containerName="pull" Apr 16 18:37:22.954137 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:22.954107 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="21722f32-f2c8-4f47-a05a-1c364f39a599" containerName="pull" Apr 16 18:37:22.954466 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:22.954199 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="21722f32-f2c8-4f47-a05a-1c364f39a599" containerName="extract" Apr 16 18:37:22.958325 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:22.958305 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4k6wm" Apr 16 18:37:22.961996 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:22.961969 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 16 18:37:22.962267 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:22.962247 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 16 18:37:22.962490 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:22.962473 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 16 18:37:22.962690 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:22.962673 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-p4wgn\"" Apr 16 18:37:22.969147 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:22.968840 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4k6wm"] Apr 16 18:37:23.071782 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:23.071747 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4s4td\" (UniqueName: \"kubernetes.io/projected/b00f00de-dbb4-4336-948e-9239b7b1a22b-kube-api-access-4s4td\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-4k6wm\" (UID: \"b00f00de-dbb4-4336-948e-9239b7b1a22b\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4k6wm" Apr 16 18:37:23.071782 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:23.071786 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/b00f00de-dbb4-4336-948e-9239b7b1a22b-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-4k6wm\" (UID: \"b00f00de-dbb4-4336-948e-9239b7b1a22b\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4k6wm" Apr 16 18:37:23.172854 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:23.172815 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4s4td\" (UniqueName: \"kubernetes.io/projected/b00f00de-dbb4-4336-948e-9239b7b1a22b-kube-api-access-4s4td\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-4k6wm\" (UID: \"b00f00de-dbb4-4336-948e-9239b7b1a22b\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4k6wm" Apr 16 18:37:23.172854 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:23.172858 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/b00f00de-dbb4-4336-948e-9239b7b1a22b-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-4k6wm\" (UID: \"b00f00de-dbb4-4336-948e-9239b7b1a22b\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4k6wm" Apr 16 18:37:23.175343 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:23.175322 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/b00f00de-dbb4-4336-948e-9239b7b1a22b-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-4k6wm\" (UID: \"b00f00de-dbb4-4336-948e-9239b7b1a22b\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4k6wm" Apr 16 18:37:23.180972 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:23.180945 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4s4td\" (UniqueName: \"kubernetes.io/projected/b00f00de-dbb4-4336-948e-9239b7b1a22b-kube-api-access-4s4td\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-4k6wm\" (UID: \"b00f00de-dbb4-4336-948e-9239b7b1a22b\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4k6wm" Apr 16 18:37:23.272880 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:23.272833 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4k6wm" Apr 16 18:37:23.397737 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:23.397713 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4k6wm"] Apr 16 18:37:23.400065 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:37:23.400041 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb00f00de_dbb4_4336_948e_9239b7b1a22b.slice/crio-8cb4ea24125d1c43c5aeabe76932e49b3206dc09c226572bdaa8faa2f76d8acf WatchSource:0}: Error finding container 8cb4ea24125d1c43c5aeabe76932e49b3206dc09c226572bdaa8faa2f76d8acf: Status 404 returned error can't find the container with id 8cb4ea24125d1c43c5aeabe76932e49b3206dc09c226572bdaa8faa2f76d8acf Apr 16 18:37:24.098285 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:24.098206 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4k6wm" event={"ID":"b00f00de-dbb4-4336-948e-9239b7b1a22b","Type":"ContainerStarted","Data":"8cb4ea24125d1c43c5aeabe76932e49b3206dc09c226572bdaa8faa2f76d8acf"} Apr 16 18:37:27.110855 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:27.110764 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4k6wm" event={"ID":"b00f00de-dbb4-4336-948e-9239b7b1a22b","Type":"ContainerStarted","Data":"ef49334c60d926de71922d22496c51639d210826c0534416b1b3ca4f17820a4c"} Apr 16 18:37:27.111326 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:27.110913 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4k6wm" Apr 16 18:37:27.131476 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:27.131432 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4k6wm" podStartSLOduration=1.713335866 podStartE2EDuration="5.131418157s" podCreationTimestamp="2026-04-16 18:37:22 +0000 UTC" firstStartedPulling="2026-04-16 18:37:23.402226398 +0000 UTC m=+420.189562642" lastFinishedPulling="2026-04-16 18:37:26.820308501 +0000 UTC m=+423.607644933" observedRunningTime="2026-04-16 18:37:27.130526097 +0000 UTC m=+423.917862354" watchObservedRunningTime="2026-04-16 18:37:27.131418157 +0000 UTC m=+423.918754412" Apr 16 18:37:27.331997 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:27.331928 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-b286x"] Apr 16 18:37:27.335329 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:27.335312 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-b286x" Apr 16 18:37:27.338131 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:27.338111 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 16 18:37:27.338255 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:27.338113 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 16 18:37:27.338255 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:27.338202 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-jsjjw\"" Apr 16 18:37:27.344145 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:27.344125 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-b286x"] Apr 16 18:37:27.410789 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:27.410689 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/2e7138bc-61a4-4680-a0b8-53a4984d5552-certificates\") pod \"keda-operator-ffbb595cb-b286x\" (UID: \"2e7138bc-61a4-4680-a0b8-53a4984d5552\") " pod="openshift-keda/keda-operator-ffbb595cb-b286x" Apr 16 18:37:27.410789 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:27.410728 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/2e7138bc-61a4-4680-a0b8-53a4984d5552-cabundle0\") pod \"keda-operator-ffbb595cb-b286x\" (UID: \"2e7138bc-61a4-4680-a0b8-53a4984d5552\") " pod="openshift-keda/keda-operator-ffbb595cb-b286x" Apr 16 18:37:27.410980 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:27.410845 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4dmh\" (UniqueName: \"kubernetes.io/projected/2e7138bc-61a4-4680-a0b8-53a4984d5552-kube-api-access-f4dmh\") pod \"keda-operator-ffbb595cb-b286x\" (UID: \"2e7138bc-61a4-4680-a0b8-53a4984d5552\") " pod="openshift-keda/keda-operator-ffbb595cb-b286x" Apr 16 18:37:27.511485 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:27.511451 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/2e7138bc-61a4-4680-a0b8-53a4984d5552-certificates\") pod \"keda-operator-ffbb595cb-b286x\" (UID: \"2e7138bc-61a4-4680-a0b8-53a4984d5552\") " pod="openshift-keda/keda-operator-ffbb595cb-b286x" Apr 16 18:37:27.511485 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:27.511485 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/2e7138bc-61a4-4680-a0b8-53a4984d5552-cabundle0\") pod \"keda-operator-ffbb595cb-b286x\" (UID: \"2e7138bc-61a4-4680-a0b8-53a4984d5552\") " pod="openshift-keda/keda-operator-ffbb595cb-b286x" Apr 16 18:37:27.511672 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:27.511561 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f4dmh\" (UniqueName: \"kubernetes.io/projected/2e7138bc-61a4-4680-a0b8-53a4984d5552-kube-api-access-f4dmh\") pod \"keda-operator-ffbb595cb-b286x\" (UID: \"2e7138bc-61a4-4680-a0b8-53a4984d5552\") " pod="openshift-keda/keda-operator-ffbb595cb-b286x" Apr 16 18:37:27.511672 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:37:27.511600 2578 secret.go:281] references non-existent secret key: ca.crt Apr 16 18:37:27.511672 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:37:27.511616 2578 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 18:37:27.511672 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:37:27.511627 2578 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-b286x: references non-existent secret key: ca.crt Apr 16 18:37:27.511839 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:37:27.511683 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2e7138bc-61a4-4680-a0b8-53a4984d5552-certificates podName:2e7138bc-61a4-4680-a0b8-53a4984d5552 nodeName:}" failed. No retries permitted until 2026-04-16 18:37:28.01166369 +0000 UTC m=+424.798999935 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/2e7138bc-61a4-4680-a0b8-53a4984d5552-certificates") pod "keda-operator-ffbb595cb-b286x" (UID: "2e7138bc-61a4-4680-a0b8-53a4984d5552") : references non-existent secret key: ca.crt Apr 16 18:37:27.512202 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:27.512160 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/2e7138bc-61a4-4680-a0b8-53a4984d5552-cabundle0\") pod \"keda-operator-ffbb595cb-b286x\" (UID: \"2e7138bc-61a4-4680-a0b8-53a4984d5552\") " pod="openshift-keda/keda-operator-ffbb595cb-b286x" Apr 16 18:37:27.520320 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:27.520301 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4dmh\" (UniqueName: \"kubernetes.io/projected/2e7138bc-61a4-4680-a0b8-53a4984d5552-kube-api-access-f4dmh\") pod \"keda-operator-ffbb595cb-b286x\" (UID: \"2e7138bc-61a4-4680-a0b8-53a4984d5552\") " pod="openshift-keda/keda-operator-ffbb595cb-b286x" Apr 16 18:37:27.750154 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:27.750120 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-dgc5n"] Apr 16 18:37:27.753372 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:27.753336 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-dgc5n" Apr 16 18:37:27.755731 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:27.755710 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 16 18:37:27.763995 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:27.763962 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-dgc5n"] Apr 16 18:37:27.915328 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:27.915300 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0b33f3c2-01eb-4623-8dce-3ea660442ae7-certificates\") pod \"keda-metrics-apiserver-7c9f485588-dgc5n\" (UID: \"0b33f3c2-01eb-4623-8dce-3ea660442ae7\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-dgc5n" Apr 16 18:37:27.915472 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:27.915351 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfvgv\" (UniqueName: \"kubernetes.io/projected/0b33f3c2-01eb-4623-8dce-3ea660442ae7-kube-api-access-jfvgv\") pod \"keda-metrics-apiserver-7c9f485588-dgc5n\" (UID: \"0b33f3c2-01eb-4623-8dce-3ea660442ae7\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-dgc5n" Apr 16 18:37:27.915472 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:27.915437 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/0b33f3c2-01eb-4623-8dce-3ea660442ae7-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-dgc5n\" (UID: \"0b33f3c2-01eb-4623-8dce-3ea660442ae7\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-dgc5n" Apr 16 18:37:28.016697 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:28.016611 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jfvgv\" (UniqueName: \"kubernetes.io/projected/0b33f3c2-01eb-4623-8dce-3ea660442ae7-kube-api-access-jfvgv\") pod \"keda-metrics-apiserver-7c9f485588-dgc5n\" (UID: \"0b33f3c2-01eb-4623-8dce-3ea660442ae7\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-dgc5n" Apr 16 18:37:28.016697 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:28.016668 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/2e7138bc-61a4-4680-a0b8-53a4984d5552-certificates\") pod \"keda-operator-ffbb595cb-b286x\" (UID: \"2e7138bc-61a4-4680-a0b8-53a4984d5552\") " pod="openshift-keda/keda-operator-ffbb595cb-b286x" Apr 16 18:37:28.016906 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:28.016713 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/0b33f3c2-01eb-4623-8dce-3ea660442ae7-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-dgc5n\" (UID: \"0b33f3c2-01eb-4623-8dce-3ea660442ae7\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-dgc5n" Apr 16 18:37:28.016906 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:28.016762 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0b33f3c2-01eb-4623-8dce-3ea660442ae7-certificates\") pod \"keda-metrics-apiserver-7c9f485588-dgc5n\" (UID: \"0b33f3c2-01eb-4623-8dce-3ea660442ae7\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-dgc5n" Apr 16 18:37:28.016906 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:37:28.016850 2578 secret.go:281] references non-existent secret key: ca.crt Apr 16 18:37:28.016906 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:37:28.016874 2578 secret.go:281] references non-existent secret key: tls.crt Apr 16 18:37:28.016906 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:37:28.016895 2578 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 18:37:28.017144 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:37:28.016916 2578 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-dgc5n: references non-existent secret key: tls.crt Apr 16 18:37:28.017144 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:37:28.016873 2578 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 18:37:28.017144 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:37:28.016980 2578 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-b286x: references non-existent secret key: ca.crt Apr 16 18:37:28.017144 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:37:28.016968 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0b33f3c2-01eb-4623-8dce-3ea660442ae7-certificates podName:0b33f3c2-01eb-4623-8dce-3ea660442ae7 nodeName:}" failed. No retries permitted until 2026-04-16 18:37:28.516950709 +0000 UTC m=+425.304286953 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/0b33f3c2-01eb-4623-8dce-3ea660442ae7-certificates") pod "keda-metrics-apiserver-7c9f485588-dgc5n" (UID: "0b33f3c2-01eb-4623-8dce-3ea660442ae7") : references non-existent secret key: tls.crt Apr 16 18:37:28.017144 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:37:28.017076 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2e7138bc-61a4-4680-a0b8-53a4984d5552-certificates podName:2e7138bc-61a4-4680-a0b8-53a4984d5552 nodeName:}" failed. No retries permitted until 2026-04-16 18:37:29.017060363 +0000 UTC m=+425.804396600 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/2e7138bc-61a4-4680-a0b8-53a4984d5552-certificates") pod "keda-operator-ffbb595cb-b286x" (UID: "2e7138bc-61a4-4680-a0b8-53a4984d5552") : references non-existent secret key: ca.crt Apr 16 18:37:28.017429 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:28.017158 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/0b33f3c2-01eb-4623-8dce-3ea660442ae7-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-dgc5n\" (UID: \"0b33f3c2-01eb-4623-8dce-3ea660442ae7\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-dgc5n" Apr 16 18:37:28.018845 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:28.018824 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-p6pbb"] Apr 16 18:37:28.022102 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:28.022085 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-p6pbb" Apr 16 18:37:28.024786 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:28.024768 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 16 18:37:28.030493 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:28.030473 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfvgv\" (UniqueName: \"kubernetes.io/projected/0b33f3c2-01eb-4623-8dce-3ea660442ae7-kube-api-access-jfvgv\") pod \"keda-metrics-apiserver-7c9f485588-dgc5n\" (UID: \"0b33f3c2-01eb-4623-8dce-3ea660442ae7\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-dgc5n" Apr 16 18:37:28.030831 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:28.030812 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-p6pbb"] Apr 16 18:37:28.117596 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:28.117558 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/9611990e-3bc1-4d7e-9e2f-602bd8844ae9-certificates\") pod \"keda-admission-cf49989db-p6pbb\" (UID: \"9611990e-3bc1-4d7e-9e2f-602bd8844ae9\") " pod="openshift-keda/keda-admission-cf49989db-p6pbb" Apr 16 18:37:28.118015 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:28.117630 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mk6jk\" (UniqueName: \"kubernetes.io/projected/9611990e-3bc1-4d7e-9e2f-602bd8844ae9-kube-api-access-mk6jk\") pod \"keda-admission-cf49989db-p6pbb\" (UID: \"9611990e-3bc1-4d7e-9e2f-602bd8844ae9\") " pod="openshift-keda/keda-admission-cf49989db-p6pbb" Apr 16 18:37:28.218384 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:28.218353 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/9611990e-3bc1-4d7e-9e2f-602bd8844ae9-certificates\") pod \"keda-admission-cf49989db-p6pbb\" (UID: \"9611990e-3bc1-4d7e-9e2f-602bd8844ae9\") " pod="openshift-keda/keda-admission-cf49989db-p6pbb" Apr 16 18:37:28.218574 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:28.218404 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mk6jk\" (UniqueName: \"kubernetes.io/projected/9611990e-3bc1-4d7e-9e2f-602bd8844ae9-kube-api-access-mk6jk\") pod \"keda-admission-cf49989db-p6pbb\" (UID: \"9611990e-3bc1-4d7e-9e2f-602bd8844ae9\") " pod="openshift-keda/keda-admission-cf49989db-p6pbb" Apr 16 18:37:28.220855 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:28.220831 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/9611990e-3bc1-4d7e-9e2f-602bd8844ae9-certificates\") pod \"keda-admission-cf49989db-p6pbb\" (UID: \"9611990e-3bc1-4d7e-9e2f-602bd8844ae9\") " pod="openshift-keda/keda-admission-cf49989db-p6pbb" Apr 16 18:37:28.229900 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:28.229880 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mk6jk\" (UniqueName: \"kubernetes.io/projected/9611990e-3bc1-4d7e-9e2f-602bd8844ae9-kube-api-access-mk6jk\") pod \"keda-admission-cf49989db-p6pbb\" (UID: \"9611990e-3bc1-4d7e-9e2f-602bd8844ae9\") " pod="openshift-keda/keda-admission-cf49989db-p6pbb" Apr 16 18:37:28.339894 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:28.339807 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-p6pbb" Apr 16 18:37:28.474203 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:28.474150 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-p6pbb"] Apr 16 18:37:28.477264 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:37:28.477234 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9611990e_3bc1_4d7e_9e2f_602bd8844ae9.slice/crio-442bc13cb3df21318a7d0c8e43544b7c135ec2d85083c2027f6ec860fccd1173 WatchSource:0}: Error finding container 442bc13cb3df21318a7d0c8e43544b7c135ec2d85083c2027f6ec860fccd1173: Status 404 returned error can't find the container with id 442bc13cb3df21318a7d0c8e43544b7c135ec2d85083c2027f6ec860fccd1173 Apr 16 18:37:28.521220 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:28.521170 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0b33f3c2-01eb-4623-8dce-3ea660442ae7-certificates\") pod \"keda-metrics-apiserver-7c9f485588-dgc5n\" (UID: \"0b33f3c2-01eb-4623-8dce-3ea660442ae7\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-dgc5n" Apr 16 18:37:28.521330 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:37:28.521264 2578 secret.go:281] references non-existent secret key: tls.crt Apr 16 18:37:28.521330 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:37:28.521283 2578 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 18:37:28.521330 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:37:28.521305 2578 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-dgc5n: references non-existent secret key: tls.crt Apr 16 18:37:28.521448 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:37:28.521379 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0b33f3c2-01eb-4623-8dce-3ea660442ae7-certificates podName:0b33f3c2-01eb-4623-8dce-3ea660442ae7 nodeName:}" failed. No retries permitted until 2026-04-16 18:37:29.521359724 +0000 UTC m=+426.308695971 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/0b33f3c2-01eb-4623-8dce-3ea660442ae7-certificates") pod "keda-metrics-apiserver-7c9f485588-dgc5n" (UID: "0b33f3c2-01eb-4623-8dce-3ea660442ae7") : references non-existent secret key: tls.crt Apr 16 18:37:29.025993 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:29.025962 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/2e7138bc-61a4-4680-a0b8-53a4984d5552-certificates\") pod \"keda-operator-ffbb595cb-b286x\" (UID: \"2e7138bc-61a4-4680-a0b8-53a4984d5552\") " pod="openshift-keda/keda-operator-ffbb595cb-b286x" Apr 16 18:37:29.026150 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:37:29.026100 2578 secret.go:281] references non-existent secret key: ca.crt Apr 16 18:37:29.026150 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:37:29.026116 2578 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 18:37:29.026150 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:37:29.026124 2578 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-b286x: references non-existent secret key: ca.crt Apr 16 18:37:29.026271 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:37:29.026194 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2e7138bc-61a4-4680-a0b8-53a4984d5552-certificates podName:2e7138bc-61a4-4680-a0b8-53a4984d5552 nodeName:}" failed. No retries permitted until 2026-04-16 18:37:31.026158841 +0000 UTC m=+427.813495075 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/2e7138bc-61a4-4680-a0b8-53a4984d5552-certificates") pod "keda-operator-ffbb595cb-b286x" (UID: "2e7138bc-61a4-4680-a0b8-53a4984d5552") : references non-existent secret key: ca.crt Apr 16 18:37:29.120123 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:29.120084 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-p6pbb" event={"ID":"9611990e-3bc1-4d7e-9e2f-602bd8844ae9","Type":"ContainerStarted","Data":"442bc13cb3df21318a7d0c8e43544b7c135ec2d85083c2027f6ec860fccd1173"} Apr 16 18:37:29.530740 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:29.530707 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0b33f3c2-01eb-4623-8dce-3ea660442ae7-certificates\") pod \"keda-metrics-apiserver-7c9f485588-dgc5n\" (UID: \"0b33f3c2-01eb-4623-8dce-3ea660442ae7\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-dgc5n" Apr 16 18:37:29.530886 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:37:29.530851 2578 secret.go:281] references non-existent secret key: tls.crt Apr 16 18:37:29.530886 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:37:29.530868 2578 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 18:37:29.530886 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:37:29.530886 2578 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-dgc5n: references non-existent secret key: tls.crt Apr 16 18:37:29.530983 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:37:29.530937 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0b33f3c2-01eb-4623-8dce-3ea660442ae7-certificates podName:0b33f3c2-01eb-4623-8dce-3ea660442ae7 nodeName:}" failed. No retries permitted until 2026-04-16 18:37:31.530923725 +0000 UTC m=+428.318259963 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/0b33f3c2-01eb-4623-8dce-3ea660442ae7-certificates") pod "keda-metrics-apiserver-7c9f485588-dgc5n" (UID: "0b33f3c2-01eb-4623-8dce-3ea660442ae7") : references non-existent secret key: tls.crt Apr 16 18:37:31.044806 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:31.044753 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/2e7138bc-61a4-4680-a0b8-53a4984d5552-certificates\") pod \"keda-operator-ffbb595cb-b286x\" (UID: \"2e7138bc-61a4-4680-a0b8-53a4984d5552\") " pod="openshift-keda/keda-operator-ffbb595cb-b286x" Apr 16 18:37:31.045220 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:37:31.044911 2578 secret.go:281] references non-existent secret key: ca.crt Apr 16 18:37:31.045220 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:37:31.044928 2578 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 18:37:31.045220 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:37:31.044937 2578 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-b286x: references non-existent secret key: ca.crt Apr 16 18:37:31.045220 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:37:31.044990 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2e7138bc-61a4-4680-a0b8-53a4984d5552-certificates podName:2e7138bc-61a4-4680-a0b8-53a4984d5552 nodeName:}" failed. No retries permitted until 2026-04-16 18:37:35.044974711 +0000 UTC m=+431.832310945 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/2e7138bc-61a4-4680-a0b8-53a4984d5552-certificates") pod "keda-operator-ffbb595cb-b286x" (UID: "2e7138bc-61a4-4680-a0b8-53a4984d5552") : references non-existent secret key: ca.crt Apr 16 18:37:31.128314 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:31.128280 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-p6pbb" event={"ID":"9611990e-3bc1-4d7e-9e2f-602bd8844ae9","Type":"ContainerStarted","Data":"46e01fa7d2eddba886d4e06e264e6cfcf5e9a6ee3fa068c8d8b9489b55031179"} Apr 16 18:37:31.128471 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:31.128392 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-p6pbb" Apr 16 18:37:31.145281 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:31.145239 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-p6pbb" podStartSLOduration=1.5475596820000002 podStartE2EDuration="3.145227834s" podCreationTimestamp="2026-04-16 18:37:28 +0000 UTC" firstStartedPulling="2026-04-16 18:37:28.478551028 +0000 UTC m=+425.265887262" lastFinishedPulling="2026-04-16 18:37:30.076219171 +0000 UTC m=+426.863555414" observedRunningTime="2026-04-16 18:37:31.144120236 +0000 UTC m=+427.931456493" watchObservedRunningTime="2026-04-16 18:37:31.145227834 +0000 UTC m=+427.932564089" Apr 16 18:37:31.548926 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:31.548886 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0b33f3c2-01eb-4623-8dce-3ea660442ae7-certificates\") pod \"keda-metrics-apiserver-7c9f485588-dgc5n\" (UID: \"0b33f3c2-01eb-4623-8dce-3ea660442ae7\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-dgc5n" Apr 16 18:37:31.549096 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:37:31.549040 2578 secret.go:281] references non-existent secret key: tls.crt Apr 16 18:37:31.549096 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:37:31.549059 2578 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 18:37:31.549096 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:37:31.549078 2578 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-dgc5n: references non-existent secret key: tls.crt Apr 16 18:37:31.549212 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:37:31.549131 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0b33f3c2-01eb-4623-8dce-3ea660442ae7-certificates podName:0b33f3c2-01eb-4623-8dce-3ea660442ae7 nodeName:}" failed. No retries permitted until 2026-04-16 18:37:35.549116508 +0000 UTC m=+432.336452746 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/0b33f3c2-01eb-4623-8dce-3ea660442ae7-certificates") pod "keda-metrics-apiserver-7c9f485588-dgc5n" (UID: "0b33f3c2-01eb-4623-8dce-3ea660442ae7") : references non-existent secret key: tls.crt Apr 16 18:37:35.079484 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:35.079440 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/2e7138bc-61a4-4680-a0b8-53a4984d5552-certificates\") pod \"keda-operator-ffbb595cb-b286x\" (UID: \"2e7138bc-61a4-4680-a0b8-53a4984d5552\") " pod="openshift-keda/keda-operator-ffbb595cb-b286x" Apr 16 18:37:35.081899 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:35.081869 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/2e7138bc-61a4-4680-a0b8-53a4984d5552-certificates\") pod \"keda-operator-ffbb595cb-b286x\" (UID: \"2e7138bc-61a4-4680-a0b8-53a4984d5552\") " pod="openshift-keda/keda-operator-ffbb595cb-b286x" Apr 16 18:37:35.146660 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:35.146622 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-b286x" Apr 16 18:37:35.266265 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:35.266239 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-b286x"] Apr 16 18:37:35.268371 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:37:35.268346 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e7138bc_61a4_4680_a0b8_53a4984d5552.slice/crio-7f3de4a90b820cad8b114348c82485a6799dcf5b19b39065ea4adba6eee6837c WatchSource:0}: Error finding container 7f3de4a90b820cad8b114348c82485a6799dcf5b19b39065ea4adba6eee6837c: Status 404 returned error can't find the container with id 7f3de4a90b820cad8b114348c82485a6799dcf5b19b39065ea4adba6eee6837c Apr 16 18:37:35.583603 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:35.583565 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0b33f3c2-01eb-4623-8dce-3ea660442ae7-certificates\") pod \"keda-metrics-apiserver-7c9f485588-dgc5n\" (UID: \"0b33f3c2-01eb-4623-8dce-3ea660442ae7\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-dgc5n" Apr 16 18:37:35.586260 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:35.586236 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0b33f3c2-01eb-4623-8dce-3ea660442ae7-certificates\") pod \"keda-metrics-apiserver-7c9f485588-dgc5n\" (UID: \"0b33f3c2-01eb-4623-8dce-3ea660442ae7\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-dgc5n" Apr 16 18:37:35.870505 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:35.870414 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-dgc5n" Apr 16 18:37:36.007498 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:36.007470 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-dgc5n"] Apr 16 18:37:36.009447 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:37:36.009412 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b33f3c2_01eb_4623_8dce_3ea660442ae7.slice/crio-742f352a12953e2ce648b27ab231fb4903b514ecd522e9f74ae8e977e8a8cace WatchSource:0}: Error finding container 742f352a12953e2ce648b27ab231fb4903b514ecd522e9f74ae8e977e8a8cace: Status 404 returned error can't find the container with id 742f352a12953e2ce648b27ab231fb4903b514ecd522e9f74ae8e977e8a8cace Apr 16 18:37:36.144990 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:36.144899 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-dgc5n" event={"ID":"0b33f3c2-01eb-4623-8dce-3ea660442ae7","Type":"ContainerStarted","Data":"742f352a12953e2ce648b27ab231fb4903b514ecd522e9f74ae8e977e8a8cace"} Apr 16 18:37:36.146358 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:36.146323 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-b286x" event={"ID":"2e7138bc-61a4-4680-a0b8-53a4984d5552","Type":"ContainerStarted","Data":"7f3de4a90b820cad8b114348c82485a6799dcf5b19b39065ea4adba6eee6837c"} Apr 16 18:37:40.159911 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:40.159873 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-dgc5n" event={"ID":"0b33f3c2-01eb-4623-8dce-3ea660442ae7","Type":"ContainerStarted","Data":"d445de25ae17ba6a408f98d470eca452984d4146da48398e1447199073b3965c"} Apr 16 18:37:40.160429 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:40.159982 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-dgc5n" Apr 16 18:37:40.161352 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:40.161324 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-b286x" event={"ID":"2e7138bc-61a4-4680-a0b8-53a4984d5552","Type":"ContainerStarted","Data":"941f83adc56e04b30ef838cf234409b956bcb3f2a593480212a2059df9e690ed"} Apr 16 18:37:40.161488 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:40.161472 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-b286x" Apr 16 18:37:40.177888 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:40.177834 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-dgc5n" podStartSLOduration=9.500963897 podStartE2EDuration="13.177819632s" podCreationTimestamp="2026-04-16 18:37:27 +0000 UTC" firstStartedPulling="2026-04-16 18:37:36.011057947 +0000 UTC m=+432.798394197" lastFinishedPulling="2026-04-16 18:37:39.687913699 +0000 UTC m=+436.475249932" observedRunningTime="2026-04-16 18:37:40.175422516 +0000 UTC m=+436.962758803" watchObservedRunningTime="2026-04-16 18:37:40.177819632 +0000 UTC m=+436.965155887" Apr 16 18:37:40.194459 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:40.194399 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-b286x" podStartSLOduration=8.782256342 podStartE2EDuration="13.194381976s" podCreationTimestamp="2026-04-16 18:37:27 +0000 UTC" firstStartedPulling="2026-04-16 18:37:35.269736646 +0000 UTC m=+432.057072888" lastFinishedPulling="2026-04-16 18:37:39.681862272 +0000 UTC m=+436.469198522" observedRunningTime="2026-04-16 18:37:40.192615449 +0000 UTC m=+436.979951701" watchObservedRunningTime="2026-04-16 18:37:40.194381976 +0000 UTC m=+436.981718233" Apr 16 18:37:48.116815 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:48.116783 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4k6wm" Apr 16 18:37:51.168872 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:51.168844 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-dgc5n" Apr 16 18:37:52.133375 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:37:52.133342 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-p6pbb" Apr 16 18:38:01.166638 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:38:01.166608 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-b286x" Apr 16 18:38:34.217469 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:38:34.217439 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-7c68cb4fc8-rdxs2"] Apr 16 18:38:34.226317 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:38:34.226293 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-xbnbv"] Apr 16 18:38:34.226495 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:38:34.226472 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7c68cb4fc8-rdxs2" Apr 16 18:38:34.229491 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:38:34.229465 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 18:38:34.229802 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:38:34.229785 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7c68cb4fc8-rdxs2"] Apr 16 18:38:34.229909 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:38:34.229896 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-xbnbv" Apr 16 18:38:34.230688 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:38:34.230668 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-gllzf\"" Apr 16 18:38:34.231086 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:38:34.231070 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 18:38:34.231248 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:38:34.231124 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 16 18:38:34.232044 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:38:34.232022 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-gtkd2\"" Apr 16 18:38:34.232143 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:38:34.232110 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 16 18:38:34.232834 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:38:34.232812 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-xbnbv"] Apr 16 18:38:34.354419 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:38:34.354384 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0549eba1-0b20-41bd-a752-bd101775a0b9-cert\") pod \"kserve-controller-manager-7c68cb4fc8-rdxs2\" (UID: \"0549eba1-0b20-41bd-a752-bd101775a0b9\") " pod="kserve/kserve-controller-manager-7c68cb4fc8-rdxs2" Apr 16 18:38:34.354585 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:38:34.354431 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e204c87e-ef63-4ef3-9fb1-2fd0e2775752-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-xbnbv\" (UID: \"e204c87e-ef63-4ef3-9fb1-2fd0e2775752\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-xbnbv" Apr 16 18:38:34.354585 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:38:34.354481 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cd4hr\" (UniqueName: \"kubernetes.io/projected/0549eba1-0b20-41bd-a752-bd101775a0b9-kube-api-access-cd4hr\") pod \"kserve-controller-manager-7c68cb4fc8-rdxs2\" (UID: \"0549eba1-0b20-41bd-a752-bd101775a0b9\") " pod="kserve/kserve-controller-manager-7c68cb4fc8-rdxs2" Apr 16 18:38:34.354682 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:38:34.354586 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljvxb\" (UniqueName: \"kubernetes.io/projected/e204c87e-ef63-4ef3-9fb1-2fd0e2775752-kube-api-access-ljvxb\") pod \"llmisvc-controller-manager-68cc5db7c4-xbnbv\" (UID: \"e204c87e-ef63-4ef3-9fb1-2fd0e2775752\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-xbnbv" Apr 16 18:38:34.455105 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:38:34.455077 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0549eba1-0b20-41bd-a752-bd101775a0b9-cert\") pod \"kserve-controller-manager-7c68cb4fc8-rdxs2\" (UID: \"0549eba1-0b20-41bd-a752-bd101775a0b9\") " pod="kserve/kserve-controller-manager-7c68cb4fc8-rdxs2" Apr 16 18:38:34.455279 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:38:34.455112 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e204c87e-ef63-4ef3-9fb1-2fd0e2775752-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-xbnbv\" (UID: \"e204c87e-ef63-4ef3-9fb1-2fd0e2775752\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-xbnbv" Apr 16 18:38:34.455279 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:38:34.455132 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cd4hr\" (UniqueName: \"kubernetes.io/projected/0549eba1-0b20-41bd-a752-bd101775a0b9-kube-api-access-cd4hr\") pod \"kserve-controller-manager-7c68cb4fc8-rdxs2\" (UID: \"0549eba1-0b20-41bd-a752-bd101775a0b9\") " pod="kserve/kserve-controller-manager-7c68cb4fc8-rdxs2" Apr 16 18:38:34.455279 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:38:34.455160 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ljvxb\" (UniqueName: \"kubernetes.io/projected/e204c87e-ef63-4ef3-9fb1-2fd0e2775752-kube-api-access-ljvxb\") pod \"llmisvc-controller-manager-68cc5db7c4-xbnbv\" (UID: \"e204c87e-ef63-4ef3-9fb1-2fd0e2775752\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-xbnbv" Apr 16 18:38:34.457569 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:38:34.457539 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0549eba1-0b20-41bd-a752-bd101775a0b9-cert\") pod \"kserve-controller-manager-7c68cb4fc8-rdxs2\" (UID: \"0549eba1-0b20-41bd-a752-bd101775a0b9\") " pod="kserve/kserve-controller-manager-7c68cb4fc8-rdxs2" Apr 16 18:38:34.457662 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:38:34.457598 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e204c87e-ef63-4ef3-9fb1-2fd0e2775752-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-xbnbv\" (UID: \"e204c87e-ef63-4ef3-9fb1-2fd0e2775752\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-xbnbv" Apr 16 18:38:34.465124 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:38:34.465103 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljvxb\" (UniqueName: \"kubernetes.io/projected/e204c87e-ef63-4ef3-9fb1-2fd0e2775752-kube-api-access-ljvxb\") pod \"llmisvc-controller-manager-68cc5db7c4-xbnbv\" (UID: \"e204c87e-ef63-4ef3-9fb1-2fd0e2775752\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-xbnbv" Apr 16 18:38:34.465242 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:38:34.465164 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cd4hr\" (UniqueName: \"kubernetes.io/projected/0549eba1-0b20-41bd-a752-bd101775a0b9-kube-api-access-cd4hr\") pod \"kserve-controller-manager-7c68cb4fc8-rdxs2\" (UID: \"0549eba1-0b20-41bd-a752-bd101775a0b9\") " pod="kserve/kserve-controller-manager-7c68cb4fc8-rdxs2" Apr 16 18:38:34.543105 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:38:34.543047 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7c68cb4fc8-rdxs2" Apr 16 18:38:34.550002 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:38:34.549980 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-xbnbv" Apr 16 18:38:34.674405 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:38:34.673967 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7c68cb4fc8-rdxs2"] Apr 16 18:38:34.676688 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:38:34.676656 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0549eba1_0b20_41bd_a752_bd101775a0b9.slice/crio-d8ce13ad0f6b605c05f082bbbf3a48a760b3fedce02a30c7fbf629a768f52335 WatchSource:0}: Error finding container d8ce13ad0f6b605c05f082bbbf3a48a760b3fedce02a30c7fbf629a768f52335: Status 404 returned error can't find the container with id d8ce13ad0f6b605c05f082bbbf3a48a760b3fedce02a30c7fbf629a768f52335 Apr 16 18:38:34.691255 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:38:34.691201 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-xbnbv"] Apr 16 18:38:34.693123 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:38:34.693100 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode204c87e_ef63_4ef3_9fb1_2fd0e2775752.slice/crio-a4cb2aba38d28c19ff096860d18546debb5da9390498b6123cad38be2ca67684 WatchSource:0}: Error finding container a4cb2aba38d28c19ff096860d18546debb5da9390498b6123cad38be2ca67684: Status 404 returned error can't find the container with id a4cb2aba38d28c19ff096860d18546debb5da9390498b6123cad38be2ca67684 Apr 16 18:38:35.340465 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:38:35.339989 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-xbnbv" event={"ID":"e204c87e-ef63-4ef3-9fb1-2fd0e2775752","Type":"ContainerStarted","Data":"a4cb2aba38d28c19ff096860d18546debb5da9390498b6123cad38be2ca67684"} Apr 16 18:38:35.342012 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:38:35.341981 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7c68cb4fc8-rdxs2" event={"ID":"0549eba1-0b20-41bd-a752-bd101775a0b9","Type":"ContainerStarted","Data":"d8ce13ad0f6b605c05f082bbbf3a48a760b3fedce02a30c7fbf629a768f52335"} Apr 16 18:38:38.353895 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:38:38.353802 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-xbnbv" event={"ID":"e204c87e-ef63-4ef3-9fb1-2fd0e2775752","Type":"ContainerStarted","Data":"9ae7d9c600580c04af46ba885af9551cd1041ede8896343a047454dc2ea7ece9"} Apr 16 18:38:38.353895 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:38:38.353880 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-xbnbv" Apr 16 18:38:38.355108 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:38:38.355085 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7c68cb4fc8-rdxs2" event={"ID":"0549eba1-0b20-41bd-a752-bd101775a0b9","Type":"ContainerStarted","Data":"a7acdf4c276baa557fdf261ea16d53f4e82d06cbbfa1565ed5b857d7cce9e960"} Apr 16 18:38:38.355260 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:38:38.355224 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-7c68cb4fc8-rdxs2" Apr 16 18:38:38.389075 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:38:38.389028 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-xbnbv" podStartSLOduration=1.013047863 podStartE2EDuration="4.389014897s" podCreationTimestamp="2026-04-16 18:38:34 +0000 UTC" firstStartedPulling="2026-04-16 18:38:34.694647833 +0000 UTC m=+491.481984066" lastFinishedPulling="2026-04-16 18:38:38.07061485 +0000 UTC m=+494.857951100" observedRunningTime="2026-04-16 18:38:38.371556422 +0000 UTC m=+495.158892679" watchObservedRunningTime="2026-04-16 18:38:38.389014897 +0000 UTC m=+495.176351152" Apr 16 18:38:38.390406 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:38:38.390372 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-7c68cb4fc8-rdxs2" podStartSLOduration=0.991761142 podStartE2EDuration="4.390361819s" podCreationTimestamp="2026-04-16 18:38:34 +0000 UTC" firstStartedPulling="2026-04-16 18:38:34.67791325 +0000 UTC m=+491.465249484" lastFinishedPulling="2026-04-16 18:38:38.076513914 +0000 UTC m=+494.863850161" observedRunningTime="2026-04-16 18:38:38.388370266 +0000 UTC m=+495.175706513" watchObservedRunningTime="2026-04-16 18:38:38.390361819 +0000 UTC m=+495.177698074" Apr 16 18:39:09.360765 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:39:09.360733 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-xbnbv" Apr 16 18:39:09.363651 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:39:09.363631 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-7c68cb4fc8-rdxs2" Apr 16 18:39:10.696709 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:39:10.696632 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-7c68cb4fc8-rdxs2"] Apr 16 18:39:10.697087 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:39:10.696863 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-7c68cb4fc8-rdxs2" podUID="0549eba1-0b20-41bd-a752-bd101775a0b9" containerName="manager" containerID="cri-o://a7acdf4c276baa557fdf261ea16d53f4e82d06cbbfa1565ed5b857d7cce9e960" gracePeriod=10 Apr 16 18:39:10.719215 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:39:10.719169 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-7c68cb4fc8-ww5qj"] Apr 16 18:39:10.723836 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:39:10.723817 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7c68cb4fc8-ww5qj" Apr 16 18:39:10.731831 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:39:10.731802 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7c68cb4fc8-ww5qj"] Apr 16 18:39:10.741601 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:39:10.741573 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1164058e-92a2-41a8-8700-0fc714f73eac-cert\") pod \"kserve-controller-manager-7c68cb4fc8-ww5qj\" (UID: \"1164058e-92a2-41a8-8700-0fc714f73eac\") " pod="kserve/kserve-controller-manager-7c68cb4fc8-ww5qj" Apr 16 18:39:10.741710 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:39:10.741622 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cx998\" (UniqueName: \"kubernetes.io/projected/1164058e-92a2-41a8-8700-0fc714f73eac-kube-api-access-cx998\") pod \"kserve-controller-manager-7c68cb4fc8-ww5qj\" (UID: \"1164058e-92a2-41a8-8700-0fc714f73eac\") " pod="kserve/kserve-controller-manager-7c68cb4fc8-ww5qj" Apr 16 18:39:10.843055 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:39:10.843019 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1164058e-92a2-41a8-8700-0fc714f73eac-cert\") pod \"kserve-controller-manager-7c68cb4fc8-ww5qj\" (UID: \"1164058e-92a2-41a8-8700-0fc714f73eac\") " pod="kserve/kserve-controller-manager-7c68cb4fc8-ww5qj" Apr 16 18:39:10.843268 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:39:10.843083 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cx998\" (UniqueName: \"kubernetes.io/projected/1164058e-92a2-41a8-8700-0fc714f73eac-kube-api-access-cx998\") pod \"kserve-controller-manager-7c68cb4fc8-ww5qj\" (UID: \"1164058e-92a2-41a8-8700-0fc714f73eac\") " pod="kserve/kserve-controller-manager-7c68cb4fc8-ww5qj" Apr 16 18:39:10.845571 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:39:10.845545 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1164058e-92a2-41a8-8700-0fc714f73eac-cert\") pod \"kserve-controller-manager-7c68cb4fc8-ww5qj\" (UID: \"1164058e-92a2-41a8-8700-0fc714f73eac\") " pod="kserve/kserve-controller-manager-7c68cb4fc8-ww5qj" Apr 16 18:39:10.851429 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:39:10.851405 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cx998\" (UniqueName: \"kubernetes.io/projected/1164058e-92a2-41a8-8700-0fc714f73eac-kube-api-access-cx998\") pod \"kserve-controller-manager-7c68cb4fc8-ww5qj\" (UID: \"1164058e-92a2-41a8-8700-0fc714f73eac\") " pod="kserve/kserve-controller-manager-7c68cb4fc8-ww5qj" Apr 16 18:39:10.938008 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:39:10.937984 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7c68cb4fc8-rdxs2" Apr 16 18:39:11.044894 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:39:11.044862 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cd4hr\" (UniqueName: \"kubernetes.io/projected/0549eba1-0b20-41bd-a752-bd101775a0b9-kube-api-access-cd4hr\") pod \"0549eba1-0b20-41bd-a752-bd101775a0b9\" (UID: \"0549eba1-0b20-41bd-a752-bd101775a0b9\") " Apr 16 18:39:11.045068 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:39:11.044944 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0549eba1-0b20-41bd-a752-bd101775a0b9-cert\") pod \"0549eba1-0b20-41bd-a752-bd101775a0b9\" (UID: \"0549eba1-0b20-41bd-a752-bd101775a0b9\") " Apr 16 18:39:11.047054 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:39:11.047027 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0549eba1-0b20-41bd-a752-bd101775a0b9-kube-api-access-cd4hr" (OuterVolumeSpecName: "kube-api-access-cd4hr") pod "0549eba1-0b20-41bd-a752-bd101775a0b9" (UID: "0549eba1-0b20-41bd-a752-bd101775a0b9"). InnerVolumeSpecName "kube-api-access-cd4hr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:39:11.047160 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:39:11.047021 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0549eba1-0b20-41bd-a752-bd101775a0b9-cert" (OuterVolumeSpecName: "cert") pod "0549eba1-0b20-41bd-a752-bd101775a0b9" (UID: "0549eba1-0b20-41bd-a752-bd101775a0b9"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:39:11.073240 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:39:11.073211 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7c68cb4fc8-ww5qj" Apr 16 18:39:11.145567 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:39:11.145529 2578 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0549eba1-0b20-41bd-a752-bd101775a0b9-cert\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 18:39:11.145567 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:39:11.145565 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cd4hr\" (UniqueName: \"kubernetes.io/projected/0549eba1-0b20-41bd-a752-bd101775a0b9-kube-api-access-cd4hr\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 18:39:11.190339 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:39:11.190316 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7c68cb4fc8-ww5qj"] Apr 16 18:39:11.192598 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:39:11.192568 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1164058e_92a2_41a8_8700_0fc714f73eac.slice/crio-677ff13c4ab46b23709bccae027b72fe182b4657e891227f8523f0af6e987cf4 WatchSource:0}: Error finding container 677ff13c4ab46b23709bccae027b72fe182b4657e891227f8523f0af6e987cf4: Status 404 returned error can't find the container with id 677ff13c4ab46b23709bccae027b72fe182b4657e891227f8523f0af6e987cf4 Apr 16 18:39:11.475760 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:39:11.475729 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7c68cb4fc8-ww5qj" event={"ID":"1164058e-92a2-41a8-8700-0fc714f73eac","Type":"ContainerStarted","Data":"677ff13c4ab46b23709bccae027b72fe182b4657e891227f8523f0af6e987cf4"} Apr 16 18:39:11.476871 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:39:11.476847 2578 generic.go:358] "Generic (PLEG): container finished" podID="0549eba1-0b20-41bd-a752-bd101775a0b9" containerID="a7acdf4c276baa557fdf261ea16d53f4e82d06cbbfa1565ed5b857d7cce9e960" exitCode=0 Apr 16 18:39:11.476979 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:39:11.476900 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7c68cb4fc8-rdxs2" event={"ID":"0549eba1-0b20-41bd-a752-bd101775a0b9","Type":"ContainerDied","Data":"a7acdf4c276baa557fdf261ea16d53f4e82d06cbbfa1565ed5b857d7cce9e960"} Apr 16 18:39:11.476979 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:39:11.476914 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7c68cb4fc8-rdxs2" Apr 16 18:39:11.476979 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:39:11.476934 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7c68cb4fc8-rdxs2" event={"ID":"0549eba1-0b20-41bd-a752-bd101775a0b9","Type":"ContainerDied","Data":"d8ce13ad0f6b605c05f082bbbf3a48a760b3fedce02a30c7fbf629a768f52335"} Apr 16 18:39:11.476979 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:39:11.476949 2578 scope.go:117] "RemoveContainer" containerID="a7acdf4c276baa557fdf261ea16d53f4e82d06cbbfa1565ed5b857d7cce9e960" Apr 16 18:39:11.486383 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:39:11.486368 2578 scope.go:117] "RemoveContainer" containerID="a7acdf4c276baa557fdf261ea16d53f4e82d06cbbfa1565ed5b857d7cce9e960" Apr 16 18:39:11.486627 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:39:11.486609 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7acdf4c276baa557fdf261ea16d53f4e82d06cbbfa1565ed5b857d7cce9e960\": container with ID starting with a7acdf4c276baa557fdf261ea16d53f4e82d06cbbfa1565ed5b857d7cce9e960 not found: ID does not exist" containerID="a7acdf4c276baa557fdf261ea16d53f4e82d06cbbfa1565ed5b857d7cce9e960" Apr 16 18:39:11.486690 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:39:11.486635 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7acdf4c276baa557fdf261ea16d53f4e82d06cbbfa1565ed5b857d7cce9e960"} err="failed to get container status \"a7acdf4c276baa557fdf261ea16d53f4e82d06cbbfa1565ed5b857d7cce9e960\": rpc error: code = NotFound desc = could not find container \"a7acdf4c276baa557fdf261ea16d53f4e82d06cbbfa1565ed5b857d7cce9e960\": container with ID starting with a7acdf4c276baa557fdf261ea16d53f4e82d06cbbfa1565ed5b857d7cce9e960 not found: ID does not exist" Apr 16 18:39:11.498713 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:39:11.498690 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-7c68cb4fc8-rdxs2"] Apr 16 18:39:11.503920 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:39:11.503887 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-7c68cb4fc8-rdxs2"] Apr 16 18:39:11.841311 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:39:11.841271 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0549eba1-0b20-41bd-a752-bd101775a0b9" path="/var/lib/kubelet/pods/0549eba1-0b20-41bd-a752-bd101775a0b9/volumes" Apr 16 18:39:12.481491 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:39:12.481447 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7c68cb4fc8-ww5qj" event={"ID":"1164058e-92a2-41a8-8700-0fc714f73eac","Type":"ContainerStarted","Data":"7148c9e12ae037fc453455ecc406f564f897f152ac906ec79be13796cbdd9322"} Apr 16 18:39:12.481695 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:39:12.481582 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-7c68cb4fc8-ww5qj" Apr 16 18:39:12.497679 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:39:12.497630 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-7c68cb4fc8-ww5qj" podStartSLOduration=1.983965608 podStartE2EDuration="2.497618447s" podCreationTimestamp="2026-04-16 18:39:10 +0000 UTC" firstStartedPulling="2026-04-16 18:39:11.193819506 +0000 UTC m=+527.981155743" lastFinishedPulling="2026-04-16 18:39:11.707472346 +0000 UTC m=+528.494808582" observedRunningTime="2026-04-16 18:39:12.496279053 +0000 UTC m=+529.283615322" watchObservedRunningTime="2026-04-16 18:39:12.497618447 +0000 UTC m=+529.284954702" Apr 16 18:39:43.489431 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:39:43.489401 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-7c68cb4fc8-ww5qj" Apr 16 18:40:23.729566 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:40:23.729521 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fdzzg_02533a21-4e1f-4bc0-a493-7ac7d35295b8/ovn-acl-logging/0.log" Apr 16 18:40:23.732042 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:40:23.732019 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fdzzg_02533a21-4e1f-4bc0-a493-7ac7d35295b8/ovn-acl-logging/0.log" Apr 16 18:40:45.725937 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:40:45.725850 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-4xml2"] Apr 16 18:40:45.726356 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:40:45.726193 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0549eba1-0b20-41bd-a752-bd101775a0b9" containerName="manager" Apr 16 18:40:45.726356 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:40:45.726205 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="0549eba1-0b20-41bd-a752-bd101775a0b9" containerName="manager" Apr 16 18:40:45.726356 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:40:45.726285 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="0549eba1-0b20-41bd-a752-bd101775a0b9" containerName="manager" Apr 16 18:40:45.729394 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:40:45.729378 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-4xml2" Apr 16 18:40:45.731836 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:40:45.731815 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-z66mq\"" Apr 16 18:40:45.735248 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:40:45.735225 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-4xml2"] Apr 16 18:40:45.837563 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:40:45.837501 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f05da57e-0303-4cb6-adb3-c3f6e1472c3a-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-8689cc5967-4xml2\" (UID: \"f05da57e-0303-4cb6-adb3-c3f6e1472c3a\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-4xml2" Apr 16 18:40:45.938796 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:40:45.938744 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f05da57e-0303-4cb6-adb3-c3f6e1472c3a-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-8689cc5967-4xml2\" (UID: \"f05da57e-0303-4cb6-adb3-c3f6e1472c3a\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-4xml2" Apr 16 18:40:45.939246 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:40:45.939211 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f05da57e-0303-4cb6-adb3-c3f6e1472c3a-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-8689cc5967-4xml2\" (UID: \"f05da57e-0303-4cb6-adb3-c3f6e1472c3a\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-4xml2" Apr 16 18:40:46.039950 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:40:46.039867 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-4xml2" Apr 16 18:40:46.160260 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:40:46.160218 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-4xml2"] Apr 16 18:40:46.163645 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:40:46.163611 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf05da57e_0303_4cb6_adb3_c3f6e1472c3a.slice/crio-aaae7cfb82e5b0f245773f31c3fba2e4fe24e556d84cf2a90c3ef0447db32046 WatchSource:0}: Error finding container aaae7cfb82e5b0f245773f31c3fba2e4fe24e556d84cf2a90c3ef0447db32046: Status 404 returned error can't find the container with id aaae7cfb82e5b0f245773f31c3fba2e4fe24e556d84cf2a90c3ef0447db32046 Apr 16 18:40:46.786974 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:40:46.786937 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-4xml2" event={"ID":"f05da57e-0303-4cb6-adb3-c3f6e1472c3a","Type":"ContainerStarted","Data":"aaae7cfb82e5b0f245773f31c3fba2e4fe24e556d84cf2a90c3ef0447db32046"} Apr 16 18:40:49.797892 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:40:49.797856 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-4xml2" event={"ID":"f05da57e-0303-4cb6-adb3-c3f6e1472c3a","Type":"ContainerStarted","Data":"420d548b396093527115e9a344d5aeffae2b58744d2ff98b52d1fbd2d71a8cc5"} Apr 16 18:40:53.812441 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:40:53.812409 2578 generic.go:358] "Generic (PLEG): container finished" podID="f05da57e-0303-4cb6-adb3-c3f6e1472c3a" containerID="420d548b396093527115e9a344d5aeffae2b58744d2ff98b52d1fbd2d71a8cc5" exitCode=0 Apr 16 18:40:53.812850 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:40:53.812480 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-4xml2" event={"ID":"f05da57e-0303-4cb6-adb3-c3f6e1472c3a","Type":"ContainerDied","Data":"420d548b396093527115e9a344d5aeffae2b58744d2ff98b52d1fbd2d71a8cc5"} Apr 16 18:41:06.870896 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:41:06.870812 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-4xml2" event={"ID":"f05da57e-0303-4cb6-adb3-c3f6e1472c3a","Type":"ContainerStarted","Data":"5c04309cbf7140c4f1591f3b291a0613da6c01414fa51f7878fe032d17d3e9a4"} Apr 16 18:41:09.882542 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:41:09.882457 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-4xml2" event={"ID":"f05da57e-0303-4cb6-adb3-c3f6e1472c3a","Type":"ContainerStarted","Data":"0e636b46cfae624d9654ac1aeb56cbfbe372eac6b21afb578e1bcbc0f7638e23"} Apr 16 18:41:09.882957 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:41:09.882730 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-4xml2" Apr 16 18:41:09.884070 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:41:09.884045 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-4xml2" podUID="f05da57e-0303-4cb6-adb3-c3f6e1472c3a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 16 18:41:09.899633 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:41:09.899592 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-4xml2" podStartSLOduration=1.523844424 podStartE2EDuration="24.89958161s" podCreationTimestamp="2026-04-16 18:40:45 +0000 UTC" firstStartedPulling="2026-04-16 18:40:46.165429297 +0000 UTC m=+622.952765534" lastFinishedPulling="2026-04-16 18:41:09.541166483 +0000 UTC m=+646.328502720" observedRunningTime="2026-04-16 18:41:09.89912978 +0000 UTC m=+646.686466036" watchObservedRunningTime="2026-04-16 18:41:09.89958161 +0000 UTC m=+646.686917864" Apr 16 18:41:10.886442 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:41:10.886411 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-4xml2" Apr 16 18:41:10.886929 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:41:10.886542 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-4xml2" podUID="f05da57e-0303-4cb6-adb3-c3f6e1472c3a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 16 18:41:10.887438 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:41:10.887417 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-4xml2" podUID="f05da57e-0303-4cb6-adb3-c3f6e1472c3a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:41:11.890009 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:41:11.889963 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-4xml2" podUID="f05da57e-0303-4cb6-adb3-c3f6e1472c3a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 16 18:41:11.890474 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:41:11.890227 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-4xml2" podUID="f05da57e-0303-4cb6-adb3-c3f6e1472c3a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:41:21.890526 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:41:21.890472 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-4xml2" podUID="f05da57e-0303-4cb6-adb3-c3f6e1472c3a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 16 18:41:21.890914 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:41:21.890858 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-4xml2" podUID="f05da57e-0303-4cb6-adb3-c3f6e1472c3a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:41:31.890396 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:41:31.890351 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-4xml2" podUID="f05da57e-0303-4cb6-adb3-c3f6e1472c3a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 16 18:41:31.890966 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:41:31.890851 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-4xml2" podUID="f05da57e-0303-4cb6-adb3-c3f6e1472c3a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:41:41.890764 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:41:41.890720 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-4xml2" podUID="f05da57e-0303-4cb6-adb3-c3f6e1472c3a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 16 18:41:41.891216 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:41:41.891189 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-4xml2" podUID="f05da57e-0303-4cb6-adb3-c3f6e1472c3a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:41:51.890610 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:41:51.890560 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-4xml2" podUID="f05da57e-0303-4cb6-adb3-c3f6e1472c3a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 16 18:41:51.891077 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:41:51.891028 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-4xml2" podUID="f05da57e-0303-4cb6-adb3-c3f6e1472c3a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:42:01.890728 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:01.890674 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-4xml2" podUID="f05da57e-0303-4cb6-adb3-c3f6e1472c3a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 16 18:42:01.891214 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:01.891100 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-4xml2" podUID="f05da57e-0303-4cb6-adb3-c3f6e1472c3a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:42:11.891163 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:11.891083 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-4xml2" Apr 16 18:42:11.891602 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:11.891265 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-4xml2" Apr 16 18:42:20.990244 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:20.990207 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-4xml2"] Apr 16 18:42:20.990722 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:20.990601 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-4xml2" podUID="f05da57e-0303-4cb6-adb3-c3f6e1472c3a" containerName="kserve-container" containerID="cri-o://5c04309cbf7140c4f1591f3b291a0613da6c01414fa51f7878fe032d17d3e9a4" gracePeriod=30 Apr 16 18:42:20.990722 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:20.990679 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-4xml2" podUID="f05da57e-0303-4cb6-adb3-c3f6e1472c3a" containerName="agent" containerID="cri-o://0e636b46cfae624d9654ac1aeb56cbfbe372eac6b21afb578e1bcbc0f7638e23" gracePeriod=30 Apr 16 18:42:21.063511 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:21.063481 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-gpl7r"] Apr 16 18:42:21.067251 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:21.067229 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-gpl7r" Apr 16 18:42:21.074708 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:21.074455 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-gpl7r"] Apr 16 18:42:21.141976 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:21.141948 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fbe71a81-66f8-43fb-b1b3-48b46f8e363f-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-56bb8466fd-gpl7r\" (UID: \"fbe71a81-66f8-43fb-b1b3-48b46f8e363f\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-gpl7r" Apr 16 18:42:21.243029 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:21.242954 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fbe71a81-66f8-43fb-b1b3-48b46f8e363f-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-56bb8466fd-gpl7r\" (UID: \"fbe71a81-66f8-43fb-b1b3-48b46f8e363f\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-gpl7r" Apr 16 18:42:21.243301 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:21.243285 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fbe71a81-66f8-43fb-b1b3-48b46f8e363f-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-56bb8466fd-gpl7r\" (UID: \"fbe71a81-66f8-43fb-b1b3-48b46f8e363f\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-gpl7r" Apr 16 18:42:21.378858 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:21.378824 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-gpl7r" Apr 16 18:42:21.504187 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:21.504145 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-gpl7r"] Apr 16 18:42:21.506507 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:42:21.506473 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfbe71a81_66f8_43fb_b1b3_48b46f8e363f.slice/crio-042eb36267bf392445b5ba8b49abf2270713906c3b3ba21516641c5f64eed5cf WatchSource:0}: Error finding container 042eb36267bf392445b5ba8b49abf2270713906c3b3ba21516641c5f64eed5cf: Status 404 returned error can't find the container with id 042eb36267bf392445b5ba8b49abf2270713906c3b3ba21516641c5f64eed5cf Apr 16 18:42:21.508429 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:21.508412 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:42:21.890684 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:21.890590 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-4xml2" podUID="f05da57e-0303-4cb6-adb3-c3f6e1472c3a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 16 18:42:21.890990 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:21.890966 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-4xml2" podUID="f05da57e-0303-4cb6-adb3-c3f6e1472c3a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:42:22.124112 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:22.124072 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-gpl7r" event={"ID":"fbe71a81-66f8-43fb-b1b3-48b46f8e363f","Type":"ContainerStarted","Data":"c3c878e41e8ffc8ae4e85ccd996405727d45bb4232282158bd2dad93b2070add"} Apr 16 18:42:22.124112 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:22.124112 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-gpl7r" event={"ID":"fbe71a81-66f8-43fb-b1b3-48b46f8e363f","Type":"ContainerStarted","Data":"042eb36267bf392445b5ba8b49abf2270713906c3b3ba21516641c5f64eed5cf"} Apr 16 18:42:26.138910 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:26.138876 2578 generic.go:358] "Generic (PLEG): container finished" podID="fbe71a81-66f8-43fb-b1b3-48b46f8e363f" containerID="c3c878e41e8ffc8ae4e85ccd996405727d45bb4232282158bd2dad93b2070add" exitCode=0 Apr 16 18:42:26.139379 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:26.138954 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-gpl7r" event={"ID":"fbe71a81-66f8-43fb-b1b3-48b46f8e363f","Type":"ContainerDied","Data":"c3c878e41e8ffc8ae4e85ccd996405727d45bb4232282158bd2dad93b2070add"} Apr 16 18:42:26.140915 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:26.140891 2578 generic.go:358] "Generic (PLEG): container finished" podID="f05da57e-0303-4cb6-adb3-c3f6e1472c3a" containerID="5c04309cbf7140c4f1591f3b291a0613da6c01414fa51f7878fe032d17d3e9a4" exitCode=0 Apr 16 18:42:26.141017 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:26.140922 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-4xml2" event={"ID":"f05da57e-0303-4cb6-adb3-c3f6e1472c3a","Type":"ContainerDied","Data":"5c04309cbf7140c4f1591f3b291a0613da6c01414fa51f7878fe032d17d3e9a4"} Apr 16 18:42:27.146421 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:27.146378 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-gpl7r" event={"ID":"fbe71a81-66f8-43fb-b1b3-48b46f8e363f","Type":"ContainerStarted","Data":"4cf309191fdd8de976cae88bc4567913c97dfdcfa67308009d1a2ced04cb2ed5"} Apr 16 18:42:27.146421 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:27.146428 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-gpl7r" event={"ID":"fbe71a81-66f8-43fb-b1b3-48b46f8e363f","Type":"ContainerStarted","Data":"0bddbbb1070d438ba3a042993d750446b01922f1ac7b3d4af9555ea8cc6b2466"} Apr 16 18:42:27.146839 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:27.146749 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-gpl7r" Apr 16 18:42:27.148201 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:27.148164 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-gpl7r" podUID="fbe71a81-66f8-43fb-b1b3-48b46f8e363f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:5000: connect: connection refused" Apr 16 18:42:27.163042 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:27.162999 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-gpl7r" podStartSLOduration=6.1629856 podStartE2EDuration="6.1629856s" podCreationTimestamp="2026-04-16 18:42:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:42:27.162858763 +0000 UTC m=+723.950195019" watchObservedRunningTime="2026-04-16 18:42:27.1629856 +0000 UTC m=+723.950321854" Apr 16 18:42:28.149780 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:28.149746 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-gpl7r" Apr 16 18:42:28.150264 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:28.149956 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-gpl7r" podUID="fbe71a81-66f8-43fb-b1b3-48b46f8e363f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:5000: connect: connection refused" Apr 16 18:42:28.151053 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:28.151026 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-gpl7r" podUID="fbe71a81-66f8-43fb-b1b3-48b46f8e363f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:42:29.153058 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:29.153019 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-gpl7r" podUID="fbe71a81-66f8-43fb-b1b3-48b46f8e363f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:5000: connect: connection refused" Apr 16 18:42:29.153443 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:29.153418 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-gpl7r" podUID="fbe71a81-66f8-43fb-b1b3-48b46f8e363f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:42:31.890126 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:31.890082 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-4xml2" podUID="f05da57e-0303-4cb6-adb3-c3f6e1472c3a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 16 18:42:31.890550 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:31.890404 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-4xml2" podUID="f05da57e-0303-4cb6-adb3-c3f6e1472c3a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:42:39.153903 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:39.153862 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-gpl7r" podUID="fbe71a81-66f8-43fb-b1b3-48b46f8e363f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:5000: connect: connection refused" Apr 16 18:42:39.154385 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:39.154309 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-gpl7r" podUID="fbe71a81-66f8-43fb-b1b3-48b46f8e363f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:42:41.890691 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:41.890647 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-4xml2" podUID="f05da57e-0303-4cb6-adb3-c3f6e1472c3a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 16 18:42:41.891164 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:41.890784 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-4xml2" Apr 16 18:42:41.891164 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:41.890980 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-4xml2" podUID="f05da57e-0303-4cb6-adb3-c3f6e1472c3a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:42:41.891164 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:41.891064 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-4xml2" Apr 16 18:42:49.153992 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:49.153941 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-gpl7r" podUID="fbe71a81-66f8-43fb-b1b3-48b46f8e363f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:5000: connect: connection refused" Apr 16 18:42:49.154482 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:49.154435 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-gpl7r" podUID="fbe71a81-66f8-43fb-b1b3-48b46f8e363f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:42:51.139620 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:51.139596 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-4xml2" Apr 16 18:42:51.198665 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:51.198638 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f05da57e-0303-4cb6-adb3-c3f6e1472c3a-kserve-provision-location\") pod \"f05da57e-0303-4cb6-adb3-c3f6e1472c3a\" (UID: \"f05da57e-0303-4cb6-adb3-c3f6e1472c3a\") " Apr 16 18:42:51.199001 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:51.198978 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f05da57e-0303-4cb6-adb3-c3f6e1472c3a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f05da57e-0303-4cb6-adb3-c3f6e1472c3a" (UID: "f05da57e-0303-4cb6-adb3-c3f6e1472c3a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:42:51.228914 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:51.228887 2578 generic.go:358] "Generic (PLEG): container finished" podID="f05da57e-0303-4cb6-adb3-c3f6e1472c3a" containerID="0e636b46cfae624d9654ac1aeb56cbfbe372eac6b21afb578e1bcbc0f7638e23" exitCode=0 Apr 16 18:42:51.229051 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:51.228935 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-4xml2" event={"ID":"f05da57e-0303-4cb6-adb3-c3f6e1472c3a","Type":"ContainerDied","Data":"0e636b46cfae624d9654ac1aeb56cbfbe372eac6b21afb578e1bcbc0f7638e23"} Apr 16 18:42:51.229051 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:51.228965 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-4xml2" event={"ID":"f05da57e-0303-4cb6-adb3-c3f6e1472c3a","Type":"ContainerDied","Data":"aaae7cfb82e5b0f245773f31c3fba2e4fe24e556d84cf2a90c3ef0447db32046"} Apr 16 18:42:51.229051 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:51.228968 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-4xml2" Apr 16 18:42:51.229051 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:51.228981 2578 scope.go:117] "RemoveContainer" containerID="0e636b46cfae624d9654ac1aeb56cbfbe372eac6b21afb578e1bcbc0f7638e23" Apr 16 18:42:51.237068 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:51.237051 2578 scope.go:117] "RemoveContainer" containerID="5c04309cbf7140c4f1591f3b291a0613da6c01414fa51f7878fe032d17d3e9a4" Apr 16 18:42:51.244081 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:51.244065 2578 scope.go:117] "RemoveContainer" containerID="420d548b396093527115e9a344d5aeffae2b58744d2ff98b52d1fbd2d71a8cc5" Apr 16 18:42:51.250294 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:51.250274 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-4xml2"] Apr 16 18:42:51.251753 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:51.251730 2578 scope.go:117] "RemoveContainer" containerID="0e636b46cfae624d9654ac1aeb56cbfbe372eac6b21afb578e1bcbc0f7638e23" Apr 16 18:42:51.252015 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:42:51.251996 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e636b46cfae624d9654ac1aeb56cbfbe372eac6b21afb578e1bcbc0f7638e23\": container with ID starting with 0e636b46cfae624d9654ac1aeb56cbfbe372eac6b21afb578e1bcbc0f7638e23 not found: ID does not exist" containerID="0e636b46cfae624d9654ac1aeb56cbfbe372eac6b21afb578e1bcbc0f7638e23" Apr 16 18:42:51.252096 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:51.252022 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e636b46cfae624d9654ac1aeb56cbfbe372eac6b21afb578e1bcbc0f7638e23"} err="failed to get container status \"0e636b46cfae624d9654ac1aeb56cbfbe372eac6b21afb578e1bcbc0f7638e23\": rpc error: code = NotFound desc = could not find container \"0e636b46cfae624d9654ac1aeb56cbfbe372eac6b21afb578e1bcbc0f7638e23\": container with ID starting with 0e636b46cfae624d9654ac1aeb56cbfbe372eac6b21afb578e1bcbc0f7638e23 not found: ID does not exist" Apr 16 18:42:51.252096 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:51.252040 2578 scope.go:117] "RemoveContainer" containerID="5c04309cbf7140c4f1591f3b291a0613da6c01414fa51f7878fe032d17d3e9a4" Apr 16 18:42:51.252316 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:42:51.252288 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c04309cbf7140c4f1591f3b291a0613da6c01414fa51f7878fe032d17d3e9a4\": container with ID starting with 5c04309cbf7140c4f1591f3b291a0613da6c01414fa51f7878fe032d17d3e9a4 not found: ID does not exist" containerID="5c04309cbf7140c4f1591f3b291a0613da6c01414fa51f7878fe032d17d3e9a4" Apr 16 18:42:51.252413 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:51.252323 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c04309cbf7140c4f1591f3b291a0613da6c01414fa51f7878fe032d17d3e9a4"} err="failed to get container status \"5c04309cbf7140c4f1591f3b291a0613da6c01414fa51f7878fe032d17d3e9a4\": rpc error: code = NotFound desc = could not find container \"5c04309cbf7140c4f1591f3b291a0613da6c01414fa51f7878fe032d17d3e9a4\": container with ID starting with 5c04309cbf7140c4f1591f3b291a0613da6c01414fa51f7878fe032d17d3e9a4 not found: ID does not exist" Apr 16 18:42:51.252413 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:51.252343 2578 scope.go:117] "RemoveContainer" containerID="420d548b396093527115e9a344d5aeffae2b58744d2ff98b52d1fbd2d71a8cc5" Apr 16 18:42:51.252635 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:42:51.252617 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"420d548b396093527115e9a344d5aeffae2b58744d2ff98b52d1fbd2d71a8cc5\": container with ID starting with 420d548b396093527115e9a344d5aeffae2b58744d2ff98b52d1fbd2d71a8cc5 not found: ID does not exist" containerID="420d548b396093527115e9a344d5aeffae2b58744d2ff98b52d1fbd2d71a8cc5" Apr 16 18:42:51.252689 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:51.252640 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"420d548b396093527115e9a344d5aeffae2b58744d2ff98b52d1fbd2d71a8cc5"} err="failed to get container status \"420d548b396093527115e9a344d5aeffae2b58744d2ff98b52d1fbd2d71a8cc5\": rpc error: code = NotFound desc = could not find container \"420d548b396093527115e9a344d5aeffae2b58744d2ff98b52d1fbd2d71a8cc5\": container with ID starting with 420d548b396093527115e9a344d5aeffae2b58744d2ff98b52d1fbd2d71a8cc5 not found: ID does not exist" Apr 16 18:42:51.253527 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:51.253492 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-4xml2"] Apr 16 18:42:51.299636 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:51.299610 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f05da57e-0303-4cb6-adb3-c3f6e1472c3a-kserve-provision-location\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 18:42:51.847043 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:51.844129 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f05da57e-0303-4cb6-adb3-c3f6e1472c3a" path="/var/lib/kubelet/pods/f05da57e-0303-4cb6-adb3-c3f6e1472c3a/volumes" Apr 16 18:42:59.153294 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:59.153247 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-gpl7r" podUID="fbe71a81-66f8-43fb-b1b3-48b46f8e363f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:5000: connect: connection refused" Apr 16 18:42:59.153853 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:42:59.153658 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-gpl7r" podUID="fbe71a81-66f8-43fb-b1b3-48b46f8e363f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:43:09.153078 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:43:09.153020 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-gpl7r" podUID="fbe71a81-66f8-43fb-b1b3-48b46f8e363f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:5000: connect: connection refused" Apr 16 18:43:09.153589 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:43:09.153562 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-gpl7r" podUID="fbe71a81-66f8-43fb-b1b3-48b46f8e363f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:43:19.153333 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:43:19.153273 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-gpl7r" podUID="fbe71a81-66f8-43fb-b1b3-48b46f8e363f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:5000: connect: connection refused" Apr 16 18:43:19.153787 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:43:19.153758 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-gpl7r" podUID="fbe71a81-66f8-43fb-b1b3-48b46f8e363f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:43:29.153196 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:43:29.153134 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-gpl7r" podUID="fbe71a81-66f8-43fb-b1b3-48b46f8e363f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:5000: connect: connection refused" Apr 16 18:43:29.153588 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:43:29.153565 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-gpl7r" podUID="fbe71a81-66f8-43fb-b1b3-48b46f8e363f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:43:33.842654 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:43:33.842627 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-gpl7r" Apr 16 18:43:33.843134 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:43:33.842673 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-gpl7r" Apr 16 18:43:46.175595 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:43:46.175506 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-gpl7r"] Apr 16 18:43:46.176009 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:43:46.175814 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-gpl7r" podUID="fbe71a81-66f8-43fb-b1b3-48b46f8e363f" containerName="kserve-container" containerID="cri-o://0bddbbb1070d438ba3a042993d750446b01922f1ac7b3d4af9555ea8cc6b2466" gracePeriod=30 Apr 16 18:43:46.176009 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:43:46.175934 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-gpl7r" podUID="fbe71a81-66f8-43fb-b1b3-48b46f8e363f" containerName="agent" containerID="cri-o://4cf309191fdd8de976cae88bc4567913c97dfdcfa67308009d1a2ced04cb2ed5" gracePeriod=30 Apr 16 18:43:51.432860 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:43:51.432830 2578 generic.go:358] "Generic (PLEG): container finished" podID="fbe71a81-66f8-43fb-b1b3-48b46f8e363f" containerID="0bddbbb1070d438ba3a042993d750446b01922f1ac7b3d4af9555ea8cc6b2466" exitCode=0 Apr 16 18:43:51.433254 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:43:51.432913 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-gpl7r" event={"ID":"fbe71a81-66f8-43fb-b1b3-48b46f8e363f","Type":"ContainerDied","Data":"0bddbbb1070d438ba3a042993d750446b01922f1ac7b3d4af9555ea8cc6b2466"} Apr 16 18:43:53.840650 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:43:53.840608 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-gpl7r" podUID="fbe71a81-66f8-43fb-b1b3-48b46f8e363f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:5000: connect: connection refused" Apr 16 18:43:53.841068 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:43:53.840937 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-gpl7r" podUID="fbe71a81-66f8-43fb-b1b3-48b46f8e363f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:43:56.259969 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:43:56.259939 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-6ptpb"] Apr 16 18:43:56.260453 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:43:56.260304 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f05da57e-0303-4cb6-adb3-c3f6e1472c3a" containerName="kserve-container" Apr 16 18:43:56.260453 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:43:56.260316 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="f05da57e-0303-4cb6-adb3-c3f6e1472c3a" containerName="kserve-container" Apr 16 18:43:56.260453 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:43:56.260324 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f05da57e-0303-4cb6-adb3-c3f6e1472c3a" containerName="agent" Apr 16 18:43:56.260453 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:43:56.260329 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="f05da57e-0303-4cb6-adb3-c3f6e1472c3a" containerName="agent" Apr 16 18:43:56.260453 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:43:56.260347 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f05da57e-0303-4cb6-adb3-c3f6e1472c3a" containerName="storage-initializer" Apr 16 18:43:56.260453 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:43:56.260353 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="f05da57e-0303-4cb6-adb3-c3f6e1472c3a" containerName="storage-initializer" Apr 16 18:43:56.260453 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:43:56.260441 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="f05da57e-0303-4cb6-adb3-c3f6e1472c3a" containerName="agent" Apr 16 18:43:56.260453 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:43:56.260457 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="f05da57e-0303-4cb6-adb3-c3f6e1472c3a" containerName="kserve-container" Apr 16 18:43:56.263756 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:43:56.263739 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-6ptpb" Apr 16 18:43:56.269738 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:43:56.269718 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-6ptpb"] Apr 16 18:43:56.343701 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:43:56.343671 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f16df37c-7739-4d7b-8bf3-16add2ce4fde-kserve-provision-location\") pod \"isvc-logger-predictor-8444b4768-6ptpb\" (UID: \"f16df37c-7739-4d7b-8bf3-16add2ce4fde\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-6ptpb" Apr 16 18:43:56.444893 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:43:56.444856 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f16df37c-7739-4d7b-8bf3-16add2ce4fde-kserve-provision-location\") pod \"isvc-logger-predictor-8444b4768-6ptpb\" (UID: \"f16df37c-7739-4d7b-8bf3-16add2ce4fde\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-6ptpb" Apr 16 18:43:56.445278 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:43:56.445259 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f16df37c-7739-4d7b-8bf3-16add2ce4fde-kserve-provision-location\") pod \"isvc-logger-predictor-8444b4768-6ptpb\" (UID: \"f16df37c-7739-4d7b-8bf3-16add2ce4fde\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-6ptpb" Apr 16 18:43:56.575511 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:43:56.575415 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-6ptpb" Apr 16 18:43:56.693542 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:43:56.693487 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-6ptpb"] Apr 16 18:43:56.697253 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:43:56.697222 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf16df37c_7739_4d7b_8bf3_16add2ce4fde.slice/crio-40944b2150431cb877d34e6b5ebf3da21643009b5e3daffa3ce7b9b68310db83 WatchSource:0}: Error finding container 40944b2150431cb877d34e6b5ebf3da21643009b5e3daffa3ce7b9b68310db83: Status 404 returned error can't find the container with id 40944b2150431cb877d34e6b5ebf3da21643009b5e3daffa3ce7b9b68310db83 Apr 16 18:43:57.454661 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:43:57.454627 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-6ptpb" event={"ID":"f16df37c-7739-4d7b-8bf3-16add2ce4fde","Type":"ContainerStarted","Data":"4235164f03dfe77308d2a79f8b01389e8525a6bc17c2f17928fa32f8f402d245"} Apr 16 18:43:57.454661 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:43:57.454663 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-6ptpb" event={"ID":"f16df37c-7739-4d7b-8bf3-16add2ce4fde","Type":"ContainerStarted","Data":"40944b2150431cb877d34e6b5ebf3da21643009b5e3daffa3ce7b9b68310db83"} Apr 16 18:44:00.465594 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:00.465562 2578 generic.go:358] "Generic (PLEG): container finished" podID="f16df37c-7739-4d7b-8bf3-16add2ce4fde" containerID="4235164f03dfe77308d2a79f8b01389e8525a6bc17c2f17928fa32f8f402d245" exitCode=0 Apr 16 18:44:00.466029 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:00.465643 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-6ptpb" event={"ID":"f16df37c-7739-4d7b-8bf3-16add2ce4fde","Type":"ContainerDied","Data":"4235164f03dfe77308d2a79f8b01389e8525a6bc17c2f17928fa32f8f402d245"} Apr 16 18:44:01.470385 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:01.470350 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-6ptpb" event={"ID":"f16df37c-7739-4d7b-8bf3-16add2ce4fde","Type":"ContainerStarted","Data":"d7b39a6d3f6269047d2b4abd4658d3783f3f4aa886f9f77144186df6457bd6ba"} Apr 16 18:44:01.470385 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:01.470389 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-6ptpb" event={"ID":"f16df37c-7739-4d7b-8bf3-16add2ce4fde","Type":"ContainerStarted","Data":"9c0f61a3c19066fe921243982b8043c09eb5f1236a6c5830a91aaae1a6f94019"} Apr 16 18:44:01.470874 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:01.470697 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-6ptpb" Apr 16 18:44:01.472208 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:01.472167 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-6ptpb" podUID="f16df37c-7739-4d7b-8bf3-16add2ce4fde" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 18:44:01.487919 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:01.487872 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-6ptpb" podStartSLOduration=5.487858306 podStartE2EDuration="5.487858306s" podCreationTimestamp="2026-04-16 18:43:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:44:01.486508566 +0000 UTC m=+818.273844823" watchObservedRunningTime="2026-04-16 18:44:01.487858306 +0000 UTC m=+818.275194552" Apr 16 18:44:02.474293 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:02.474265 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-6ptpb" Apr 16 18:44:02.474663 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:02.474379 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-6ptpb" podUID="f16df37c-7739-4d7b-8bf3-16add2ce4fde" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 18:44:02.475394 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:02.475367 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-6ptpb" podUID="f16df37c-7739-4d7b-8bf3-16add2ce4fde" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:44:03.477599 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:03.477547 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-6ptpb" podUID="f16df37c-7739-4d7b-8bf3-16add2ce4fde" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 18:44:03.478016 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:03.477873 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-6ptpb" podUID="f16df37c-7739-4d7b-8bf3-16add2ce4fde" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:44:03.840640 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:03.840548 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-gpl7r" podUID="fbe71a81-66f8-43fb-b1b3-48b46f8e363f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:5000: connect: connection refused" Apr 16 18:44:03.840946 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:03.840920 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-gpl7r" podUID="fbe71a81-66f8-43fb-b1b3-48b46f8e363f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:44:13.478449 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:13.478400 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-6ptpb" podUID="f16df37c-7739-4d7b-8bf3-16add2ce4fde" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 18:44:13.478972 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:13.478930 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-6ptpb" podUID="f16df37c-7739-4d7b-8bf3-16add2ce4fde" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:44:13.840579 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:13.840469 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-gpl7r" podUID="fbe71a81-66f8-43fb-b1b3-48b46f8e363f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:5000: connect: connection refused" Apr 16 18:44:13.840823 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:13.840797 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-gpl7r" podUID="fbe71a81-66f8-43fb-b1b3-48b46f8e363f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:44:13.841576 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:13.841553 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-gpl7r" Apr 16 18:44:13.841680 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:13.841611 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-gpl7r" Apr 16 18:44:16.312753 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:16.312725 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-gpl7r" Apr 16 18:44:16.400618 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:16.400583 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fbe71a81-66f8-43fb-b1b3-48b46f8e363f-kserve-provision-location\") pod \"fbe71a81-66f8-43fb-b1b3-48b46f8e363f\" (UID: \"fbe71a81-66f8-43fb-b1b3-48b46f8e363f\") " Apr 16 18:44:16.400838 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:16.400814 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbe71a81-66f8-43fb-b1b3-48b46f8e363f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "fbe71a81-66f8-43fb-b1b3-48b46f8e363f" (UID: "fbe71a81-66f8-43fb-b1b3-48b46f8e363f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:44:16.502109 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:16.502080 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fbe71a81-66f8-43fb-b1b3-48b46f8e363f-kserve-provision-location\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 18:44:16.520756 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:16.520725 2578 generic.go:358] "Generic (PLEG): container finished" podID="fbe71a81-66f8-43fb-b1b3-48b46f8e363f" containerID="4cf309191fdd8de976cae88bc4567913c97dfdcfa67308009d1a2ced04cb2ed5" exitCode=0 Apr 16 18:44:16.520893 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:16.520827 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-gpl7r" Apr 16 18:44:16.520933 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:16.520817 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-gpl7r" event={"ID":"fbe71a81-66f8-43fb-b1b3-48b46f8e363f","Type":"ContainerDied","Data":"4cf309191fdd8de976cae88bc4567913c97dfdcfa67308009d1a2ced04cb2ed5"} Apr 16 18:44:16.520969 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:16.520930 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-gpl7r" event={"ID":"fbe71a81-66f8-43fb-b1b3-48b46f8e363f","Type":"ContainerDied","Data":"042eb36267bf392445b5ba8b49abf2270713906c3b3ba21516641c5f64eed5cf"} Apr 16 18:44:16.520969 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:16.520946 2578 scope.go:117] "RemoveContainer" containerID="4cf309191fdd8de976cae88bc4567913c97dfdcfa67308009d1a2ced04cb2ed5" Apr 16 18:44:16.528838 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:16.528702 2578 scope.go:117] "RemoveContainer" containerID="0bddbbb1070d438ba3a042993d750446b01922f1ac7b3d4af9555ea8cc6b2466" Apr 16 18:44:16.535755 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:16.535740 2578 scope.go:117] "RemoveContainer" containerID="c3c878e41e8ffc8ae4e85ccd996405727d45bb4232282158bd2dad93b2070add" Apr 16 18:44:16.542627 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:16.542605 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-gpl7r"] Apr 16 18:44:16.543752 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:16.543734 2578 scope.go:117] "RemoveContainer" containerID="4cf309191fdd8de976cae88bc4567913c97dfdcfa67308009d1a2ced04cb2ed5" Apr 16 18:44:16.544062 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:44:16.544039 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cf309191fdd8de976cae88bc4567913c97dfdcfa67308009d1a2ced04cb2ed5\": container with ID starting with 4cf309191fdd8de976cae88bc4567913c97dfdcfa67308009d1a2ced04cb2ed5 not found: ID does not exist" containerID="4cf309191fdd8de976cae88bc4567913c97dfdcfa67308009d1a2ced04cb2ed5" Apr 16 18:44:16.544193 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:16.544074 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cf309191fdd8de976cae88bc4567913c97dfdcfa67308009d1a2ced04cb2ed5"} err="failed to get container status \"4cf309191fdd8de976cae88bc4567913c97dfdcfa67308009d1a2ced04cb2ed5\": rpc error: code = NotFound desc = could not find container \"4cf309191fdd8de976cae88bc4567913c97dfdcfa67308009d1a2ced04cb2ed5\": container with ID starting with 4cf309191fdd8de976cae88bc4567913c97dfdcfa67308009d1a2ced04cb2ed5 not found: ID does not exist" Apr 16 18:44:16.544193 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:16.544103 2578 scope.go:117] "RemoveContainer" containerID="0bddbbb1070d438ba3a042993d750446b01922f1ac7b3d4af9555ea8cc6b2466" Apr 16 18:44:16.544484 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:44:16.544456 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bddbbb1070d438ba3a042993d750446b01922f1ac7b3d4af9555ea8cc6b2466\": container with ID starting with 0bddbbb1070d438ba3a042993d750446b01922f1ac7b3d4af9555ea8cc6b2466 not found: ID does not exist" containerID="0bddbbb1070d438ba3a042993d750446b01922f1ac7b3d4af9555ea8cc6b2466" Apr 16 18:44:16.544553 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:16.544484 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bddbbb1070d438ba3a042993d750446b01922f1ac7b3d4af9555ea8cc6b2466"} err="failed to get container status \"0bddbbb1070d438ba3a042993d750446b01922f1ac7b3d4af9555ea8cc6b2466\": rpc error: code = NotFound desc = could not find container \"0bddbbb1070d438ba3a042993d750446b01922f1ac7b3d4af9555ea8cc6b2466\": container with ID starting with 0bddbbb1070d438ba3a042993d750446b01922f1ac7b3d4af9555ea8cc6b2466 not found: ID does not exist" Apr 16 18:44:16.544553 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:16.544508 2578 scope.go:117] "RemoveContainer" containerID="c3c878e41e8ffc8ae4e85ccd996405727d45bb4232282158bd2dad93b2070add" Apr 16 18:44:16.544869 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:44:16.544822 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3c878e41e8ffc8ae4e85ccd996405727d45bb4232282158bd2dad93b2070add\": container with ID starting with c3c878e41e8ffc8ae4e85ccd996405727d45bb4232282158bd2dad93b2070add not found: ID does not exist" containerID="c3c878e41e8ffc8ae4e85ccd996405727d45bb4232282158bd2dad93b2070add" Apr 16 18:44:16.544869 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:16.544856 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3c878e41e8ffc8ae4e85ccd996405727d45bb4232282158bd2dad93b2070add"} err="failed to get container status \"c3c878e41e8ffc8ae4e85ccd996405727d45bb4232282158bd2dad93b2070add\": rpc error: code = NotFound desc = could not find container \"c3c878e41e8ffc8ae4e85ccd996405727d45bb4232282158bd2dad93b2070add\": container with ID starting with c3c878e41e8ffc8ae4e85ccd996405727d45bb4232282158bd2dad93b2070add not found: ID does not exist" Apr 16 18:44:16.546087 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:16.546069 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-gpl7r"] Apr 16 18:44:17.841694 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:17.841660 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbe71a81-66f8-43fb-b1b3-48b46f8e363f" path="/var/lib/kubelet/pods/fbe71a81-66f8-43fb-b1b3-48b46f8e363f/volumes" Apr 16 18:44:23.477871 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:23.477818 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-6ptpb" podUID="f16df37c-7739-4d7b-8bf3-16add2ce4fde" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 18:44:23.478327 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:23.478299 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-6ptpb" podUID="f16df37c-7739-4d7b-8bf3-16add2ce4fde" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:44:33.478430 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:33.478381 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-6ptpb" podUID="f16df37c-7739-4d7b-8bf3-16add2ce4fde" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 18:44:33.478823 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:33.478783 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-6ptpb" podUID="f16df37c-7739-4d7b-8bf3-16add2ce4fde" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:44:43.477597 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:43.477549 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-6ptpb" podUID="f16df37c-7739-4d7b-8bf3-16add2ce4fde" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 18:44:43.478140 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:43.478112 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-6ptpb" podUID="f16df37c-7739-4d7b-8bf3-16add2ce4fde" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:44:51.399640 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:51.399605 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7999ffd77-fmmjm"] Apr 16 18:44:51.400087 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:51.399960 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fbe71a81-66f8-43fb-b1b3-48b46f8e363f" containerName="agent" Apr 16 18:44:51.400087 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:51.399970 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbe71a81-66f8-43fb-b1b3-48b46f8e363f" containerName="agent" Apr 16 18:44:51.400087 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:51.399981 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fbe71a81-66f8-43fb-b1b3-48b46f8e363f" containerName="kserve-container" Apr 16 18:44:51.400087 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:51.399987 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbe71a81-66f8-43fb-b1b3-48b46f8e363f" containerName="kserve-container" Apr 16 18:44:51.400087 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:51.399997 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fbe71a81-66f8-43fb-b1b3-48b46f8e363f" containerName="storage-initializer" Apr 16 18:44:51.400087 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:51.400003 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbe71a81-66f8-43fb-b1b3-48b46f8e363f" containerName="storage-initializer" Apr 16 18:44:51.400087 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:51.400051 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="fbe71a81-66f8-43fb-b1b3-48b46f8e363f" containerName="agent" Apr 16 18:44:51.400087 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:51.400060 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="fbe71a81-66f8-43fb-b1b3-48b46f8e363f" containerName="kserve-container" Apr 16 18:44:51.403056 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:51.403032 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7999ffd77-fmmjm" Apr 16 18:44:51.411821 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:51.411797 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7999ffd77-fmmjm"] Apr 16 18:44:51.486429 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:51.486398 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/02bdcfc3-3fd9-47ff-ab10-d2223dfc8e74-console-config\") pod \"console-7999ffd77-fmmjm\" (UID: \"02bdcfc3-3fd9-47ff-ab10-d2223dfc8e74\") " pod="openshift-console/console-7999ffd77-fmmjm" Apr 16 18:44:51.486575 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:51.486438 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gcw7\" (UniqueName: \"kubernetes.io/projected/02bdcfc3-3fd9-47ff-ab10-d2223dfc8e74-kube-api-access-8gcw7\") pod \"console-7999ffd77-fmmjm\" (UID: \"02bdcfc3-3fd9-47ff-ab10-d2223dfc8e74\") " pod="openshift-console/console-7999ffd77-fmmjm" Apr 16 18:44:51.486575 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:51.486454 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/02bdcfc3-3fd9-47ff-ab10-d2223dfc8e74-service-ca\") pod \"console-7999ffd77-fmmjm\" (UID: \"02bdcfc3-3fd9-47ff-ab10-d2223dfc8e74\") " pod="openshift-console/console-7999ffd77-fmmjm" Apr 16 18:44:51.486575 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:51.486553 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/02bdcfc3-3fd9-47ff-ab10-d2223dfc8e74-console-serving-cert\") pod \"console-7999ffd77-fmmjm\" (UID: \"02bdcfc3-3fd9-47ff-ab10-d2223dfc8e74\") " pod="openshift-console/console-7999ffd77-fmmjm" Apr 16 18:44:51.486692 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:51.486629 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/02bdcfc3-3fd9-47ff-ab10-d2223dfc8e74-console-oauth-config\") pod \"console-7999ffd77-fmmjm\" (UID: \"02bdcfc3-3fd9-47ff-ab10-d2223dfc8e74\") " pod="openshift-console/console-7999ffd77-fmmjm" Apr 16 18:44:51.486692 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:51.486649 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/02bdcfc3-3fd9-47ff-ab10-d2223dfc8e74-trusted-ca-bundle\") pod \"console-7999ffd77-fmmjm\" (UID: \"02bdcfc3-3fd9-47ff-ab10-d2223dfc8e74\") " pod="openshift-console/console-7999ffd77-fmmjm" Apr 16 18:44:51.486755 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:51.486697 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/02bdcfc3-3fd9-47ff-ab10-d2223dfc8e74-oauth-serving-cert\") pod \"console-7999ffd77-fmmjm\" (UID: \"02bdcfc3-3fd9-47ff-ab10-d2223dfc8e74\") " pod="openshift-console/console-7999ffd77-fmmjm" Apr 16 18:44:51.587153 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:51.587120 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8gcw7\" (UniqueName: \"kubernetes.io/projected/02bdcfc3-3fd9-47ff-ab10-d2223dfc8e74-kube-api-access-8gcw7\") pod \"console-7999ffd77-fmmjm\" (UID: \"02bdcfc3-3fd9-47ff-ab10-d2223dfc8e74\") " pod="openshift-console/console-7999ffd77-fmmjm" Apr 16 18:44:51.587153 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:51.587158 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/02bdcfc3-3fd9-47ff-ab10-d2223dfc8e74-service-ca\") pod \"console-7999ffd77-fmmjm\" (UID: \"02bdcfc3-3fd9-47ff-ab10-d2223dfc8e74\") " pod="openshift-console/console-7999ffd77-fmmjm" Apr 16 18:44:51.587371 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:51.587205 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/02bdcfc3-3fd9-47ff-ab10-d2223dfc8e74-console-serving-cert\") pod \"console-7999ffd77-fmmjm\" (UID: \"02bdcfc3-3fd9-47ff-ab10-d2223dfc8e74\") " pod="openshift-console/console-7999ffd77-fmmjm" Apr 16 18:44:51.587371 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:51.587253 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/02bdcfc3-3fd9-47ff-ab10-d2223dfc8e74-console-oauth-config\") pod \"console-7999ffd77-fmmjm\" (UID: \"02bdcfc3-3fd9-47ff-ab10-d2223dfc8e74\") " pod="openshift-console/console-7999ffd77-fmmjm" Apr 16 18:44:51.587371 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:51.587270 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/02bdcfc3-3fd9-47ff-ab10-d2223dfc8e74-trusted-ca-bundle\") pod \"console-7999ffd77-fmmjm\" (UID: \"02bdcfc3-3fd9-47ff-ab10-d2223dfc8e74\") " pod="openshift-console/console-7999ffd77-fmmjm" Apr 16 18:44:51.587371 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:51.587298 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/02bdcfc3-3fd9-47ff-ab10-d2223dfc8e74-oauth-serving-cert\") pod \"console-7999ffd77-fmmjm\" (UID: \"02bdcfc3-3fd9-47ff-ab10-d2223dfc8e74\") " pod="openshift-console/console-7999ffd77-fmmjm" Apr 16 18:44:51.587557 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:51.587446 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/02bdcfc3-3fd9-47ff-ab10-d2223dfc8e74-console-config\") pod \"console-7999ffd77-fmmjm\" (UID: \"02bdcfc3-3fd9-47ff-ab10-d2223dfc8e74\") " pod="openshift-console/console-7999ffd77-fmmjm" Apr 16 18:44:51.587922 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:51.587889 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/02bdcfc3-3fd9-47ff-ab10-d2223dfc8e74-service-ca\") pod \"console-7999ffd77-fmmjm\" (UID: \"02bdcfc3-3fd9-47ff-ab10-d2223dfc8e74\") " pod="openshift-console/console-7999ffd77-fmmjm" Apr 16 18:44:51.588041 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:51.587987 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/02bdcfc3-3fd9-47ff-ab10-d2223dfc8e74-oauth-serving-cert\") pod \"console-7999ffd77-fmmjm\" (UID: \"02bdcfc3-3fd9-47ff-ab10-d2223dfc8e74\") " pod="openshift-console/console-7999ffd77-fmmjm" Apr 16 18:44:51.588104 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:51.588070 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/02bdcfc3-3fd9-47ff-ab10-d2223dfc8e74-console-config\") pod \"console-7999ffd77-fmmjm\" (UID: \"02bdcfc3-3fd9-47ff-ab10-d2223dfc8e74\") " pod="openshift-console/console-7999ffd77-fmmjm" Apr 16 18:44:51.588161 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:51.588144 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/02bdcfc3-3fd9-47ff-ab10-d2223dfc8e74-trusted-ca-bundle\") pod \"console-7999ffd77-fmmjm\" (UID: \"02bdcfc3-3fd9-47ff-ab10-d2223dfc8e74\") " pod="openshift-console/console-7999ffd77-fmmjm" Apr 16 18:44:51.589701 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:51.589671 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/02bdcfc3-3fd9-47ff-ab10-d2223dfc8e74-console-serving-cert\") pod \"console-7999ffd77-fmmjm\" (UID: \"02bdcfc3-3fd9-47ff-ab10-d2223dfc8e74\") " pod="openshift-console/console-7999ffd77-fmmjm" Apr 16 18:44:51.589807 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:51.589706 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/02bdcfc3-3fd9-47ff-ab10-d2223dfc8e74-console-oauth-config\") pod \"console-7999ffd77-fmmjm\" (UID: \"02bdcfc3-3fd9-47ff-ab10-d2223dfc8e74\") " pod="openshift-console/console-7999ffd77-fmmjm" Apr 16 18:44:51.595287 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:51.595269 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gcw7\" (UniqueName: \"kubernetes.io/projected/02bdcfc3-3fd9-47ff-ab10-d2223dfc8e74-kube-api-access-8gcw7\") pod \"console-7999ffd77-fmmjm\" (UID: \"02bdcfc3-3fd9-47ff-ab10-d2223dfc8e74\") " pod="openshift-console/console-7999ffd77-fmmjm" Apr 16 18:44:51.713443 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:51.713364 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7999ffd77-fmmjm" Apr 16 18:44:52.038409 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:52.038376 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7999ffd77-fmmjm"] Apr 16 18:44:52.042631 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:44:52.042601 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02bdcfc3_3fd9_47ff_ab10_d2223dfc8e74.slice/crio-b8187c3a108294b44d1e10b6e2dd6657fedff60827d2bc2d02d562be33d891d9 WatchSource:0}: Error finding container b8187c3a108294b44d1e10b6e2dd6657fedff60827d2bc2d02d562be33d891d9: Status 404 returned error can't find the container with id b8187c3a108294b44d1e10b6e2dd6657fedff60827d2bc2d02d562be33d891d9 Apr 16 18:44:52.643400 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:52.643364 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7999ffd77-fmmjm" event={"ID":"02bdcfc3-3fd9-47ff-ab10-d2223dfc8e74","Type":"ContainerStarted","Data":"ed32e133b4ec87535a3c5261febc6668fa644eff9855dff717d24422c7e3fc85"} Apr 16 18:44:52.643400 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:52.643399 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7999ffd77-fmmjm" event={"ID":"02bdcfc3-3fd9-47ff-ab10-d2223dfc8e74","Type":"ContainerStarted","Data":"b8187c3a108294b44d1e10b6e2dd6657fedff60827d2bc2d02d562be33d891d9"} Apr 16 18:44:52.663346 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:52.663305 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7999ffd77-fmmjm" podStartSLOduration=1.6632927899999999 podStartE2EDuration="1.66329279s" podCreationTimestamp="2026-04-16 18:44:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:44:52.661484187 +0000 UTC m=+869.448820438" watchObservedRunningTime="2026-04-16 18:44:52.66329279 +0000 UTC m=+869.450629045" Apr 16 18:44:53.478158 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:53.478113 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-6ptpb" podUID="f16df37c-7739-4d7b-8bf3-16add2ce4fde" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 18:44:53.478622 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:44:53.478588 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-6ptpb" podUID="f16df37c-7739-4d7b-8bf3-16add2ce4fde" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:45:01.713761 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:01.713725 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7999ffd77-fmmjm" Apr 16 18:45:01.714281 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:01.713796 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7999ffd77-fmmjm" Apr 16 18:45:01.718649 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:01.718632 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7999ffd77-fmmjm" Apr 16 18:45:02.681272 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:02.681241 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7999ffd77-fmmjm" Apr 16 18:45:02.725659 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:02.725610 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7b49dcbf9f-npgkc"] Apr 16 18:45:03.477586 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:03.477542 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-6ptpb" podUID="f16df37c-7739-4d7b-8bf3-16add2ce4fde" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 18:45:03.478024 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:03.478001 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-6ptpb" podUID="f16df37c-7739-4d7b-8bf3-16add2ce4fde" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:45:08.838249 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:08.838217 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-6ptpb" Apr 16 18:45:08.838618 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:08.838268 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-6ptpb" Apr 16 18:45:21.501394 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:21.501279 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-6ptpb"] Apr 16 18:45:21.501904 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:21.501652 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-6ptpb" podUID="f16df37c-7739-4d7b-8bf3-16add2ce4fde" containerName="kserve-container" containerID="cri-o://9c0f61a3c19066fe921243982b8043c09eb5f1236a6c5830a91aaae1a6f94019" gracePeriod=30 Apr 16 18:45:21.501904 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:21.501732 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-6ptpb" podUID="f16df37c-7739-4d7b-8bf3-16add2ce4fde" containerName="agent" containerID="cri-o://d7b39a6d3f6269047d2b4abd4658d3783f3f4aa886f9f77144186df6457bd6ba" gracePeriod=30 Apr 16 18:45:21.519172 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:21.519121 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-fg9p4"] Apr 16 18:45:21.522732 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:21.522712 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-fg9p4" Apr 16 18:45:21.528592 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:21.528574 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-fg9p4"] Apr 16 18:45:21.635249 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:21.635218 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9807eb33-454f-4ec2-908c-ff10f75fdfb9-kserve-provision-location\") pod \"isvc-lightgbm-predictor-bd79f4555-fg9p4\" (UID: \"9807eb33-454f-4ec2-908c-ff10f75fdfb9\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-fg9p4" Apr 16 18:45:21.736052 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:21.736020 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9807eb33-454f-4ec2-908c-ff10f75fdfb9-kserve-provision-location\") pod \"isvc-lightgbm-predictor-bd79f4555-fg9p4\" (UID: \"9807eb33-454f-4ec2-908c-ff10f75fdfb9\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-fg9p4" Apr 16 18:45:21.736416 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:21.736396 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9807eb33-454f-4ec2-908c-ff10f75fdfb9-kserve-provision-location\") pod \"isvc-lightgbm-predictor-bd79f4555-fg9p4\" (UID: \"9807eb33-454f-4ec2-908c-ff10f75fdfb9\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-fg9p4" Apr 16 18:45:21.834206 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:21.834125 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-fg9p4" Apr 16 18:45:21.955355 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:21.955318 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-fg9p4"] Apr 16 18:45:21.958078 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:45:21.958036 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9807eb33_454f_4ec2_908c_ff10f75fdfb9.slice/crio-2b1ba95909a9667c4e4d4f550cb39854281505284dfdaa42e1fce445791c9521 WatchSource:0}: Error finding container 2b1ba95909a9667c4e4d4f550cb39854281505284dfdaa42e1fce445791c9521: Status 404 returned error can't find the container with id 2b1ba95909a9667c4e4d4f550cb39854281505284dfdaa42e1fce445791c9521 Apr 16 18:45:22.743085 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:22.743044 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-fg9p4" event={"ID":"9807eb33-454f-4ec2-908c-ff10f75fdfb9","Type":"ContainerStarted","Data":"5f4b6a49f592266c65279068662b6fbef3ddca3a7776b5141d6193e9ccf26bd2"} Apr 16 18:45:22.743085 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:22.743087 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-fg9p4" event={"ID":"9807eb33-454f-4ec2-908c-ff10f75fdfb9","Type":"ContainerStarted","Data":"2b1ba95909a9667c4e4d4f550cb39854281505284dfdaa42e1fce445791c9521"} Apr 16 18:45:23.753168 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:23.753140 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fdzzg_02533a21-4e1f-4bc0-a493-7ac7d35295b8/ovn-acl-logging/0.log" Apr 16 18:45:23.756127 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:23.756107 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fdzzg_02533a21-4e1f-4bc0-a493-7ac7d35295b8/ovn-acl-logging/0.log" Apr 16 18:45:25.753146 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:25.753114 2578 generic.go:358] "Generic (PLEG): container finished" podID="9807eb33-454f-4ec2-908c-ff10f75fdfb9" containerID="5f4b6a49f592266c65279068662b6fbef3ddca3a7776b5141d6193e9ccf26bd2" exitCode=0 Apr 16 18:45:25.753577 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:25.753207 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-fg9p4" event={"ID":"9807eb33-454f-4ec2-908c-ff10f75fdfb9","Type":"ContainerDied","Data":"5f4b6a49f592266c65279068662b6fbef3ddca3a7776b5141d6193e9ccf26bd2"} Apr 16 18:45:25.755292 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:25.755265 2578 generic.go:358] "Generic (PLEG): container finished" podID="f16df37c-7739-4d7b-8bf3-16add2ce4fde" containerID="9c0f61a3c19066fe921243982b8043c09eb5f1236a6c5830a91aaae1a6f94019" exitCode=0 Apr 16 18:45:25.755400 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:25.755317 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-6ptpb" event={"ID":"f16df37c-7739-4d7b-8bf3-16add2ce4fde","Type":"ContainerDied","Data":"9c0f61a3c19066fe921243982b8043c09eb5f1236a6c5830a91aaae1a6f94019"} Apr 16 18:45:27.750080 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:27.750012 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7b49dcbf9f-npgkc" podUID="a1b0ae9c-d0d2-465b-8b06-9b5710035c19" containerName="console" containerID="cri-o://26e3d10b9b4e2e14d5150bdf5c4f1e395d985dd24eac51481af2250897ac19e2" gracePeriod=15 Apr 16 18:45:28.308148 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:28.308122 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7b49dcbf9f-npgkc_a1b0ae9c-d0d2-465b-8b06-9b5710035c19/console/0.log" Apr 16 18:45:28.308317 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:28.308224 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b49dcbf9f-npgkc" Apr 16 18:45:28.396129 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:28.396041 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a1b0ae9c-d0d2-465b-8b06-9b5710035c19-oauth-serving-cert\") pod \"a1b0ae9c-d0d2-465b-8b06-9b5710035c19\" (UID: \"a1b0ae9c-d0d2-465b-8b06-9b5710035c19\") " Apr 16 18:45:28.396129 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:28.396114 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a1b0ae9c-d0d2-465b-8b06-9b5710035c19-console-oauth-config\") pod \"a1b0ae9c-d0d2-465b-8b06-9b5710035c19\" (UID: \"a1b0ae9c-d0d2-465b-8b06-9b5710035c19\") " Apr 16 18:45:28.396377 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:28.396147 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a1b0ae9c-d0d2-465b-8b06-9b5710035c19-service-ca\") pod \"a1b0ae9c-d0d2-465b-8b06-9b5710035c19\" (UID: \"a1b0ae9c-d0d2-465b-8b06-9b5710035c19\") " Apr 16 18:45:28.396377 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:28.396229 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a1b0ae9c-d0d2-465b-8b06-9b5710035c19-trusted-ca-bundle\") pod \"a1b0ae9c-d0d2-465b-8b06-9b5710035c19\" (UID: \"a1b0ae9c-d0d2-465b-8b06-9b5710035c19\") " Apr 16 18:45:28.396377 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:28.396279 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a1b0ae9c-d0d2-465b-8b06-9b5710035c19-console-config\") pod \"a1b0ae9c-d0d2-465b-8b06-9b5710035c19\" (UID: \"a1b0ae9c-d0d2-465b-8b06-9b5710035c19\") " Apr 16 18:45:28.396377 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:28.396306 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a1b0ae9c-d0d2-465b-8b06-9b5710035c19-console-serving-cert\") pod \"a1b0ae9c-d0d2-465b-8b06-9b5710035c19\" (UID: \"a1b0ae9c-d0d2-465b-8b06-9b5710035c19\") " Apr 16 18:45:28.396377 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:28.396348 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dg4nk\" (UniqueName: \"kubernetes.io/projected/a1b0ae9c-d0d2-465b-8b06-9b5710035c19-kube-api-access-dg4nk\") pod \"a1b0ae9c-d0d2-465b-8b06-9b5710035c19\" (UID: \"a1b0ae9c-d0d2-465b-8b06-9b5710035c19\") " Apr 16 18:45:28.396867 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:28.396502 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1b0ae9c-d0d2-465b-8b06-9b5710035c19-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "a1b0ae9c-d0d2-465b-8b06-9b5710035c19" (UID: "a1b0ae9c-d0d2-465b-8b06-9b5710035c19"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:45:28.396867 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:28.396670 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1b0ae9c-d0d2-465b-8b06-9b5710035c19-console-config" (OuterVolumeSpecName: "console-config") pod "a1b0ae9c-d0d2-465b-8b06-9b5710035c19" (UID: "a1b0ae9c-d0d2-465b-8b06-9b5710035c19"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:45:28.396867 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:28.396696 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1b0ae9c-d0d2-465b-8b06-9b5710035c19-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "a1b0ae9c-d0d2-465b-8b06-9b5710035c19" (UID: "a1b0ae9c-d0d2-465b-8b06-9b5710035c19"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:45:28.396867 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:28.396840 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1b0ae9c-d0d2-465b-8b06-9b5710035c19-service-ca" (OuterVolumeSpecName: "service-ca") pod "a1b0ae9c-d0d2-465b-8b06-9b5710035c19" (UID: "a1b0ae9c-d0d2-465b-8b06-9b5710035c19"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:45:28.399159 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:28.399129 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1b0ae9c-d0d2-465b-8b06-9b5710035c19-kube-api-access-dg4nk" (OuterVolumeSpecName: "kube-api-access-dg4nk") pod "a1b0ae9c-d0d2-465b-8b06-9b5710035c19" (UID: "a1b0ae9c-d0d2-465b-8b06-9b5710035c19"). InnerVolumeSpecName "kube-api-access-dg4nk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:45:28.399159 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:28.399144 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1b0ae9c-d0d2-465b-8b06-9b5710035c19-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "a1b0ae9c-d0d2-465b-8b06-9b5710035c19" (UID: "a1b0ae9c-d0d2-465b-8b06-9b5710035c19"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:45:28.399350 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:28.399286 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1b0ae9c-d0d2-465b-8b06-9b5710035c19-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "a1b0ae9c-d0d2-465b-8b06-9b5710035c19" (UID: "a1b0ae9c-d0d2-465b-8b06-9b5710035c19"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:45:28.497507 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:28.497474 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a1b0ae9c-d0d2-465b-8b06-9b5710035c19-console-oauth-config\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 18:45:28.497507 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:28.497505 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a1b0ae9c-d0d2-465b-8b06-9b5710035c19-service-ca\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 18:45:28.497507 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:28.497514 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a1b0ae9c-d0d2-465b-8b06-9b5710035c19-trusted-ca-bundle\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 18:45:28.497769 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:28.497542 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a1b0ae9c-d0d2-465b-8b06-9b5710035c19-console-config\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 18:45:28.497769 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:28.497555 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a1b0ae9c-d0d2-465b-8b06-9b5710035c19-console-serving-cert\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 18:45:28.497769 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:28.497563 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dg4nk\" (UniqueName: \"kubernetes.io/projected/a1b0ae9c-d0d2-465b-8b06-9b5710035c19-kube-api-access-dg4nk\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 18:45:28.497769 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:28.497578 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a1b0ae9c-d0d2-465b-8b06-9b5710035c19-oauth-serving-cert\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 18:45:28.769294 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:28.769267 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7b49dcbf9f-npgkc_a1b0ae9c-d0d2-465b-8b06-9b5710035c19/console/0.log" Apr 16 18:45:28.769831 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:28.769318 2578 generic.go:358] "Generic (PLEG): container finished" podID="a1b0ae9c-d0d2-465b-8b06-9b5710035c19" containerID="26e3d10b9b4e2e14d5150bdf5c4f1e395d985dd24eac51481af2250897ac19e2" exitCode=2 Apr 16 18:45:28.769831 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:28.769371 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b49dcbf9f-npgkc" event={"ID":"a1b0ae9c-d0d2-465b-8b06-9b5710035c19","Type":"ContainerDied","Data":"26e3d10b9b4e2e14d5150bdf5c4f1e395d985dd24eac51481af2250897ac19e2"} Apr 16 18:45:28.769831 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:28.769401 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b49dcbf9f-npgkc" Apr 16 18:45:28.769831 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:28.769419 2578 scope.go:117] "RemoveContainer" containerID="26e3d10b9b4e2e14d5150bdf5c4f1e395d985dd24eac51481af2250897ac19e2" Apr 16 18:45:28.769831 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:28.769406 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b49dcbf9f-npgkc" event={"ID":"a1b0ae9c-d0d2-465b-8b06-9b5710035c19","Type":"ContainerDied","Data":"b4a2b09a71b38f62cf3f602bdef46d06c16f38d84616207f47e0e50d7a675cb0"} Apr 16 18:45:28.780712 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:28.780694 2578 scope.go:117] "RemoveContainer" containerID="26e3d10b9b4e2e14d5150bdf5c4f1e395d985dd24eac51481af2250897ac19e2" Apr 16 18:45:28.781049 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:45:28.781022 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26e3d10b9b4e2e14d5150bdf5c4f1e395d985dd24eac51481af2250897ac19e2\": container with ID starting with 26e3d10b9b4e2e14d5150bdf5c4f1e395d985dd24eac51481af2250897ac19e2 not found: ID does not exist" containerID="26e3d10b9b4e2e14d5150bdf5c4f1e395d985dd24eac51481af2250897ac19e2" Apr 16 18:45:28.781132 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:28.781052 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26e3d10b9b4e2e14d5150bdf5c4f1e395d985dd24eac51481af2250897ac19e2"} err="failed to get container status \"26e3d10b9b4e2e14d5150bdf5c4f1e395d985dd24eac51481af2250897ac19e2\": rpc error: code = NotFound desc = could not find container \"26e3d10b9b4e2e14d5150bdf5c4f1e395d985dd24eac51481af2250897ac19e2\": container with ID starting with 26e3d10b9b4e2e14d5150bdf5c4f1e395d985dd24eac51481af2250897ac19e2 not found: ID does not exist" Apr 16 18:45:28.794081 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:28.794056 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7b49dcbf9f-npgkc"] Apr 16 18:45:28.797422 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:28.797399 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7b49dcbf9f-npgkc"] Apr 16 18:45:28.837851 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:28.837810 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-6ptpb" podUID="f16df37c-7739-4d7b-8bf3-16add2ce4fde" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 18:45:28.838225 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:28.838168 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-6ptpb" podUID="f16df37c-7739-4d7b-8bf3-16add2ce4fde" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:45:29.841905 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:29.841868 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1b0ae9c-d0d2-465b-8b06-9b5710035c19" path="/var/lib/kubelet/pods/a1b0ae9c-d0d2-465b-8b06-9b5710035c19/volumes" Apr 16 18:45:32.787093 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:32.787062 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-fg9p4" event={"ID":"9807eb33-454f-4ec2-908c-ff10f75fdfb9","Type":"ContainerStarted","Data":"afea749dac3c14f606f279fcd346a4c2e028fb437ec960c8cd56337d5fdb1bc6"} Apr 16 18:45:32.787510 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:32.787388 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-fg9p4" Apr 16 18:45:32.788794 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:32.788767 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-fg9p4" podUID="9807eb33-454f-4ec2-908c-ff10f75fdfb9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 18:45:32.803556 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:32.803512 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-fg9p4" podStartSLOduration=5.606319299 podStartE2EDuration="11.803498309s" podCreationTimestamp="2026-04-16 18:45:21 +0000 UTC" firstStartedPulling="2026-04-16 18:45:25.754591884 +0000 UTC m=+902.541928118" lastFinishedPulling="2026-04-16 18:45:31.951770891 +0000 UTC m=+908.739107128" observedRunningTime="2026-04-16 18:45:32.802891366 +0000 UTC m=+909.590227626" watchObservedRunningTime="2026-04-16 18:45:32.803498309 +0000 UTC m=+909.590834564" Apr 16 18:45:33.795758 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:33.795719 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-fg9p4" podUID="9807eb33-454f-4ec2-908c-ff10f75fdfb9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 18:45:38.838307 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:38.838263 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-6ptpb" podUID="f16df37c-7739-4d7b-8bf3-16add2ce4fde" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 18:45:38.838814 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:38.838635 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-6ptpb" podUID="f16df37c-7739-4d7b-8bf3-16add2ce4fde" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:45:43.795767 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:43.795725 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-fg9p4" podUID="9807eb33-454f-4ec2-908c-ff10f75fdfb9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 18:45:48.838092 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:48.838044 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-6ptpb" podUID="f16df37c-7739-4d7b-8bf3-16add2ce4fde" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 18:45:48.838501 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:48.838249 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-6ptpb" Apr 16 18:45:48.838501 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:48.838337 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-6ptpb" podUID="f16df37c-7739-4d7b-8bf3-16add2ce4fde" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:45:48.838501 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:48.838439 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-6ptpb" Apr 16 18:45:51.690312 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:51.690290 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-6ptpb" Apr 16 18:45:51.788522 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:51.788452 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f16df37c-7739-4d7b-8bf3-16add2ce4fde-kserve-provision-location\") pod \"f16df37c-7739-4d7b-8bf3-16add2ce4fde\" (UID: \"f16df37c-7739-4d7b-8bf3-16add2ce4fde\") " Apr 16 18:45:51.788765 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:51.788743 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f16df37c-7739-4d7b-8bf3-16add2ce4fde-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f16df37c-7739-4d7b-8bf3-16add2ce4fde" (UID: "f16df37c-7739-4d7b-8bf3-16add2ce4fde"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:45:51.853788 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:51.853760 2578 generic.go:358] "Generic (PLEG): container finished" podID="f16df37c-7739-4d7b-8bf3-16add2ce4fde" containerID="d7b39a6d3f6269047d2b4abd4658d3783f3f4aa886f9f77144186df6457bd6ba" exitCode=0 Apr 16 18:45:51.853892 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:51.853793 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-6ptpb" event={"ID":"f16df37c-7739-4d7b-8bf3-16add2ce4fde","Type":"ContainerDied","Data":"d7b39a6d3f6269047d2b4abd4658d3783f3f4aa886f9f77144186df6457bd6ba"} Apr 16 18:45:51.853892 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:51.853812 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-6ptpb" event={"ID":"f16df37c-7739-4d7b-8bf3-16add2ce4fde","Type":"ContainerDied","Data":"40944b2150431cb877d34e6b5ebf3da21643009b5e3daffa3ce7b9b68310db83"} Apr 16 18:45:51.853892 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:51.853828 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-6ptpb" Apr 16 18:45:51.853990 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:51.853830 2578 scope.go:117] "RemoveContainer" containerID="d7b39a6d3f6269047d2b4abd4658d3783f3f4aa886f9f77144186df6457bd6ba" Apr 16 18:45:51.861659 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:51.861644 2578 scope.go:117] "RemoveContainer" containerID="9c0f61a3c19066fe921243982b8043c09eb5f1236a6c5830a91aaae1a6f94019" Apr 16 18:45:51.868189 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:51.868156 2578 scope.go:117] "RemoveContainer" containerID="4235164f03dfe77308d2a79f8b01389e8525a6bc17c2f17928fa32f8f402d245" Apr 16 18:45:51.872140 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:51.872122 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-6ptpb"] Apr 16 18:45:51.875172 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:51.875152 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-6ptpb"] Apr 16 18:45:51.875941 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:51.875906 2578 scope.go:117] "RemoveContainer" containerID="d7b39a6d3f6269047d2b4abd4658d3783f3f4aa886f9f77144186df6457bd6ba" Apr 16 18:45:51.876209 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:45:51.876160 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7b39a6d3f6269047d2b4abd4658d3783f3f4aa886f9f77144186df6457bd6ba\": container with ID starting with d7b39a6d3f6269047d2b4abd4658d3783f3f4aa886f9f77144186df6457bd6ba not found: ID does not exist" containerID="d7b39a6d3f6269047d2b4abd4658d3783f3f4aa886f9f77144186df6457bd6ba" Apr 16 18:45:51.876268 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:51.876221 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7b39a6d3f6269047d2b4abd4658d3783f3f4aa886f9f77144186df6457bd6ba"} err="failed to get container status \"d7b39a6d3f6269047d2b4abd4658d3783f3f4aa886f9f77144186df6457bd6ba\": rpc error: code = NotFound desc = could not find container \"d7b39a6d3f6269047d2b4abd4658d3783f3f4aa886f9f77144186df6457bd6ba\": container with ID starting with d7b39a6d3f6269047d2b4abd4658d3783f3f4aa886f9f77144186df6457bd6ba not found: ID does not exist" Apr 16 18:45:51.876268 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:51.876245 2578 scope.go:117] "RemoveContainer" containerID="9c0f61a3c19066fe921243982b8043c09eb5f1236a6c5830a91aaae1a6f94019" Apr 16 18:45:51.876490 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:45:51.876473 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c0f61a3c19066fe921243982b8043c09eb5f1236a6c5830a91aaae1a6f94019\": container with ID starting with 9c0f61a3c19066fe921243982b8043c09eb5f1236a6c5830a91aaae1a6f94019 not found: ID does not exist" containerID="9c0f61a3c19066fe921243982b8043c09eb5f1236a6c5830a91aaae1a6f94019" Apr 16 18:45:51.876553 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:51.876495 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c0f61a3c19066fe921243982b8043c09eb5f1236a6c5830a91aaae1a6f94019"} err="failed to get container status \"9c0f61a3c19066fe921243982b8043c09eb5f1236a6c5830a91aaae1a6f94019\": rpc error: code = NotFound desc = could not find container \"9c0f61a3c19066fe921243982b8043c09eb5f1236a6c5830a91aaae1a6f94019\": container with ID starting with 9c0f61a3c19066fe921243982b8043c09eb5f1236a6c5830a91aaae1a6f94019 not found: ID does not exist" Apr 16 18:45:51.876553 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:51.876510 2578 scope.go:117] "RemoveContainer" containerID="4235164f03dfe77308d2a79f8b01389e8525a6bc17c2f17928fa32f8f402d245" Apr 16 18:45:51.876736 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:45:51.876718 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4235164f03dfe77308d2a79f8b01389e8525a6bc17c2f17928fa32f8f402d245\": container with ID starting with 4235164f03dfe77308d2a79f8b01389e8525a6bc17c2f17928fa32f8f402d245 not found: ID does not exist" containerID="4235164f03dfe77308d2a79f8b01389e8525a6bc17c2f17928fa32f8f402d245" Apr 16 18:45:51.876795 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:51.876746 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4235164f03dfe77308d2a79f8b01389e8525a6bc17c2f17928fa32f8f402d245"} err="failed to get container status \"4235164f03dfe77308d2a79f8b01389e8525a6bc17c2f17928fa32f8f402d245\": rpc error: code = NotFound desc = could not find container \"4235164f03dfe77308d2a79f8b01389e8525a6bc17c2f17928fa32f8f402d245\": container with ID starting with 4235164f03dfe77308d2a79f8b01389e8525a6bc17c2f17928fa32f8f402d245 not found: ID does not exist" Apr 16 18:45:51.889310 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:51.889290 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f16df37c-7739-4d7b-8bf3-16add2ce4fde-kserve-provision-location\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 18:45:53.796129 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:53.796084 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-fg9p4" podUID="9807eb33-454f-4ec2-908c-ff10f75fdfb9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 18:45:53.841172 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:45:53.841139 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f16df37c-7739-4d7b-8bf3-16add2ce4fde" path="/var/lib/kubelet/pods/f16df37c-7739-4d7b-8bf3-16add2ce4fde/volumes" Apr 16 18:46:03.796560 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:46:03.796517 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-fg9p4" podUID="9807eb33-454f-4ec2-908c-ff10f75fdfb9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 18:46:13.796068 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:46:13.796024 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-fg9p4" podUID="9807eb33-454f-4ec2-908c-ff10f75fdfb9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 18:46:23.796219 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:46:23.796167 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-fg9p4" podUID="9807eb33-454f-4ec2-908c-ff10f75fdfb9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 18:46:33.796446 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:46:33.796399 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-fg9p4" podUID="9807eb33-454f-4ec2-908c-ff10f75fdfb9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 18:46:34.837301 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:46:34.837255 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-fg9p4" podUID="9807eb33-454f-4ec2-908c-ff10f75fdfb9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 18:46:44.837458 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:46:44.837361 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-fg9p4" podUID="9807eb33-454f-4ec2-908c-ff10f75fdfb9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 18:46:54.839025 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:46:54.838992 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-fg9p4" Apr 16 18:47:01.690077 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:47:01.690042 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-fg9p4"] Apr 16 18:47:01.690467 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:47:01.690309 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-fg9p4" podUID="9807eb33-454f-4ec2-908c-ff10f75fdfb9" containerName="kserve-container" containerID="cri-o://afea749dac3c14f606f279fcd346a4c2e028fb437ec960c8cd56337d5fdb1bc6" gracePeriod=30 Apr 16 18:47:01.754475 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:47:01.754437 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-sx9jd"] Apr 16 18:47:01.754824 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:47:01.754812 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a1b0ae9c-d0d2-465b-8b06-9b5710035c19" containerName="console" Apr 16 18:47:01.754865 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:47:01.754825 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1b0ae9c-d0d2-465b-8b06-9b5710035c19" containerName="console" Apr 16 18:47:01.754865 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:47:01.754836 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f16df37c-7739-4d7b-8bf3-16add2ce4fde" containerName="kserve-container" Apr 16 18:47:01.754865 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:47:01.754841 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="f16df37c-7739-4d7b-8bf3-16add2ce4fde" containerName="kserve-container" Apr 16 18:47:01.754865 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:47:01.754850 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f16df37c-7739-4d7b-8bf3-16add2ce4fde" containerName="storage-initializer" Apr 16 18:47:01.754865 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:47:01.754855 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="f16df37c-7739-4d7b-8bf3-16add2ce4fde" containerName="storage-initializer" Apr 16 18:47:01.754865 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:47:01.754861 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f16df37c-7739-4d7b-8bf3-16add2ce4fde" containerName="agent" Apr 16 18:47:01.754865 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:47:01.754866 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="f16df37c-7739-4d7b-8bf3-16add2ce4fde" containerName="agent" Apr 16 18:47:01.755084 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:47:01.754911 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="f16df37c-7739-4d7b-8bf3-16add2ce4fde" containerName="agent" Apr 16 18:47:01.755084 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:47:01.754919 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="a1b0ae9c-d0d2-465b-8b06-9b5710035c19" containerName="console" Apr 16 18:47:01.755084 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:47:01.754927 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="f16df37c-7739-4d7b-8bf3-16add2ce4fde" containerName="kserve-container" Apr 16 18:47:01.757956 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:47:01.757938 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-sx9jd" Apr 16 18:47:01.766351 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:47:01.766323 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-sx9jd"] Apr 16 18:47:01.846521 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:47:01.846485 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5997b834-6cac-49a5-a59e-acea81a43da9-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-7fd76f4446-sx9jd\" (UID: \"5997b834-6cac-49a5-a59e-acea81a43da9\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-sx9jd" Apr 16 18:47:01.947060 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:47:01.946972 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5997b834-6cac-49a5-a59e-acea81a43da9-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-7fd76f4446-sx9jd\" (UID: \"5997b834-6cac-49a5-a59e-acea81a43da9\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-sx9jd" Apr 16 18:47:01.947379 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:47:01.947359 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5997b834-6cac-49a5-a59e-acea81a43da9-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-7fd76f4446-sx9jd\" (UID: \"5997b834-6cac-49a5-a59e-acea81a43da9\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-sx9jd" Apr 16 18:47:02.069296 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:47:02.069261 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-sx9jd" Apr 16 18:47:02.192192 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:47:02.192148 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-sx9jd"] Apr 16 18:47:02.195137 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:47:02.195109 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5997b834_6cac_49a5_a59e_acea81a43da9.slice/crio-46fc7329ee0511a36b5b5319488511fd3d88da6ce824d617b9862c278f4c6b91 WatchSource:0}: Error finding container 46fc7329ee0511a36b5b5319488511fd3d88da6ce824d617b9862c278f4c6b91: Status 404 returned error can't find the container with id 46fc7329ee0511a36b5b5319488511fd3d88da6ce824d617b9862c278f4c6b91 Apr 16 18:47:03.094949 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:47:03.094916 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-sx9jd" event={"ID":"5997b834-6cac-49a5-a59e-acea81a43da9","Type":"ContainerStarted","Data":"e42f65e6b56a921024b8698af754599a7f902fc8a52ff7b41916ad0b46873a07"} Apr 16 18:47:03.095390 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:47:03.094955 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-sx9jd" event={"ID":"5997b834-6cac-49a5-a59e-acea81a43da9","Type":"ContainerStarted","Data":"46fc7329ee0511a36b5b5319488511fd3d88da6ce824d617b9862c278f4c6b91"} Apr 16 18:47:04.837537 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:47:04.837475 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-fg9p4" podUID="9807eb33-454f-4ec2-908c-ff10f75fdfb9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 18:47:06.106024 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:47:06.105993 2578 generic.go:358] "Generic (PLEG): container finished" podID="5997b834-6cac-49a5-a59e-acea81a43da9" containerID="e42f65e6b56a921024b8698af754599a7f902fc8a52ff7b41916ad0b46873a07" exitCode=0 Apr 16 18:47:06.106389 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:47:06.106077 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-sx9jd" event={"ID":"5997b834-6cac-49a5-a59e-acea81a43da9","Type":"ContainerDied","Data":"e42f65e6b56a921024b8698af754599a7f902fc8a52ff7b41916ad0b46873a07"} Apr 16 18:47:06.340583 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:47:06.340559 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-fg9p4" Apr 16 18:47:06.388793 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:47:06.388758 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9807eb33-454f-4ec2-908c-ff10f75fdfb9-kserve-provision-location\") pod \"9807eb33-454f-4ec2-908c-ff10f75fdfb9\" (UID: \"9807eb33-454f-4ec2-908c-ff10f75fdfb9\") " Apr 16 18:47:06.389108 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:47:06.389079 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9807eb33-454f-4ec2-908c-ff10f75fdfb9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9807eb33-454f-4ec2-908c-ff10f75fdfb9" (UID: "9807eb33-454f-4ec2-908c-ff10f75fdfb9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:47:06.489330 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:47:06.489296 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9807eb33-454f-4ec2-908c-ff10f75fdfb9-kserve-provision-location\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 18:47:07.110735 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:47:07.110703 2578 generic.go:358] "Generic (PLEG): container finished" podID="9807eb33-454f-4ec2-908c-ff10f75fdfb9" containerID="afea749dac3c14f606f279fcd346a4c2e028fb437ec960c8cd56337d5fdb1bc6" exitCode=0 Apr 16 18:47:07.111154 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:47:07.110774 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-fg9p4" event={"ID":"9807eb33-454f-4ec2-908c-ff10f75fdfb9","Type":"ContainerDied","Data":"afea749dac3c14f606f279fcd346a4c2e028fb437ec960c8cd56337d5fdb1bc6"} Apr 16 18:47:07.111154 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:47:07.110779 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-fg9p4" Apr 16 18:47:07.111154 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:47:07.110812 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-fg9p4" event={"ID":"9807eb33-454f-4ec2-908c-ff10f75fdfb9","Type":"ContainerDied","Data":"2b1ba95909a9667c4e4d4f550cb39854281505284dfdaa42e1fce445791c9521"} Apr 16 18:47:07.111154 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:47:07.110832 2578 scope.go:117] "RemoveContainer" containerID="afea749dac3c14f606f279fcd346a4c2e028fb437ec960c8cd56337d5fdb1bc6" Apr 16 18:47:07.112628 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:47:07.112604 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-sx9jd" event={"ID":"5997b834-6cac-49a5-a59e-acea81a43da9","Type":"ContainerStarted","Data":"1fec60ccefdc5f388d2688b31d140a52fd8773eca007b9fc35b841e4b63b7c2b"} Apr 16 18:47:07.112876 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:47:07.112853 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-sx9jd" Apr 16 18:47:07.114315 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:47:07.114291 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-sx9jd" podUID="5997b834-6cac-49a5-a59e-acea81a43da9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 16 18:47:07.119120 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:47:07.119100 2578 scope.go:117] "RemoveContainer" containerID="5f4b6a49f592266c65279068662b6fbef3ddca3a7776b5141d6193e9ccf26bd2" Apr 16 18:47:07.125920 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:47:07.125906 2578 scope.go:117] "RemoveContainer" containerID="afea749dac3c14f606f279fcd346a4c2e028fb437ec960c8cd56337d5fdb1bc6" Apr 16 18:47:07.126158 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:47:07.126139 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afea749dac3c14f606f279fcd346a4c2e028fb437ec960c8cd56337d5fdb1bc6\": container with ID starting with afea749dac3c14f606f279fcd346a4c2e028fb437ec960c8cd56337d5fdb1bc6 not found: ID does not exist" containerID="afea749dac3c14f606f279fcd346a4c2e028fb437ec960c8cd56337d5fdb1bc6" Apr 16 18:47:07.126358 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:47:07.126169 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afea749dac3c14f606f279fcd346a4c2e028fb437ec960c8cd56337d5fdb1bc6"} err="failed to get container status \"afea749dac3c14f606f279fcd346a4c2e028fb437ec960c8cd56337d5fdb1bc6\": rpc error: code = NotFound desc = could not find container \"afea749dac3c14f606f279fcd346a4c2e028fb437ec960c8cd56337d5fdb1bc6\": container with ID starting with afea749dac3c14f606f279fcd346a4c2e028fb437ec960c8cd56337d5fdb1bc6 not found: ID does not exist" Apr 16 18:47:07.126358 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:47:07.126210 2578 scope.go:117] "RemoveContainer" containerID="5f4b6a49f592266c65279068662b6fbef3ddca3a7776b5141d6193e9ccf26bd2" Apr 16 18:47:07.126443 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:47:07.126424 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f4b6a49f592266c65279068662b6fbef3ddca3a7776b5141d6193e9ccf26bd2\": container with ID starting with 5f4b6a49f592266c65279068662b6fbef3ddca3a7776b5141d6193e9ccf26bd2 not found: ID does not exist" containerID="5f4b6a49f592266c65279068662b6fbef3ddca3a7776b5141d6193e9ccf26bd2" Apr 16 18:47:07.126482 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:47:07.126447 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f4b6a49f592266c65279068662b6fbef3ddca3a7776b5141d6193e9ccf26bd2"} err="failed to get container status \"5f4b6a49f592266c65279068662b6fbef3ddca3a7776b5141d6193e9ccf26bd2\": rpc error: code = NotFound desc = could not find container \"5f4b6a49f592266c65279068662b6fbef3ddca3a7776b5141d6193e9ccf26bd2\": container with ID starting with 5f4b6a49f592266c65279068662b6fbef3ddca3a7776b5141d6193e9ccf26bd2 not found: ID does not exist" Apr 16 18:47:07.131773 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:47:07.131735 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-sx9jd" podStartSLOduration=6.131724289 podStartE2EDuration="6.131724289s" podCreationTimestamp="2026-04-16 18:47:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:47:07.130472258 +0000 UTC m=+1003.917808516" watchObservedRunningTime="2026-04-16 18:47:07.131724289 +0000 UTC m=+1003.919060543" Apr 16 18:47:07.142450 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:47:07.142429 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-fg9p4"] Apr 16 18:47:07.146226 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:47:07.146206 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-fg9p4"] Apr 16 18:47:07.841854 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:47:07.841811 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9807eb33-454f-4ec2-908c-ff10f75fdfb9" path="/var/lib/kubelet/pods/9807eb33-454f-4ec2-908c-ff10f75fdfb9/volumes" Apr 16 18:47:08.117777 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:47:08.117689 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-sx9jd" podUID="5997b834-6cac-49a5-a59e-acea81a43da9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 16 18:47:18.118031 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:47:18.117987 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-sx9jd" podUID="5997b834-6cac-49a5-a59e-acea81a43da9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 16 18:47:28.118267 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:47:28.118226 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-sx9jd" podUID="5997b834-6cac-49a5-a59e-acea81a43da9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 16 18:47:38.117849 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:47:38.117808 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-sx9jd" podUID="5997b834-6cac-49a5-a59e-acea81a43da9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 16 18:47:48.117886 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:47:48.117840 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-sx9jd" podUID="5997b834-6cac-49a5-a59e-acea81a43da9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 16 18:47:58.117804 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:47:58.117762 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-sx9jd" podUID="5997b834-6cac-49a5-a59e-acea81a43da9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 16 18:48:08.117727 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:48:08.117682 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-sx9jd" podUID="5997b834-6cac-49a5-a59e-acea81a43da9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 16 18:48:15.837595 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:48:15.837503 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-sx9jd" podUID="5997b834-6cac-49a5-a59e-acea81a43da9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 16 18:48:25.841621 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:48:25.841591 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-sx9jd" Apr 16 18:48:32.138306 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:48:32.138265 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-sx9jd"] Apr 16 18:48:32.138763 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:48:32.138663 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-sx9jd" podUID="5997b834-6cac-49a5-a59e-acea81a43da9" containerName="kserve-container" containerID="cri-o://1fec60ccefdc5f388d2688b31d140a52fd8773eca007b9fc35b841e4b63b7c2b" gracePeriod=30 Apr 16 18:48:32.214619 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:48:32.214582 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-788c698548-qh7fk"] Apr 16 18:48:32.214995 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:48:32.214980 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9807eb33-454f-4ec2-908c-ff10f75fdfb9" containerName="kserve-container" Apr 16 18:48:32.215052 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:48:32.214997 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="9807eb33-454f-4ec2-908c-ff10f75fdfb9" containerName="kserve-container" Apr 16 18:48:32.215052 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:48:32.215025 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9807eb33-454f-4ec2-908c-ff10f75fdfb9" containerName="storage-initializer" Apr 16 18:48:32.215052 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:48:32.215032 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="9807eb33-454f-4ec2-908c-ff10f75fdfb9" containerName="storage-initializer" Apr 16 18:48:32.215145 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:48:32.215089 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="9807eb33-454f-4ec2-908c-ff10f75fdfb9" containerName="kserve-container" Apr 16 18:48:32.218099 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:48:32.218082 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-788c698548-qh7fk" Apr 16 18:48:32.226099 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:48:32.225801 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-788c698548-qh7fk"] Apr 16 18:48:32.407672 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:48:32.407577 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0ebf21e7-25bb-4dde-b9cb-a686136d91d8-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-788c698548-qh7fk\" (UID: \"0ebf21e7-25bb-4dde-b9cb-a686136d91d8\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-788c698548-qh7fk" Apr 16 18:48:32.508945 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:48:32.508904 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0ebf21e7-25bb-4dde-b9cb-a686136d91d8-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-788c698548-qh7fk\" (UID: \"0ebf21e7-25bb-4dde-b9cb-a686136d91d8\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-788c698548-qh7fk" Apr 16 18:48:32.509324 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:48:32.509301 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0ebf21e7-25bb-4dde-b9cb-a686136d91d8-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-788c698548-qh7fk\" (UID: \"0ebf21e7-25bb-4dde-b9cb-a686136d91d8\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-788c698548-qh7fk" Apr 16 18:48:32.529960 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:48:32.529935 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-788c698548-qh7fk" Apr 16 18:48:32.654674 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:48:32.654642 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-788c698548-qh7fk"] Apr 16 18:48:32.656597 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:48:32.656570 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ebf21e7_25bb_4dde_b9cb_a686136d91d8.slice/crio-01bbe11b07d29c24e7af142c15cc34bbc2057d7cd7d95af3b5505e7111d7407b WatchSource:0}: Error finding container 01bbe11b07d29c24e7af142c15cc34bbc2057d7cd7d95af3b5505e7111d7407b: Status 404 returned error can't find the container with id 01bbe11b07d29c24e7af142c15cc34bbc2057d7cd7d95af3b5505e7111d7407b Apr 16 18:48:32.658623 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:48:32.658606 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:48:33.401226 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:48:33.401193 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-788c698548-qh7fk" event={"ID":"0ebf21e7-25bb-4dde-b9cb-a686136d91d8","Type":"ContainerStarted","Data":"9525b65d76eb9b0c84ca6b927e592f2923b787e3915e3b55737ae58899ac8288"} Apr 16 18:48:33.401226 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:48:33.401229 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-788c698548-qh7fk" event={"ID":"0ebf21e7-25bb-4dde-b9cb-a686136d91d8","Type":"ContainerStarted","Data":"01bbe11b07d29c24e7af142c15cc34bbc2057d7cd7d95af3b5505e7111d7407b"} Apr 16 18:48:35.838166 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:48:35.838122 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-sx9jd" podUID="5997b834-6cac-49a5-a59e-acea81a43da9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 16 18:48:36.785979 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:48:36.785956 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-sx9jd" Apr 16 18:48:36.946432 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:48:36.946340 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5997b834-6cac-49a5-a59e-acea81a43da9-kserve-provision-location\") pod \"5997b834-6cac-49a5-a59e-acea81a43da9\" (UID: \"5997b834-6cac-49a5-a59e-acea81a43da9\") " Apr 16 18:48:36.946782 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:48:36.946614 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5997b834-6cac-49a5-a59e-acea81a43da9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5997b834-6cac-49a5-a59e-acea81a43da9" (UID: "5997b834-6cac-49a5-a59e-acea81a43da9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:48:37.047275 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:48:37.047243 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5997b834-6cac-49a5-a59e-acea81a43da9-kserve-provision-location\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 18:48:37.414969 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:48:37.414938 2578 generic.go:358] "Generic (PLEG): container finished" podID="0ebf21e7-25bb-4dde-b9cb-a686136d91d8" containerID="9525b65d76eb9b0c84ca6b927e592f2923b787e3915e3b55737ae58899ac8288" exitCode=0 Apr 16 18:48:37.415132 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:48:37.415009 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-788c698548-qh7fk" event={"ID":"0ebf21e7-25bb-4dde-b9cb-a686136d91d8","Type":"ContainerDied","Data":"9525b65d76eb9b0c84ca6b927e592f2923b787e3915e3b55737ae58899ac8288"} Apr 16 18:48:37.416397 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:48:37.416341 2578 generic.go:358] "Generic (PLEG): container finished" podID="5997b834-6cac-49a5-a59e-acea81a43da9" containerID="1fec60ccefdc5f388d2688b31d140a52fd8773eca007b9fc35b841e4b63b7c2b" exitCode=0 Apr 16 18:48:37.416397 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:48:37.416374 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-sx9jd" event={"ID":"5997b834-6cac-49a5-a59e-acea81a43da9","Type":"ContainerDied","Data":"1fec60ccefdc5f388d2688b31d140a52fd8773eca007b9fc35b841e4b63b7c2b"} Apr 16 18:48:37.416397 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:48:37.416391 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-sx9jd" event={"ID":"5997b834-6cac-49a5-a59e-acea81a43da9","Type":"ContainerDied","Data":"46fc7329ee0511a36b5b5319488511fd3d88da6ce824d617b9862c278f4c6b91"} Apr 16 18:48:37.416540 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:48:37.416405 2578 scope.go:117] "RemoveContainer" containerID="1fec60ccefdc5f388d2688b31d140a52fd8773eca007b9fc35b841e4b63b7c2b" Apr 16 18:48:37.416540 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:48:37.416409 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-sx9jd" Apr 16 18:48:37.424341 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:48:37.424323 2578 scope.go:117] "RemoveContainer" containerID="e42f65e6b56a921024b8698af754599a7f902fc8a52ff7b41916ad0b46873a07" Apr 16 18:48:37.432058 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:48:37.432042 2578 scope.go:117] "RemoveContainer" containerID="1fec60ccefdc5f388d2688b31d140a52fd8773eca007b9fc35b841e4b63b7c2b" Apr 16 18:48:37.432417 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:48:37.432388 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fec60ccefdc5f388d2688b31d140a52fd8773eca007b9fc35b841e4b63b7c2b\": container with ID starting with 1fec60ccefdc5f388d2688b31d140a52fd8773eca007b9fc35b841e4b63b7c2b not found: ID does not exist" containerID="1fec60ccefdc5f388d2688b31d140a52fd8773eca007b9fc35b841e4b63b7c2b" Apr 16 18:48:37.432497 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:48:37.432425 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fec60ccefdc5f388d2688b31d140a52fd8773eca007b9fc35b841e4b63b7c2b"} err="failed to get container status \"1fec60ccefdc5f388d2688b31d140a52fd8773eca007b9fc35b841e4b63b7c2b\": rpc error: code = NotFound desc = could not find container \"1fec60ccefdc5f388d2688b31d140a52fd8773eca007b9fc35b841e4b63b7c2b\": container with ID starting with 1fec60ccefdc5f388d2688b31d140a52fd8773eca007b9fc35b841e4b63b7c2b not found: ID does not exist" Apr 16 18:48:37.432497 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:48:37.432447 2578 scope.go:117] "RemoveContainer" containerID="e42f65e6b56a921024b8698af754599a7f902fc8a52ff7b41916ad0b46873a07" Apr 16 18:48:37.432798 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:48:37.432772 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e42f65e6b56a921024b8698af754599a7f902fc8a52ff7b41916ad0b46873a07\": container with ID starting with e42f65e6b56a921024b8698af754599a7f902fc8a52ff7b41916ad0b46873a07 not found: ID does not exist" containerID="e42f65e6b56a921024b8698af754599a7f902fc8a52ff7b41916ad0b46873a07" Apr 16 18:48:37.432861 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:48:37.432805 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e42f65e6b56a921024b8698af754599a7f902fc8a52ff7b41916ad0b46873a07"} err="failed to get container status \"e42f65e6b56a921024b8698af754599a7f902fc8a52ff7b41916ad0b46873a07\": rpc error: code = NotFound desc = could not find container \"e42f65e6b56a921024b8698af754599a7f902fc8a52ff7b41916ad0b46873a07\": container with ID starting with e42f65e6b56a921024b8698af754599a7f902fc8a52ff7b41916ad0b46873a07 not found: ID does not exist" Apr 16 18:48:37.442378 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:48:37.442356 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-sx9jd"] Apr 16 18:48:37.443957 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:48:37.443939 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-sx9jd"] Apr 16 18:48:37.841675 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:48:37.841645 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5997b834-6cac-49a5-a59e-acea81a43da9" path="/var/lib/kubelet/pods/5997b834-6cac-49a5-a59e-acea81a43da9/volumes" Apr 16 18:50:47.450751 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:50:47.450726 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fdzzg_02533a21-4e1f-4bc0-a493-7ac7d35295b8/ovn-acl-logging/0.log" Apr 16 18:50:47.451258 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:50:47.450774 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fdzzg_02533a21-4e1f-4bc0-a493-7ac7d35295b8/ovn-acl-logging/0.log" Apr 16 18:50:48.909781 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:50:48.909750 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-788c698548-qh7fk" event={"ID":"0ebf21e7-25bb-4dde-b9cb-a686136d91d8","Type":"ContainerStarted","Data":"66c7a5998a9e6afe8ca3a78b3d973d84f929ec967005e0f5a1afac081ee265ad"} Apr 16 18:50:48.910167 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:50:48.909814 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-788c698548-qh7fk" Apr 16 18:50:48.932341 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:50:48.932264 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-788c698548-qh7fk" podStartSLOduration=6.400543131 podStartE2EDuration="2m16.932248982s" podCreationTimestamp="2026-04-16 18:48:32 +0000 UTC" firstStartedPulling="2026-04-16 18:48:37.416041942 +0000 UTC m=+1094.203378175" lastFinishedPulling="2026-04-16 18:50:47.947747778 +0000 UTC m=+1224.735084026" observedRunningTime="2026-04-16 18:50:48.932142604 +0000 UTC m=+1225.719478862" watchObservedRunningTime="2026-04-16 18:50:48.932248982 +0000 UTC m=+1225.719585236" Apr 16 18:51:19.918352 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:19.918273 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-788c698548-qh7fk" Apr 16 18:51:22.399510 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:22.399474 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-788c698548-qh7fk"] Apr 16 18:51:22.399951 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:22.399781 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-788c698548-qh7fk" podUID="0ebf21e7-25bb-4dde-b9cb-a686136d91d8" containerName="kserve-container" containerID="cri-o://66c7a5998a9e6afe8ca3a78b3d973d84f929ec967005e0f5a1afac081ee265ad" gracePeriod=30 Apr 16 18:51:22.492047 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:22.492011 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-57f9c55f54-2kl7w"] Apr 16 18:51:22.492420 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:22.492407 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5997b834-6cac-49a5-a59e-acea81a43da9" containerName="storage-initializer" Apr 16 18:51:22.492469 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:22.492422 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="5997b834-6cac-49a5-a59e-acea81a43da9" containerName="storage-initializer" Apr 16 18:51:22.492469 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:22.492439 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5997b834-6cac-49a5-a59e-acea81a43da9" containerName="kserve-container" Apr 16 18:51:22.492469 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:22.492444 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="5997b834-6cac-49a5-a59e-acea81a43da9" containerName="kserve-container" Apr 16 18:51:22.492624 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:22.492504 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="5997b834-6cac-49a5-a59e-acea81a43da9" containerName="kserve-container" Apr 16 18:51:22.519702 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:22.519661 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-57f9c55f54-2kl7w"] Apr 16 18:51:22.519859 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:22.519790 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-57f9c55f54-2kl7w" Apr 16 18:51:22.613581 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:22.613551 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a75c4191-bd0f-44d2-abc3-8ccc2a4113c8-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-57f9c55f54-2kl7w\" (UID: \"a75c4191-bd0f-44d2-abc3-8ccc2a4113c8\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-57f9c55f54-2kl7w" Apr 16 18:51:22.715038 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:22.714941 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a75c4191-bd0f-44d2-abc3-8ccc2a4113c8-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-57f9c55f54-2kl7w\" (UID: \"a75c4191-bd0f-44d2-abc3-8ccc2a4113c8\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-57f9c55f54-2kl7w" Apr 16 18:51:22.715365 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:22.715343 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a75c4191-bd0f-44d2-abc3-8ccc2a4113c8-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-57f9c55f54-2kl7w\" (UID: \"a75c4191-bd0f-44d2-abc3-8ccc2a4113c8\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-57f9c55f54-2kl7w" Apr 16 18:51:22.830804 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:22.830768 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-57f9c55f54-2kl7w" Apr 16 18:51:22.952593 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:22.952550 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-57f9c55f54-2kl7w"] Apr 16 18:51:22.955157 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:51:22.955123 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda75c4191_bd0f_44d2_abc3_8ccc2a4113c8.slice/crio-d2853ef0e86daf4634c5a4bacf414b9375b89abc4971e53a2c0ef382d3e6f933 WatchSource:0}: Error finding container d2853ef0e86daf4634c5a4bacf414b9375b89abc4971e53a2c0ef382d3e6f933: Status 404 returned error can't find the container with id d2853ef0e86daf4634c5a4bacf414b9375b89abc4971e53a2c0ef382d3e6f933 Apr 16 18:51:23.016736 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:23.016700 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-57f9c55f54-2kl7w" event={"ID":"a75c4191-bd0f-44d2-abc3-8ccc2a4113c8","Type":"ContainerStarted","Data":"d2853ef0e86daf4634c5a4bacf414b9375b89abc4971e53a2c0ef382d3e6f933"} Apr 16 18:51:23.566935 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:23.566908 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-788c698548-qh7fk" Apr 16 18:51:23.622455 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:23.622368 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0ebf21e7-25bb-4dde-b9cb-a686136d91d8-kserve-provision-location\") pod \"0ebf21e7-25bb-4dde-b9cb-a686136d91d8\" (UID: \"0ebf21e7-25bb-4dde-b9cb-a686136d91d8\") " Apr 16 18:51:23.622808 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:23.622777 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ebf21e7-25bb-4dde-b9cb-a686136d91d8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0ebf21e7-25bb-4dde-b9cb-a686136d91d8" (UID: "0ebf21e7-25bb-4dde-b9cb-a686136d91d8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:51:23.723022 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:23.722990 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0ebf21e7-25bb-4dde-b9cb-a686136d91d8-kserve-provision-location\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 18:51:24.021785 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:24.021744 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-57f9c55f54-2kl7w" event={"ID":"a75c4191-bd0f-44d2-abc3-8ccc2a4113c8","Type":"ContainerStarted","Data":"503e8088120a46986f2c09bb2331365bc948fe3179b9db0fe8fdbcf9a805274f"} Apr 16 18:51:24.023075 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:24.023051 2578 generic.go:358] "Generic (PLEG): container finished" podID="0ebf21e7-25bb-4dde-b9cb-a686136d91d8" containerID="66c7a5998a9e6afe8ca3a78b3d973d84f929ec967005e0f5a1afac081ee265ad" exitCode=0 Apr 16 18:51:24.023165 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:24.023095 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-788c698548-qh7fk" event={"ID":"0ebf21e7-25bb-4dde-b9cb-a686136d91d8","Type":"ContainerDied","Data":"66c7a5998a9e6afe8ca3a78b3d973d84f929ec967005e0f5a1afac081ee265ad"} Apr 16 18:51:24.023165 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:24.023107 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-788c698548-qh7fk" Apr 16 18:51:24.023165 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:24.023125 2578 scope.go:117] "RemoveContainer" containerID="66c7a5998a9e6afe8ca3a78b3d973d84f929ec967005e0f5a1afac081ee265ad" Apr 16 18:51:24.023294 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:24.023113 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-788c698548-qh7fk" event={"ID":"0ebf21e7-25bb-4dde-b9cb-a686136d91d8","Type":"ContainerDied","Data":"01bbe11b07d29c24e7af142c15cc34bbc2057d7cd7d95af3b5505e7111d7407b"} Apr 16 18:51:24.030916 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:24.030900 2578 scope.go:117] "RemoveContainer" containerID="9525b65d76eb9b0c84ca6b927e592f2923b787e3915e3b55737ae58899ac8288" Apr 16 18:51:24.038028 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:24.038007 2578 scope.go:117] "RemoveContainer" containerID="66c7a5998a9e6afe8ca3a78b3d973d84f929ec967005e0f5a1afac081ee265ad" Apr 16 18:51:24.038330 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:51:24.038304 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66c7a5998a9e6afe8ca3a78b3d973d84f929ec967005e0f5a1afac081ee265ad\": container with ID starting with 66c7a5998a9e6afe8ca3a78b3d973d84f929ec967005e0f5a1afac081ee265ad not found: ID does not exist" containerID="66c7a5998a9e6afe8ca3a78b3d973d84f929ec967005e0f5a1afac081ee265ad" Apr 16 18:51:24.038431 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:24.038341 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66c7a5998a9e6afe8ca3a78b3d973d84f929ec967005e0f5a1afac081ee265ad"} err="failed to get container status \"66c7a5998a9e6afe8ca3a78b3d973d84f929ec967005e0f5a1afac081ee265ad\": rpc error: code = NotFound desc = could not find container \"66c7a5998a9e6afe8ca3a78b3d973d84f929ec967005e0f5a1afac081ee265ad\": container with ID starting with 66c7a5998a9e6afe8ca3a78b3d973d84f929ec967005e0f5a1afac081ee265ad not found: ID does not exist" Apr 16 18:51:24.038431 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:24.038362 2578 scope.go:117] "RemoveContainer" containerID="9525b65d76eb9b0c84ca6b927e592f2923b787e3915e3b55737ae58899ac8288" Apr 16 18:51:24.038696 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:51:24.038656 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9525b65d76eb9b0c84ca6b927e592f2923b787e3915e3b55737ae58899ac8288\": container with ID starting with 9525b65d76eb9b0c84ca6b927e592f2923b787e3915e3b55737ae58899ac8288 not found: ID does not exist" containerID="9525b65d76eb9b0c84ca6b927e592f2923b787e3915e3b55737ae58899ac8288" Apr 16 18:51:24.038835 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:24.038708 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9525b65d76eb9b0c84ca6b927e592f2923b787e3915e3b55737ae58899ac8288"} err="failed to get container status \"9525b65d76eb9b0c84ca6b927e592f2923b787e3915e3b55737ae58899ac8288\": rpc error: code = NotFound desc = could not find container \"9525b65d76eb9b0c84ca6b927e592f2923b787e3915e3b55737ae58899ac8288\": container with ID starting with 9525b65d76eb9b0c84ca6b927e592f2923b787e3915e3b55737ae58899ac8288 not found: ID does not exist" Apr 16 18:51:24.051878 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:24.051832 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-788c698548-qh7fk"] Apr 16 18:51:24.053900 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:24.053875 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-788c698548-qh7fk"] Apr 16 18:51:25.841589 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:25.841548 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ebf21e7-25bb-4dde-b9cb-a686136d91d8" path="/var/lib/kubelet/pods/0ebf21e7-25bb-4dde-b9cb-a686136d91d8/volumes" Apr 16 18:51:27.037449 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:27.037393 2578 generic.go:358] "Generic (PLEG): container finished" podID="a75c4191-bd0f-44d2-abc3-8ccc2a4113c8" containerID="503e8088120a46986f2c09bb2331365bc948fe3179b9db0fe8fdbcf9a805274f" exitCode=0 Apr 16 18:51:27.037817 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:27.037466 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-57f9c55f54-2kl7w" event={"ID":"a75c4191-bd0f-44d2-abc3-8ccc2a4113c8","Type":"ContainerDied","Data":"503e8088120a46986f2c09bb2331365bc948fe3179b9db0fe8fdbcf9a805274f"} Apr 16 18:51:28.042782 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:28.042746 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-57f9c55f54-2kl7w" event={"ID":"a75c4191-bd0f-44d2-abc3-8ccc2a4113c8","Type":"ContainerStarted","Data":"945d939cc4b23acf7368d412608a8f4ac266aa69ace921f275229fb93d753b6b"} Apr 16 18:51:28.043275 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:28.043104 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-57f9c55f54-2kl7w" Apr 16 18:51:28.044621 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:28.044598 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-57f9c55f54-2kl7w" podUID="a75c4191-bd0f-44d2-abc3-8ccc2a4113c8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 16 18:51:28.059606 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:28.059555 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-57f9c55f54-2kl7w" podStartSLOduration=6.059537167 podStartE2EDuration="6.059537167s" podCreationTimestamp="2026-04-16 18:51:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:51:28.059129965 +0000 UTC m=+1264.846466221" watchObservedRunningTime="2026-04-16 18:51:28.059537167 +0000 UTC m=+1264.846873430" Apr 16 18:51:29.047392 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:29.047352 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-57f9c55f54-2kl7w" podUID="a75c4191-bd0f-44d2-abc3-8ccc2a4113c8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 16 18:51:39.049368 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:39.049336 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-57f9c55f54-2kl7w" Apr 16 18:51:42.543389 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:42.543355 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-57f9c55f54-2kl7w"] Apr 16 18:51:42.543731 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:42.543607 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-57f9c55f54-2kl7w" podUID="a75c4191-bd0f-44d2-abc3-8ccc2a4113c8" containerName="kserve-container" containerID="cri-o://945d939cc4b23acf7368d412608a8f4ac266aa69ace921f275229fb93d753b6b" gracePeriod=30 Apr 16 18:51:42.604388 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:42.604351 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-65b68fcf68-m7qc7"] Apr 16 18:51:42.604723 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:42.604711 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0ebf21e7-25bb-4dde-b9cb-a686136d91d8" containerName="storage-initializer" Apr 16 18:51:42.604767 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:42.604725 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ebf21e7-25bb-4dde-b9cb-a686136d91d8" containerName="storage-initializer" Apr 16 18:51:42.604767 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:42.604751 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0ebf21e7-25bb-4dde-b9cb-a686136d91d8" containerName="kserve-container" Apr 16 18:51:42.604767 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:42.604757 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ebf21e7-25bb-4dde-b9cb-a686136d91d8" containerName="kserve-container" Apr 16 18:51:42.604856 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:42.604819 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="0ebf21e7-25bb-4dde-b9cb-a686136d91d8" containerName="kserve-container" Apr 16 18:51:42.608337 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:42.608317 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-65b68fcf68-m7qc7" Apr 16 18:51:42.614383 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:42.614354 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-65b68fcf68-m7qc7"] Apr 16 18:51:42.689608 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:42.689577 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/31399e59-2746-4638-8d18-6e148545825e-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-65b68fcf68-m7qc7\" (UID: \"31399e59-2746-4638-8d18-6e148545825e\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-65b68fcf68-m7qc7" Apr 16 18:51:42.790434 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:42.790391 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/31399e59-2746-4638-8d18-6e148545825e-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-65b68fcf68-m7qc7\" (UID: \"31399e59-2746-4638-8d18-6e148545825e\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-65b68fcf68-m7qc7" Apr 16 18:51:42.790797 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:42.790774 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/31399e59-2746-4638-8d18-6e148545825e-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-65b68fcf68-m7qc7\" (UID: \"31399e59-2746-4638-8d18-6e148545825e\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-65b68fcf68-m7qc7" Apr 16 18:51:42.920060 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:42.919970 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-65b68fcf68-m7qc7" Apr 16 18:51:43.047292 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:43.047201 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-65b68fcf68-m7qc7"] Apr 16 18:51:43.094715 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:43.094682 2578 generic.go:358] "Generic (PLEG): container finished" podID="a75c4191-bd0f-44d2-abc3-8ccc2a4113c8" containerID="945d939cc4b23acf7368d412608a8f4ac266aa69ace921f275229fb93d753b6b" exitCode=0 Apr 16 18:51:43.094868 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:43.094774 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-57f9c55f54-2kl7w" event={"ID":"a75c4191-bd0f-44d2-abc3-8ccc2a4113c8","Type":"ContainerDied","Data":"945d939cc4b23acf7368d412608a8f4ac266aa69ace921f275229fb93d753b6b"} Apr 16 18:51:43.112551 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:51:43.112515 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31399e59_2746_4638_8d18_6e148545825e.slice/crio-5d027145b3975125997756a4700644fb7d4fd9b4304cdb9b2c43b8b79b53ebc3 WatchSource:0}: Error finding container 5d027145b3975125997756a4700644fb7d4fd9b4304cdb9b2c43b8b79b53ebc3: Status 404 returned error can't find the container with id 5d027145b3975125997756a4700644fb7d4fd9b4304cdb9b2c43b8b79b53ebc3 Apr 16 18:51:43.201571 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:43.201546 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-57f9c55f54-2kl7w" Apr 16 18:51:43.295044 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:43.294997 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a75c4191-bd0f-44d2-abc3-8ccc2a4113c8-kserve-provision-location\") pod \"a75c4191-bd0f-44d2-abc3-8ccc2a4113c8\" (UID: \"a75c4191-bd0f-44d2-abc3-8ccc2a4113c8\") " Apr 16 18:51:43.295365 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:43.295336 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a75c4191-bd0f-44d2-abc3-8ccc2a4113c8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a75c4191-bd0f-44d2-abc3-8ccc2a4113c8" (UID: "a75c4191-bd0f-44d2-abc3-8ccc2a4113c8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:51:43.395887 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:43.395842 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a75c4191-bd0f-44d2-abc3-8ccc2a4113c8-kserve-provision-location\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 18:51:44.099513 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:44.099432 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-57f9c55f54-2kl7w" event={"ID":"a75c4191-bd0f-44d2-abc3-8ccc2a4113c8","Type":"ContainerDied","Data":"d2853ef0e86daf4634c5a4bacf414b9375b89abc4971e53a2c0ef382d3e6f933"} Apr 16 18:51:44.099513 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:44.099459 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-57f9c55f54-2kl7w" Apr 16 18:51:44.099513 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:44.099477 2578 scope.go:117] "RemoveContainer" containerID="945d939cc4b23acf7368d412608a8f4ac266aa69ace921f275229fb93d753b6b" Apr 16 18:51:44.100840 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:44.100810 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-65b68fcf68-m7qc7" event={"ID":"31399e59-2746-4638-8d18-6e148545825e","Type":"ContainerStarted","Data":"219d4d235f862f87cdb6b6fdb19efcfff294b9c6f713d5eca939a675fcff147a"} Apr 16 18:51:44.100959 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:44.100846 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-65b68fcf68-m7qc7" event={"ID":"31399e59-2746-4638-8d18-6e148545825e","Type":"ContainerStarted","Data":"5d027145b3975125997756a4700644fb7d4fd9b4304cdb9b2c43b8b79b53ebc3"} Apr 16 18:51:44.107335 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:44.107318 2578 scope.go:117] "RemoveContainer" containerID="503e8088120a46986f2c09bb2331365bc948fe3179b9db0fe8fdbcf9a805274f" Apr 16 18:51:44.132307 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:44.132270 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-57f9c55f54-2kl7w"] Apr 16 18:51:44.133787 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:44.133764 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-57f9c55f54-2kl7w"] Apr 16 18:51:45.840792 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:45.840761 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a75c4191-bd0f-44d2-abc3-8ccc2a4113c8" path="/var/lib/kubelet/pods/a75c4191-bd0f-44d2-abc3-8ccc2a4113c8/volumes" Apr 16 18:51:47.112141 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:47.112045 2578 generic.go:358] "Generic (PLEG): container finished" podID="31399e59-2746-4638-8d18-6e148545825e" containerID="219d4d235f862f87cdb6b6fdb19efcfff294b9c6f713d5eca939a675fcff147a" exitCode=0 Apr 16 18:51:47.112141 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:47.112123 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-65b68fcf68-m7qc7" event={"ID":"31399e59-2746-4638-8d18-6e148545825e","Type":"ContainerDied","Data":"219d4d235f862f87cdb6b6fdb19efcfff294b9c6f713d5eca939a675fcff147a"} Apr 16 18:51:48.117931 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:48.117887 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-65b68fcf68-m7qc7" event={"ID":"31399e59-2746-4638-8d18-6e148545825e","Type":"ContainerStarted","Data":"be29136a653d307e85284059240dc011e6902d3045e36956380fb31be3d319eb"} Apr 16 18:51:48.118435 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:48.118097 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-65b68fcf68-m7qc7" Apr 16 18:51:48.136066 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:51:48.136011 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-65b68fcf68-m7qc7" podStartSLOduration=6.135994999 podStartE2EDuration="6.135994999s" podCreationTimestamp="2026-04-16 18:51:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:51:48.134653151 +0000 UTC m=+1284.921989407" watchObservedRunningTime="2026-04-16 18:51:48.135994999 +0000 UTC m=+1284.923331256" Apr 16 18:52:19.126475 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:52:19.126441 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-65b68fcf68-m7qc7" Apr 16 18:52:22.724217 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:52:22.723600 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-65b68fcf68-m7qc7"] Apr 16 18:52:22.724217 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:52:22.724003 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-65b68fcf68-m7qc7" podUID="31399e59-2746-4638-8d18-6e148545825e" containerName="kserve-container" containerID="cri-o://be29136a653d307e85284059240dc011e6902d3045e36956380fb31be3d319eb" gracePeriod=30 Apr 16 18:52:22.771322 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:52:22.771283 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-64dbc547c5-z25m7"] Apr 16 18:52:22.771681 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:52:22.771667 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a75c4191-bd0f-44d2-abc3-8ccc2a4113c8" containerName="storage-initializer" Apr 16 18:52:22.771726 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:52:22.771683 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="a75c4191-bd0f-44d2-abc3-8ccc2a4113c8" containerName="storage-initializer" Apr 16 18:52:22.771726 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:52:22.771703 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a75c4191-bd0f-44d2-abc3-8ccc2a4113c8" containerName="kserve-container" Apr 16 18:52:22.771726 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:52:22.771708 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="a75c4191-bd0f-44d2-abc3-8ccc2a4113c8" containerName="kserve-container" Apr 16 18:52:22.771819 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:52:22.771769 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="a75c4191-bd0f-44d2-abc3-8ccc2a4113c8" containerName="kserve-container" Apr 16 18:52:22.774994 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:52:22.774978 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-64dbc547c5-z25m7" Apr 16 18:52:22.784065 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:52:22.784037 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-64dbc547c5-z25m7"] Apr 16 18:52:22.821349 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:52:22.821316 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/01020069-28e9-4b31-957d-6b2555ceaef1-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-64dbc547c5-z25m7\" (UID: \"01020069-28e9-4b31-957d-6b2555ceaef1\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-64dbc547c5-z25m7" Apr 16 18:52:22.922369 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:52:22.922325 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/01020069-28e9-4b31-957d-6b2555ceaef1-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-64dbc547c5-z25m7\" (UID: \"01020069-28e9-4b31-957d-6b2555ceaef1\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-64dbc547c5-z25m7" Apr 16 18:52:22.922842 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:52:22.922818 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/01020069-28e9-4b31-957d-6b2555ceaef1-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-64dbc547c5-z25m7\" (UID: \"01020069-28e9-4b31-957d-6b2555ceaef1\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-64dbc547c5-z25m7" Apr 16 18:52:23.086539 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:52:23.086431 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-64dbc547c5-z25m7" Apr 16 18:52:23.212346 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:52:23.212321 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-64dbc547c5-z25m7"] Apr 16 18:52:23.215123 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:52:23.215087 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01020069_28e9_4b31_957d_6b2555ceaef1.slice/crio-4e90847610ce15c5fc631074cb8f8f4e6eb4ebfb85798d397b22cb73050f6876 WatchSource:0}: Error finding container 4e90847610ce15c5fc631074cb8f8f4e6eb4ebfb85798d397b22cb73050f6876: Status 404 returned error can't find the container with id 4e90847610ce15c5fc631074cb8f8f4e6eb4ebfb85798d397b22cb73050f6876 Apr 16 18:52:23.238396 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:52:23.238368 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-64dbc547c5-z25m7" event={"ID":"01020069-28e9-4b31-957d-6b2555ceaef1","Type":"ContainerStarted","Data":"4e90847610ce15c5fc631074cb8f8f4e6eb4ebfb85798d397b22cb73050f6876"} Apr 16 18:52:23.954005 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:52:23.953983 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-65b68fcf68-m7qc7" Apr 16 18:52:24.032051 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:52:24.031983 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/31399e59-2746-4638-8d18-6e148545825e-kserve-provision-location\") pod \"31399e59-2746-4638-8d18-6e148545825e\" (UID: \"31399e59-2746-4638-8d18-6e148545825e\") " Apr 16 18:52:24.032367 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:52:24.032345 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31399e59-2746-4638-8d18-6e148545825e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "31399e59-2746-4638-8d18-6e148545825e" (UID: "31399e59-2746-4638-8d18-6e148545825e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:52:24.133225 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:52:24.133195 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/31399e59-2746-4638-8d18-6e148545825e-kserve-provision-location\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 18:52:24.243306 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:52:24.243272 2578 generic.go:358] "Generic (PLEG): container finished" podID="31399e59-2746-4638-8d18-6e148545825e" containerID="be29136a653d307e85284059240dc011e6902d3045e36956380fb31be3d319eb" exitCode=0 Apr 16 18:52:24.243462 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:52:24.243320 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-65b68fcf68-m7qc7" event={"ID":"31399e59-2746-4638-8d18-6e148545825e","Type":"ContainerDied","Data":"be29136a653d307e85284059240dc011e6902d3045e36956380fb31be3d319eb"} Apr 16 18:52:24.243462 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:52:24.243347 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-65b68fcf68-m7qc7" Apr 16 18:52:24.243462 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:52:24.243364 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-65b68fcf68-m7qc7" event={"ID":"31399e59-2746-4638-8d18-6e148545825e","Type":"ContainerDied","Data":"5d027145b3975125997756a4700644fb7d4fd9b4304cdb9b2c43b8b79b53ebc3"} Apr 16 18:52:24.243462 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:52:24.243386 2578 scope.go:117] "RemoveContainer" containerID="be29136a653d307e85284059240dc011e6902d3045e36956380fb31be3d319eb" Apr 16 18:52:24.244749 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:52:24.244722 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-64dbc547c5-z25m7" event={"ID":"01020069-28e9-4b31-957d-6b2555ceaef1","Type":"ContainerStarted","Data":"8a1a68e9770df905be96720141219b790527e71376a2722bbad86f811fefef38"} Apr 16 18:52:24.251643 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:52:24.251629 2578 scope.go:117] "RemoveContainer" containerID="219d4d235f862f87cdb6b6fdb19efcfff294b9c6f713d5eca939a675fcff147a" Apr 16 18:52:24.258327 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:52:24.258305 2578 scope.go:117] "RemoveContainer" containerID="be29136a653d307e85284059240dc011e6902d3045e36956380fb31be3d319eb" Apr 16 18:52:24.258557 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:52:24.258539 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be29136a653d307e85284059240dc011e6902d3045e36956380fb31be3d319eb\": container with ID starting with be29136a653d307e85284059240dc011e6902d3045e36956380fb31be3d319eb not found: ID does not exist" containerID="be29136a653d307e85284059240dc011e6902d3045e36956380fb31be3d319eb" Apr 16 18:52:24.258601 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:52:24.258568 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be29136a653d307e85284059240dc011e6902d3045e36956380fb31be3d319eb"} err="failed to get container status \"be29136a653d307e85284059240dc011e6902d3045e36956380fb31be3d319eb\": rpc error: code = NotFound desc = could not find container \"be29136a653d307e85284059240dc011e6902d3045e36956380fb31be3d319eb\": container with ID starting with be29136a653d307e85284059240dc011e6902d3045e36956380fb31be3d319eb not found: ID does not exist" Apr 16 18:52:24.258601 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:52:24.258584 2578 scope.go:117] "RemoveContainer" containerID="219d4d235f862f87cdb6b6fdb19efcfff294b9c6f713d5eca939a675fcff147a" Apr 16 18:52:24.258793 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:52:24.258777 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"219d4d235f862f87cdb6b6fdb19efcfff294b9c6f713d5eca939a675fcff147a\": container with ID starting with 219d4d235f862f87cdb6b6fdb19efcfff294b9c6f713d5eca939a675fcff147a not found: ID does not exist" containerID="219d4d235f862f87cdb6b6fdb19efcfff294b9c6f713d5eca939a675fcff147a" Apr 16 18:52:24.258831 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:52:24.258797 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"219d4d235f862f87cdb6b6fdb19efcfff294b9c6f713d5eca939a675fcff147a"} err="failed to get container status \"219d4d235f862f87cdb6b6fdb19efcfff294b9c6f713d5eca939a675fcff147a\": rpc error: code = NotFound desc = could not find container \"219d4d235f862f87cdb6b6fdb19efcfff294b9c6f713d5eca939a675fcff147a\": container with ID starting with 219d4d235f862f87cdb6b6fdb19efcfff294b9c6f713d5eca939a675fcff147a not found: ID does not exist" Apr 16 18:52:24.277056 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:52:24.277030 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-65b68fcf68-m7qc7"] Apr 16 18:52:24.280410 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:52:24.280388 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-65b68fcf68-m7qc7"] Apr 16 18:52:25.846349 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:52:25.846315 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31399e59-2746-4638-8d18-6e148545825e" path="/var/lib/kubelet/pods/31399e59-2746-4638-8d18-6e148545825e/volumes" Apr 16 18:52:27.257051 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:52:27.257017 2578 generic.go:358] "Generic (PLEG): container finished" podID="01020069-28e9-4b31-957d-6b2555ceaef1" containerID="8a1a68e9770df905be96720141219b790527e71376a2722bbad86f811fefef38" exitCode=0 Apr 16 18:52:27.257481 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:52:27.257065 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-64dbc547c5-z25m7" event={"ID":"01020069-28e9-4b31-957d-6b2555ceaef1","Type":"ContainerDied","Data":"8a1a68e9770df905be96720141219b790527e71376a2722bbad86f811fefef38"} Apr 16 18:52:28.263015 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:52:28.262975 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-64dbc547c5-z25m7" event={"ID":"01020069-28e9-4b31-957d-6b2555ceaef1","Type":"ContainerStarted","Data":"9f9bc9f483aa103b6064eca87486303bf950eecce6784d0bcc6a2014daee1fb2"} Apr 16 18:52:30.272889 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:52:30.272802 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-64dbc547c5-z25m7" event={"ID":"01020069-28e9-4b31-957d-6b2555ceaef1","Type":"ContainerStarted","Data":"86b593b61e3383a726777e9c5b9ccf694410566e324d1fc221c8360271c1c9d8"} Apr 16 18:52:30.273256 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:52:30.272950 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-64dbc547c5-z25m7" Apr 16 18:52:30.273256 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:52:30.273061 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-64dbc547c5-z25m7" Apr 16 18:52:30.292364 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:52:30.292320 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-64dbc547c5-z25m7" podStartSLOduration=5.632359461 podStartE2EDuration="8.29230584s" podCreationTimestamp="2026-04-16 18:52:22 +0000 UTC" firstStartedPulling="2026-04-16 18:52:27.318258371 +0000 UTC m=+1324.105594606" lastFinishedPulling="2026-04-16 18:52:29.978204745 +0000 UTC m=+1326.765540985" observedRunningTime="2026-04-16 18:52:30.290578771 +0000 UTC m=+1327.077915042" watchObservedRunningTime="2026-04-16 18:52:30.29230584 +0000 UTC m=+1327.079642145" Apr 16 18:53:01.280591 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:53:01.280497 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-64dbc547c5-z25m7" Apr 16 18:53:31.281518 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:53:31.281463 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-64dbc547c5-z25m7" Apr 16 18:53:32.917494 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:53:32.917456 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-64dbc547c5-z25m7"] Apr 16 18:53:32.917897 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:53:32.917835 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-64dbc547c5-z25m7" podUID="01020069-28e9-4b31-957d-6b2555ceaef1" containerName="kserve-container" containerID="cri-o://9f9bc9f483aa103b6064eca87486303bf950eecce6784d0bcc6a2014daee1fb2" gracePeriod=30 Apr 16 18:53:32.917970 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:53:32.917871 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-64dbc547c5-z25m7" podUID="01020069-28e9-4b31-957d-6b2555ceaef1" containerName="kserve-agent" containerID="cri-o://86b593b61e3383a726777e9c5b9ccf694410566e324d1fc221c8360271c1c9d8" gracePeriod=30 Apr 16 18:53:32.974532 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:53:32.974500 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-c57db76c5-7knj8"] Apr 16 18:53:32.974870 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:53:32.974855 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="31399e59-2746-4638-8d18-6e148545825e" containerName="storage-initializer" Apr 16 18:53:32.974923 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:53:32.974872 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="31399e59-2746-4638-8d18-6e148545825e" containerName="storage-initializer" Apr 16 18:53:32.974923 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:53:32.974887 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="31399e59-2746-4638-8d18-6e148545825e" containerName="kserve-container" Apr 16 18:53:32.974923 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:53:32.974893 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="31399e59-2746-4638-8d18-6e148545825e" containerName="kserve-container" Apr 16 18:53:32.975028 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:53:32.974954 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="31399e59-2746-4638-8d18-6e148545825e" containerName="kserve-container" Apr 16 18:53:32.979086 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:53:32.979066 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-c57db76c5-7knj8" Apr 16 18:53:32.985440 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:53:32.985417 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-c57db76c5-7knj8"] Apr 16 18:53:33.101319 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:53:33.101284 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7e2191b1-7d18-4641-8fd7-e74c9a8beb8b-kserve-provision-location\") pod \"isvc-paddle-predictor-c57db76c5-7knj8\" (UID: \"7e2191b1-7d18-4641-8fd7-e74c9a8beb8b\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-c57db76c5-7knj8" Apr 16 18:53:33.202671 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:53:33.202572 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7e2191b1-7d18-4641-8fd7-e74c9a8beb8b-kserve-provision-location\") pod \"isvc-paddle-predictor-c57db76c5-7knj8\" (UID: \"7e2191b1-7d18-4641-8fd7-e74c9a8beb8b\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-c57db76c5-7knj8" Apr 16 18:53:33.202991 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:53:33.202966 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7e2191b1-7d18-4641-8fd7-e74c9a8beb8b-kserve-provision-location\") pod \"isvc-paddle-predictor-c57db76c5-7knj8\" (UID: \"7e2191b1-7d18-4641-8fd7-e74c9a8beb8b\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-c57db76c5-7knj8" Apr 16 18:53:33.289315 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:53:33.289283 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-c57db76c5-7knj8" Apr 16 18:53:33.411903 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:53:33.411877 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-c57db76c5-7knj8"] Apr 16 18:53:33.414386 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:53:33.414356 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e2191b1_7d18_4641_8fd7_e74c9a8beb8b.slice/crio-2ad6fc080873acb28b0410fb597b720bf6374442957d7edf0e4c70d5081f99e6 WatchSource:0}: Error finding container 2ad6fc080873acb28b0410fb597b720bf6374442957d7edf0e4c70d5081f99e6: Status 404 returned error can't find the container with id 2ad6fc080873acb28b0410fb597b720bf6374442957d7edf0e4c70d5081f99e6 Apr 16 18:53:33.416165 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:53:33.416145 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:53:33.491412 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:53:33.491377 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-c57db76c5-7knj8" event={"ID":"7e2191b1-7d18-4641-8fd7-e74c9a8beb8b","Type":"ContainerStarted","Data":"8b9d5e44f64749c5a989f7d438858d3345647634a2cf8c5643bb5a2f59d9cd59"} Apr 16 18:53:33.491574 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:53:33.491420 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-c57db76c5-7knj8" event={"ID":"7e2191b1-7d18-4641-8fd7-e74c9a8beb8b","Type":"ContainerStarted","Data":"2ad6fc080873acb28b0410fb597b720bf6374442957d7edf0e4c70d5081f99e6"} Apr 16 18:53:35.499093 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:53:35.499056 2578 generic.go:358] "Generic (PLEG): container finished" podID="01020069-28e9-4b31-957d-6b2555ceaef1" containerID="9f9bc9f483aa103b6064eca87486303bf950eecce6784d0bcc6a2014daee1fb2" exitCode=0 Apr 16 18:53:35.499482 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:53:35.499119 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-64dbc547c5-z25m7" event={"ID":"01020069-28e9-4b31-957d-6b2555ceaef1","Type":"ContainerDied","Data":"9f9bc9f483aa103b6064eca87486303bf950eecce6784d0bcc6a2014daee1fb2"} Apr 16 18:53:38.512507 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:53:38.512475 2578 generic.go:358] "Generic (PLEG): container finished" podID="7e2191b1-7d18-4641-8fd7-e74c9a8beb8b" containerID="8b9d5e44f64749c5a989f7d438858d3345647634a2cf8c5643bb5a2f59d9cd59" exitCode=0 Apr 16 18:53:38.512876 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:53:38.512502 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-c57db76c5-7knj8" event={"ID":"7e2191b1-7d18-4641-8fd7-e74c9a8beb8b","Type":"ContainerDied","Data":"8b9d5e44f64749c5a989f7d438858d3345647634a2cf8c5643bb5a2f59d9cd59"} Apr 16 18:53:41.277854 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:53:41.277807 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-64dbc547c5-z25m7" podUID="01020069-28e9-4b31-957d-6b2555ceaef1" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.42:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.134.0.42:8080: connect: connection refused" Apr 16 18:53:50.567931 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:53:50.567895 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-c57db76c5-7knj8" event={"ID":"7e2191b1-7d18-4641-8fd7-e74c9a8beb8b","Type":"ContainerStarted","Data":"81261e112aeda9b63fbaaad75b961d0baf36a54bcd0f87d6c90d0ee8e449681d"} Apr 16 18:53:50.568357 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:53:50.568198 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-c57db76c5-7knj8" Apr 16 18:53:50.569572 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:53:50.569545 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-c57db76c5-7knj8" podUID="7e2191b1-7d18-4641-8fd7-e74c9a8beb8b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 16 18:53:50.585234 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:53:50.585162 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-predictor-c57db76c5-7knj8" podStartSLOduration=7.599577335 podStartE2EDuration="18.585147207s" podCreationTimestamp="2026-04-16 18:53:32 +0000 UTC" firstStartedPulling="2026-04-16 18:53:38.513828315 +0000 UTC m=+1395.301164548" lastFinishedPulling="2026-04-16 18:53:49.499398185 +0000 UTC m=+1406.286734420" observedRunningTime="2026-04-16 18:53:50.58248461 +0000 UTC m=+1407.369820869" watchObservedRunningTime="2026-04-16 18:53:50.585147207 +0000 UTC m=+1407.372483464" Apr 16 18:53:51.277831 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:53:51.277785 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-64dbc547c5-z25m7" podUID="01020069-28e9-4b31-957d-6b2555ceaef1" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.42:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.134.0.42:8080: connect: connection refused" Apr 16 18:53:51.571830 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:53:51.571742 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-c57db76c5-7knj8" podUID="7e2191b1-7d18-4641-8fd7-e74c9a8beb8b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 16 18:54:01.277941 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:01.277894 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-64dbc547c5-z25m7" podUID="01020069-28e9-4b31-957d-6b2555ceaef1" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.42:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.134.0.42:8080: connect: connection refused" Apr 16 18:54:01.278371 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:01.278024 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-64dbc547c5-z25m7" Apr 16 18:54:01.572379 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:01.572286 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-c57db76c5-7knj8" podUID="7e2191b1-7d18-4641-8fd7-e74c9a8beb8b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 16 18:54:03.562891 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:03.562868 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-64dbc547c5-z25m7" Apr 16 18:54:03.614580 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:03.614543 2578 generic.go:358] "Generic (PLEG): container finished" podID="01020069-28e9-4b31-957d-6b2555ceaef1" containerID="86b593b61e3383a726777e9c5b9ccf694410566e324d1fc221c8360271c1c9d8" exitCode=0 Apr 16 18:54:03.614761 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:03.614627 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-64dbc547c5-z25m7" Apr 16 18:54:03.614761 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:03.614635 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-64dbc547c5-z25m7" event={"ID":"01020069-28e9-4b31-957d-6b2555ceaef1","Type":"ContainerDied","Data":"86b593b61e3383a726777e9c5b9ccf694410566e324d1fc221c8360271c1c9d8"} Apr 16 18:54:03.614761 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:03.614665 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-64dbc547c5-z25m7" event={"ID":"01020069-28e9-4b31-957d-6b2555ceaef1","Type":"ContainerDied","Data":"4e90847610ce15c5fc631074cb8f8f4e6eb4ebfb85798d397b22cb73050f6876"} Apr 16 18:54:03.614761 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:03.614695 2578 scope.go:117] "RemoveContainer" containerID="86b593b61e3383a726777e9c5b9ccf694410566e324d1fc221c8360271c1c9d8" Apr 16 18:54:03.622155 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:03.622125 2578 scope.go:117] "RemoveContainer" containerID="9f9bc9f483aa103b6064eca87486303bf950eecce6784d0bcc6a2014daee1fb2" Apr 16 18:54:03.628913 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:03.628898 2578 scope.go:117] "RemoveContainer" containerID="8a1a68e9770df905be96720141219b790527e71376a2722bbad86f811fefef38" Apr 16 18:54:03.635694 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:03.635677 2578 scope.go:117] "RemoveContainer" containerID="86b593b61e3383a726777e9c5b9ccf694410566e324d1fc221c8360271c1c9d8" Apr 16 18:54:03.635945 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:54:03.635926 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86b593b61e3383a726777e9c5b9ccf694410566e324d1fc221c8360271c1c9d8\": container with ID starting with 86b593b61e3383a726777e9c5b9ccf694410566e324d1fc221c8360271c1c9d8 not found: ID does not exist" containerID="86b593b61e3383a726777e9c5b9ccf694410566e324d1fc221c8360271c1c9d8" Apr 16 18:54:03.636036 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:03.635953 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86b593b61e3383a726777e9c5b9ccf694410566e324d1fc221c8360271c1c9d8"} err="failed to get container status \"86b593b61e3383a726777e9c5b9ccf694410566e324d1fc221c8360271c1c9d8\": rpc error: code = NotFound desc = could not find container \"86b593b61e3383a726777e9c5b9ccf694410566e324d1fc221c8360271c1c9d8\": container with ID starting with 86b593b61e3383a726777e9c5b9ccf694410566e324d1fc221c8360271c1c9d8 not found: ID does not exist" Apr 16 18:54:03.636036 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:03.635971 2578 scope.go:117] "RemoveContainer" containerID="9f9bc9f483aa103b6064eca87486303bf950eecce6784d0bcc6a2014daee1fb2" Apr 16 18:54:03.636225 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:54:03.636208 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f9bc9f483aa103b6064eca87486303bf950eecce6784d0bcc6a2014daee1fb2\": container with ID starting with 9f9bc9f483aa103b6064eca87486303bf950eecce6784d0bcc6a2014daee1fb2 not found: ID does not exist" containerID="9f9bc9f483aa103b6064eca87486303bf950eecce6784d0bcc6a2014daee1fb2" Apr 16 18:54:03.636268 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:03.636232 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f9bc9f483aa103b6064eca87486303bf950eecce6784d0bcc6a2014daee1fb2"} err="failed to get container status \"9f9bc9f483aa103b6064eca87486303bf950eecce6784d0bcc6a2014daee1fb2\": rpc error: code = NotFound desc = could not find container \"9f9bc9f483aa103b6064eca87486303bf950eecce6784d0bcc6a2014daee1fb2\": container with ID starting with 9f9bc9f483aa103b6064eca87486303bf950eecce6784d0bcc6a2014daee1fb2 not found: ID does not exist" Apr 16 18:54:03.636268 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:03.636248 2578 scope.go:117] "RemoveContainer" containerID="8a1a68e9770df905be96720141219b790527e71376a2722bbad86f811fefef38" Apr 16 18:54:03.636480 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:54:03.636464 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a1a68e9770df905be96720141219b790527e71376a2722bbad86f811fefef38\": container with ID starting with 8a1a68e9770df905be96720141219b790527e71376a2722bbad86f811fefef38 not found: ID does not exist" containerID="8a1a68e9770df905be96720141219b790527e71376a2722bbad86f811fefef38" Apr 16 18:54:03.636525 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:03.636482 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a1a68e9770df905be96720141219b790527e71376a2722bbad86f811fefef38"} err="failed to get container status \"8a1a68e9770df905be96720141219b790527e71376a2722bbad86f811fefef38\": rpc error: code = NotFound desc = could not find container \"8a1a68e9770df905be96720141219b790527e71376a2722bbad86f811fefef38\": container with ID starting with 8a1a68e9770df905be96720141219b790527e71376a2722bbad86f811fefef38 not found: ID does not exist" Apr 16 18:54:03.655392 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:03.655374 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/01020069-28e9-4b31-957d-6b2555ceaef1-kserve-provision-location\") pod \"01020069-28e9-4b31-957d-6b2555ceaef1\" (UID: \"01020069-28e9-4b31-957d-6b2555ceaef1\") " Apr 16 18:54:03.655647 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:03.655627 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01020069-28e9-4b31-957d-6b2555ceaef1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "01020069-28e9-4b31-957d-6b2555ceaef1" (UID: "01020069-28e9-4b31-957d-6b2555ceaef1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:54:03.756613 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:03.756590 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/01020069-28e9-4b31-957d-6b2555ceaef1-kserve-provision-location\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 18:54:03.931344 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:03.931312 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-64dbc547c5-z25m7"] Apr 16 18:54:03.937086 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:03.937063 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-64dbc547c5-z25m7"] Apr 16 18:54:05.840826 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:05.840792 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01020069-28e9-4b31-957d-6b2555ceaef1" path="/var/lib/kubelet/pods/01020069-28e9-4b31-957d-6b2555ceaef1/volumes" Apr 16 18:54:11.572700 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:11.572606 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-c57db76c5-7knj8" podUID="7e2191b1-7d18-4641-8fd7-e74c9a8beb8b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 16 18:54:21.572662 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:21.572616 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-c57db76c5-7knj8" podUID="7e2191b1-7d18-4641-8fd7-e74c9a8beb8b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 16 18:54:31.573381 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:31.573346 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-c57db76c5-7knj8" Apr 16 18:54:34.522208 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:34.522144 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-c57db76c5-7knj8"] Apr 16 18:54:34.522656 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:34.522466 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-predictor-c57db76c5-7knj8" podUID="7e2191b1-7d18-4641-8fd7-e74c9a8beb8b" containerName="kserve-container" containerID="cri-o://81261e112aeda9b63fbaaad75b961d0baf36a54bcd0f87d6c90d0ee8e449681d" gracePeriod=30 Apr 16 18:54:34.595560 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:34.595530 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-8c7997d9b-pp6d6"] Apr 16 18:54:34.595898 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:34.595882 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="01020069-28e9-4b31-957d-6b2555ceaef1" containerName="kserve-container" Apr 16 18:54:34.595898 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:34.595899 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="01020069-28e9-4b31-957d-6b2555ceaef1" containerName="kserve-container" Apr 16 18:54:34.595979 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:34.595915 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="01020069-28e9-4b31-957d-6b2555ceaef1" containerName="storage-initializer" Apr 16 18:54:34.595979 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:34.595921 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="01020069-28e9-4b31-957d-6b2555ceaef1" containerName="storage-initializer" Apr 16 18:54:34.595979 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:34.595934 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="01020069-28e9-4b31-957d-6b2555ceaef1" containerName="kserve-agent" Apr 16 18:54:34.595979 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:34.595939 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="01020069-28e9-4b31-957d-6b2555ceaef1" containerName="kserve-agent" Apr 16 18:54:34.596100 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:34.595991 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="01020069-28e9-4b31-957d-6b2555ceaef1" containerName="kserve-container" Apr 16 18:54:34.596100 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:34.596002 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="01020069-28e9-4b31-957d-6b2555ceaef1" containerName="kserve-agent" Apr 16 18:54:34.604738 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:34.604709 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-8c7997d9b-pp6d6" Apr 16 18:54:34.609450 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:34.609421 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-8c7997d9b-pp6d6"] Apr 16 18:54:34.723052 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:34.723019 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/86208719-5f92-44d2-a9bb-a533bd42147e-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-8c7997d9b-pp6d6\" (UID: \"86208719-5f92-44d2-a9bb-a533bd42147e\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-8c7997d9b-pp6d6" Apr 16 18:54:34.824096 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:34.824007 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/86208719-5f92-44d2-a9bb-a533bd42147e-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-8c7997d9b-pp6d6\" (UID: \"86208719-5f92-44d2-a9bb-a533bd42147e\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-8c7997d9b-pp6d6" Apr 16 18:54:34.824409 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:34.824389 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/86208719-5f92-44d2-a9bb-a533bd42147e-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-8c7997d9b-pp6d6\" (UID: \"86208719-5f92-44d2-a9bb-a533bd42147e\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-8c7997d9b-pp6d6" Apr 16 18:54:34.915586 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:34.915549 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-8c7997d9b-pp6d6" Apr 16 18:54:35.039250 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:35.039205 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-8c7997d9b-pp6d6"] Apr 16 18:54:35.042634 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:54:35.042604 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86208719_5f92_44d2_a9bb_a533bd42147e.slice/crio-8481b3f3249771b2de5428dafc44ffac85d9fb405fb17799a9904330a0743470 WatchSource:0}: Error finding container 8481b3f3249771b2de5428dafc44ffac85d9fb405fb17799a9904330a0743470: Status 404 returned error can't find the container with id 8481b3f3249771b2de5428dafc44ffac85d9fb405fb17799a9904330a0743470 Apr 16 18:54:35.730906 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:35.730863 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-8c7997d9b-pp6d6" event={"ID":"86208719-5f92-44d2-a9bb-a533bd42147e","Type":"ContainerStarted","Data":"c96322827e7638759a6f6a0543d7d1756c42ae85e62bc8c76be624f3b86a8902"} Apr 16 18:54:35.731327 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:35.730913 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-8c7997d9b-pp6d6" event={"ID":"86208719-5f92-44d2-a9bb-a533bd42147e","Type":"ContainerStarted","Data":"8481b3f3249771b2de5428dafc44ffac85d9fb405fb17799a9904330a0743470"} Apr 16 18:54:37.367745 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:37.367722 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-c57db76c5-7knj8" Apr 16 18:54:37.549727 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:37.549694 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7e2191b1-7d18-4641-8fd7-e74c9a8beb8b-kserve-provision-location\") pod \"7e2191b1-7d18-4641-8fd7-e74c9a8beb8b\" (UID: \"7e2191b1-7d18-4641-8fd7-e74c9a8beb8b\") " Apr 16 18:54:37.559555 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:37.559523 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e2191b1-7d18-4641-8fd7-e74c9a8beb8b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7e2191b1-7d18-4641-8fd7-e74c9a8beb8b" (UID: "7e2191b1-7d18-4641-8fd7-e74c9a8beb8b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:54:37.650795 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:37.650770 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7e2191b1-7d18-4641-8fd7-e74c9a8beb8b-kserve-provision-location\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 18:54:37.739568 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:37.739537 2578 generic.go:358] "Generic (PLEG): container finished" podID="7e2191b1-7d18-4641-8fd7-e74c9a8beb8b" containerID="81261e112aeda9b63fbaaad75b961d0baf36a54bcd0f87d6c90d0ee8e449681d" exitCode=0 Apr 16 18:54:37.739568 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:37.739573 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-c57db76c5-7knj8" event={"ID":"7e2191b1-7d18-4641-8fd7-e74c9a8beb8b","Type":"ContainerDied","Data":"81261e112aeda9b63fbaaad75b961d0baf36a54bcd0f87d6c90d0ee8e449681d"} Apr 16 18:54:37.739766 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:37.739593 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-c57db76c5-7knj8" event={"ID":"7e2191b1-7d18-4641-8fd7-e74c9a8beb8b","Type":"ContainerDied","Data":"2ad6fc080873acb28b0410fb597b720bf6374442957d7edf0e4c70d5081f99e6"} Apr 16 18:54:37.739766 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:37.739599 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-c57db76c5-7knj8" Apr 16 18:54:37.739766 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:37.739608 2578 scope.go:117] "RemoveContainer" containerID="81261e112aeda9b63fbaaad75b961d0baf36a54bcd0f87d6c90d0ee8e449681d" Apr 16 18:54:37.747597 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:37.747575 2578 scope.go:117] "RemoveContainer" containerID="8b9d5e44f64749c5a989f7d438858d3345647634a2cf8c5643bb5a2f59d9cd59" Apr 16 18:54:37.754737 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:37.754723 2578 scope.go:117] "RemoveContainer" containerID="81261e112aeda9b63fbaaad75b961d0baf36a54bcd0f87d6c90d0ee8e449681d" Apr 16 18:54:37.754974 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:54:37.754958 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81261e112aeda9b63fbaaad75b961d0baf36a54bcd0f87d6c90d0ee8e449681d\": container with ID starting with 81261e112aeda9b63fbaaad75b961d0baf36a54bcd0f87d6c90d0ee8e449681d not found: ID does not exist" containerID="81261e112aeda9b63fbaaad75b961d0baf36a54bcd0f87d6c90d0ee8e449681d" Apr 16 18:54:37.755044 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:37.754979 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81261e112aeda9b63fbaaad75b961d0baf36a54bcd0f87d6c90d0ee8e449681d"} err="failed to get container status \"81261e112aeda9b63fbaaad75b961d0baf36a54bcd0f87d6c90d0ee8e449681d\": rpc error: code = NotFound desc = could not find container \"81261e112aeda9b63fbaaad75b961d0baf36a54bcd0f87d6c90d0ee8e449681d\": container with ID starting with 81261e112aeda9b63fbaaad75b961d0baf36a54bcd0f87d6c90d0ee8e449681d not found: ID does not exist" Apr 16 18:54:37.755044 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:37.755013 2578 scope.go:117] "RemoveContainer" containerID="8b9d5e44f64749c5a989f7d438858d3345647634a2cf8c5643bb5a2f59d9cd59" Apr 16 18:54:37.755222 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:54:37.755208 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b9d5e44f64749c5a989f7d438858d3345647634a2cf8c5643bb5a2f59d9cd59\": container with ID starting with 8b9d5e44f64749c5a989f7d438858d3345647634a2cf8c5643bb5a2f59d9cd59 not found: ID does not exist" containerID="8b9d5e44f64749c5a989f7d438858d3345647634a2cf8c5643bb5a2f59d9cd59" Apr 16 18:54:37.755277 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:37.755225 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b9d5e44f64749c5a989f7d438858d3345647634a2cf8c5643bb5a2f59d9cd59"} err="failed to get container status \"8b9d5e44f64749c5a989f7d438858d3345647634a2cf8c5643bb5a2f59d9cd59\": rpc error: code = NotFound desc = could not find container \"8b9d5e44f64749c5a989f7d438858d3345647634a2cf8c5643bb5a2f59d9cd59\": container with ID starting with 8b9d5e44f64749c5a989f7d438858d3345647634a2cf8c5643bb5a2f59d9cd59 not found: ID does not exist" Apr 16 18:54:37.761396 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:37.761372 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-c57db76c5-7knj8"] Apr 16 18:54:37.765632 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:37.765612 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-c57db76c5-7knj8"] Apr 16 18:54:37.841881 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:37.841786 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e2191b1-7d18-4641-8fd7-e74c9a8beb8b" path="/var/lib/kubelet/pods/7e2191b1-7d18-4641-8fd7-e74c9a8beb8b/volumes" Apr 16 18:54:39.748945 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:39.748913 2578 generic.go:358] "Generic (PLEG): container finished" podID="86208719-5f92-44d2-a9bb-a533bd42147e" containerID="c96322827e7638759a6f6a0543d7d1756c42ae85e62bc8c76be624f3b86a8902" exitCode=0 Apr 16 18:54:39.749324 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:39.748981 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-8c7997d9b-pp6d6" event={"ID":"86208719-5f92-44d2-a9bb-a533bd42147e","Type":"ContainerDied","Data":"c96322827e7638759a6f6a0543d7d1756c42ae85e62bc8c76be624f3b86a8902"} Apr 16 18:54:40.753573 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:40.753540 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-8c7997d9b-pp6d6" event={"ID":"86208719-5f92-44d2-a9bb-a533bd42147e","Type":"ContainerStarted","Data":"797dfad1ec3d57c22e8e0490911487d9009c53e0ce1afa0f9120566c22f9e994"} Apr 16 18:54:40.753974 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:40.753816 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-8c7997d9b-pp6d6" Apr 16 18:54:40.755075 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:40.755049 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-8c7997d9b-pp6d6" podUID="86208719-5f92-44d2-a9bb-a533bd42147e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 16 18:54:40.771669 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:40.771632 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-8c7997d9b-pp6d6" podStartSLOduration=6.771620585 podStartE2EDuration="6.771620585s" podCreationTimestamp="2026-04-16 18:54:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:54:40.770395524 +0000 UTC m=+1457.557731780" watchObservedRunningTime="2026-04-16 18:54:40.771620585 +0000 UTC m=+1457.558956839" Apr 16 18:54:41.756837 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:41.756795 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-8c7997d9b-pp6d6" podUID="86208719-5f92-44d2-a9bb-a533bd42147e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 16 18:54:51.757264 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:54:51.757225 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-8c7997d9b-pp6d6" podUID="86208719-5f92-44d2-a9bb-a533bd42147e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 16 18:55:01.756781 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:55:01.756740 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-8c7997d9b-pp6d6" podUID="86208719-5f92-44d2-a9bb-a533bd42147e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 16 18:55:11.756810 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:55:11.756762 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-8c7997d9b-pp6d6" podUID="86208719-5f92-44d2-a9bb-a533bd42147e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 16 18:55:21.758353 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:55:21.758322 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-8c7997d9b-pp6d6" Apr 16 18:55:26.119319 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:55:26.119285 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-8c7997d9b-pp6d6"] Apr 16 18:55:26.120063 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:55:26.120032 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-8c7997d9b-pp6d6" podUID="86208719-5f92-44d2-a9bb-a533bd42147e" containerName="kserve-container" containerID="cri-o://797dfad1ec3d57c22e8e0490911487d9009c53e0ce1afa0f9120566c22f9e994" gracePeriod=30 Apr 16 18:55:28.766234 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:55:28.766207 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-8c7997d9b-pp6d6" Apr 16 18:55:28.865999 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:55:28.865917 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/86208719-5f92-44d2-a9bb-a533bd42147e-kserve-provision-location\") pod \"86208719-5f92-44d2-a9bb-a533bd42147e\" (UID: \"86208719-5f92-44d2-a9bb-a533bd42147e\") " Apr 16 18:55:28.875977 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:55:28.875949 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86208719-5f92-44d2-a9bb-a533bd42147e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "86208719-5f92-44d2-a9bb-a533bd42147e" (UID: "86208719-5f92-44d2-a9bb-a533bd42147e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:55:28.916498 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:55:28.916471 2578 generic.go:358] "Generic (PLEG): container finished" podID="86208719-5f92-44d2-a9bb-a533bd42147e" containerID="797dfad1ec3d57c22e8e0490911487d9009c53e0ce1afa0f9120566c22f9e994" exitCode=0 Apr 16 18:55:28.916659 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:55:28.916538 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-8c7997d9b-pp6d6" event={"ID":"86208719-5f92-44d2-a9bb-a533bd42147e","Type":"ContainerDied","Data":"797dfad1ec3d57c22e8e0490911487d9009c53e0ce1afa0f9120566c22f9e994"} Apr 16 18:55:28.916659 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:55:28.916555 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-8c7997d9b-pp6d6" Apr 16 18:55:28.916659 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:55:28.916583 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-8c7997d9b-pp6d6" event={"ID":"86208719-5f92-44d2-a9bb-a533bd42147e","Type":"ContainerDied","Data":"8481b3f3249771b2de5428dafc44ffac85d9fb405fb17799a9904330a0743470"} Apr 16 18:55:28.916659 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:55:28.916604 2578 scope.go:117] "RemoveContainer" containerID="797dfad1ec3d57c22e8e0490911487d9009c53e0ce1afa0f9120566c22f9e994" Apr 16 18:55:28.924876 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:55:28.924862 2578 scope.go:117] "RemoveContainer" containerID="c96322827e7638759a6f6a0543d7d1756c42ae85e62bc8c76be624f3b86a8902" Apr 16 18:55:28.931700 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:55:28.931684 2578 scope.go:117] "RemoveContainer" containerID="797dfad1ec3d57c22e8e0490911487d9009c53e0ce1afa0f9120566c22f9e994" Apr 16 18:55:28.931927 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:55:28.931908 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"797dfad1ec3d57c22e8e0490911487d9009c53e0ce1afa0f9120566c22f9e994\": container with ID starting with 797dfad1ec3d57c22e8e0490911487d9009c53e0ce1afa0f9120566c22f9e994 not found: ID does not exist" containerID="797dfad1ec3d57c22e8e0490911487d9009c53e0ce1afa0f9120566c22f9e994" Apr 16 18:55:28.931971 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:55:28.931936 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"797dfad1ec3d57c22e8e0490911487d9009c53e0ce1afa0f9120566c22f9e994"} err="failed to get container status \"797dfad1ec3d57c22e8e0490911487d9009c53e0ce1afa0f9120566c22f9e994\": rpc error: code = NotFound desc = could not find container \"797dfad1ec3d57c22e8e0490911487d9009c53e0ce1afa0f9120566c22f9e994\": container with ID starting with 797dfad1ec3d57c22e8e0490911487d9009c53e0ce1afa0f9120566c22f9e994 not found: ID does not exist" Apr 16 18:55:28.931971 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:55:28.931954 2578 scope.go:117] "RemoveContainer" containerID="c96322827e7638759a6f6a0543d7d1756c42ae85e62bc8c76be624f3b86a8902" Apr 16 18:55:28.932158 ip-10-0-139-33 kubenswrapper[2578]: E0416 18:55:28.932144 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c96322827e7638759a6f6a0543d7d1756c42ae85e62bc8c76be624f3b86a8902\": container with ID starting with c96322827e7638759a6f6a0543d7d1756c42ae85e62bc8c76be624f3b86a8902 not found: ID does not exist" containerID="c96322827e7638759a6f6a0543d7d1756c42ae85e62bc8c76be624f3b86a8902" Apr 16 18:55:28.932291 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:55:28.932164 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c96322827e7638759a6f6a0543d7d1756c42ae85e62bc8c76be624f3b86a8902"} err="failed to get container status \"c96322827e7638759a6f6a0543d7d1756c42ae85e62bc8c76be624f3b86a8902\": rpc error: code = NotFound desc = could not find container \"c96322827e7638759a6f6a0543d7d1756c42ae85e62bc8c76be624f3b86a8902\": container with ID starting with c96322827e7638759a6f6a0543d7d1756c42ae85e62bc8c76be624f3b86a8902 not found: ID does not exist" Apr 16 18:55:28.937773 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:55:28.937752 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-8c7997d9b-pp6d6"] Apr 16 18:55:28.940294 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:55:28.940273 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-8c7997d9b-pp6d6"] Apr 16 18:55:28.967635 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:55:28.967611 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/86208719-5f92-44d2-a9bb-a533bd42147e-kserve-provision-location\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 18:55:29.840949 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:55:29.840916 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86208719-5f92-44d2-a9bb-a533bd42147e" path="/var/lib/kubelet/pods/86208719-5f92-44d2-a9bb-a533bd42147e/volumes" Apr 16 18:55:47.477894 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:55:47.477807 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fdzzg_02533a21-4e1f-4bc0-a493-7ac7d35295b8/ovn-acl-logging/0.log" Apr 16 18:55:47.479837 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:55:47.479814 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fdzzg_02533a21-4e1f-4bc0-a493-7ac7d35295b8/ovn-acl-logging/0.log" Apr 16 18:58:39.033867 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:58:39.033828 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-gjlqs"] Apr 16 18:58:39.034397 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:58:39.034233 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7e2191b1-7d18-4641-8fd7-e74c9a8beb8b" containerName="storage-initializer" Apr 16 18:58:39.034397 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:58:39.034253 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e2191b1-7d18-4641-8fd7-e74c9a8beb8b" containerName="storage-initializer" Apr 16 18:58:39.034397 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:58:39.034270 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="86208719-5f92-44d2-a9bb-a533bd42147e" containerName="storage-initializer" Apr 16 18:58:39.034397 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:58:39.034278 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="86208719-5f92-44d2-a9bb-a533bd42147e" containerName="storage-initializer" Apr 16 18:58:39.034397 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:58:39.034303 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7e2191b1-7d18-4641-8fd7-e74c9a8beb8b" containerName="kserve-container" Apr 16 18:58:39.034397 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:58:39.034312 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e2191b1-7d18-4641-8fd7-e74c9a8beb8b" containerName="kserve-container" Apr 16 18:58:39.034397 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:58:39.034326 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="86208719-5f92-44d2-a9bb-a533bd42147e" containerName="kserve-container" Apr 16 18:58:39.034397 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:58:39.034333 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="86208719-5f92-44d2-a9bb-a533bd42147e" containerName="kserve-container" Apr 16 18:58:39.034397 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:58:39.034402 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="86208719-5f92-44d2-a9bb-a533bd42147e" containerName="kserve-container" Apr 16 18:58:39.034756 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:58:39.034413 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="7e2191b1-7d18-4641-8fd7-e74c9a8beb8b" containerName="kserve-container" Apr 16 18:58:39.037556 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:58:39.037525 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-gjlqs" Apr 16 18:58:39.040099 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:58:39.040079 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-z66mq\"" Apr 16 18:58:39.042148 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:58:39.042121 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/884ef371-2661-43aa-b8ac-2620e08a6b75-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-5bf494d8dc-gjlqs\" (UID: \"884ef371-2661-43aa-b8ac-2620e08a6b75\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-gjlqs" Apr 16 18:58:39.044401 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:58:39.044379 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-gjlqs"] Apr 16 18:58:39.142646 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:58:39.142610 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/884ef371-2661-43aa-b8ac-2620e08a6b75-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-5bf494d8dc-gjlqs\" (UID: \"884ef371-2661-43aa-b8ac-2620e08a6b75\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-gjlqs" Apr 16 18:58:39.142999 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:58:39.142976 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/884ef371-2661-43aa-b8ac-2620e08a6b75-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-5bf494d8dc-gjlqs\" (UID: \"884ef371-2661-43aa-b8ac-2620e08a6b75\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-gjlqs" Apr 16 18:58:39.348746 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:58:39.348645 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-gjlqs" Apr 16 18:58:39.471517 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:58:39.471489 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-gjlqs"] Apr 16 18:58:39.474474 ip-10-0-139-33 kubenswrapper[2578]: W0416 18:58:39.474442 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod884ef371_2661_43aa_b8ac_2620e08a6b75.slice/crio-8bbb3670a642da77cca4b964b16e00e19375dbbc67da3ecd5348999543cfbe01 WatchSource:0}: Error finding container 8bbb3670a642da77cca4b964b16e00e19375dbbc67da3ecd5348999543cfbe01: Status 404 returned error can't find the container with id 8bbb3670a642da77cca4b964b16e00e19375dbbc67da3ecd5348999543cfbe01 Apr 16 18:58:39.476477 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:58:39.476461 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:58:39.528133 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:58:39.528099 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-gjlqs" event={"ID":"884ef371-2661-43aa-b8ac-2620e08a6b75","Type":"ContainerStarted","Data":"8bbb3670a642da77cca4b964b16e00e19375dbbc67da3ecd5348999543cfbe01"} Apr 16 18:58:40.532911 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:58:40.532830 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-gjlqs" event={"ID":"884ef371-2661-43aa-b8ac-2620e08a6b75","Type":"ContainerStarted","Data":"2644f30c36d024dcba7b0a9b1cd7f53fd664bb43925056965e040554674fa5a3"} Apr 16 18:58:43.544583 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:58:43.544551 2578 generic.go:358] "Generic (PLEG): container finished" podID="884ef371-2661-43aa-b8ac-2620e08a6b75" containerID="2644f30c36d024dcba7b0a9b1cd7f53fd664bb43925056965e040554674fa5a3" exitCode=0 Apr 16 18:58:43.544960 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:58:43.544613 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-gjlqs" event={"ID":"884ef371-2661-43aa-b8ac-2620e08a6b75","Type":"ContainerDied","Data":"2644f30c36d024dcba7b0a9b1cd7f53fd664bb43925056965e040554674fa5a3"} Apr 16 18:58:50.576486 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:58:50.576396 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-gjlqs" event={"ID":"884ef371-2661-43aa-b8ac-2620e08a6b75","Type":"ContainerStarted","Data":"6a416f0b1f069021ba0710143e166e818d64332b2eede9773605c5bebdb62dbe"} Apr 16 18:58:50.576899 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:58:50.576697 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-gjlqs" Apr 16 18:58:50.578262 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:58:50.578234 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-gjlqs" podUID="884ef371-2661-43aa-b8ac-2620e08a6b75" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 16 18:58:50.592778 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:58:50.592730 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-gjlqs" podStartSLOduration=4.814276563 podStartE2EDuration="11.592719036s" podCreationTimestamp="2026-04-16 18:58:39 +0000 UTC" firstStartedPulling="2026-04-16 18:58:43.545817255 +0000 UTC m=+1700.333153488" lastFinishedPulling="2026-04-16 18:58:50.324259728 +0000 UTC m=+1707.111595961" observedRunningTime="2026-04-16 18:58:50.591007518 +0000 UTC m=+1707.378343774" watchObservedRunningTime="2026-04-16 18:58:50.592719036 +0000 UTC m=+1707.380055291" Apr 16 18:58:51.580988 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:58:51.580952 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-gjlqs" podUID="884ef371-2661-43aa-b8ac-2620e08a6b75" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 16 18:59:01.581648 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:59:01.581606 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-gjlqs" podUID="884ef371-2661-43aa-b8ac-2620e08a6b75" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 16 18:59:11.581523 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:59:11.581454 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-gjlqs" podUID="884ef371-2661-43aa-b8ac-2620e08a6b75" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 16 18:59:21.581012 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:59:21.580967 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-gjlqs" podUID="884ef371-2661-43aa-b8ac-2620e08a6b75" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 16 18:59:31.581573 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:59:31.581520 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-gjlqs" podUID="884ef371-2661-43aa-b8ac-2620e08a6b75" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 16 18:59:41.581328 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:59:41.581287 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-gjlqs" podUID="884ef371-2661-43aa-b8ac-2620e08a6b75" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 16 18:59:51.581765 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:59:51.581722 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-gjlqs" podUID="884ef371-2661-43aa-b8ac-2620e08a6b75" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 16 18:59:56.837189 ip-10-0-139-33 kubenswrapper[2578]: I0416 18:59:56.837139 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-gjlqs" podUID="884ef371-2661-43aa-b8ac-2620e08a6b75" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 16 19:00:06.837394 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:00:06.837348 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-gjlqs" podUID="884ef371-2661-43aa-b8ac-2620e08a6b75" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 16 19:00:16.838637 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:00:16.838553 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-gjlqs" Apr 16 19:00:20.632634 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:00:20.632599 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-gjlqs"] Apr 16 19:00:20.633005 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:00:20.632863 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-gjlqs" podUID="884ef371-2661-43aa-b8ac-2620e08a6b75" containerName="kserve-container" containerID="cri-o://6a416f0b1f069021ba0710143e166e818d64332b2eede9773605c5bebdb62dbe" gracePeriod=30 Apr 16 19:00:24.177411 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:00:24.177389 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-gjlqs" Apr 16 19:00:24.322255 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:00:24.322145 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/884ef371-2661-43aa-b8ac-2620e08a6b75-kserve-provision-location\") pod \"884ef371-2661-43aa-b8ac-2620e08a6b75\" (UID: \"884ef371-2661-43aa-b8ac-2620e08a6b75\") " Apr 16 19:00:24.322469 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:00:24.322445 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/884ef371-2661-43aa-b8ac-2620e08a6b75-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "884ef371-2661-43aa-b8ac-2620e08a6b75" (UID: "884ef371-2661-43aa-b8ac-2620e08a6b75"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:00:24.422829 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:00:24.422793 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/884ef371-2661-43aa-b8ac-2620e08a6b75-kserve-provision-location\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 19:00:24.881527 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:00:24.881491 2578 generic.go:358] "Generic (PLEG): container finished" podID="884ef371-2661-43aa-b8ac-2620e08a6b75" containerID="6a416f0b1f069021ba0710143e166e818d64332b2eede9773605c5bebdb62dbe" exitCode=0 Apr 16 19:00:24.881698 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:00:24.881552 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-gjlqs" Apr 16 19:00:24.881698 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:00:24.881570 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-gjlqs" event={"ID":"884ef371-2661-43aa-b8ac-2620e08a6b75","Type":"ContainerDied","Data":"6a416f0b1f069021ba0710143e166e818d64332b2eede9773605c5bebdb62dbe"} Apr 16 19:00:24.881698 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:00:24.881607 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-gjlqs" event={"ID":"884ef371-2661-43aa-b8ac-2620e08a6b75","Type":"ContainerDied","Data":"8bbb3670a642da77cca4b964b16e00e19375dbbc67da3ecd5348999543cfbe01"} Apr 16 19:00:24.881698 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:00:24.881623 2578 scope.go:117] "RemoveContainer" containerID="6a416f0b1f069021ba0710143e166e818d64332b2eede9773605c5bebdb62dbe" Apr 16 19:00:24.889882 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:00:24.889864 2578 scope.go:117] "RemoveContainer" containerID="2644f30c36d024dcba7b0a9b1cd7f53fd664bb43925056965e040554674fa5a3" Apr 16 19:00:24.896780 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:00:24.896763 2578 scope.go:117] "RemoveContainer" containerID="6a416f0b1f069021ba0710143e166e818d64332b2eede9773605c5bebdb62dbe" Apr 16 19:00:24.897044 ip-10-0-139-33 kubenswrapper[2578]: E0416 19:00:24.897025 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a416f0b1f069021ba0710143e166e818d64332b2eede9773605c5bebdb62dbe\": container with ID starting with 6a416f0b1f069021ba0710143e166e818d64332b2eede9773605c5bebdb62dbe not found: ID does not exist" containerID="6a416f0b1f069021ba0710143e166e818d64332b2eede9773605c5bebdb62dbe" Apr 16 19:00:24.897140 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:00:24.897050 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a416f0b1f069021ba0710143e166e818d64332b2eede9773605c5bebdb62dbe"} err="failed to get container status \"6a416f0b1f069021ba0710143e166e818d64332b2eede9773605c5bebdb62dbe\": rpc error: code = NotFound desc = could not find container \"6a416f0b1f069021ba0710143e166e818d64332b2eede9773605c5bebdb62dbe\": container with ID starting with 6a416f0b1f069021ba0710143e166e818d64332b2eede9773605c5bebdb62dbe not found: ID does not exist" Apr 16 19:00:24.897140 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:00:24.897068 2578 scope.go:117] "RemoveContainer" containerID="2644f30c36d024dcba7b0a9b1cd7f53fd664bb43925056965e040554674fa5a3" Apr 16 19:00:24.897348 ip-10-0-139-33 kubenswrapper[2578]: E0416 19:00:24.897328 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2644f30c36d024dcba7b0a9b1cd7f53fd664bb43925056965e040554674fa5a3\": container with ID starting with 2644f30c36d024dcba7b0a9b1cd7f53fd664bb43925056965e040554674fa5a3 not found: ID does not exist" containerID="2644f30c36d024dcba7b0a9b1cd7f53fd664bb43925056965e040554674fa5a3" Apr 16 19:00:24.897418 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:00:24.897354 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2644f30c36d024dcba7b0a9b1cd7f53fd664bb43925056965e040554674fa5a3"} err="failed to get container status \"2644f30c36d024dcba7b0a9b1cd7f53fd664bb43925056965e040554674fa5a3\": rpc error: code = NotFound desc = could not find container \"2644f30c36d024dcba7b0a9b1cd7f53fd664bb43925056965e040554674fa5a3\": container with ID starting with 2644f30c36d024dcba7b0a9b1cd7f53fd664bb43925056965e040554674fa5a3 not found: ID does not exist" Apr 16 19:00:24.902353 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:00:24.902329 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-gjlqs"] Apr 16 19:00:24.903728 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:00:24.903709 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-gjlqs"] Apr 16 19:00:25.841689 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:00:25.841655 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="884ef371-2661-43aa-b8ac-2620e08a6b75" path="/var/lib/kubelet/pods/884ef371-2661-43aa-b8ac-2620e08a6b75/volumes" Apr 16 19:00:47.500663 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:00:47.500631 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fdzzg_02533a21-4e1f-4bc0-a493-7ac7d35295b8/ovn-acl-logging/0.log" Apr 16 19:00:47.503434 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:00:47.503409 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fdzzg_02533a21-4e1f-4bc0-a493-7ac7d35295b8/ovn-acl-logging/0.log" Apr 16 19:01:52.218867 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:01:52.214169 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-primary-a68e7d-predictor-655664fd94-frkx7"] Apr 16 19:01:52.218867 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:01:52.214992 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="884ef371-2661-43aa-b8ac-2620e08a6b75" containerName="storage-initializer" Apr 16 19:01:52.218867 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:01:52.215011 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="884ef371-2661-43aa-b8ac-2620e08a6b75" containerName="storage-initializer" Apr 16 19:01:52.218867 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:01:52.215026 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="884ef371-2661-43aa-b8ac-2620e08a6b75" containerName="kserve-container" Apr 16 19:01:52.218867 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:01:52.215036 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="884ef371-2661-43aa-b8ac-2620e08a6b75" containerName="kserve-container" Apr 16 19:01:52.218867 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:01:52.215225 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="884ef371-2661-43aa-b8ac-2620e08a6b75" containerName="kserve-container" Apr 16 19:01:52.219567 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:01:52.219520 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-a68e7d-predictor-655664fd94-frkx7" Apr 16 19:01:52.222551 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:01:52.222525 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-z66mq\"" Apr 16 19:01:52.224119 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:01:52.224094 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-a68e7d-predictor-655664fd94-frkx7"] Apr 16 19:01:52.336347 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:01:52.336304 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d141a9ba-91a0-479c-b2db-fd36e7c38e51-kserve-provision-location\") pod \"isvc-primary-a68e7d-predictor-655664fd94-frkx7\" (UID: \"d141a9ba-91a0-479c-b2db-fd36e7c38e51\") " pod="kserve-ci-e2e-test/isvc-primary-a68e7d-predictor-655664fd94-frkx7" Apr 16 19:01:52.437063 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:01:52.437028 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d141a9ba-91a0-479c-b2db-fd36e7c38e51-kserve-provision-location\") pod \"isvc-primary-a68e7d-predictor-655664fd94-frkx7\" (UID: \"d141a9ba-91a0-479c-b2db-fd36e7c38e51\") " pod="kserve-ci-e2e-test/isvc-primary-a68e7d-predictor-655664fd94-frkx7" Apr 16 19:01:52.437450 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:01:52.437422 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d141a9ba-91a0-479c-b2db-fd36e7c38e51-kserve-provision-location\") pod \"isvc-primary-a68e7d-predictor-655664fd94-frkx7\" (UID: \"d141a9ba-91a0-479c-b2db-fd36e7c38e51\") " pod="kserve-ci-e2e-test/isvc-primary-a68e7d-predictor-655664fd94-frkx7" Apr 16 19:01:52.531116 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:01:52.531036 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-a68e7d-predictor-655664fd94-frkx7" Apr 16 19:01:52.651886 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:01:52.651862 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-a68e7d-predictor-655664fd94-frkx7"] Apr 16 19:01:52.654512 ip-10-0-139-33 kubenswrapper[2578]: W0416 19:01:52.654482 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd141a9ba_91a0_479c_b2db_fd36e7c38e51.slice/crio-4a4251cea0c1f48b4bb91aa01ba115dceeaee0aa4b6d5c09ac482abe0bef5378 WatchSource:0}: Error finding container 4a4251cea0c1f48b4bb91aa01ba115dceeaee0aa4b6d5c09ac482abe0bef5378: Status 404 returned error can't find the container with id 4a4251cea0c1f48b4bb91aa01ba115dceeaee0aa4b6d5c09ac482abe0bef5378 Apr 16 19:01:53.178252 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:01:53.178148 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-a68e7d-predictor-655664fd94-frkx7" event={"ID":"d141a9ba-91a0-479c-b2db-fd36e7c38e51","Type":"ContainerStarted","Data":"7793f5b393aaf29b2ce8b50f27b70d9e5d00a66ba6506f35c4e782419e2ba94b"} Apr 16 19:01:53.178252 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:01:53.178200 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-a68e7d-predictor-655664fd94-frkx7" event={"ID":"d141a9ba-91a0-479c-b2db-fd36e7c38e51","Type":"ContainerStarted","Data":"4a4251cea0c1f48b4bb91aa01ba115dceeaee0aa4b6d5c09ac482abe0bef5378"} Apr 16 19:01:57.192192 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:01:57.192150 2578 generic.go:358] "Generic (PLEG): container finished" podID="d141a9ba-91a0-479c-b2db-fd36e7c38e51" containerID="7793f5b393aaf29b2ce8b50f27b70d9e5d00a66ba6506f35c4e782419e2ba94b" exitCode=0 Apr 16 19:01:57.192562 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:01:57.192223 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-a68e7d-predictor-655664fd94-frkx7" event={"ID":"d141a9ba-91a0-479c-b2db-fd36e7c38e51","Type":"ContainerDied","Data":"7793f5b393aaf29b2ce8b50f27b70d9e5d00a66ba6506f35c4e782419e2ba94b"} Apr 16 19:01:58.196833 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:01:58.196799 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-a68e7d-predictor-655664fd94-frkx7" event={"ID":"d141a9ba-91a0-479c-b2db-fd36e7c38e51","Type":"ContainerStarted","Data":"ca9e347c89ff45a5a853047bfe1ef013f2e1c39203eddcea8625208c4aede9c0"} Apr 16 19:01:58.197372 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:01:58.197164 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-a68e7d-predictor-655664fd94-frkx7" Apr 16 19:01:58.198320 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:01:58.198294 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-a68e7d-predictor-655664fd94-frkx7" podUID="d141a9ba-91a0-479c-b2db-fd36e7c38e51" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 16 19:01:58.215420 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:01:58.215379 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-primary-a68e7d-predictor-655664fd94-frkx7" podStartSLOduration=6.215367416 podStartE2EDuration="6.215367416s" podCreationTimestamp="2026-04-16 19:01:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:01:58.214267447 +0000 UTC m=+1895.001603704" watchObservedRunningTime="2026-04-16 19:01:58.215367416 +0000 UTC m=+1895.002703672" Apr 16 19:01:59.200701 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:01:59.200663 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-a68e7d-predictor-655664fd94-frkx7" podUID="d141a9ba-91a0-479c-b2db-fd36e7c38e51" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 16 19:02:09.201589 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:02:09.201542 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-a68e7d-predictor-655664fd94-frkx7" podUID="d141a9ba-91a0-479c-b2db-fd36e7c38e51" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 16 19:02:19.201146 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:02:19.201095 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-a68e7d-predictor-655664fd94-frkx7" podUID="d141a9ba-91a0-479c-b2db-fd36e7c38e51" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 16 19:02:29.201630 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:02:29.201586 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-a68e7d-predictor-655664fd94-frkx7" podUID="d141a9ba-91a0-479c-b2db-fd36e7c38e51" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 16 19:02:39.200893 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:02:39.200854 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-a68e7d-predictor-655664fd94-frkx7" podUID="d141a9ba-91a0-479c-b2db-fd36e7c38e51" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 16 19:02:49.201613 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:02:49.201575 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-a68e7d-predictor-655664fd94-frkx7" podUID="d141a9ba-91a0-479c-b2db-fd36e7c38e51" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 16 19:02:59.200674 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:02:59.200630 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-a68e7d-predictor-655664fd94-frkx7" podUID="d141a9ba-91a0-479c-b2db-fd36e7c38e51" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 16 19:03:06.838342 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:06.838304 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-a68e7d-predictor-655664fd94-frkx7" Apr 16 19:03:12.380632 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:12.380551 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-a68e7d-predictor-7f7d945c6c-r7zr2"] Apr 16 19:03:12.384015 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:12.383999 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-a68e7d-predictor-7f7d945c6c-r7zr2" Apr 16 19:03:12.386910 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:12.386884 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 16 19:03:12.387033 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:12.386911 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-secret-a68e7d\"" Apr 16 19:03:12.387033 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:12.386911 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-sa-a68e7d-dockercfg-t787b\"" Apr 16 19:03:12.390626 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:12.390607 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-a68e7d-predictor-7f7d945c6c-r7zr2"] Apr 16 19:03:12.522895 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:12.522862 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/316749c8-6e39-443e-875c-99e98f785c1f-cabundle-cert\") pod \"isvc-secondary-a68e7d-predictor-7f7d945c6c-r7zr2\" (UID: \"316749c8-6e39-443e-875c-99e98f785c1f\") " pod="kserve-ci-e2e-test/isvc-secondary-a68e7d-predictor-7f7d945c6c-r7zr2" Apr 16 19:03:12.522895 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:12.522903 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/316749c8-6e39-443e-875c-99e98f785c1f-kserve-provision-location\") pod \"isvc-secondary-a68e7d-predictor-7f7d945c6c-r7zr2\" (UID: \"316749c8-6e39-443e-875c-99e98f785c1f\") " pod="kserve-ci-e2e-test/isvc-secondary-a68e7d-predictor-7f7d945c6c-r7zr2" Apr 16 19:03:12.624220 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:12.624165 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/316749c8-6e39-443e-875c-99e98f785c1f-cabundle-cert\") pod \"isvc-secondary-a68e7d-predictor-7f7d945c6c-r7zr2\" (UID: \"316749c8-6e39-443e-875c-99e98f785c1f\") " pod="kserve-ci-e2e-test/isvc-secondary-a68e7d-predictor-7f7d945c6c-r7zr2" Apr 16 19:03:12.624389 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:12.624230 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/316749c8-6e39-443e-875c-99e98f785c1f-kserve-provision-location\") pod \"isvc-secondary-a68e7d-predictor-7f7d945c6c-r7zr2\" (UID: \"316749c8-6e39-443e-875c-99e98f785c1f\") " pod="kserve-ci-e2e-test/isvc-secondary-a68e7d-predictor-7f7d945c6c-r7zr2" Apr 16 19:03:12.624603 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:12.624587 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/316749c8-6e39-443e-875c-99e98f785c1f-kserve-provision-location\") pod \"isvc-secondary-a68e7d-predictor-7f7d945c6c-r7zr2\" (UID: \"316749c8-6e39-443e-875c-99e98f785c1f\") " pod="kserve-ci-e2e-test/isvc-secondary-a68e7d-predictor-7f7d945c6c-r7zr2" Apr 16 19:03:12.624771 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:12.624752 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/316749c8-6e39-443e-875c-99e98f785c1f-cabundle-cert\") pod \"isvc-secondary-a68e7d-predictor-7f7d945c6c-r7zr2\" (UID: \"316749c8-6e39-443e-875c-99e98f785c1f\") " pod="kserve-ci-e2e-test/isvc-secondary-a68e7d-predictor-7f7d945c6c-r7zr2" Apr 16 19:03:12.695671 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:12.695590 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-a68e7d-predictor-7f7d945c6c-r7zr2" Apr 16 19:03:12.811913 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:12.811867 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-a68e7d-predictor-7f7d945c6c-r7zr2"] Apr 16 19:03:12.813712 ip-10-0-139-33 kubenswrapper[2578]: W0416 19:03:12.813687 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod316749c8_6e39_443e_875c_99e98f785c1f.slice/crio-3b4841040e005e7a5d998fcac9ec3e81fefe89034c869bb1d25dc7452ffe92d6 WatchSource:0}: Error finding container 3b4841040e005e7a5d998fcac9ec3e81fefe89034c869bb1d25dc7452ffe92d6: Status 404 returned error can't find the container with id 3b4841040e005e7a5d998fcac9ec3e81fefe89034c869bb1d25dc7452ffe92d6 Apr 16 19:03:13.453565 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:13.453520 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-a68e7d-predictor-7f7d945c6c-r7zr2" event={"ID":"316749c8-6e39-443e-875c-99e98f785c1f","Type":"ContainerStarted","Data":"2bb8270016c7bc3115ccaa87813eacf41b2b124dde1bcd83e5e7e4eb55e8caf0"} Apr 16 19:03:13.453565 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:13.453569 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-a68e7d-predictor-7f7d945c6c-r7zr2" event={"ID":"316749c8-6e39-443e-875c-99e98f785c1f","Type":"ContainerStarted","Data":"3b4841040e005e7a5d998fcac9ec3e81fefe89034c869bb1d25dc7452ffe92d6"} Apr 16 19:03:17.468342 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:17.468308 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-a68e7d-predictor-7f7d945c6c-r7zr2_316749c8-6e39-443e-875c-99e98f785c1f/storage-initializer/0.log" Apr 16 19:03:17.468731 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:17.468347 2578 generic.go:358] "Generic (PLEG): container finished" podID="316749c8-6e39-443e-875c-99e98f785c1f" containerID="2bb8270016c7bc3115ccaa87813eacf41b2b124dde1bcd83e5e7e4eb55e8caf0" exitCode=1 Apr 16 19:03:17.468731 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:17.468432 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-a68e7d-predictor-7f7d945c6c-r7zr2" event={"ID":"316749c8-6e39-443e-875c-99e98f785c1f","Type":"ContainerDied","Data":"2bb8270016c7bc3115ccaa87813eacf41b2b124dde1bcd83e5e7e4eb55e8caf0"} Apr 16 19:03:18.473655 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:18.473627 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-a68e7d-predictor-7f7d945c6c-r7zr2_316749c8-6e39-443e-875c-99e98f785c1f/storage-initializer/0.log" Apr 16 19:03:18.474031 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:18.473735 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-a68e7d-predictor-7f7d945c6c-r7zr2" event={"ID":"316749c8-6e39-443e-875c-99e98f785c1f","Type":"ContainerStarted","Data":"4899187b750035e57e778a5e961b89b883c4c808c810f7cdc754f78f056d650b"} Apr 16 19:03:25.495950 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:25.495923 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-a68e7d-predictor-7f7d945c6c-r7zr2_316749c8-6e39-443e-875c-99e98f785c1f/storage-initializer/1.log" Apr 16 19:03:25.496371 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:25.496289 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-a68e7d-predictor-7f7d945c6c-r7zr2_316749c8-6e39-443e-875c-99e98f785c1f/storage-initializer/0.log" Apr 16 19:03:25.496371 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:25.496318 2578 generic.go:358] "Generic (PLEG): container finished" podID="316749c8-6e39-443e-875c-99e98f785c1f" containerID="4899187b750035e57e778a5e961b89b883c4c808c810f7cdc754f78f056d650b" exitCode=1 Apr 16 19:03:25.496476 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:25.496380 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-a68e7d-predictor-7f7d945c6c-r7zr2" event={"ID":"316749c8-6e39-443e-875c-99e98f785c1f","Type":"ContainerDied","Data":"4899187b750035e57e778a5e961b89b883c4c808c810f7cdc754f78f056d650b"} Apr 16 19:03:25.496476 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:25.496410 2578 scope.go:117] "RemoveContainer" containerID="2bb8270016c7bc3115ccaa87813eacf41b2b124dde1bcd83e5e7e4eb55e8caf0" Apr 16 19:03:25.496812 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:25.496784 2578 scope.go:117] "RemoveContainer" containerID="2bb8270016c7bc3115ccaa87813eacf41b2b124dde1bcd83e5e7e4eb55e8caf0" Apr 16 19:03:25.506716 ip-10-0-139-33 kubenswrapper[2578]: E0416 19:03:25.506686 2578 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-a68e7d-predictor-7f7d945c6c-r7zr2_kserve-ci-e2e-test_316749c8-6e39-443e-875c-99e98f785c1f_0 in pod sandbox 3b4841040e005e7a5d998fcac9ec3e81fefe89034c869bb1d25dc7452ffe92d6 from index: no such id: '2bb8270016c7bc3115ccaa87813eacf41b2b124dde1bcd83e5e7e4eb55e8caf0'" containerID="2bb8270016c7bc3115ccaa87813eacf41b2b124dde1bcd83e5e7e4eb55e8caf0" Apr 16 19:03:25.506799 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:25.506726 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bb8270016c7bc3115ccaa87813eacf41b2b124dde1bcd83e5e7e4eb55e8caf0"} err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-a68e7d-predictor-7f7d945c6c-r7zr2_kserve-ci-e2e-test_316749c8-6e39-443e-875c-99e98f785c1f_0 in pod sandbox 3b4841040e005e7a5d998fcac9ec3e81fefe89034c869bb1d25dc7452ffe92d6 from index: no such id: '2bb8270016c7bc3115ccaa87813eacf41b2b124dde1bcd83e5e7e4eb55e8caf0'" Apr 16 19:03:25.506928 ip-10-0-139-33 kubenswrapper[2578]: E0416 19:03:25.506910 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-secondary-a68e7d-predictor-7f7d945c6c-r7zr2_kserve-ci-e2e-test(316749c8-6e39-443e-875c-99e98f785c1f)\"" pod="kserve-ci-e2e-test/isvc-secondary-a68e7d-predictor-7f7d945c6c-r7zr2" podUID="316749c8-6e39-443e-875c-99e98f785c1f" Apr 16 19:03:26.500804 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:26.500775 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-a68e7d-predictor-7f7d945c6c-r7zr2_316749c8-6e39-443e-875c-99e98f785c1f/storage-initializer/1.log" Apr 16 19:03:28.483007 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:28.482969 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-a68e7d-predictor-7f7d945c6c-r7zr2"] Apr 16 19:03:28.570853 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:28.570817 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-a68e7d-predictor-655664fd94-frkx7"] Apr 16 19:03:28.571430 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:28.571395 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-a68e7d-predictor-655664fd94-frkx7" podUID="d141a9ba-91a0-479c-b2db-fd36e7c38e51" containerName="kserve-container" containerID="cri-o://ca9e347c89ff45a5a853047bfe1ef013f2e1c39203eddcea8625208c4aede9c0" gracePeriod=30 Apr 16 19:03:28.632271 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:28.632251 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-a68e7d-predictor-7f7d945c6c-r7zr2_316749c8-6e39-443e-875c-99e98f785c1f/storage-initializer/1.log" Apr 16 19:03:28.632389 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:28.632312 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-a68e7d-predictor-7f7d945c6c-r7zr2" Apr 16 19:03:28.645737 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:28.645710 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-b3b032-predictor-774ccfdcd7-g6z92"] Apr 16 19:03:28.646079 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:28.646067 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="316749c8-6e39-443e-875c-99e98f785c1f" containerName="storage-initializer" Apr 16 19:03:28.646116 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:28.646082 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="316749c8-6e39-443e-875c-99e98f785c1f" containerName="storage-initializer" Apr 16 19:03:28.646116 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:28.646089 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="316749c8-6e39-443e-875c-99e98f785c1f" containerName="storage-initializer" Apr 16 19:03:28.646116 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:28.646095 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="316749c8-6e39-443e-875c-99e98f785c1f" containerName="storage-initializer" Apr 16 19:03:28.646243 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:28.646151 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="316749c8-6e39-443e-875c-99e98f785c1f" containerName="storage-initializer" Apr 16 19:03:28.646243 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:28.646161 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="316749c8-6e39-443e-875c-99e98f785c1f" containerName="storage-initializer" Apr 16 19:03:28.650394 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:28.650372 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-b3b032-predictor-774ccfdcd7-g6z92" Apr 16 19:03:28.653014 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:28.652994 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-secret-b3b032\"" Apr 16 19:03:28.653567 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:28.653552 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-sa-b3b032-dockercfg-jvpsx\"" Apr 16 19:03:28.660392 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:28.660370 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-b3b032-predictor-774ccfdcd7-g6z92"] Apr 16 19:03:28.764121 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:28.764025 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/316749c8-6e39-443e-875c-99e98f785c1f-kserve-provision-location\") pod \"316749c8-6e39-443e-875c-99e98f785c1f\" (UID: \"316749c8-6e39-443e-875c-99e98f785c1f\") " Apr 16 19:03:28.764308 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:28.764147 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/316749c8-6e39-443e-875c-99e98f785c1f-cabundle-cert\") pod \"316749c8-6e39-443e-875c-99e98f785c1f\" (UID: \"316749c8-6e39-443e-875c-99e98f785c1f\") " Apr 16 19:03:28.764386 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:28.764304 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/42f4da98-5207-44d5-a92d-837aea5a667d-kserve-provision-location\") pod \"isvc-init-fail-b3b032-predictor-774ccfdcd7-g6z92\" (UID: \"42f4da98-5207-44d5-a92d-837aea5a667d\") " pod="kserve-ci-e2e-test/isvc-init-fail-b3b032-predictor-774ccfdcd7-g6z92" Apr 16 19:03:28.764386 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:28.764323 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/316749c8-6e39-443e-875c-99e98f785c1f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "316749c8-6e39-443e-875c-99e98f785c1f" (UID: "316749c8-6e39-443e-875c-99e98f785c1f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:03:28.764492 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:28.764476 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/42f4da98-5207-44d5-a92d-837aea5a667d-cabundle-cert\") pod \"isvc-init-fail-b3b032-predictor-774ccfdcd7-g6z92\" (UID: \"42f4da98-5207-44d5-a92d-837aea5a667d\") " pod="kserve-ci-e2e-test/isvc-init-fail-b3b032-predictor-774ccfdcd7-g6z92" Apr 16 19:03:28.764534 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:28.764485 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/316749c8-6e39-443e-875c-99e98f785c1f-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "316749c8-6e39-443e-875c-99e98f785c1f" (UID: "316749c8-6e39-443e-875c-99e98f785c1f"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:03:28.764576 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:28.764544 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/316749c8-6e39-443e-875c-99e98f785c1f-kserve-provision-location\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 19:03:28.764576 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:28.764563 2578 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/316749c8-6e39-443e-875c-99e98f785c1f-cabundle-cert\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 19:03:28.865253 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:28.865219 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/42f4da98-5207-44d5-a92d-837aea5a667d-cabundle-cert\") pod \"isvc-init-fail-b3b032-predictor-774ccfdcd7-g6z92\" (UID: \"42f4da98-5207-44d5-a92d-837aea5a667d\") " pod="kserve-ci-e2e-test/isvc-init-fail-b3b032-predictor-774ccfdcd7-g6z92" Apr 16 19:03:28.865469 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:28.865267 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/42f4da98-5207-44d5-a92d-837aea5a667d-kserve-provision-location\") pod \"isvc-init-fail-b3b032-predictor-774ccfdcd7-g6z92\" (UID: \"42f4da98-5207-44d5-a92d-837aea5a667d\") " pod="kserve-ci-e2e-test/isvc-init-fail-b3b032-predictor-774ccfdcd7-g6z92" Apr 16 19:03:28.865684 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:28.865668 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/42f4da98-5207-44d5-a92d-837aea5a667d-kserve-provision-location\") pod \"isvc-init-fail-b3b032-predictor-774ccfdcd7-g6z92\" (UID: \"42f4da98-5207-44d5-a92d-837aea5a667d\") " pod="kserve-ci-e2e-test/isvc-init-fail-b3b032-predictor-774ccfdcd7-g6z92" Apr 16 19:03:28.865862 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:28.865845 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/42f4da98-5207-44d5-a92d-837aea5a667d-cabundle-cert\") pod \"isvc-init-fail-b3b032-predictor-774ccfdcd7-g6z92\" (UID: \"42f4da98-5207-44d5-a92d-837aea5a667d\") " pod="kserve-ci-e2e-test/isvc-init-fail-b3b032-predictor-774ccfdcd7-g6z92" Apr 16 19:03:28.961543 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:28.961512 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-b3b032-predictor-774ccfdcd7-g6z92" Apr 16 19:03:29.096028 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:29.095988 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-b3b032-predictor-774ccfdcd7-g6z92"] Apr 16 19:03:29.103471 ip-10-0-139-33 kubenswrapper[2578]: W0416 19:03:29.103440 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42f4da98_5207_44d5_a92d_837aea5a667d.slice/crio-89abc105bc399a46e93e711621b6af04be8a08b977ffc66f0c030a5c9d430b56 WatchSource:0}: Error finding container 89abc105bc399a46e93e711621b6af04be8a08b977ffc66f0c030a5c9d430b56: Status 404 returned error can't find the container with id 89abc105bc399a46e93e711621b6af04be8a08b977ffc66f0c030a5c9d430b56 Apr 16 19:03:29.511862 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:29.511831 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-a68e7d-predictor-7f7d945c6c-r7zr2_316749c8-6e39-443e-875c-99e98f785c1f/storage-initializer/1.log" Apr 16 19:03:29.512374 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:29.511969 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-a68e7d-predictor-7f7d945c6c-r7zr2" event={"ID":"316749c8-6e39-443e-875c-99e98f785c1f","Type":"ContainerDied","Data":"3b4841040e005e7a5d998fcac9ec3e81fefe89034c869bb1d25dc7452ffe92d6"} Apr 16 19:03:29.512374 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:29.511980 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-a68e7d-predictor-7f7d945c6c-r7zr2" Apr 16 19:03:29.512374 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:29.512021 2578 scope.go:117] "RemoveContainer" containerID="4899187b750035e57e778a5e961b89b883c4c808c810f7cdc754f78f056d650b" Apr 16 19:03:29.513436 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:29.513401 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-b3b032-predictor-774ccfdcd7-g6z92" event={"ID":"42f4da98-5207-44d5-a92d-837aea5a667d","Type":"ContainerStarted","Data":"cb2ef4a78ff3979f9e6f1826a703b6b8341d0bf107d89de7bd4ece9e97eb7c08"} Apr 16 19:03:29.513614 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:29.513443 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-b3b032-predictor-774ccfdcd7-g6z92" event={"ID":"42f4da98-5207-44d5-a92d-837aea5a667d","Type":"ContainerStarted","Data":"89abc105bc399a46e93e711621b6af04be8a08b977ffc66f0c030a5c9d430b56"} Apr 16 19:03:29.585560 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:29.585522 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-a68e7d-predictor-7f7d945c6c-r7zr2"] Apr 16 19:03:29.591364 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:29.591333 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-a68e7d-predictor-7f7d945c6c-r7zr2"] Apr 16 19:03:29.841260 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:29.841162 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="316749c8-6e39-443e-875c-99e98f785c1f" path="/var/lib/kubelet/pods/316749c8-6e39-443e-875c-99e98f785c1f/volumes" Apr 16 19:03:33.129946 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:33.129890 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-a68e7d-predictor-655664fd94-frkx7" Apr 16 19:03:33.307535 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:33.307501 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d141a9ba-91a0-479c-b2db-fd36e7c38e51-kserve-provision-location\") pod \"d141a9ba-91a0-479c-b2db-fd36e7c38e51\" (UID: \"d141a9ba-91a0-479c-b2db-fd36e7c38e51\") " Apr 16 19:03:33.307832 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:33.307810 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d141a9ba-91a0-479c-b2db-fd36e7c38e51-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d141a9ba-91a0-479c-b2db-fd36e7c38e51" (UID: "d141a9ba-91a0-479c-b2db-fd36e7c38e51"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:03:33.408386 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:33.408302 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d141a9ba-91a0-479c-b2db-fd36e7c38e51-kserve-provision-location\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 19:03:33.529678 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:33.529646 2578 generic.go:358] "Generic (PLEG): container finished" podID="d141a9ba-91a0-479c-b2db-fd36e7c38e51" containerID="ca9e347c89ff45a5a853047bfe1ef013f2e1c39203eddcea8625208c4aede9c0" exitCode=0 Apr 16 19:03:33.529853 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:33.529701 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-a68e7d-predictor-655664fd94-frkx7" event={"ID":"d141a9ba-91a0-479c-b2db-fd36e7c38e51","Type":"ContainerDied","Data":"ca9e347c89ff45a5a853047bfe1ef013f2e1c39203eddcea8625208c4aede9c0"} Apr 16 19:03:33.529853 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:33.529724 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-a68e7d-predictor-655664fd94-frkx7" event={"ID":"d141a9ba-91a0-479c-b2db-fd36e7c38e51","Type":"ContainerDied","Data":"4a4251cea0c1f48b4bb91aa01ba115dceeaee0aa4b6d5c09ac482abe0bef5378"} Apr 16 19:03:33.529853 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:33.529731 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-a68e7d-predictor-655664fd94-frkx7" Apr 16 19:03:33.529853 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:33.529740 2578 scope.go:117] "RemoveContainer" containerID="ca9e347c89ff45a5a853047bfe1ef013f2e1c39203eddcea8625208c4aede9c0" Apr 16 19:03:33.538339 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:33.538318 2578 scope.go:117] "RemoveContainer" containerID="7793f5b393aaf29b2ce8b50f27b70d9e5d00a66ba6506f35c4e782419e2ba94b" Apr 16 19:03:33.546973 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:33.546955 2578 scope.go:117] "RemoveContainer" containerID="ca9e347c89ff45a5a853047bfe1ef013f2e1c39203eddcea8625208c4aede9c0" Apr 16 19:03:33.547237 ip-10-0-139-33 kubenswrapper[2578]: E0416 19:03:33.547214 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca9e347c89ff45a5a853047bfe1ef013f2e1c39203eddcea8625208c4aede9c0\": container with ID starting with ca9e347c89ff45a5a853047bfe1ef013f2e1c39203eddcea8625208c4aede9c0 not found: ID does not exist" containerID="ca9e347c89ff45a5a853047bfe1ef013f2e1c39203eddcea8625208c4aede9c0" Apr 16 19:03:33.547306 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:33.547248 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca9e347c89ff45a5a853047bfe1ef013f2e1c39203eddcea8625208c4aede9c0"} err="failed to get container status \"ca9e347c89ff45a5a853047bfe1ef013f2e1c39203eddcea8625208c4aede9c0\": rpc error: code = NotFound desc = could not find container \"ca9e347c89ff45a5a853047bfe1ef013f2e1c39203eddcea8625208c4aede9c0\": container with ID starting with ca9e347c89ff45a5a853047bfe1ef013f2e1c39203eddcea8625208c4aede9c0 not found: ID does not exist" Apr 16 19:03:33.547306 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:33.547271 2578 scope.go:117] "RemoveContainer" containerID="7793f5b393aaf29b2ce8b50f27b70d9e5d00a66ba6506f35c4e782419e2ba94b" Apr 16 19:03:33.547507 ip-10-0-139-33 kubenswrapper[2578]: E0416 19:03:33.547491 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7793f5b393aaf29b2ce8b50f27b70d9e5d00a66ba6506f35c4e782419e2ba94b\": container with ID starting with 7793f5b393aaf29b2ce8b50f27b70d9e5d00a66ba6506f35c4e782419e2ba94b not found: ID does not exist" containerID="7793f5b393aaf29b2ce8b50f27b70d9e5d00a66ba6506f35c4e782419e2ba94b" Apr 16 19:03:33.547548 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:33.547514 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7793f5b393aaf29b2ce8b50f27b70d9e5d00a66ba6506f35c4e782419e2ba94b"} err="failed to get container status \"7793f5b393aaf29b2ce8b50f27b70d9e5d00a66ba6506f35c4e782419e2ba94b\": rpc error: code = NotFound desc = could not find container \"7793f5b393aaf29b2ce8b50f27b70d9e5d00a66ba6506f35c4e782419e2ba94b\": container with ID starting with 7793f5b393aaf29b2ce8b50f27b70d9e5d00a66ba6506f35c4e782419e2ba94b not found: ID does not exist" Apr 16 19:03:33.550947 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:33.550928 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-a68e7d-predictor-655664fd94-frkx7"] Apr 16 19:03:33.556010 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:33.555991 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-a68e7d-predictor-655664fd94-frkx7"] Apr 16 19:03:33.841561 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:33.841527 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d141a9ba-91a0-479c-b2db-fd36e7c38e51" path="/var/lib/kubelet/pods/d141a9ba-91a0-479c-b2db-fd36e7c38e51/volumes" Apr 16 19:03:34.534284 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:34.534257 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-b3b032-predictor-774ccfdcd7-g6z92_42f4da98-5207-44d5-a92d-837aea5a667d/storage-initializer/0.log" Apr 16 19:03:34.534627 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:34.534297 2578 generic.go:358] "Generic (PLEG): container finished" podID="42f4da98-5207-44d5-a92d-837aea5a667d" containerID="cb2ef4a78ff3979f9e6f1826a703b6b8341d0bf107d89de7bd4ece9e97eb7c08" exitCode=1 Apr 16 19:03:34.534627 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:34.534377 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-b3b032-predictor-774ccfdcd7-g6z92" event={"ID":"42f4da98-5207-44d5-a92d-837aea5a667d","Type":"ContainerDied","Data":"cb2ef4a78ff3979f9e6f1826a703b6b8341d0bf107d89de7bd4ece9e97eb7c08"} Apr 16 19:03:35.539885 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:35.539859 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-b3b032-predictor-774ccfdcd7-g6z92_42f4da98-5207-44d5-a92d-837aea5a667d/storage-initializer/0.log" Apr 16 19:03:35.540276 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:35.539972 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-b3b032-predictor-774ccfdcd7-g6z92" event={"ID":"42f4da98-5207-44d5-a92d-837aea5a667d","Type":"ContainerStarted","Data":"ed70a1f10d6fab533fc2d6a0e6320a6fbf2d9eefe7cfaee6d79b89394e8e39a5"} Apr 16 19:03:38.601911 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:38.601875 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-b3b032-predictor-774ccfdcd7-g6z92"] Apr 16 19:03:38.602376 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:38.602119 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-init-fail-b3b032-predictor-774ccfdcd7-g6z92" podUID="42f4da98-5207-44d5-a92d-837aea5a667d" containerName="storage-initializer" containerID="cri-o://ed70a1f10d6fab533fc2d6a0e6320a6fbf2d9eefe7cfaee6d79b89394e8e39a5" gracePeriod=30 Apr 16 19:03:40.141771 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:40.141745 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-b3b032-predictor-774ccfdcd7-g6z92_42f4da98-5207-44d5-a92d-837aea5a667d/storage-initializer/1.log" Apr 16 19:03:40.142115 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:40.142100 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-b3b032-predictor-774ccfdcd7-g6z92_42f4da98-5207-44d5-a92d-837aea5a667d/storage-initializer/0.log" Apr 16 19:03:40.142192 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:40.142164 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-b3b032-predictor-774ccfdcd7-g6z92" Apr 16 19:03:40.161790 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:40.161759 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/42f4da98-5207-44d5-a92d-837aea5a667d-cabundle-cert\") pod \"42f4da98-5207-44d5-a92d-837aea5a667d\" (UID: \"42f4da98-5207-44d5-a92d-837aea5a667d\") " Apr 16 19:03:40.161918 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:40.161806 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/42f4da98-5207-44d5-a92d-837aea5a667d-kserve-provision-location\") pod \"42f4da98-5207-44d5-a92d-837aea5a667d\" (UID: \"42f4da98-5207-44d5-a92d-837aea5a667d\") " Apr 16 19:03:40.162040 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:40.162019 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42f4da98-5207-44d5-a92d-837aea5a667d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "42f4da98-5207-44d5-a92d-837aea5a667d" (UID: "42f4da98-5207-44d5-a92d-837aea5a667d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:03:40.162125 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:40.162105 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42f4da98-5207-44d5-a92d-837aea5a667d-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "42f4da98-5207-44d5-a92d-837aea5a667d" (UID: "42f4da98-5207-44d5-a92d-837aea5a667d"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:03:40.263289 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:40.263207 2578 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/42f4da98-5207-44d5-a92d-837aea5a667d-cabundle-cert\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 19:03:40.263289 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:40.263239 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/42f4da98-5207-44d5-a92d-837aea5a667d-kserve-provision-location\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 19:03:40.558326 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:40.558240 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-b3b032-predictor-774ccfdcd7-g6z92_42f4da98-5207-44d5-a92d-837aea5a667d/storage-initializer/1.log" Apr 16 19:03:40.558611 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:40.558593 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-b3b032-predictor-774ccfdcd7-g6z92_42f4da98-5207-44d5-a92d-837aea5a667d/storage-initializer/0.log" Apr 16 19:03:40.558673 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:40.558633 2578 generic.go:358] "Generic (PLEG): container finished" podID="42f4da98-5207-44d5-a92d-837aea5a667d" containerID="ed70a1f10d6fab533fc2d6a0e6320a6fbf2d9eefe7cfaee6d79b89394e8e39a5" exitCode=1 Apr 16 19:03:40.558731 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:40.558717 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-b3b032-predictor-774ccfdcd7-g6z92" Apr 16 19:03:40.558791 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:40.558722 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-b3b032-predictor-774ccfdcd7-g6z92" event={"ID":"42f4da98-5207-44d5-a92d-837aea5a667d","Type":"ContainerDied","Data":"ed70a1f10d6fab533fc2d6a0e6320a6fbf2d9eefe7cfaee6d79b89394e8e39a5"} Apr 16 19:03:40.558791 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:40.558766 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-b3b032-predictor-774ccfdcd7-g6z92" event={"ID":"42f4da98-5207-44d5-a92d-837aea5a667d","Type":"ContainerDied","Data":"89abc105bc399a46e93e711621b6af04be8a08b977ffc66f0c030a5c9d430b56"} Apr 16 19:03:40.558886 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:40.558791 2578 scope.go:117] "RemoveContainer" containerID="ed70a1f10d6fab533fc2d6a0e6320a6fbf2d9eefe7cfaee6d79b89394e8e39a5" Apr 16 19:03:40.566856 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:40.566837 2578 scope.go:117] "RemoveContainer" containerID="cb2ef4a78ff3979f9e6f1826a703b6b8341d0bf107d89de7bd4ece9e97eb7c08" Apr 16 19:03:40.574142 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:40.574119 2578 scope.go:117] "RemoveContainer" containerID="ed70a1f10d6fab533fc2d6a0e6320a6fbf2d9eefe7cfaee6d79b89394e8e39a5" Apr 16 19:03:40.574414 ip-10-0-139-33 kubenswrapper[2578]: E0416 19:03:40.574396 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed70a1f10d6fab533fc2d6a0e6320a6fbf2d9eefe7cfaee6d79b89394e8e39a5\": container with ID starting with ed70a1f10d6fab533fc2d6a0e6320a6fbf2d9eefe7cfaee6d79b89394e8e39a5 not found: ID does not exist" containerID="ed70a1f10d6fab533fc2d6a0e6320a6fbf2d9eefe7cfaee6d79b89394e8e39a5" Apr 16 19:03:40.574477 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:40.574424 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed70a1f10d6fab533fc2d6a0e6320a6fbf2d9eefe7cfaee6d79b89394e8e39a5"} err="failed to get container status \"ed70a1f10d6fab533fc2d6a0e6320a6fbf2d9eefe7cfaee6d79b89394e8e39a5\": rpc error: code = NotFound desc = could not find container \"ed70a1f10d6fab533fc2d6a0e6320a6fbf2d9eefe7cfaee6d79b89394e8e39a5\": container with ID starting with ed70a1f10d6fab533fc2d6a0e6320a6fbf2d9eefe7cfaee6d79b89394e8e39a5 not found: ID does not exist" Apr 16 19:03:40.574477 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:40.574440 2578 scope.go:117] "RemoveContainer" containerID="cb2ef4a78ff3979f9e6f1826a703b6b8341d0bf107d89de7bd4ece9e97eb7c08" Apr 16 19:03:40.574680 ip-10-0-139-33 kubenswrapper[2578]: E0416 19:03:40.574662 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb2ef4a78ff3979f9e6f1826a703b6b8341d0bf107d89de7bd4ece9e97eb7c08\": container with ID starting with cb2ef4a78ff3979f9e6f1826a703b6b8341d0bf107d89de7bd4ece9e97eb7c08 not found: ID does not exist" containerID="cb2ef4a78ff3979f9e6f1826a703b6b8341d0bf107d89de7bd4ece9e97eb7c08" Apr 16 19:03:40.574731 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:40.574683 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb2ef4a78ff3979f9e6f1826a703b6b8341d0bf107d89de7bd4ece9e97eb7c08"} err="failed to get container status \"cb2ef4a78ff3979f9e6f1826a703b6b8341d0bf107d89de7bd4ece9e97eb7c08\": rpc error: code = NotFound desc = could not find container \"cb2ef4a78ff3979f9e6f1826a703b6b8341d0bf107d89de7bd4ece9e97eb7c08\": container with ID starting with cb2ef4a78ff3979f9e6f1826a703b6b8341d0bf107d89de7bd4ece9e97eb7c08 not found: ID does not exist" Apr 16 19:03:40.592558 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:40.592534 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-b3b032-predictor-774ccfdcd7-g6z92"] Apr 16 19:03:40.595499 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:40.595477 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-b3b032-predictor-774ccfdcd7-g6z92"] Apr 16 19:03:41.841560 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:03:41.841530 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42f4da98-5207-44d5-a92d-837aea5a667d" path="/var/lib/kubelet/pods/42f4da98-5207-44d5-a92d-837aea5a667d/volumes" Apr 16 19:05:47.525850 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:05:47.525817 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fdzzg_02533a21-4e1f-4bc0-a493-7ac7d35295b8/ovn-acl-logging/0.log" Apr 16 19:05:47.529588 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:05:47.529564 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fdzzg_02533a21-4e1f-4bc0-a493-7ac7d35295b8/ovn-acl-logging/0.log" Apr 16 19:06:49.172895 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:06:49.172861 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-785bb8f7db-9jt7g"] Apr 16 19:06:49.173378 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:06:49.173203 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="42f4da98-5207-44d5-a92d-837aea5a667d" containerName="storage-initializer" Apr 16 19:06:49.173378 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:06:49.173216 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="42f4da98-5207-44d5-a92d-837aea5a667d" containerName="storage-initializer" Apr 16 19:06:49.173378 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:06:49.173230 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d141a9ba-91a0-479c-b2db-fd36e7c38e51" containerName="storage-initializer" Apr 16 19:06:49.173378 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:06:49.173236 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="d141a9ba-91a0-479c-b2db-fd36e7c38e51" containerName="storage-initializer" Apr 16 19:06:49.173378 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:06:49.173243 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d141a9ba-91a0-479c-b2db-fd36e7c38e51" containerName="kserve-container" Apr 16 19:06:49.173378 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:06:49.173249 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="d141a9ba-91a0-479c-b2db-fd36e7c38e51" containerName="kserve-container" Apr 16 19:06:49.173378 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:06:49.173264 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="42f4da98-5207-44d5-a92d-837aea5a667d" containerName="storage-initializer" Apr 16 19:06:49.173378 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:06:49.173269 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="42f4da98-5207-44d5-a92d-837aea5a667d" containerName="storage-initializer" Apr 16 19:06:49.173378 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:06:49.173322 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="42f4da98-5207-44d5-a92d-837aea5a667d" containerName="storage-initializer" Apr 16 19:06:49.173378 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:06:49.173333 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="d141a9ba-91a0-479c-b2db-fd36e7c38e51" containerName="kserve-container" Apr 16 19:06:49.173874 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:06:49.173448 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="42f4da98-5207-44d5-a92d-837aea5a667d" containerName="storage-initializer" Apr 16 19:06:49.176352 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:06:49.176336 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-785bb8f7db-9jt7g" Apr 16 19:06:49.178949 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:06:49.178928 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-z66mq\"" Apr 16 19:06:49.184060 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:06:49.183721 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-785bb8f7db-9jt7g"] Apr 16 19:06:49.282508 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:06:49.282475 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6bdd8a66-f69c-48e5-8b11-a0bdfaeb5377-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-785bb8f7db-9jt7g\" (UID: \"6bdd8a66-f69c-48e5-8b11-a0bdfaeb5377\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-785bb8f7db-9jt7g" Apr 16 19:06:49.383243 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:06:49.383208 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6bdd8a66-f69c-48e5-8b11-a0bdfaeb5377-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-785bb8f7db-9jt7g\" (UID: \"6bdd8a66-f69c-48e5-8b11-a0bdfaeb5377\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-785bb8f7db-9jt7g" Apr 16 19:06:49.383510 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:06:49.383493 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6bdd8a66-f69c-48e5-8b11-a0bdfaeb5377-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-785bb8f7db-9jt7g\" (UID: \"6bdd8a66-f69c-48e5-8b11-a0bdfaeb5377\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-785bb8f7db-9jt7g" Apr 16 19:06:49.487448 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:06:49.487416 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-785bb8f7db-9jt7g" Apr 16 19:06:49.607433 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:06:49.607338 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-785bb8f7db-9jt7g"] Apr 16 19:06:49.610073 ip-10-0-139-33 kubenswrapper[2578]: W0416 19:06:49.610036 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bdd8a66_f69c_48e5_8b11_a0bdfaeb5377.slice/crio-5c96d933ba3a1f72f7df01ca1b035830d8fcad0ffd7ceb209addc9d91835392c WatchSource:0}: Error finding container 5c96d933ba3a1f72f7df01ca1b035830d8fcad0ffd7ceb209addc9d91835392c: Status 404 returned error can't find the container with id 5c96d933ba3a1f72f7df01ca1b035830d8fcad0ffd7ceb209addc9d91835392c Apr 16 19:06:49.611911 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:06:49.611891 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:06:50.173811 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:06:50.173774 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-785bb8f7db-9jt7g" event={"ID":"6bdd8a66-f69c-48e5-8b11-a0bdfaeb5377","Type":"ContainerStarted","Data":"a1dbe59c09693f079a7d496a4d827fc1c2b08d204a31f2512dd1f21cc544b168"} Apr 16 19:06:50.173811 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:06:50.173815 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-785bb8f7db-9jt7g" event={"ID":"6bdd8a66-f69c-48e5-8b11-a0bdfaeb5377","Type":"ContainerStarted","Data":"5c96d933ba3a1f72f7df01ca1b035830d8fcad0ffd7ceb209addc9d91835392c"} Apr 16 19:06:54.186290 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:06:54.186258 2578 generic.go:358] "Generic (PLEG): container finished" podID="6bdd8a66-f69c-48e5-8b11-a0bdfaeb5377" containerID="a1dbe59c09693f079a7d496a4d827fc1c2b08d204a31f2512dd1f21cc544b168" exitCode=0 Apr 16 19:06:54.186675 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:06:54.186333 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-785bb8f7db-9jt7g" event={"ID":"6bdd8a66-f69c-48e5-8b11-a0bdfaeb5377","Type":"ContainerDied","Data":"a1dbe59c09693f079a7d496a4d827fc1c2b08d204a31f2512dd1f21cc544b168"} Apr 16 19:07:14.263924 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:07:14.263883 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-785bb8f7db-9jt7g" event={"ID":"6bdd8a66-f69c-48e5-8b11-a0bdfaeb5377","Type":"ContainerStarted","Data":"f0e5d6bfc0c26b35c485cd280e3aaa8a13476b1bee4bce9ea52ff7ec6bebe6d3"} Apr 16 19:07:14.264440 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:07:14.264245 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-785bb8f7db-9jt7g" Apr 16 19:07:14.265149 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:07:14.265114 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-785bb8f7db-9jt7g" podUID="6bdd8a66-f69c-48e5-8b11-a0bdfaeb5377" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 16 19:07:14.280323 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:07:14.280252 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-785bb8f7db-9jt7g" podStartSLOduration=6.13039412 podStartE2EDuration="25.280235674s" podCreationTimestamp="2026-04-16 19:06:49 +0000 UTC" firstStartedPulling="2026-04-16 19:06:54.187422454 +0000 UTC m=+2190.974758687" lastFinishedPulling="2026-04-16 19:07:13.337264007 +0000 UTC m=+2210.124600241" observedRunningTime="2026-04-16 19:07:14.27975294 +0000 UTC m=+2211.067089208" watchObservedRunningTime="2026-04-16 19:07:14.280235674 +0000 UTC m=+2211.067571933" Apr 16 19:07:15.267825 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:07:15.267786 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-785bb8f7db-9jt7g" podUID="6bdd8a66-f69c-48e5-8b11-a0bdfaeb5377" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 16 19:07:25.268478 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:07:25.268435 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-785bb8f7db-9jt7g" podUID="6bdd8a66-f69c-48e5-8b11-a0bdfaeb5377" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 16 19:07:35.267896 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:07:35.267854 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-785bb8f7db-9jt7g" podUID="6bdd8a66-f69c-48e5-8b11-a0bdfaeb5377" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 16 19:07:45.267916 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:07:45.267833 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-785bb8f7db-9jt7g" podUID="6bdd8a66-f69c-48e5-8b11-a0bdfaeb5377" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 16 19:07:55.268262 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:07:55.268211 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-785bb8f7db-9jt7g" podUID="6bdd8a66-f69c-48e5-8b11-a0bdfaeb5377" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 16 19:08:05.268418 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:08:05.268377 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-785bb8f7db-9jt7g" podUID="6bdd8a66-f69c-48e5-8b11-a0bdfaeb5377" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 16 19:08:15.268097 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:08:15.268053 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-785bb8f7db-9jt7g" podUID="6bdd8a66-f69c-48e5-8b11-a0bdfaeb5377" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 16 19:08:25.269259 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:08:25.269231 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-785bb8f7db-9jt7g" Apr 16 19:08:29.456899 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:08:29.456868 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-785bb8f7db-9jt7g"] Apr 16 19:08:29.457347 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:08:29.457130 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-785bb8f7db-9jt7g" podUID="6bdd8a66-f69c-48e5-8b11-a0bdfaeb5377" containerName="kserve-container" containerID="cri-o://f0e5d6bfc0c26b35c485cd280e3aaa8a13476b1bee4bce9ea52ff7ec6bebe6d3" gracePeriod=30 Apr 16 19:08:34.523353 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:08:34.523312 2578 generic.go:358] "Generic (PLEG): container finished" podID="6bdd8a66-f69c-48e5-8b11-a0bdfaeb5377" containerID="f0e5d6bfc0c26b35c485cd280e3aaa8a13476b1bee4bce9ea52ff7ec6bebe6d3" exitCode=0 Apr 16 19:08:34.523731 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:08:34.523384 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-785bb8f7db-9jt7g" event={"ID":"6bdd8a66-f69c-48e5-8b11-a0bdfaeb5377","Type":"ContainerDied","Data":"f0e5d6bfc0c26b35c485cd280e3aaa8a13476b1bee4bce9ea52ff7ec6bebe6d3"} Apr 16 19:08:34.792993 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:08:34.792964 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-785bb8f7db-9jt7g" Apr 16 19:08:34.964686 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:08:34.964639 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6bdd8a66-f69c-48e5-8b11-a0bdfaeb5377-kserve-provision-location\") pod \"6bdd8a66-f69c-48e5-8b11-a0bdfaeb5377\" (UID: \"6bdd8a66-f69c-48e5-8b11-a0bdfaeb5377\") " Apr 16 19:08:34.965063 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:08:34.965036 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bdd8a66-f69c-48e5-8b11-a0bdfaeb5377-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6bdd8a66-f69c-48e5-8b11-a0bdfaeb5377" (UID: "6bdd8a66-f69c-48e5-8b11-a0bdfaeb5377"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:08:35.066238 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:08:35.066122 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6bdd8a66-f69c-48e5-8b11-a0bdfaeb5377-kserve-provision-location\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 19:08:35.528424 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:08:35.528383 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-785bb8f7db-9jt7g" event={"ID":"6bdd8a66-f69c-48e5-8b11-a0bdfaeb5377","Type":"ContainerDied","Data":"5c96d933ba3a1f72f7df01ca1b035830d8fcad0ffd7ceb209addc9d91835392c"} Apr 16 19:08:35.528424 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:08:35.528420 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-785bb8f7db-9jt7g" Apr 16 19:08:35.528918 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:08:35.528435 2578 scope.go:117] "RemoveContainer" containerID="f0e5d6bfc0c26b35c485cd280e3aaa8a13476b1bee4bce9ea52ff7ec6bebe6d3" Apr 16 19:08:35.537303 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:08:35.537288 2578 scope.go:117] "RemoveContainer" containerID="a1dbe59c09693f079a7d496a4d827fc1c2b08d204a31f2512dd1f21cc544b168" Apr 16 19:08:35.551448 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:08:35.551419 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-785bb8f7db-9jt7g"] Apr 16 19:08:35.556558 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:08:35.556532 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-785bb8f7db-9jt7g"] Apr 16 19:08:35.841606 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:08:35.841521 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bdd8a66-f69c-48e5-8b11-a0bdfaeb5377" path="/var/lib/kubelet/pods/6bdd8a66-f69c-48e5-8b11-a0bdfaeb5377/volumes" Apr 16 19:10:47.548644 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:10:47.548539 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fdzzg_02533a21-4e1f-4bc0-a493-7ac7d35295b8/ovn-acl-logging/0.log" Apr 16 19:10:47.553157 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:10:47.553138 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fdzzg_02533a21-4e1f-4bc0-a493-7ac7d35295b8/ovn-acl-logging/0.log" Apr 16 19:13:02.294210 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:13:02.294159 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-59d84b47f5-p67pz"] Apr 16 19:13:02.294726 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:13:02.294532 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6bdd8a66-f69c-48e5-8b11-a0bdfaeb5377" containerName="kserve-container" Apr 16 19:13:02.294726 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:13:02.294545 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bdd8a66-f69c-48e5-8b11-a0bdfaeb5377" containerName="kserve-container" Apr 16 19:13:02.294726 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:13:02.294555 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6bdd8a66-f69c-48e5-8b11-a0bdfaeb5377" containerName="storage-initializer" Apr 16 19:13:02.294726 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:13:02.294562 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bdd8a66-f69c-48e5-8b11-a0bdfaeb5377" containerName="storage-initializer" Apr 16 19:13:02.294726 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:13:02.294617 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="6bdd8a66-f69c-48e5-8b11-a0bdfaeb5377" containerName="kserve-container" Apr 16 19:13:02.297518 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:13:02.297484 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-59d84b47f5-p67pz" Apr 16 19:13:02.300136 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:13:02.300112 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-z66mq\"" Apr 16 19:13:02.305419 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:13:02.305398 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-59d84b47f5-p67pz"] Apr 16 19:13:02.314148 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:13:02.314120 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/85724ee6-d268-44c8-8ef2-aa2c41e5e59c-kserve-provision-location\") pod \"isvc-sklearn-predictor-59d84b47f5-p67pz\" (UID: \"85724ee6-d268-44c8-8ef2-aa2c41e5e59c\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-59d84b47f5-p67pz" Apr 16 19:13:02.415318 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:13:02.415282 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/85724ee6-d268-44c8-8ef2-aa2c41e5e59c-kserve-provision-location\") pod \"isvc-sklearn-predictor-59d84b47f5-p67pz\" (UID: \"85724ee6-d268-44c8-8ef2-aa2c41e5e59c\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-59d84b47f5-p67pz" Apr 16 19:13:02.415701 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:13:02.415679 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/85724ee6-d268-44c8-8ef2-aa2c41e5e59c-kserve-provision-location\") pod \"isvc-sklearn-predictor-59d84b47f5-p67pz\" (UID: \"85724ee6-d268-44c8-8ef2-aa2c41e5e59c\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-59d84b47f5-p67pz" Apr 16 19:13:02.609977 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:13:02.609889 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-59d84b47f5-p67pz" Apr 16 19:13:02.727345 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:13:02.727303 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-59d84b47f5-p67pz"] Apr 16 19:13:02.730005 ip-10-0-139-33 kubenswrapper[2578]: W0416 19:13:02.729963 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85724ee6_d268_44c8_8ef2_aa2c41e5e59c.slice/crio-6ae7a0047daceeefbd39a201e39248428d23130e9d581035eef6187b72d99967 WatchSource:0}: Error finding container 6ae7a0047daceeefbd39a201e39248428d23130e9d581035eef6187b72d99967: Status 404 returned error can't find the container with id 6ae7a0047daceeefbd39a201e39248428d23130e9d581035eef6187b72d99967 Apr 16 19:13:02.732289 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:13:02.732264 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:13:03.381461 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:13:03.381417 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-59d84b47f5-p67pz" event={"ID":"85724ee6-d268-44c8-8ef2-aa2c41e5e59c","Type":"ContainerStarted","Data":"057a98186cbacea3cc2f8f0601880e4928d19cff1f37d068b55dcc8aac94ae6b"} Apr 16 19:13:03.381461 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:13:03.381460 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-59d84b47f5-p67pz" event={"ID":"85724ee6-d268-44c8-8ef2-aa2c41e5e59c","Type":"ContainerStarted","Data":"6ae7a0047daceeefbd39a201e39248428d23130e9d581035eef6187b72d99967"} Apr 16 19:13:07.396685 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:13:07.396649 2578 generic.go:358] "Generic (PLEG): container finished" podID="85724ee6-d268-44c8-8ef2-aa2c41e5e59c" containerID="057a98186cbacea3cc2f8f0601880e4928d19cff1f37d068b55dcc8aac94ae6b" exitCode=0 Apr 16 19:13:07.397082 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:13:07.396715 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-59d84b47f5-p67pz" event={"ID":"85724ee6-d268-44c8-8ef2-aa2c41e5e59c","Type":"ContainerDied","Data":"057a98186cbacea3cc2f8f0601880e4928d19cff1f37d068b55dcc8aac94ae6b"} Apr 16 19:13:08.402071 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:13:08.402037 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-59d84b47f5-p67pz" event={"ID":"85724ee6-d268-44c8-8ef2-aa2c41e5e59c","Type":"ContainerStarted","Data":"d76a5ec73c527d50da24c72d176304cc79102c2a57316efa1f85d02024f9ffbb"} Apr 16 19:13:08.402491 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:13:08.402399 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-59d84b47f5-p67pz" Apr 16 19:13:08.403764 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:13:08.403736 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-59d84b47f5-p67pz" podUID="85724ee6-d268-44c8-8ef2-aa2c41e5e59c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 16 19:13:08.418880 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:13:08.418840 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-59d84b47f5-p67pz" podStartSLOduration=6.418826458 podStartE2EDuration="6.418826458s" podCreationTimestamp="2026-04-16 19:13:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:13:08.417589274 +0000 UTC m=+2565.204925531" watchObservedRunningTime="2026-04-16 19:13:08.418826458 +0000 UTC m=+2565.206162713" Apr 16 19:13:09.405194 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:13:09.405147 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-59d84b47f5-p67pz" podUID="85724ee6-d268-44c8-8ef2-aa2c41e5e59c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 16 19:13:19.405846 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:13:19.405808 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-59d84b47f5-p67pz" podUID="85724ee6-d268-44c8-8ef2-aa2c41e5e59c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 16 19:13:29.405862 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:13:29.405815 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-59d84b47f5-p67pz" podUID="85724ee6-d268-44c8-8ef2-aa2c41e5e59c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 16 19:13:39.406033 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:13:39.405990 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-59d84b47f5-p67pz" podUID="85724ee6-d268-44c8-8ef2-aa2c41e5e59c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 16 19:13:49.405218 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:13:49.405105 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-59d84b47f5-p67pz" podUID="85724ee6-d268-44c8-8ef2-aa2c41e5e59c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 16 19:13:59.405820 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:13:59.405778 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-59d84b47f5-p67pz" podUID="85724ee6-d268-44c8-8ef2-aa2c41e5e59c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 16 19:14:09.405283 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:14:09.405244 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-59d84b47f5-p67pz" podUID="85724ee6-d268-44c8-8ef2-aa2c41e5e59c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 16 19:14:17.841263 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:14:17.841236 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-59d84b47f5-p67pz" Apr 16 19:14:22.441400 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:14:22.441368 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-59d84b47f5-p67pz"] Apr 16 19:14:22.441802 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:14:22.441647 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-59d84b47f5-p67pz" podUID="85724ee6-d268-44c8-8ef2-aa2c41e5e59c" containerName="kserve-container" containerID="cri-o://d76a5ec73c527d50da24c72d176304cc79102c2a57316efa1f85d02024f9ffbb" gracePeriod=30 Apr 16 19:14:22.482669 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:14:22.482637 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-5fdf655847-wsqps"] Apr 16 19:14:22.485988 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:14:22.485972 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-5fdf655847-wsqps" Apr 16 19:14:22.495049 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:14:22.495026 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-5fdf655847-wsqps"] Apr 16 19:14:22.577013 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:14:22.576986 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a7112717-d106-46bc-9fb6-9ecf432133d9-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-5fdf655847-wsqps\" (UID: \"a7112717-d106-46bc-9fb6-9ecf432133d9\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-5fdf655847-wsqps" Apr 16 19:14:22.678510 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:14:22.678473 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a7112717-d106-46bc-9fb6-9ecf432133d9-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-5fdf655847-wsqps\" (UID: \"a7112717-d106-46bc-9fb6-9ecf432133d9\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-5fdf655847-wsqps" Apr 16 19:14:22.678879 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:14:22.678855 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a7112717-d106-46bc-9fb6-9ecf432133d9-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-5fdf655847-wsqps\" (UID: \"a7112717-d106-46bc-9fb6-9ecf432133d9\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-5fdf655847-wsqps" Apr 16 19:14:22.796808 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:14:22.796782 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-5fdf655847-wsqps" Apr 16 19:14:22.917247 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:14:22.917218 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-5fdf655847-wsqps"] Apr 16 19:14:22.919977 ip-10-0-139-33 kubenswrapper[2578]: W0416 19:14:22.919944 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7112717_d106_46bc_9fb6_9ecf432133d9.slice/crio-79775552d1484f4699f95d3f3075332c5b66d4e948dbaa155837d176b4ad73b2 WatchSource:0}: Error finding container 79775552d1484f4699f95d3f3075332c5b66d4e948dbaa155837d176b4ad73b2: Status 404 returned error can't find the container with id 79775552d1484f4699f95d3f3075332c5b66d4e948dbaa155837d176b4ad73b2 Apr 16 19:14:23.662194 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:14:23.662156 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-5fdf655847-wsqps" event={"ID":"a7112717-d106-46bc-9fb6-9ecf432133d9","Type":"ContainerStarted","Data":"43ff7db592da06002cb4c119d02e4e94172c80856d670c1b7c9367efb276c629"} Apr 16 19:14:23.662194 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:14:23.662200 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-5fdf655847-wsqps" event={"ID":"a7112717-d106-46bc-9fb6-9ecf432133d9","Type":"ContainerStarted","Data":"79775552d1484f4699f95d3f3075332c5b66d4e948dbaa155837d176b4ad73b2"} Apr 16 19:14:26.587627 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:14:26.587602 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-59d84b47f5-p67pz" Apr 16 19:14:26.672245 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:14:26.672217 2578 generic.go:358] "Generic (PLEG): container finished" podID="a7112717-d106-46bc-9fb6-9ecf432133d9" containerID="43ff7db592da06002cb4c119d02e4e94172c80856d670c1b7c9367efb276c629" exitCode=0 Apr 16 19:14:26.672371 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:14:26.672285 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-5fdf655847-wsqps" event={"ID":"a7112717-d106-46bc-9fb6-9ecf432133d9","Type":"ContainerDied","Data":"43ff7db592da06002cb4c119d02e4e94172c80856d670c1b7c9367efb276c629"} Apr 16 19:14:26.673766 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:14:26.673742 2578 generic.go:358] "Generic (PLEG): container finished" podID="85724ee6-d268-44c8-8ef2-aa2c41e5e59c" containerID="d76a5ec73c527d50da24c72d176304cc79102c2a57316efa1f85d02024f9ffbb" exitCode=0 Apr 16 19:14:26.673891 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:14:26.673799 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-59d84b47f5-p67pz" Apr 16 19:14:26.673891 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:14:26.673800 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-59d84b47f5-p67pz" event={"ID":"85724ee6-d268-44c8-8ef2-aa2c41e5e59c","Type":"ContainerDied","Data":"d76a5ec73c527d50da24c72d176304cc79102c2a57316efa1f85d02024f9ffbb"} Apr 16 19:14:26.673979 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:14:26.673892 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-59d84b47f5-p67pz" event={"ID":"85724ee6-d268-44c8-8ef2-aa2c41e5e59c","Type":"ContainerDied","Data":"6ae7a0047daceeefbd39a201e39248428d23130e9d581035eef6187b72d99967"} Apr 16 19:14:26.673979 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:14:26.673913 2578 scope.go:117] "RemoveContainer" containerID="d76a5ec73c527d50da24c72d176304cc79102c2a57316efa1f85d02024f9ffbb" Apr 16 19:14:26.709576 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:14:26.709556 2578 scope.go:117] "RemoveContainer" containerID="057a98186cbacea3cc2f8f0601880e4928d19cff1f37d068b55dcc8aac94ae6b" Apr 16 19:14:26.713830 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:14:26.713811 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/85724ee6-d268-44c8-8ef2-aa2c41e5e59c-kserve-provision-location\") pod \"85724ee6-d268-44c8-8ef2-aa2c41e5e59c\" (UID: \"85724ee6-d268-44c8-8ef2-aa2c41e5e59c\") " Apr 16 19:14:26.714068 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:14:26.714048 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85724ee6-d268-44c8-8ef2-aa2c41e5e59c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "85724ee6-d268-44c8-8ef2-aa2c41e5e59c" (UID: "85724ee6-d268-44c8-8ef2-aa2c41e5e59c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:14:26.724261 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:14:26.724241 2578 scope.go:117] "RemoveContainer" containerID="d76a5ec73c527d50da24c72d176304cc79102c2a57316efa1f85d02024f9ffbb" Apr 16 19:14:26.724646 ip-10-0-139-33 kubenswrapper[2578]: E0416 19:14:26.724615 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d76a5ec73c527d50da24c72d176304cc79102c2a57316efa1f85d02024f9ffbb\": container with ID starting with d76a5ec73c527d50da24c72d176304cc79102c2a57316efa1f85d02024f9ffbb not found: ID does not exist" containerID="d76a5ec73c527d50da24c72d176304cc79102c2a57316efa1f85d02024f9ffbb" Apr 16 19:14:26.724781 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:14:26.724657 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d76a5ec73c527d50da24c72d176304cc79102c2a57316efa1f85d02024f9ffbb"} err="failed to get container status \"d76a5ec73c527d50da24c72d176304cc79102c2a57316efa1f85d02024f9ffbb\": rpc error: code = NotFound desc = could not find container \"d76a5ec73c527d50da24c72d176304cc79102c2a57316efa1f85d02024f9ffbb\": container with ID starting with d76a5ec73c527d50da24c72d176304cc79102c2a57316efa1f85d02024f9ffbb not found: ID does not exist" Apr 16 19:14:26.724781 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:14:26.724681 2578 scope.go:117] "RemoveContainer" containerID="057a98186cbacea3cc2f8f0601880e4928d19cff1f37d068b55dcc8aac94ae6b" Apr 16 19:14:26.724980 ip-10-0-139-33 kubenswrapper[2578]: E0416 19:14:26.724964 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"057a98186cbacea3cc2f8f0601880e4928d19cff1f37d068b55dcc8aac94ae6b\": container with ID starting with 057a98186cbacea3cc2f8f0601880e4928d19cff1f37d068b55dcc8aac94ae6b not found: ID does not exist" containerID="057a98186cbacea3cc2f8f0601880e4928d19cff1f37d068b55dcc8aac94ae6b" Apr 16 19:14:26.725043 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:14:26.724986 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"057a98186cbacea3cc2f8f0601880e4928d19cff1f37d068b55dcc8aac94ae6b"} err="failed to get container status \"057a98186cbacea3cc2f8f0601880e4928d19cff1f37d068b55dcc8aac94ae6b\": rpc error: code = NotFound desc = could not find container \"057a98186cbacea3cc2f8f0601880e4928d19cff1f37d068b55dcc8aac94ae6b\": container with ID starting with 057a98186cbacea3cc2f8f0601880e4928d19cff1f37d068b55dcc8aac94ae6b not found: ID does not exist" Apr 16 19:14:26.814914 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:14:26.814889 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/85724ee6-d268-44c8-8ef2-aa2c41e5e59c-kserve-provision-location\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 19:14:26.995764 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:14:26.995738 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-59d84b47f5-p67pz"] Apr 16 19:14:27.001218 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:14:27.001194 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-59d84b47f5-p67pz"] Apr 16 19:14:27.678663 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:14:27.678626 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-5fdf655847-wsqps" event={"ID":"a7112717-d106-46bc-9fb6-9ecf432133d9","Type":"ContainerStarted","Data":"2d344a7c4d17fac8046eb8f36d258584d02158a61d13e46dfb2fcea10d816dc9"} Apr 16 19:14:27.679108 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:14:27.678866 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-5fdf655847-wsqps" Apr 16 19:14:27.697131 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:14:27.697089 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-5fdf655847-wsqps" podStartSLOduration=5.6970786570000005 podStartE2EDuration="5.697078657s" podCreationTimestamp="2026-04-16 19:14:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:14:27.695230226 +0000 UTC m=+2644.482566483" watchObservedRunningTime="2026-04-16 19:14:27.697078657 +0000 UTC m=+2644.484414911" Apr 16 19:14:27.842226 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:14:27.842195 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85724ee6-d268-44c8-8ef2-aa2c41e5e59c" path="/var/lib/kubelet/pods/85724ee6-d268-44c8-8ef2-aa2c41e5e59c/volumes" Apr 16 19:14:58.777251 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:14:58.777198 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-5fdf655847-wsqps" podUID="a7112717-d106-46bc-9fb6-9ecf432133d9" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 16 19:15:08.686236 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:08.686202 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-5fdf655847-wsqps" Apr 16 19:15:12.597041 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:12.596809 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-5fdf655847-wsqps"] Apr 16 19:15:12.597587 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:12.597353 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-5fdf655847-wsqps" podUID="a7112717-d106-46bc-9fb6-9ecf432133d9" containerName="kserve-container" containerID="cri-o://2d344a7c4d17fac8046eb8f36d258584d02158a61d13e46dfb2fcea10d816dc9" gracePeriod=30 Apr 16 19:15:12.663624 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:12.663584 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-56797944b-rlxdl"] Apr 16 19:15:12.663948 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:12.663935 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="85724ee6-d268-44c8-8ef2-aa2c41e5e59c" containerName="kserve-container" Apr 16 19:15:12.663996 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:12.663950 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="85724ee6-d268-44c8-8ef2-aa2c41e5e59c" containerName="kserve-container" Apr 16 19:15:12.663996 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:12.663963 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="85724ee6-d268-44c8-8ef2-aa2c41e5e59c" containerName="storage-initializer" Apr 16 19:15:12.663996 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:12.663969 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="85724ee6-d268-44c8-8ef2-aa2c41e5e59c" containerName="storage-initializer" Apr 16 19:15:12.664119 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:12.664027 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="85724ee6-d268-44c8-8ef2-aa2c41e5e59c" containerName="kserve-container" Apr 16 19:15:12.667359 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:12.667337 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-56797944b-rlxdl" Apr 16 19:15:12.674650 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:12.674528 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-56797944b-rlxdl"] Apr 16 19:15:12.696303 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:12.696271 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b43c19d4-f0ce-43df-9623-753d470b1e51-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-56797944b-rlxdl\" (UID: \"b43c19d4-f0ce-43df-9623-753d470b1e51\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-56797944b-rlxdl" Apr 16 19:15:12.797794 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:12.797753 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b43c19d4-f0ce-43df-9623-753d470b1e51-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-56797944b-rlxdl\" (UID: \"b43c19d4-f0ce-43df-9623-753d470b1e51\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-56797944b-rlxdl" Apr 16 19:15:12.798114 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:12.798097 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b43c19d4-f0ce-43df-9623-753d470b1e51-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-56797944b-rlxdl\" (UID: \"b43c19d4-f0ce-43df-9623-753d470b1e51\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-56797944b-rlxdl" Apr 16 19:15:12.979083 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:12.979056 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-56797944b-rlxdl" Apr 16 19:15:13.307294 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:13.307269 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-56797944b-rlxdl"] Apr 16 19:15:13.309507 ip-10-0-139-33 kubenswrapper[2578]: W0416 19:15:13.309472 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb43c19d4_f0ce_43df_9623_753d470b1e51.slice/crio-9c87e19ec13cbfc9fe1f0519cf696e6ec08dd7c8d8b5d6d06071226b3ac38c0b WatchSource:0}: Error finding container 9c87e19ec13cbfc9fe1f0519cf696e6ec08dd7c8d8b5d6d06071226b3ac38c0b: Status 404 returned error can't find the container with id 9c87e19ec13cbfc9fe1f0519cf696e6ec08dd7c8d8b5d6d06071226b3ac38c0b Apr 16 19:15:13.835858 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:13.835821 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-56797944b-rlxdl" event={"ID":"b43c19d4-f0ce-43df-9623-753d470b1e51","Type":"ContainerStarted","Data":"93609498551f5e0cf8ba1d27a9c6631d5be2b0ac3345ec8eaa4faf9554ee898d"} Apr 16 19:15:13.835858 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:13.835860 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-56797944b-rlxdl" event={"ID":"b43c19d4-f0ce-43df-9623-753d470b1e51","Type":"ContainerStarted","Data":"9c87e19ec13cbfc9fe1f0519cf696e6ec08dd7c8d8b5d6d06071226b3ac38c0b"} Apr 16 19:15:18.683639 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:18.683596 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-5fdf655847-wsqps" podUID="a7112717-d106-46bc-9fb6-9ecf432133d9" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.51:8080/v2/models/sklearn-v2-mlserver/ready\": dial tcp 10.134.0.51:8080: connect: connection refused" Apr 16 19:15:19.834725 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:19.834702 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-5fdf655847-wsqps" Apr 16 19:15:19.857264 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:19.857231 2578 generic.go:358] "Generic (PLEG): container finished" podID="a7112717-d106-46bc-9fb6-9ecf432133d9" containerID="2d344a7c4d17fac8046eb8f36d258584d02158a61d13e46dfb2fcea10d816dc9" exitCode=0 Apr 16 19:15:19.857425 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:19.857293 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-5fdf655847-wsqps" event={"ID":"a7112717-d106-46bc-9fb6-9ecf432133d9","Type":"ContainerDied","Data":"2d344a7c4d17fac8046eb8f36d258584d02158a61d13e46dfb2fcea10d816dc9"} Apr 16 19:15:19.857425 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:19.857314 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-5fdf655847-wsqps" Apr 16 19:15:19.857425 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:19.857333 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-5fdf655847-wsqps" event={"ID":"a7112717-d106-46bc-9fb6-9ecf432133d9","Type":"ContainerDied","Data":"79775552d1484f4699f95d3f3075332c5b66d4e948dbaa155837d176b4ad73b2"} Apr 16 19:15:19.857425 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:19.857355 2578 scope.go:117] "RemoveContainer" containerID="2d344a7c4d17fac8046eb8f36d258584d02158a61d13e46dfb2fcea10d816dc9" Apr 16 19:15:19.859152 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:19.859133 2578 generic.go:358] "Generic (PLEG): container finished" podID="b43c19d4-f0ce-43df-9623-753d470b1e51" containerID="93609498551f5e0cf8ba1d27a9c6631d5be2b0ac3345ec8eaa4faf9554ee898d" exitCode=0 Apr 16 19:15:19.859277 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:19.859206 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-56797944b-rlxdl" event={"ID":"b43c19d4-f0ce-43df-9623-753d470b1e51","Type":"ContainerDied","Data":"93609498551f5e0cf8ba1d27a9c6631d5be2b0ac3345ec8eaa4faf9554ee898d"} Apr 16 19:15:19.866275 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:19.866255 2578 scope.go:117] "RemoveContainer" containerID="43ff7db592da06002cb4c119d02e4e94172c80856d670c1b7c9367efb276c629" Apr 16 19:15:19.873725 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:19.873706 2578 scope.go:117] "RemoveContainer" containerID="2d344a7c4d17fac8046eb8f36d258584d02158a61d13e46dfb2fcea10d816dc9" Apr 16 19:15:19.873994 ip-10-0-139-33 kubenswrapper[2578]: E0416 19:15:19.873966 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d344a7c4d17fac8046eb8f36d258584d02158a61d13e46dfb2fcea10d816dc9\": container with ID starting with 2d344a7c4d17fac8046eb8f36d258584d02158a61d13e46dfb2fcea10d816dc9 not found: ID does not exist" containerID="2d344a7c4d17fac8046eb8f36d258584d02158a61d13e46dfb2fcea10d816dc9" Apr 16 19:15:19.874081 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:19.874005 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d344a7c4d17fac8046eb8f36d258584d02158a61d13e46dfb2fcea10d816dc9"} err="failed to get container status \"2d344a7c4d17fac8046eb8f36d258584d02158a61d13e46dfb2fcea10d816dc9\": rpc error: code = NotFound desc = could not find container \"2d344a7c4d17fac8046eb8f36d258584d02158a61d13e46dfb2fcea10d816dc9\": container with ID starting with 2d344a7c4d17fac8046eb8f36d258584d02158a61d13e46dfb2fcea10d816dc9 not found: ID does not exist" Apr 16 19:15:19.874081 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:19.874030 2578 scope.go:117] "RemoveContainer" containerID="43ff7db592da06002cb4c119d02e4e94172c80856d670c1b7c9367efb276c629" Apr 16 19:15:19.874318 ip-10-0-139-33 kubenswrapper[2578]: E0416 19:15:19.874301 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43ff7db592da06002cb4c119d02e4e94172c80856d670c1b7c9367efb276c629\": container with ID starting with 43ff7db592da06002cb4c119d02e4e94172c80856d670c1b7c9367efb276c629 not found: ID does not exist" containerID="43ff7db592da06002cb4c119d02e4e94172c80856d670c1b7c9367efb276c629" Apr 16 19:15:19.874389 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:19.874325 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43ff7db592da06002cb4c119d02e4e94172c80856d670c1b7c9367efb276c629"} err="failed to get container status \"43ff7db592da06002cb4c119d02e4e94172c80856d670c1b7c9367efb276c629\": rpc error: code = NotFound desc = could not find container \"43ff7db592da06002cb4c119d02e4e94172c80856d670c1b7c9367efb276c629\": container with ID starting with 43ff7db592da06002cb4c119d02e4e94172c80856d670c1b7c9367efb276c629 not found: ID does not exist" Apr 16 19:15:19.955686 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:19.955629 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a7112717-d106-46bc-9fb6-9ecf432133d9-kserve-provision-location\") pod \"a7112717-d106-46bc-9fb6-9ecf432133d9\" (UID: \"a7112717-d106-46bc-9fb6-9ecf432133d9\") " Apr 16 19:15:19.955932 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:19.955913 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7112717-d106-46bc-9fb6-9ecf432133d9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a7112717-d106-46bc-9fb6-9ecf432133d9" (UID: "a7112717-d106-46bc-9fb6-9ecf432133d9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:15:20.056705 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:20.056679 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a7112717-d106-46bc-9fb6-9ecf432133d9-kserve-provision-location\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 19:15:20.179394 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:20.179367 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-5fdf655847-wsqps"] Apr 16 19:15:20.184694 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:20.184668 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-5fdf655847-wsqps"] Apr 16 19:15:20.865339 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:20.865309 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-56797944b-rlxdl" event={"ID":"b43c19d4-f0ce-43df-9623-753d470b1e51","Type":"ContainerStarted","Data":"406304620babc33be8168c3d9fad6234e8f42a0a2112ac03fc5f63ba12ce7b2f"} Apr 16 19:15:20.865748 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:20.865577 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-56797944b-rlxdl" Apr 16 19:15:20.867033 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:20.866998 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-56797944b-rlxdl" podUID="b43c19d4-f0ce-43df-9623-753d470b1e51" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.52:8080: connect: connection refused" Apr 16 19:15:20.882136 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:20.882098 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-56797944b-rlxdl" podStartSLOduration=8.882085228 podStartE2EDuration="8.882085228s" podCreationTimestamp="2026-04-16 19:15:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:15:20.879790862 +0000 UTC m=+2697.667127118" watchObservedRunningTime="2026-04-16 19:15:20.882085228 +0000 UTC m=+2697.669421531" Apr 16 19:15:21.842140 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:21.842101 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7112717-d106-46bc-9fb6-9ecf432133d9" path="/var/lib/kubelet/pods/a7112717-d106-46bc-9fb6-9ecf432133d9/volumes" Apr 16 19:15:21.868333 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:21.868301 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-56797944b-rlxdl" podUID="b43c19d4-f0ce-43df-9623-753d470b1e51" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.52:8080: connect: connection refused" Apr 16 19:15:31.869334 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:31.869283 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-56797944b-rlxdl" podUID="b43c19d4-f0ce-43df-9623-753d470b1e51" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.52:8080: connect: connection refused" Apr 16 19:15:41.869349 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:41.869319 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-56797944b-rlxdl" Apr 16 19:15:47.573020 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:47.572991 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fdzzg_02533a21-4e1f-4bc0-a493-7ac7d35295b8/ovn-acl-logging/0.log" Apr 16 19:15:47.578106 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:47.578085 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fdzzg_02533a21-4e1f-4bc0-a493-7ac7d35295b8/ovn-acl-logging/0.log" Apr 16 19:15:49.759456 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:49.759429 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-runtime-predictor-56797944b-rlxdl_b43c19d4-f0ce-43df-9623-753d470b1e51/kserve-container/0.log" Apr 16 19:15:49.879386 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:49.879346 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-56797944b-rlxdl"] Apr 16 19:15:49.880021 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:49.879986 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-56797944b-rlxdl" podUID="b43c19d4-f0ce-43df-9623-753d470b1e51" containerName="kserve-container" containerID="cri-o://406304620babc33be8168c3d9fad6234e8f42a0a2112ac03fc5f63ba12ce7b2f" gracePeriod=30 Apr 16 19:15:49.944642 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:49.944605 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7f9684dc47-d2qqg"] Apr 16 19:15:49.944952 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:49.944940 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a7112717-d106-46bc-9fb6-9ecf432133d9" containerName="storage-initializer" Apr 16 19:15:49.945022 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:49.944953 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7112717-d106-46bc-9fb6-9ecf432133d9" containerName="storage-initializer" Apr 16 19:15:49.945022 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:49.944966 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a7112717-d106-46bc-9fb6-9ecf432133d9" containerName="kserve-container" Apr 16 19:15:49.945022 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:49.944972 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7112717-d106-46bc-9fb6-9ecf432133d9" containerName="kserve-container" Apr 16 19:15:49.945115 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:49.945041 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="a7112717-d106-46bc-9fb6-9ecf432133d9" containerName="kserve-container" Apr 16 19:15:49.947414 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:49.947392 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7f9684dc47-d2qqg" Apr 16 19:15:49.955356 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:49.955331 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7f9684dc47-d2qqg"] Apr 16 19:15:50.113657 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:50.113567 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/53000491-dc51-4f99-9737-43d6a93dc529-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-7f9684dc47-d2qqg\" (UID: \"53000491-dc51-4f99-9737-43d6a93dc529\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7f9684dc47-d2qqg" Apr 16 19:15:50.214265 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:50.214217 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/53000491-dc51-4f99-9737-43d6a93dc529-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-7f9684dc47-d2qqg\" (UID: \"53000491-dc51-4f99-9737-43d6a93dc529\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7f9684dc47-d2qqg" Apr 16 19:15:50.214607 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:50.214585 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/53000491-dc51-4f99-9737-43d6a93dc529-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-7f9684dc47-d2qqg\" (UID: \"53000491-dc51-4f99-9737-43d6a93dc529\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7f9684dc47-d2qqg" Apr 16 19:15:50.258122 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:50.258090 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7f9684dc47-d2qqg" Apr 16 19:15:50.392822 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:50.392745 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7f9684dc47-d2qqg"] Apr 16 19:15:50.396333 ip-10-0-139-33 kubenswrapper[2578]: W0416 19:15:50.396263 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53000491_dc51_4f99_9737_43d6a93dc529.slice/crio-10ca8cfa07988b40c95cd30de9d0d34639827c078934682c508c1be97bccfadf WatchSource:0}: Error finding container 10ca8cfa07988b40c95cd30de9d0d34639827c078934682c508c1be97bccfadf: Status 404 returned error can't find the container with id 10ca8cfa07988b40c95cd30de9d0d34639827c078934682c508c1be97bccfadf Apr 16 19:15:50.706905 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:50.706879 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-56797944b-rlxdl" Apr 16 19:15:50.819828 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:50.819796 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b43c19d4-f0ce-43df-9623-753d470b1e51-kserve-provision-location\") pod \"b43c19d4-f0ce-43df-9623-753d470b1e51\" (UID: \"b43c19d4-f0ce-43df-9623-753d470b1e51\") " Apr 16 19:15:50.833802 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:50.833767 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b43c19d4-f0ce-43df-9623-753d470b1e51-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b43c19d4-f0ce-43df-9623-753d470b1e51" (UID: "b43c19d4-f0ce-43df-9623-753d470b1e51"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:15:50.920774 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:50.920694 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b43c19d4-f0ce-43df-9623-753d470b1e51-kserve-provision-location\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 19:15:50.974150 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:50.974118 2578 generic.go:358] "Generic (PLEG): container finished" podID="b43c19d4-f0ce-43df-9623-753d470b1e51" containerID="406304620babc33be8168c3d9fad6234e8f42a0a2112ac03fc5f63ba12ce7b2f" exitCode=0 Apr 16 19:15:50.974344 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:50.974198 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-56797944b-rlxdl" Apr 16 19:15:50.974344 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:50.974210 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-56797944b-rlxdl" event={"ID":"b43c19d4-f0ce-43df-9623-753d470b1e51","Type":"ContainerDied","Data":"406304620babc33be8168c3d9fad6234e8f42a0a2112ac03fc5f63ba12ce7b2f"} Apr 16 19:15:50.974344 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:50.974260 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-56797944b-rlxdl" event={"ID":"b43c19d4-f0ce-43df-9623-753d470b1e51","Type":"ContainerDied","Data":"9c87e19ec13cbfc9fe1f0519cf696e6ec08dd7c8d8b5d6d06071226b3ac38c0b"} Apr 16 19:15:50.974344 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:50.974284 2578 scope.go:117] "RemoveContainer" containerID="406304620babc33be8168c3d9fad6234e8f42a0a2112ac03fc5f63ba12ce7b2f" Apr 16 19:15:50.975803 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:50.975761 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7f9684dc47-d2qqg" event={"ID":"53000491-dc51-4f99-9737-43d6a93dc529","Type":"ContainerStarted","Data":"26a3de6c83d36160638e81fa4f9a67f71f65e06254e9d11210b55b018e5a61c2"} Apr 16 19:15:50.975803 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:50.975797 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7f9684dc47-d2qqg" event={"ID":"53000491-dc51-4f99-9737-43d6a93dc529","Type":"ContainerStarted","Data":"10ca8cfa07988b40c95cd30de9d0d34639827c078934682c508c1be97bccfadf"} Apr 16 19:15:50.983221 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:50.983165 2578 scope.go:117] "RemoveContainer" containerID="93609498551f5e0cf8ba1d27a9c6631d5be2b0ac3345ec8eaa4faf9554ee898d" Apr 16 19:15:50.990680 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:50.990661 2578 scope.go:117] "RemoveContainer" containerID="406304620babc33be8168c3d9fad6234e8f42a0a2112ac03fc5f63ba12ce7b2f" Apr 16 19:15:50.990984 ip-10-0-139-33 kubenswrapper[2578]: E0416 19:15:50.990965 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"406304620babc33be8168c3d9fad6234e8f42a0a2112ac03fc5f63ba12ce7b2f\": container with ID starting with 406304620babc33be8168c3d9fad6234e8f42a0a2112ac03fc5f63ba12ce7b2f not found: ID does not exist" containerID="406304620babc33be8168c3d9fad6234e8f42a0a2112ac03fc5f63ba12ce7b2f" Apr 16 19:15:50.991039 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:50.990994 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"406304620babc33be8168c3d9fad6234e8f42a0a2112ac03fc5f63ba12ce7b2f"} err="failed to get container status \"406304620babc33be8168c3d9fad6234e8f42a0a2112ac03fc5f63ba12ce7b2f\": rpc error: code = NotFound desc = could not find container \"406304620babc33be8168c3d9fad6234e8f42a0a2112ac03fc5f63ba12ce7b2f\": container with ID starting with 406304620babc33be8168c3d9fad6234e8f42a0a2112ac03fc5f63ba12ce7b2f not found: ID does not exist" Apr 16 19:15:50.991039 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:50.991014 2578 scope.go:117] "RemoveContainer" containerID="93609498551f5e0cf8ba1d27a9c6631d5be2b0ac3345ec8eaa4faf9554ee898d" Apr 16 19:15:50.991280 ip-10-0-139-33 kubenswrapper[2578]: E0416 19:15:50.991259 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93609498551f5e0cf8ba1d27a9c6631d5be2b0ac3345ec8eaa4faf9554ee898d\": container with ID starting with 93609498551f5e0cf8ba1d27a9c6631d5be2b0ac3345ec8eaa4faf9554ee898d not found: ID does not exist" containerID="93609498551f5e0cf8ba1d27a9c6631d5be2b0ac3345ec8eaa4faf9554ee898d" Apr 16 19:15:50.991335 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:50.991287 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93609498551f5e0cf8ba1d27a9c6631d5be2b0ac3345ec8eaa4faf9554ee898d"} err="failed to get container status \"93609498551f5e0cf8ba1d27a9c6631d5be2b0ac3345ec8eaa4faf9554ee898d\": rpc error: code = NotFound desc = could not find container \"93609498551f5e0cf8ba1d27a9c6631d5be2b0ac3345ec8eaa4faf9554ee898d\": container with ID starting with 93609498551f5e0cf8ba1d27a9c6631d5be2b0ac3345ec8eaa4faf9554ee898d not found: ID does not exist" Apr 16 19:15:51.007352 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:51.007324 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-56797944b-rlxdl"] Apr 16 19:15:51.010804 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:51.010781 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-56797944b-rlxdl"] Apr 16 19:15:51.842480 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:51.842450 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b43c19d4-f0ce-43df-9623-753d470b1e51" path="/var/lib/kubelet/pods/b43c19d4-f0ce-43df-9623-753d470b1e51/volumes" Apr 16 19:15:54.990407 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:54.990375 2578 generic.go:358] "Generic (PLEG): container finished" podID="53000491-dc51-4f99-9737-43d6a93dc529" containerID="26a3de6c83d36160638e81fa4f9a67f71f65e06254e9d11210b55b018e5a61c2" exitCode=0 Apr 16 19:15:54.990807 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:54.990461 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7f9684dc47-d2qqg" event={"ID":"53000491-dc51-4f99-9737-43d6a93dc529","Type":"ContainerDied","Data":"26a3de6c83d36160638e81fa4f9a67f71f65e06254e9d11210b55b018e5a61c2"} Apr 16 19:15:55.995093 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:55.995060 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7f9684dc47-d2qqg" event={"ID":"53000491-dc51-4f99-9737-43d6a93dc529","Type":"ContainerStarted","Data":"b313cf4f143897f556af4e0f58002f2207c433e09b5790828df1868da59557ab"} Apr 16 19:15:55.995471 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:55.995328 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7f9684dc47-d2qqg" Apr 16 19:15:56.011780 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:15:56.011732 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7f9684dc47-d2qqg" podStartSLOduration=7.0117190130000004 podStartE2EDuration="7.011719013s" podCreationTimestamp="2026-04-16 19:15:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:15:56.010048962 +0000 UTC m=+2732.797385219" watchObservedRunningTime="2026-04-16 19:15:56.011719013 +0000 UTC m=+2732.799055269" Apr 16 19:16:27.077599 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:16:27.077562 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7f9684dc47-d2qqg" podUID="53000491-dc51-4f99-9737-43d6a93dc529" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 16 19:16:37.000202 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:16:37.000155 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7f9684dc47-d2qqg" Apr 16 19:16:40.047660 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:16:40.047629 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7f9684dc47-d2qqg"] Apr 16 19:16:40.048048 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:16:40.047849 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7f9684dc47-d2qqg" podUID="53000491-dc51-4f99-9737-43d6a93dc529" containerName="kserve-container" containerID="cri-o://b313cf4f143897f556af4e0f58002f2207c433e09b5790828df1868da59557ab" gracePeriod=30 Apr 16 19:16:40.117692 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:16:40.117661 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5ff9c785b8-h8ccv"] Apr 16 19:16:40.118109 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:16:40.118092 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b43c19d4-f0ce-43df-9623-753d470b1e51" containerName="storage-initializer" Apr 16 19:16:40.118225 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:16:40.118113 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="b43c19d4-f0ce-43df-9623-753d470b1e51" containerName="storage-initializer" Apr 16 19:16:40.118225 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:16:40.118164 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b43c19d4-f0ce-43df-9623-753d470b1e51" containerName="kserve-container" Apr 16 19:16:40.118225 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:16:40.118188 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="b43c19d4-f0ce-43df-9623-753d470b1e51" containerName="kserve-container" Apr 16 19:16:40.118390 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:16:40.118347 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="b43c19d4-f0ce-43df-9623-753d470b1e51" containerName="kserve-container" Apr 16 19:16:40.120543 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:16:40.120524 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5ff9c785b8-h8ccv" Apr 16 19:16:40.127980 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:16:40.127962 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5ff9c785b8-h8ccv"] Apr 16 19:16:40.196010 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:16:40.195976 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c2c1c3bb-02ee-4569-93c3-a67943843d79-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-5ff9c785b8-h8ccv\" (UID: \"c2c1c3bb-02ee-4569-93c3-a67943843d79\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5ff9c785b8-h8ccv" Apr 16 19:16:40.296824 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:16:40.296785 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c2c1c3bb-02ee-4569-93c3-a67943843d79-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-5ff9c785b8-h8ccv\" (UID: \"c2c1c3bb-02ee-4569-93c3-a67943843d79\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5ff9c785b8-h8ccv" Apr 16 19:16:40.297162 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:16:40.297143 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c2c1c3bb-02ee-4569-93c3-a67943843d79-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-5ff9c785b8-h8ccv\" (UID: \"c2c1c3bb-02ee-4569-93c3-a67943843d79\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5ff9c785b8-h8ccv" Apr 16 19:16:40.431381 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:16:40.431260 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5ff9c785b8-h8ccv" Apr 16 19:16:40.551363 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:16:40.551334 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5ff9c785b8-h8ccv"] Apr 16 19:16:40.553620 ip-10-0-139-33 kubenswrapper[2578]: W0416 19:16:40.553588 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2c1c3bb_02ee_4569_93c3_a67943843d79.slice/crio-ce8c19eb9c6b2200f2096bbad1dc0fe63613d965ff3f892dbfa3493ad3354beb WatchSource:0}: Error finding container ce8c19eb9c6b2200f2096bbad1dc0fe63613d965ff3f892dbfa3493ad3354beb: Status 404 returned error can't find the container with id ce8c19eb9c6b2200f2096bbad1dc0fe63613d965ff3f892dbfa3493ad3354beb Apr 16 19:16:41.145872 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:16:41.145827 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5ff9c785b8-h8ccv" event={"ID":"c2c1c3bb-02ee-4569-93c3-a67943843d79","Type":"ContainerStarted","Data":"82a114885639bc0dfdb6c4de8d7120e0cfbc15cc262a3673d766a9b5199914b2"} Apr 16 19:16:41.146266 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:16:41.145876 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5ff9c785b8-h8ccv" event={"ID":"c2c1c3bb-02ee-4569-93c3-a67943843d79","Type":"ContainerStarted","Data":"ce8c19eb9c6b2200f2096bbad1dc0fe63613d965ff3f892dbfa3493ad3354beb"} Apr 16 19:16:45.160695 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:16:45.160664 2578 generic.go:358] "Generic (PLEG): container finished" podID="c2c1c3bb-02ee-4569-93c3-a67943843d79" containerID="82a114885639bc0dfdb6c4de8d7120e0cfbc15cc262a3673d766a9b5199914b2" exitCode=0 Apr 16 19:16:45.161219 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:16:45.160744 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5ff9c785b8-h8ccv" event={"ID":"c2c1c3bb-02ee-4569-93c3-a67943843d79","Type":"ContainerDied","Data":"82a114885639bc0dfdb6c4de8d7120e0cfbc15cc262a3673d766a9b5199914b2"} Apr 16 19:16:46.165382 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:16:46.165350 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5ff9c785b8-h8ccv" event={"ID":"c2c1c3bb-02ee-4569-93c3-a67943843d79","Type":"ContainerStarted","Data":"43e90a4ab187c0e48a25f363b10d52f6f92bdeb81c7bfad904f262cfa4a7f75b"} Apr 16 19:16:46.165804 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:16:46.165649 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5ff9c785b8-h8ccv" Apr 16 19:16:46.166922 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:16:46.166895 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5ff9c785b8-h8ccv" podUID="c2c1c3bb-02ee-4569-93c3-a67943843d79" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 16 19:16:46.183122 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:16:46.183067 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5ff9c785b8-h8ccv" podStartSLOduration=6.183050282 podStartE2EDuration="6.183050282s" podCreationTimestamp="2026-04-16 19:16:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:16:46.180568168 +0000 UTC m=+2782.967904435" watchObservedRunningTime="2026-04-16 19:16:46.183050282 +0000 UTC m=+2782.970386539" Apr 16 19:16:46.999197 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:16:46.999139 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7f9684dc47-d2qqg" podUID="53000491-dc51-4f99-9737-43d6a93dc529" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.53:8080/v2/models/isvc-sklearn-v2-runtime/ready\": dial tcp 10.134.0.53:8080: connect: connection refused" Apr 16 19:16:47.169461 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:16:47.169408 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5ff9c785b8-h8ccv" podUID="c2c1c3bb-02ee-4569-93c3-a67943843d79" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 16 19:16:47.498953 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:16:47.498931 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7f9684dc47-d2qqg" Apr 16 19:16:47.559926 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:16:47.559846 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/53000491-dc51-4f99-9737-43d6a93dc529-kserve-provision-location\") pod \"53000491-dc51-4f99-9737-43d6a93dc529\" (UID: \"53000491-dc51-4f99-9737-43d6a93dc529\") " Apr 16 19:16:47.560229 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:16:47.560204 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53000491-dc51-4f99-9737-43d6a93dc529-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "53000491-dc51-4f99-9737-43d6a93dc529" (UID: "53000491-dc51-4f99-9737-43d6a93dc529"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:16:47.660482 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:16:47.660443 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/53000491-dc51-4f99-9737-43d6a93dc529-kserve-provision-location\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 19:16:48.174221 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:16:48.174116 2578 generic.go:358] "Generic (PLEG): container finished" podID="53000491-dc51-4f99-9737-43d6a93dc529" containerID="b313cf4f143897f556af4e0f58002f2207c433e09b5790828df1868da59557ab" exitCode=0 Apr 16 19:16:48.174221 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:16:48.174206 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7f9684dc47-d2qqg" event={"ID":"53000491-dc51-4f99-9737-43d6a93dc529","Type":"ContainerDied","Data":"b313cf4f143897f556af4e0f58002f2207c433e09b5790828df1868da59557ab"} Apr 16 19:16:48.174713 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:16:48.174235 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7f9684dc47-d2qqg" Apr 16 19:16:48.174713 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:16:48.174248 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7f9684dc47-d2qqg" event={"ID":"53000491-dc51-4f99-9737-43d6a93dc529","Type":"ContainerDied","Data":"10ca8cfa07988b40c95cd30de9d0d34639827c078934682c508c1be97bccfadf"} Apr 16 19:16:48.174713 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:16:48.174264 2578 scope.go:117] "RemoveContainer" containerID="b313cf4f143897f556af4e0f58002f2207c433e09b5790828df1868da59557ab" Apr 16 19:16:48.182624 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:16:48.182603 2578 scope.go:117] "RemoveContainer" containerID="26a3de6c83d36160638e81fa4f9a67f71f65e06254e9d11210b55b018e5a61c2" Apr 16 19:16:48.190330 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:16:48.190311 2578 scope.go:117] "RemoveContainer" containerID="b313cf4f143897f556af4e0f58002f2207c433e09b5790828df1868da59557ab" Apr 16 19:16:48.190602 ip-10-0-139-33 kubenswrapper[2578]: E0416 19:16:48.190578 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b313cf4f143897f556af4e0f58002f2207c433e09b5790828df1868da59557ab\": container with ID starting with b313cf4f143897f556af4e0f58002f2207c433e09b5790828df1868da59557ab not found: ID does not exist" containerID="b313cf4f143897f556af4e0f58002f2207c433e09b5790828df1868da59557ab" Apr 16 19:16:48.190706 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:16:48.190607 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b313cf4f143897f556af4e0f58002f2207c433e09b5790828df1868da59557ab"} err="failed to get container status \"b313cf4f143897f556af4e0f58002f2207c433e09b5790828df1868da59557ab\": rpc error: code = NotFound desc = could not find container \"b313cf4f143897f556af4e0f58002f2207c433e09b5790828df1868da59557ab\": container with ID starting with b313cf4f143897f556af4e0f58002f2207c433e09b5790828df1868da59557ab not found: ID does not exist" Apr 16 19:16:48.190706 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:16:48.190625 2578 scope.go:117] "RemoveContainer" containerID="26a3de6c83d36160638e81fa4f9a67f71f65e06254e9d11210b55b018e5a61c2" Apr 16 19:16:48.190896 ip-10-0-139-33 kubenswrapper[2578]: E0416 19:16:48.190875 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26a3de6c83d36160638e81fa4f9a67f71f65e06254e9d11210b55b018e5a61c2\": container with ID starting with 26a3de6c83d36160638e81fa4f9a67f71f65e06254e9d11210b55b018e5a61c2 not found: ID does not exist" containerID="26a3de6c83d36160638e81fa4f9a67f71f65e06254e9d11210b55b018e5a61c2" Apr 16 19:16:48.190936 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:16:48.190906 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26a3de6c83d36160638e81fa4f9a67f71f65e06254e9d11210b55b018e5a61c2"} err="failed to get container status \"26a3de6c83d36160638e81fa4f9a67f71f65e06254e9d11210b55b018e5a61c2\": rpc error: code = NotFound desc = could not find container \"26a3de6c83d36160638e81fa4f9a67f71f65e06254e9d11210b55b018e5a61c2\": container with ID starting with 26a3de6c83d36160638e81fa4f9a67f71f65e06254e9d11210b55b018e5a61c2 not found: ID does not exist" Apr 16 19:16:48.191053 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:16:48.191040 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7f9684dc47-d2qqg"] Apr 16 19:16:48.193988 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:16:48.193966 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7f9684dc47-d2qqg"] Apr 16 19:16:49.841775 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:16:49.841741 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53000491-dc51-4f99-9737-43d6a93dc529" path="/var/lib/kubelet/pods/53000491-dc51-4f99-9737-43d6a93dc529/volumes" Apr 16 19:16:57.170170 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:16:57.170131 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5ff9c785b8-h8ccv" podUID="c2c1c3bb-02ee-4569-93c3-a67943843d79" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 16 19:17:07.170148 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:17:07.170101 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5ff9c785b8-h8ccv" podUID="c2c1c3bb-02ee-4569-93c3-a67943843d79" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 16 19:17:17.170040 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:17:17.169998 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5ff9c785b8-h8ccv" podUID="c2c1c3bb-02ee-4569-93c3-a67943843d79" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 16 19:17:27.170125 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:17:27.170080 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5ff9c785b8-h8ccv" podUID="c2c1c3bb-02ee-4569-93c3-a67943843d79" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 16 19:17:37.170344 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:17:37.170297 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5ff9c785b8-h8ccv" podUID="c2c1c3bb-02ee-4569-93c3-a67943843d79" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 16 19:17:47.169990 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:17:47.169941 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5ff9c785b8-h8ccv" podUID="c2c1c3bb-02ee-4569-93c3-a67943843d79" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 16 19:17:57.170910 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:17:57.170879 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5ff9c785b8-h8ccv" Apr 16 19:18:00.340111 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:18:00.340068 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5ff9c785b8-h8ccv"] Apr 16 19:18:00.340621 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:18:00.340398 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5ff9c785b8-h8ccv" podUID="c2c1c3bb-02ee-4569-93c3-a67943843d79" containerName="kserve-container" containerID="cri-o://43e90a4ab187c0e48a25f363b10d52f6f92bdeb81c7bfad904f262cfa4a7f75b" gracePeriod=30 Apr 16 19:18:00.396833 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:18:00.396796 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7bb9cdb449-vgrgl"] Apr 16 19:18:00.397369 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:18:00.397347 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="53000491-dc51-4f99-9737-43d6a93dc529" containerName="kserve-container" Apr 16 19:18:00.397446 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:18:00.397374 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="53000491-dc51-4f99-9737-43d6a93dc529" containerName="kserve-container" Apr 16 19:18:00.397446 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:18:00.397388 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="53000491-dc51-4f99-9737-43d6a93dc529" containerName="storage-initializer" Apr 16 19:18:00.397446 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:18:00.397396 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="53000491-dc51-4f99-9737-43d6a93dc529" containerName="storage-initializer" Apr 16 19:18:00.397577 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:18:00.397518 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="53000491-dc51-4f99-9737-43d6a93dc529" containerName="kserve-container" Apr 16 19:18:00.399738 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:18:00.399723 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7bb9cdb449-vgrgl" Apr 16 19:18:00.406897 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:18:00.406871 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7bb9cdb449-vgrgl"] Apr 16 19:18:00.568061 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:18:00.568020 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7d7bd594-57e7-4363-879f-678e0838bd37-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-7bb9cdb449-vgrgl\" (UID: \"7d7bd594-57e7-4363-879f-678e0838bd37\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7bb9cdb449-vgrgl" Apr 16 19:18:00.669119 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:18:00.669027 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7d7bd594-57e7-4363-879f-678e0838bd37-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-7bb9cdb449-vgrgl\" (UID: \"7d7bd594-57e7-4363-879f-678e0838bd37\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7bb9cdb449-vgrgl" Apr 16 19:18:00.669475 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:18:00.669453 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7d7bd594-57e7-4363-879f-678e0838bd37-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-7bb9cdb449-vgrgl\" (UID: \"7d7bd594-57e7-4363-879f-678e0838bd37\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7bb9cdb449-vgrgl" Apr 16 19:18:00.711124 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:18:00.711077 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7bb9cdb449-vgrgl" Apr 16 19:18:00.828127 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:18:00.828092 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7bb9cdb449-vgrgl"] Apr 16 19:18:00.831169 ip-10-0-139-33 kubenswrapper[2578]: W0416 19:18:00.831136 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d7bd594_57e7_4363_879f_678e0838bd37.slice/crio-d15e530ff0d9b4ba1117a0034ec5e77ab7d903ef35c1fa421adce6fe125b45d5 WatchSource:0}: Error finding container d15e530ff0d9b4ba1117a0034ec5e77ab7d903ef35c1fa421adce6fe125b45d5: Status 404 returned error can't find the container with id d15e530ff0d9b4ba1117a0034ec5e77ab7d903ef35c1fa421adce6fe125b45d5 Apr 16 19:18:01.419899 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:18:01.419853 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7bb9cdb449-vgrgl" event={"ID":"7d7bd594-57e7-4363-879f-678e0838bd37","Type":"ContainerStarted","Data":"31d8452f5585fc2629ea7f1188ae519a78974f206d343adc5970193bc4c8ffbb"} Apr 16 19:18:01.419899 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:18:01.419893 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7bb9cdb449-vgrgl" event={"ID":"7d7bd594-57e7-4363-879f-678e0838bd37","Type":"ContainerStarted","Data":"d15e530ff0d9b4ba1117a0034ec5e77ab7d903ef35c1fa421adce6fe125b45d5"} Apr 16 19:18:04.681118 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:18:04.681094 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5ff9c785b8-h8ccv" Apr 16 19:18:04.702082 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:18:04.702047 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c2c1c3bb-02ee-4569-93c3-a67943843d79-kserve-provision-location\") pod \"c2c1c3bb-02ee-4569-93c3-a67943843d79\" (UID: \"c2c1c3bb-02ee-4569-93c3-a67943843d79\") " Apr 16 19:18:04.702402 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:18:04.702379 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2c1c3bb-02ee-4569-93c3-a67943843d79-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c2c1c3bb-02ee-4569-93c3-a67943843d79" (UID: "c2c1c3bb-02ee-4569-93c3-a67943843d79"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:18:04.802715 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:18:04.802683 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c2c1c3bb-02ee-4569-93c3-a67943843d79-kserve-provision-location\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 19:18:05.433208 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:18:05.433156 2578 generic.go:358] "Generic (PLEG): container finished" podID="7d7bd594-57e7-4363-879f-678e0838bd37" containerID="31d8452f5585fc2629ea7f1188ae519a78974f206d343adc5970193bc4c8ffbb" exitCode=0 Apr 16 19:18:05.433390 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:18:05.433232 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7bb9cdb449-vgrgl" event={"ID":"7d7bd594-57e7-4363-879f-678e0838bd37","Type":"ContainerDied","Data":"31d8452f5585fc2629ea7f1188ae519a78974f206d343adc5970193bc4c8ffbb"} Apr 16 19:18:05.434374 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:18:05.434353 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:18:05.434712 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:18:05.434690 2578 generic.go:358] "Generic (PLEG): container finished" podID="c2c1c3bb-02ee-4569-93c3-a67943843d79" containerID="43e90a4ab187c0e48a25f363b10d52f6f92bdeb81c7bfad904f262cfa4a7f75b" exitCode=0 Apr 16 19:18:05.434790 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:18:05.434736 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5ff9c785b8-h8ccv" event={"ID":"c2c1c3bb-02ee-4569-93c3-a67943843d79","Type":"ContainerDied","Data":"43e90a4ab187c0e48a25f363b10d52f6f92bdeb81c7bfad904f262cfa4a7f75b"} Apr 16 19:18:05.434790 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:18:05.434751 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5ff9c785b8-h8ccv" Apr 16 19:18:05.434790 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:18:05.434761 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5ff9c785b8-h8ccv" event={"ID":"c2c1c3bb-02ee-4569-93c3-a67943843d79","Type":"ContainerDied","Data":"ce8c19eb9c6b2200f2096bbad1dc0fe63613d965ff3f892dbfa3493ad3354beb"} Apr 16 19:18:05.434790 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:18:05.434779 2578 scope.go:117] "RemoveContainer" containerID="43e90a4ab187c0e48a25f363b10d52f6f92bdeb81c7bfad904f262cfa4a7f75b" Apr 16 19:18:05.446214 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:18:05.446195 2578 scope.go:117] "RemoveContainer" containerID="82a114885639bc0dfdb6c4de8d7120e0cfbc15cc262a3673d766a9b5199914b2" Apr 16 19:18:05.461046 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:18:05.460990 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5ff9c785b8-h8ccv"] Apr 16 19:18:05.461111 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:18:05.461084 2578 scope.go:117] "RemoveContainer" containerID="43e90a4ab187c0e48a25f363b10d52f6f92bdeb81c7bfad904f262cfa4a7f75b" Apr 16 19:18:05.461408 ip-10-0-139-33 kubenswrapper[2578]: E0416 19:18:05.461383 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43e90a4ab187c0e48a25f363b10d52f6f92bdeb81c7bfad904f262cfa4a7f75b\": container with ID starting with 43e90a4ab187c0e48a25f363b10d52f6f92bdeb81c7bfad904f262cfa4a7f75b not found: ID does not exist" containerID="43e90a4ab187c0e48a25f363b10d52f6f92bdeb81c7bfad904f262cfa4a7f75b" Apr 16 19:18:05.461553 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:18:05.461523 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43e90a4ab187c0e48a25f363b10d52f6f92bdeb81c7bfad904f262cfa4a7f75b"} err="failed to get container status \"43e90a4ab187c0e48a25f363b10d52f6f92bdeb81c7bfad904f262cfa4a7f75b\": rpc error: code = NotFound desc = could not find container \"43e90a4ab187c0e48a25f363b10d52f6f92bdeb81c7bfad904f262cfa4a7f75b\": container with ID starting with 43e90a4ab187c0e48a25f363b10d52f6f92bdeb81c7bfad904f262cfa4a7f75b not found: ID does not exist" Apr 16 19:18:05.461634 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:18:05.461559 2578 scope.go:117] "RemoveContainer" containerID="82a114885639bc0dfdb6c4de8d7120e0cfbc15cc262a3673d766a9b5199914b2" Apr 16 19:18:05.461951 ip-10-0-139-33 kubenswrapper[2578]: E0416 19:18:05.461906 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82a114885639bc0dfdb6c4de8d7120e0cfbc15cc262a3673d766a9b5199914b2\": container with ID starting with 82a114885639bc0dfdb6c4de8d7120e0cfbc15cc262a3673d766a9b5199914b2 not found: ID does not exist" containerID="82a114885639bc0dfdb6c4de8d7120e0cfbc15cc262a3673d766a9b5199914b2" Apr 16 19:18:05.461951 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:18:05.461942 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82a114885639bc0dfdb6c4de8d7120e0cfbc15cc262a3673d766a9b5199914b2"} err="failed to get container status \"82a114885639bc0dfdb6c4de8d7120e0cfbc15cc262a3673d766a9b5199914b2\": rpc error: code = NotFound desc = could not find container \"82a114885639bc0dfdb6c4de8d7120e0cfbc15cc262a3673d766a9b5199914b2\": container with ID starting with 82a114885639bc0dfdb6c4de8d7120e0cfbc15cc262a3673d766a9b5199914b2 not found: ID does not exist" Apr 16 19:18:05.463997 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:18:05.463977 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5ff9c785b8-h8ccv"] Apr 16 19:18:05.842394 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:18:05.842351 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2c1c3bb-02ee-4569-93c3-a67943843d79" path="/var/lib/kubelet/pods/c2c1c3bb-02ee-4569-93c3-a67943843d79/volumes" Apr 16 19:18:06.440571 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:18:06.440538 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7bb9cdb449-vgrgl" event={"ID":"7d7bd594-57e7-4363-879f-678e0838bd37","Type":"ContainerStarted","Data":"501853ddfc8ddd4676d5b98be30d2e9f9f8cf103acbf91dd1f4180ce37ffeb90"} Apr 16 19:18:06.440831 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:18:06.440813 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7bb9cdb449-vgrgl" Apr 16 19:18:06.442172 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:18:06.442146 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7bb9cdb449-vgrgl" podUID="7d7bd594-57e7-4363-879f-678e0838bd37" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 16 19:18:06.456136 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:18:06.456083 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7bb9cdb449-vgrgl" podStartSLOduration=6.456065794 podStartE2EDuration="6.456065794s" podCreationTimestamp="2026-04-16 19:18:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:18:06.455044583 +0000 UTC m=+2863.242380839" watchObservedRunningTime="2026-04-16 19:18:06.456065794 +0000 UTC m=+2863.243402049" Apr 16 19:18:07.443971 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:18:07.443924 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7bb9cdb449-vgrgl" podUID="7d7bd594-57e7-4363-879f-678e0838bd37" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 16 19:18:17.444435 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:18:17.444338 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7bb9cdb449-vgrgl" podUID="7d7bd594-57e7-4363-879f-678e0838bd37" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 16 19:18:27.444703 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:18:27.444655 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7bb9cdb449-vgrgl" podUID="7d7bd594-57e7-4363-879f-678e0838bd37" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 16 19:18:37.444067 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:18:37.444019 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7bb9cdb449-vgrgl" podUID="7d7bd594-57e7-4363-879f-678e0838bd37" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 16 19:18:47.444696 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:18:47.444657 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7bb9cdb449-vgrgl" podUID="7d7bd594-57e7-4363-879f-678e0838bd37" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 16 19:18:57.444296 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:18:57.444253 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7bb9cdb449-vgrgl" podUID="7d7bd594-57e7-4363-879f-678e0838bd37" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 16 19:19:07.444768 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:19:07.444727 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7bb9cdb449-vgrgl" podUID="7d7bd594-57e7-4363-879f-678e0838bd37" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 16 19:19:12.837869 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:19:12.837840 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7bb9cdb449-vgrgl" Apr 16 19:19:20.542604 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:19:20.542565 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7bb9cdb449-vgrgl"] Apr 16 19:19:20.543221 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:19:20.543148 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7bb9cdb449-vgrgl" podUID="7d7bd594-57e7-4363-879f-678e0838bd37" containerName="kserve-container" containerID="cri-o://501853ddfc8ddd4676d5b98be30d2e9f9f8cf103acbf91dd1f4180ce37ffeb90" gracePeriod=30 Apr 16 19:19:22.837593 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:19:22.837538 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7bb9cdb449-vgrgl" podUID="7d7bd594-57e7-4363-879f-678e0838bd37" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 16 19:19:24.982624 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:19:24.982599 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7bb9cdb449-vgrgl" Apr 16 19:19:25.056273 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:19:25.056171 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7d7bd594-57e7-4363-879f-678e0838bd37-kserve-provision-location\") pod \"7d7bd594-57e7-4363-879f-678e0838bd37\" (UID: \"7d7bd594-57e7-4363-879f-678e0838bd37\") " Apr 16 19:19:25.056485 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:19:25.056463 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d7bd594-57e7-4363-879f-678e0838bd37-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7d7bd594-57e7-4363-879f-678e0838bd37" (UID: "7d7bd594-57e7-4363-879f-678e0838bd37"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:19:25.157384 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:19:25.157357 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7d7bd594-57e7-4363-879f-678e0838bd37-kserve-provision-location\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 19:19:25.700794 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:19:25.700755 2578 generic.go:358] "Generic (PLEG): container finished" podID="7d7bd594-57e7-4363-879f-678e0838bd37" containerID="501853ddfc8ddd4676d5b98be30d2e9f9f8cf103acbf91dd1f4180ce37ffeb90" exitCode=0 Apr 16 19:19:25.700972 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:19:25.700834 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7bb9cdb449-vgrgl" Apr 16 19:19:25.700972 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:19:25.700835 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7bb9cdb449-vgrgl" event={"ID":"7d7bd594-57e7-4363-879f-678e0838bd37","Type":"ContainerDied","Data":"501853ddfc8ddd4676d5b98be30d2e9f9f8cf103acbf91dd1f4180ce37ffeb90"} Apr 16 19:19:25.700972 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:19:25.700874 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7bb9cdb449-vgrgl" event={"ID":"7d7bd594-57e7-4363-879f-678e0838bd37","Type":"ContainerDied","Data":"d15e530ff0d9b4ba1117a0034ec5e77ab7d903ef35c1fa421adce6fe125b45d5"} Apr 16 19:19:25.700972 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:19:25.700888 2578 scope.go:117] "RemoveContainer" containerID="501853ddfc8ddd4676d5b98be30d2e9f9f8cf103acbf91dd1f4180ce37ffeb90" Apr 16 19:19:25.709337 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:19:25.709319 2578 scope.go:117] "RemoveContainer" containerID="31d8452f5585fc2629ea7f1188ae519a78974f206d343adc5970193bc4c8ffbb" Apr 16 19:19:25.716305 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:19:25.716289 2578 scope.go:117] "RemoveContainer" containerID="501853ddfc8ddd4676d5b98be30d2e9f9f8cf103acbf91dd1f4180ce37ffeb90" Apr 16 19:19:25.716551 ip-10-0-139-33 kubenswrapper[2578]: E0416 19:19:25.716533 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"501853ddfc8ddd4676d5b98be30d2e9f9f8cf103acbf91dd1f4180ce37ffeb90\": container with ID starting with 501853ddfc8ddd4676d5b98be30d2e9f9f8cf103acbf91dd1f4180ce37ffeb90 not found: ID does not exist" containerID="501853ddfc8ddd4676d5b98be30d2e9f9f8cf103acbf91dd1f4180ce37ffeb90" Apr 16 19:19:25.716590 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:19:25.716561 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"501853ddfc8ddd4676d5b98be30d2e9f9f8cf103acbf91dd1f4180ce37ffeb90"} err="failed to get container status \"501853ddfc8ddd4676d5b98be30d2e9f9f8cf103acbf91dd1f4180ce37ffeb90\": rpc error: code = NotFound desc = could not find container \"501853ddfc8ddd4676d5b98be30d2e9f9f8cf103acbf91dd1f4180ce37ffeb90\": container with ID starting with 501853ddfc8ddd4676d5b98be30d2e9f9f8cf103acbf91dd1f4180ce37ffeb90 not found: ID does not exist" Apr 16 19:19:25.716590 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:19:25.716579 2578 scope.go:117] "RemoveContainer" containerID="31d8452f5585fc2629ea7f1188ae519a78974f206d343adc5970193bc4c8ffbb" Apr 16 19:19:25.716815 ip-10-0-139-33 kubenswrapper[2578]: E0416 19:19:25.716798 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31d8452f5585fc2629ea7f1188ae519a78974f206d343adc5970193bc4c8ffbb\": container with ID starting with 31d8452f5585fc2629ea7f1188ae519a78974f206d343adc5970193bc4c8ffbb not found: ID does not exist" containerID="31d8452f5585fc2629ea7f1188ae519a78974f206d343adc5970193bc4c8ffbb" Apr 16 19:19:25.716863 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:19:25.716821 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31d8452f5585fc2629ea7f1188ae519a78974f206d343adc5970193bc4c8ffbb"} err="failed to get container status \"31d8452f5585fc2629ea7f1188ae519a78974f206d343adc5970193bc4c8ffbb\": rpc error: code = NotFound desc = could not find container \"31d8452f5585fc2629ea7f1188ae519a78974f206d343adc5970193bc4c8ffbb\": container with ID starting with 31d8452f5585fc2629ea7f1188ae519a78974f206d343adc5970193bc4c8ffbb not found: ID does not exist" Apr 16 19:19:25.720950 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:19:25.720929 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7bb9cdb449-vgrgl"] Apr 16 19:19:25.723843 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:19:25.723819 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7bb9cdb449-vgrgl"] Apr 16 19:19:25.842201 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:19:25.842147 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d7bd594-57e7-4363-879f-678e0838bd37" path="/var/lib/kubelet/pods/7d7bd594-57e7-4363-879f-678e0838bd37/volumes" Apr 16 19:20:42.668520 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:20:42.668482 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-5fc768bcf-d29wg"] Apr 16 19:20:42.669106 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:20:42.668995 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c2c1c3bb-02ee-4569-93c3-a67943843d79" containerName="storage-initializer" Apr 16 19:20:42.669106 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:20:42.669016 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2c1c3bb-02ee-4569-93c3-a67943843d79" containerName="storage-initializer" Apr 16 19:20:42.669106 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:20:42.669035 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7d7bd594-57e7-4363-879f-678e0838bd37" containerName="kserve-container" Apr 16 19:20:42.669106 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:20:42.669043 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d7bd594-57e7-4363-879f-678e0838bd37" containerName="kserve-container" Apr 16 19:20:42.669106 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:20:42.669061 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c2c1c3bb-02ee-4569-93c3-a67943843d79" containerName="kserve-container" Apr 16 19:20:42.669106 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:20:42.669070 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2c1c3bb-02ee-4569-93c3-a67943843d79" containerName="kserve-container" Apr 16 19:20:42.669106 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:20:42.669087 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7d7bd594-57e7-4363-879f-678e0838bd37" containerName="storage-initializer" Apr 16 19:20:42.669106 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:20:42.669095 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d7bd594-57e7-4363-879f-678e0838bd37" containerName="storage-initializer" Apr 16 19:20:42.669580 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:20:42.669191 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="7d7bd594-57e7-4363-879f-678e0838bd37" containerName="kserve-container" Apr 16 19:20:42.669580 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:20:42.669207 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="c2c1c3bb-02ee-4569-93c3-a67943843d79" containerName="kserve-container" Apr 16 19:20:42.672373 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:20:42.672352 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-5fc768bcf-d29wg" Apr 16 19:20:42.674856 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:20:42.674839 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-z66mq\"" Apr 16 19:20:42.680013 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:20:42.679988 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-5fc768bcf-d29wg"] Apr 16 19:20:42.784818 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:20:42.784781 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/31cc995b-4585-400a-90f8-2d419cddc50a-kserve-provision-location\") pod \"isvc-triton-predictor-5fc768bcf-d29wg\" (UID: \"31cc995b-4585-400a-90f8-2d419cddc50a\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-5fc768bcf-d29wg" Apr 16 19:20:42.885883 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:20:42.885837 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/31cc995b-4585-400a-90f8-2d419cddc50a-kserve-provision-location\") pod \"isvc-triton-predictor-5fc768bcf-d29wg\" (UID: \"31cc995b-4585-400a-90f8-2d419cddc50a\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-5fc768bcf-d29wg" Apr 16 19:20:42.886283 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:20:42.886260 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/31cc995b-4585-400a-90f8-2d419cddc50a-kserve-provision-location\") pod \"isvc-triton-predictor-5fc768bcf-d29wg\" (UID: \"31cc995b-4585-400a-90f8-2d419cddc50a\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-5fc768bcf-d29wg" Apr 16 19:20:42.984271 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:20:42.984242 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-5fc768bcf-d29wg" Apr 16 19:20:43.107024 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:20:43.106998 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-5fc768bcf-d29wg"] Apr 16 19:20:43.109515 ip-10-0-139-33 kubenswrapper[2578]: W0416 19:20:43.109487 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31cc995b_4585_400a_90f8_2d419cddc50a.slice/crio-46de4745c9f9d3e745f1b34e07b5c625c123f1cb788926c1b2cd65b6fd30dec9 WatchSource:0}: Error finding container 46de4745c9f9d3e745f1b34e07b5c625c123f1cb788926c1b2cd65b6fd30dec9: Status 404 returned error can't find the container with id 46de4745c9f9d3e745f1b34e07b5c625c123f1cb788926c1b2cd65b6fd30dec9 Apr 16 19:20:43.951552 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:20:43.951512 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-5fc768bcf-d29wg" event={"ID":"31cc995b-4585-400a-90f8-2d419cddc50a","Type":"ContainerStarted","Data":"c1520d433d4adaa95814d067a3f2dc9a07b6196426712dd5bc2a10f730d27ee8"} Apr 16 19:20:43.951552 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:20:43.951557 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-5fc768bcf-d29wg" event={"ID":"31cc995b-4585-400a-90f8-2d419cddc50a","Type":"ContainerStarted","Data":"46de4745c9f9d3e745f1b34e07b5c625c123f1cb788926c1b2cd65b6fd30dec9"} Apr 16 19:20:46.962448 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:20:46.962419 2578 generic.go:358] "Generic (PLEG): container finished" podID="31cc995b-4585-400a-90f8-2d419cddc50a" containerID="c1520d433d4adaa95814d067a3f2dc9a07b6196426712dd5bc2a10f730d27ee8" exitCode=0 Apr 16 19:20:46.962756 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:20:46.962497 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-5fc768bcf-d29wg" event={"ID":"31cc995b-4585-400a-90f8-2d419cddc50a","Type":"ContainerDied","Data":"c1520d433d4adaa95814d067a3f2dc9a07b6196426712dd5bc2a10f730d27ee8"} Apr 16 19:20:47.598733 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:20:47.598708 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fdzzg_02533a21-4e1f-4bc0-a493-7ac7d35295b8/ovn-acl-logging/0.log" Apr 16 19:20:47.603019 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:20:47.602997 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fdzzg_02533a21-4e1f-4bc0-a493-7ac7d35295b8/ovn-acl-logging/0.log" Apr 16 19:22:41.425790 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:22:41.425751 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-5fc768bcf-d29wg" event={"ID":"31cc995b-4585-400a-90f8-2d419cddc50a","Type":"ContainerStarted","Data":"a0061c82182c17d0dac21cd362e043d2a5e82b4b7855cb187fca117024d4f747"} Apr 16 19:22:41.426195 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:22:41.425990 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-5fc768bcf-d29wg" Apr 16 19:22:41.427208 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:22:41.427164 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-5fc768bcf-d29wg" podUID="31cc995b-4585-400a-90f8-2d419cddc50a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.56:8080: connect: connection refused" Apr 16 19:22:41.442592 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:22:41.442547 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-triton-predictor-5fc768bcf-d29wg" podStartSLOduration=5.16654423 podStartE2EDuration="1m59.442536369s" podCreationTimestamp="2026-04-16 19:20:42 +0000 UTC" firstStartedPulling="2026-04-16 19:20:46.963573168 +0000 UTC m=+3023.750909401" lastFinishedPulling="2026-04-16 19:22:41.239565306 +0000 UTC m=+3138.026901540" observedRunningTime="2026-04-16 19:22:41.440578341 +0000 UTC m=+3138.227914598" watchObservedRunningTime="2026-04-16 19:22:41.442536369 +0000 UTC m=+3138.229872625" Apr 16 19:22:42.429852 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:22:42.429807 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-5fc768bcf-d29wg" podUID="31cc995b-4585-400a-90f8-2d419cddc50a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.56:8080: connect: connection refused" Apr 16 19:22:52.430933 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:22:52.430900 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-5fc768bcf-d29wg" Apr 16 19:22:54.238688 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:22:54.238654 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-5fc768bcf-d29wg"] Apr 16 19:22:54.239057 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:22:54.239021 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-triton-predictor-5fc768bcf-d29wg" podUID="31cc995b-4585-400a-90f8-2d419cddc50a" containerName="kserve-container" containerID="cri-o://a0061c82182c17d0dac21cd362e043d2a5e82b4b7855cb187fca117024d4f747" gracePeriod=30 Apr 16 19:22:57.028780 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:22:57.028755 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-5fc768bcf-d29wg" Apr 16 19:22:57.105112 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:22:57.105029 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/31cc995b-4585-400a-90f8-2d419cddc50a-kserve-provision-location\") pod \"31cc995b-4585-400a-90f8-2d419cddc50a\" (UID: \"31cc995b-4585-400a-90f8-2d419cddc50a\") " Apr 16 19:22:57.105380 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:22:57.105357 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31cc995b-4585-400a-90f8-2d419cddc50a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "31cc995b-4585-400a-90f8-2d419cddc50a" (UID: "31cc995b-4585-400a-90f8-2d419cddc50a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:22:57.206469 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:22:57.206428 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/31cc995b-4585-400a-90f8-2d419cddc50a-kserve-provision-location\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 19:22:57.477599 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:22:57.477562 2578 generic.go:358] "Generic (PLEG): container finished" podID="31cc995b-4585-400a-90f8-2d419cddc50a" containerID="a0061c82182c17d0dac21cd362e043d2a5e82b4b7855cb187fca117024d4f747" exitCode=0 Apr 16 19:22:57.477765 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:22:57.477645 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-5fc768bcf-d29wg" Apr 16 19:22:57.477765 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:22:57.477646 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-5fc768bcf-d29wg" event={"ID":"31cc995b-4585-400a-90f8-2d419cddc50a","Type":"ContainerDied","Data":"a0061c82182c17d0dac21cd362e043d2a5e82b4b7855cb187fca117024d4f747"} Apr 16 19:22:57.477765 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:22:57.477685 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-5fc768bcf-d29wg" event={"ID":"31cc995b-4585-400a-90f8-2d419cddc50a","Type":"ContainerDied","Data":"46de4745c9f9d3e745f1b34e07b5c625c123f1cb788926c1b2cd65b6fd30dec9"} Apr 16 19:22:57.477765 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:22:57.477701 2578 scope.go:117] "RemoveContainer" containerID="a0061c82182c17d0dac21cd362e043d2a5e82b4b7855cb187fca117024d4f747" Apr 16 19:22:57.486688 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:22:57.486673 2578 scope.go:117] "RemoveContainer" containerID="c1520d433d4adaa95814d067a3f2dc9a07b6196426712dd5bc2a10f730d27ee8" Apr 16 19:22:57.493939 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:22:57.493922 2578 scope.go:117] "RemoveContainer" containerID="a0061c82182c17d0dac21cd362e043d2a5e82b4b7855cb187fca117024d4f747" Apr 16 19:22:57.494196 ip-10-0-139-33 kubenswrapper[2578]: E0416 19:22:57.494153 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0061c82182c17d0dac21cd362e043d2a5e82b4b7855cb187fca117024d4f747\": container with ID starting with a0061c82182c17d0dac21cd362e043d2a5e82b4b7855cb187fca117024d4f747 not found: ID does not exist" containerID="a0061c82182c17d0dac21cd362e043d2a5e82b4b7855cb187fca117024d4f747" Apr 16 19:22:57.494271 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:22:57.494204 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0061c82182c17d0dac21cd362e043d2a5e82b4b7855cb187fca117024d4f747"} err="failed to get container status \"a0061c82182c17d0dac21cd362e043d2a5e82b4b7855cb187fca117024d4f747\": rpc error: code = NotFound desc = could not find container \"a0061c82182c17d0dac21cd362e043d2a5e82b4b7855cb187fca117024d4f747\": container with ID starting with a0061c82182c17d0dac21cd362e043d2a5e82b4b7855cb187fca117024d4f747 not found: ID does not exist" Apr 16 19:22:57.494271 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:22:57.494224 2578 scope.go:117] "RemoveContainer" containerID="c1520d433d4adaa95814d067a3f2dc9a07b6196426712dd5bc2a10f730d27ee8" Apr 16 19:22:57.494468 ip-10-0-139-33 kubenswrapper[2578]: E0416 19:22:57.494454 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1520d433d4adaa95814d067a3f2dc9a07b6196426712dd5bc2a10f730d27ee8\": container with ID starting with c1520d433d4adaa95814d067a3f2dc9a07b6196426712dd5bc2a10f730d27ee8 not found: ID does not exist" containerID="c1520d433d4adaa95814d067a3f2dc9a07b6196426712dd5bc2a10f730d27ee8" Apr 16 19:22:57.494518 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:22:57.494471 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1520d433d4adaa95814d067a3f2dc9a07b6196426712dd5bc2a10f730d27ee8"} err="failed to get container status \"c1520d433d4adaa95814d067a3f2dc9a07b6196426712dd5bc2a10f730d27ee8\": rpc error: code = NotFound desc = could not find container \"c1520d433d4adaa95814d067a3f2dc9a07b6196426712dd5bc2a10f730d27ee8\": container with ID starting with c1520d433d4adaa95814d067a3f2dc9a07b6196426712dd5bc2a10f730d27ee8 not found: ID does not exist" Apr 16 19:22:57.499139 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:22:57.499117 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-5fc768bcf-d29wg"] Apr 16 19:22:57.500582 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:22:57.500564 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-5fc768bcf-d29wg"] Apr 16 19:22:57.841536 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:22:57.841463 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31cc995b-4585-400a-90f8-2d419cddc50a" path="/var/lib/kubelet/pods/31cc995b-4585-400a-90f8-2d419cddc50a/volumes" Apr 16 19:24:24.564106 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:24:24.564028 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-9c799c49c-c75zp"] Apr 16 19:24:24.564576 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:24:24.564391 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="31cc995b-4585-400a-90f8-2d419cddc50a" containerName="storage-initializer" Apr 16 19:24:24.564576 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:24:24.564403 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="31cc995b-4585-400a-90f8-2d419cddc50a" containerName="storage-initializer" Apr 16 19:24:24.564576 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:24:24.564415 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="31cc995b-4585-400a-90f8-2d419cddc50a" containerName="kserve-container" Apr 16 19:24:24.564576 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:24:24.564420 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="31cc995b-4585-400a-90f8-2d419cddc50a" containerName="kserve-container" Apr 16 19:24:24.564576 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:24:24.564477 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="31cc995b-4585-400a-90f8-2d419cddc50a" containerName="kserve-container" Apr 16 19:24:24.567339 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:24:24.567321 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-9c799c49c-c75zp" Apr 16 19:24:24.569644 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:24:24.569620 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-z66mq\"" Apr 16 19:24:24.574407 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:24:24.574387 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-9c799c49c-c75zp"] Apr 16 19:24:24.619931 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:24:24.619897 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f6f278e6-4baa-4182-8cec-844fdff2d3aa-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-9c799c49c-c75zp\" (UID: \"f6f278e6-4baa-4182-8cec-844fdff2d3aa\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-9c799c49c-c75zp" Apr 16 19:24:24.721393 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:24:24.721362 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f6f278e6-4baa-4182-8cec-844fdff2d3aa-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-9c799c49c-c75zp\" (UID: \"f6f278e6-4baa-4182-8cec-844fdff2d3aa\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-9c799c49c-c75zp" Apr 16 19:24:24.721725 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:24:24.721705 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f6f278e6-4baa-4182-8cec-844fdff2d3aa-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-9c799c49c-c75zp\" (UID: \"f6f278e6-4baa-4182-8cec-844fdff2d3aa\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-9c799c49c-c75zp" Apr 16 19:24:24.877416 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:24:24.877307 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-9c799c49c-c75zp" Apr 16 19:24:25.000117 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:24:25.000088 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-9c799c49c-c75zp"] Apr 16 19:24:25.002235 ip-10-0-139-33 kubenswrapper[2578]: W0416 19:24:25.002203 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6f278e6_4baa_4182_8cec_844fdff2d3aa.slice/crio-2eb3036bc24889680959c5e7274f556b004c28cf53ec8c09e81c8c28a76ab94d WatchSource:0}: Error finding container 2eb3036bc24889680959c5e7274f556b004c28cf53ec8c09e81c8c28a76ab94d: Status 404 returned error can't find the container with id 2eb3036bc24889680959c5e7274f556b004c28cf53ec8c09e81c8c28a76ab94d Apr 16 19:24:25.004216 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:24:25.004200 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:24:25.768342 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:24:25.768302 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-9c799c49c-c75zp" event={"ID":"f6f278e6-4baa-4182-8cec-844fdff2d3aa","Type":"ContainerStarted","Data":"f7a307ad10d2c34e6a409a14408d8a85f4c3e6fd0c55375106ba99ef3e1ef654"} Apr 16 19:24:25.768342 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:24:25.768347 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-9c799c49c-c75zp" event={"ID":"f6f278e6-4baa-4182-8cec-844fdff2d3aa","Type":"ContainerStarted","Data":"2eb3036bc24889680959c5e7274f556b004c28cf53ec8c09e81c8c28a76ab94d"} Apr 16 19:24:29.787964 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:24:29.787927 2578 generic.go:358] "Generic (PLEG): container finished" podID="f6f278e6-4baa-4182-8cec-844fdff2d3aa" containerID="f7a307ad10d2c34e6a409a14408d8a85f4c3e6fd0c55375106ba99ef3e1ef654" exitCode=0 Apr 16 19:24:29.788344 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:24:29.788006 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-9c799c49c-c75zp" event={"ID":"f6f278e6-4baa-4182-8cec-844fdff2d3aa","Type":"ContainerDied","Data":"f7a307ad10d2c34e6a409a14408d8a85f4c3e6fd0c55375106ba99ef3e1ef654"} Apr 16 19:24:30.792752 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:24:30.792675 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-9c799c49c-c75zp" event={"ID":"f6f278e6-4baa-4182-8cec-844fdff2d3aa","Type":"ContainerStarted","Data":"8b53b1667b2cb96249b306901f6e6e731051f496c9608543b8876eb940c3c368"} Apr 16 19:24:30.793101 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:24:30.792881 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-9c799c49c-c75zp" Apr 16 19:24:30.809709 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:24:30.809669 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-9c799c49c-c75zp" podStartSLOduration=6.809657312 podStartE2EDuration="6.809657312s" podCreationTimestamp="2026-04-16 19:24:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:24:30.807929475 +0000 UTC m=+3247.595265754" watchObservedRunningTime="2026-04-16 19:24:30.809657312 +0000 UTC m=+3247.596993638" Apr 16 19:25:01.876721 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:25:01.876681 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-9c799c49c-c75zp" Apr 16 19:25:04.797205 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:25:04.797148 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-9c799c49c-c75zp"] Apr 16 19:25:04.797647 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:25:04.797469 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-9c799c49c-c75zp" podUID="f6f278e6-4baa-4182-8cec-844fdff2d3aa" containerName="kserve-container" containerID="cri-o://8b53b1667b2cb96249b306901f6e6e731051f496c9608543b8876eb940c3c368" gracePeriod=30 Apr 16 19:25:04.835647 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:25:04.835610 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-654cf7d7c6-zfjkm"] Apr 16 19:25:04.839569 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:25:04.839546 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-654cf7d7c6-zfjkm" Apr 16 19:25:04.848251 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:25:04.848214 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-654cf7d7c6-zfjkm"] Apr 16 19:25:04.958988 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:25:04.958953 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4a3e7319-d691-4ef3-a61b-0ef60560c20c-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-654cf7d7c6-zfjkm\" (UID: \"4a3e7319-d691-4ef3-a61b-0ef60560c20c\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-654cf7d7c6-zfjkm" Apr 16 19:25:05.060140 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:25:05.060052 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4a3e7319-d691-4ef3-a61b-0ef60560c20c-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-654cf7d7c6-zfjkm\" (UID: \"4a3e7319-d691-4ef3-a61b-0ef60560c20c\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-654cf7d7c6-zfjkm" Apr 16 19:25:05.060443 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:25:05.060425 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4a3e7319-d691-4ef3-a61b-0ef60560c20c-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-654cf7d7c6-zfjkm\" (UID: \"4a3e7319-d691-4ef3-a61b-0ef60560c20c\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-654cf7d7c6-zfjkm" Apr 16 19:25:05.150697 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:25:05.150647 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-654cf7d7c6-zfjkm" Apr 16 19:25:05.272975 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:25:05.272940 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-654cf7d7c6-zfjkm"] Apr 16 19:25:05.275514 ip-10-0-139-33 kubenswrapper[2578]: W0416 19:25:05.275484 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a3e7319_d691_4ef3_a61b_0ef60560c20c.slice/crio-7aa9a2adf03707a96b28e29d2d0535e07f2512331ca896dc0796581cf0931fbe WatchSource:0}: Error finding container 7aa9a2adf03707a96b28e29d2d0535e07f2512331ca896dc0796581cf0931fbe: Status 404 returned error can't find the container with id 7aa9a2adf03707a96b28e29d2d0535e07f2512331ca896dc0796581cf0931fbe Apr 16 19:25:05.907231 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:25:05.907194 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-654cf7d7c6-zfjkm" event={"ID":"4a3e7319-d691-4ef3-a61b-0ef60560c20c","Type":"ContainerStarted","Data":"0bd1414ec42586a5194a94d6bfaec5371c1c6778f000cbd36f73ec0d5dc834af"} Apr 16 19:25:05.907231 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:25:05.907236 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-654cf7d7c6-zfjkm" event={"ID":"4a3e7319-d691-4ef3-a61b-0ef60560c20c","Type":"ContainerStarted","Data":"7aa9a2adf03707a96b28e29d2d0535e07f2512331ca896dc0796581cf0931fbe"} Apr 16 19:25:09.921478 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:25:09.921436 2578 generic.go:358] "Generic (PLEG): container finished" podID="4a3e7319-d691-4ef3-a61b-0ef60560c20c" containerID="0bd1414ec42586a5194a94d6bfaec5371c1c6778f000cbd36f73ec0d5dc834af" exitCode=0 Apr 16 19:25:09.922002 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:25:09.921504 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-654cf7d7c6-zfjkm" event={"ID":"4a3e7319-d691-4ef3-a61b-0ef60560c20c","Type":"ContainerDied","Data":"0bd1414ec42586a5194a94d6bfaec5371c1c6778f000cbd36f73ec0d5dc834af"} Apr 16 19:25:10.925686 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:25:10.925646 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-654cf7d7c6-zfjkm" event={"ID":"4a3e7319-d691-4ef3-a61b-0ef60560c20c","Type":"ContainerStarted","Data":"5dd3afb4c13e3a42058a22abe272d146e410932c149ed1677689c9e2b5fb2659"} Apr 16 19:25:10.926060 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:25:10.925859 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-654cf7d7c6-zfjkm" Apr 16 19:25:10.944802 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:25:10.944743 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-654cf7d7c6-zfjkm" podStartSLOduration=6.944731348 podStartE2EDuration="6.944731348s" podCreationTimestamp="2026-04-16 19:25:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:25:10.942310993 +0000 UTC m=+3287.729647251" watchObservedRunningTime="2026-04-16 19:25:10.944731348 +0000 UTC m=+3287.732067678" Apr 16 19:25:11.797091 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:25:11.797047 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-9c799c49c-c75zp" podUID="f6f278e6-4baa-4182-8cec-844fdff2d3aa" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.57:8080/v2/models/isvc-xgboost-v2-mlserver/ready\": dial tcp 10.134.0.57:8080: connect: connection refused" Apr 16 19:25:12.933307 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:25:12.933275 2578 generic.go:358] "Generic (PLEG): container finished" podID="f6f278e6-4baa-4182-8cec-844fdff2d3aa" containerID="8b53b1667b2cb96249b306901f6e6e731051f496c9608543b8876eb940c3c368" exitCode=0 Apr 16 19:25:12.933664 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:25:12.933351 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-9c799c49c-c75zp" event={"ID":"f6f278e6-4baa-4182-8cec-844fdff2d3aa","Type":"ContainerDied","Data":"8b53b1667b2cb96249b306901f6e6e731051f496c9608543b8876eb940c3c368"} Apr 16 19:25:12.948268 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:25:12.948246 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-9c799c49c-c75zp" Apr 16 19:25:13.022199 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:25:13.022150 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f6f278e6-4baa-4182-8cec-844fdff2d3aa-kserve-provision-location\") pod \"f6f278e6-4baa-4182-8cec-844fdff2d3aa\" (UID: \"f6f278e6-4baa-4182-8cec-844fdff2d3aa\") " Apr 16 19:25:13.022465 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:25:13.022442 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6f278e6-4baa-4182-8cec-844fdff2d3aa-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f6f278e6-4baa-4182-8cec-844fdff2d3aa" (UID: "f6f278e6-4baa-4182-8cec-844fdff2d3aa"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:25:13.122737 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:25:13.122653 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f6f278e6-4baa-4182-8cec-844fdff2d3aa-kserve-provision-location\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 19:25:13.938091 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:25:13.938059 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-9c799c49c-c75zp" event={"ID":"f6f278e6-4baa-4182-8cec-844fdff2d3aa","Type":"ContainerDied","Data":"2eb3036bc24889680959c5e7274f556b004c28cf53ec8c09e81c8c28a76ab94d"} Apr 16 19:25:13.938091 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:25:13.938089 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-9c799c49c-c75zp" Apr 16 19:25:13.938579 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:25:13.938104 2578 scope.go:117] "RemoveContainer" containerID="8b53b1667b2cb96249b306901f6e6e731051f496c9608543b8876eb940c3c368" Apr 16 19:25:13.946169 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:25:13.946153 2578 scope.go:117] "RemoveContainer" containerID="f7a307ad10d2c34e6a409a14408d8a85f4c3e6fd0c55375106ba99ef3e1ef654" Apr 16 19:25:13.954947 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:25:13.954927 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-9c799c49c-c75zp"] Apr 16 19:25:13.957850 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:25:13.957828 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-9c799c49c-c75zp"] Apr 16 19:25:15.841138 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:25:15.841106 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6f278e6-4baa-4182-8cec-844fdff2d3aa" path="/var/lib/kubelet/pods/f6f278e6-4baa-4182-8cec-844fdff2d3aa/volumes" Apr 16 19:25:41.977969 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:25:41.977885 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-654cf7d7c6-zfjkm" Apr 16 19:25:44.901515 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:25:44.901464 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-654cf7d7c6-zfjkm"] Apr 16 19:25:44.901923 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:25:44.901815 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-654cf7d7c6-zfjkm" podUID="4a3e7319-d691-4ef3-a61b-0ef60560c20c" containerName="kserve-container" containerID="cri-o://5dd3afb4c13e3a42058a22abe272d146e410932c149ed1677689c9e2b5fb2659" gracePeriod=30 Apr 16 19:25:47.622247 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:25:47.622215 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fdzzg_02533a21-4e1f-4bc0-a493-7ac7d35295b8/ovn-acl-logging/0.log" Apr 16 19:25:47.626578 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:25:47.626559 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fdzzg_02533a21-4e1f-4bc0-a493-7ac7d35295b8/ovn-acl-logging/0.log" Apr 16 19:25:51.941142 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:25:51.941118 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-654cf7d7c6-zfjkm" Apr 16 19:25:52.032827 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:25:52.032751 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4a3e7319-d691-4ef3-a61b-0ef60560c20c-kserve-provision-location\") pod \"4a3e7319-d691-4ef3-a61b-0ef60560c20c\" (UID: \"4a3e7319-d691-4ef3-a61b-0ef60560c20c\") " Apr 16 19:25:52.033071 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:25:52.033049 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a3e7319-d691-4ef3-a61b-0ef60560c20c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4a3e7319-d691-4ef3-a61b-0ef60560c20c" (UID: "4a3e7319-d691-4ef3-a61b-0ef60560c20c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:25:52.059790 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:25:52.059764 2578 generic.go:358] "Generic (PLEG): container finished" podID="4a3e7319-d691-4ef3-a61b-0ef60560c20c" containerID="5dd3afb4c13e3a42058a22abe272d146e410932c149ed1677689c9e2b5fb2659" exitCode=0 Apr 16 19:25:52.059927 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:25:52.059832 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-654cf7d7c6-zfjkm" Apr 16 19:25:52.059927 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:25:52.059844 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-654cf7d7c6-zfjkm" event={"ID":"4a3e7319-d691-4ef3-a61b-0ef60560c20c","Type":"ContainerDied","Data":"5dd3afb4c13e3a42058a22abe272d146e410932c149ed1677689c9e2b5fb2659"} Apr 16 19:25:52.059927 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:25:52.059885 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-654cf7d7c6-zfjkm" event={"ID":"4a3e7319-d691-4ef3-a61b-0ef60560c20c","Type":"ContainerDied","Data":"7aa9a2adf03707a96b28e29d2d0535e07f2512331ca896dc0796581cf0931fbe"} Apr 16 19:25:52.059927 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:25:52.059905 2578 scope.go:117] "RemoveContainer" containerID="5dd3afb4c13e3a42058a22abe272d146e410932c149ed1677689c9e2b5fb2659" Apr 16 19:25:52.068674 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:25:52.068658 2578 scope.go:117] "RemoveContainer" containerID="0bd1414ec42586a5194a94d6bfaec5371c1c6778f000cbd36f73ec0d5dc834af" Apr 16 19:25:52.075720 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:25:52.075704 2578 scope.go:117] "RemoveContainer" containerID="5dd3afb4c13e3a42058a22abe272d146e410932c149ed1677689c9e2b5fb2659" Apr 16 19:25:52.075965 ip-10-0-139-33 kubenswrapper[2578]: E0416 19:25:52.075948 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5dd3afb4c13e3a42058a22abe272d146e410932c149ed1677689c9e2b5fb2659\": container with ID starting with 5dd3afb4c13e3a42058a22abe272d146e410932c149ed1677689c9e2b5fb2659 not found: ID does not exist" containerID="5dd3afb4c13e3a42058a22abe272d146e410932c149ed1677689c9e2b5fb2659" Apr 16 19:25:52.076011 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:25:52.075974 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5dd3afb4c13e3a42058a22abe272d146e410932c149ed1677689c9e2b5fb2659"} err="failed to get container status \"5dd3afb4c13e3a42058a22abe272d146e410932c149ed1677689c9e2b5fb2659\": rpc error: code = NotFound desc = could not find container \"5dd3afb4c13e3a42058a22abe272d146e410932c149ed1677689c9e2b5fb2659\": container with ID starting with 5dd3afb4c13e3a42058a22abe272d146e410932c149ed1677689c9e2b5fb2659 not found: ID does not exist" Apr 16 19:25:52.076011 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:25:52.075989 2578 scope.go:117] "RemoveContainer" containerID="0bd1414ec42586a5194a94d6bfaec5371c1c6778f000cbd36f73ec0d5dc834af" Apr 16 19:25:52.076262 ip-10-0-139-33 kubenswrapper[2578]: E0416 19:25:52.076237 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bd1414ec42586a5194a94d6bfaec5371c1c6778f000cbd36f73ec0d5dc834af\": container with ID starting with 0bd1414ec42586a5194a94d6bfaec5371c1c6778f000cbd36f73ec0d5dc834af not found: ID does not exist" containerID="0bd1414ec42586a5194a94d6bfaec5371c1c6778f000cbd36f73ec0d5dc834af" Apr 16 19:25:52.076311 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:25:52.076273 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bd1414ec42586a5194a94d6bfaec5371c1c6778f000cbd36f73ec0d5dc834af"} err="failed to get container status \"0bd1414ec42586a5194a94d6bfaec5371c1c6778f000cbd36f73ec0d5dc834af\": rpc error: code = NotFound desc = could not find container \"0bd1414ec42586a5194a94d6bfaec5371c1c6778f000cbd36f73ec0d5dc834af\": container with ID starting with 0bd1414ec42586a5194a94d6bfaec5371c1c6778f000cbd36f73ec0d5dc834af not found: ID does not exist" Apr 16 19:25:52.081540 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:25:52.081519 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-654cf7d7c6-zfjkm"] Apr 16 19:25:52.086365 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:25:52.086344 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-654cf7d7c6-zfjkm"] Apr 16 19:25:52.138829 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:25:52.138800 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4a3e7319-d691-4ef3-a61b-0ef60560c20c-kserve-provision-location\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 19:25:52.930027 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:25:52.929980 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-654cf7d7c6-zfjkm" podUID="4a3e7319-d691-4ef3-a61b-0ef60560c20c" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.58:8080/v2/models/xgboost-v2-mlserver/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Apr 16 19:25:53.840905 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:25:53.840871 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a3e7319-d691-4ef3-a61b-0ef60560c20c" path="/var/lib/kubelet/pods/4a3e7319-d691-4ef3-a61b-0ef60560c20c/volumes" Apr 16 19:26:55.151232 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:26:55.151197 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-d58974886-vlcts"] Apr 16 19:26:55.151675 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:26:55.151522 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4a3e7319-d691-4ef3-a61b-0ef60560c20c" containerName="kserve-container" Apr 16 19:26:55.151675 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:26:55.151534 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a3e7319-d691-4ef3-a61b-0ef60560c20c" containerName="kserve-container" Apr 16 19:26:55.151675 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:26:55.151550 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f6f278e6-4baa-4182-8cec-844fdff2d3aa" containerName="storage-initializer" Apr 16 19:26:55.151675 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:26:55.151556 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6f278e6-4baa-4182-8cec-844fdff2d3aa" containerName="storage-initializer" Apr 16 19:26:55.151675 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:26:55.151569 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f6f278e6-4baa-4182-8cec-844fdff2d3aa" containerName="kserve-container" Apr 16 19:26:55.151675 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:26:55.151574 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6f278e6-4baa-4182-8cec-844fdff2d3aa" containerName="kserve-container" Apr 16 19:26:55.151675 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:26:55.151580 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4a3e7319-d691-4ef3-a61b-0ef60560c20c" containerName="storage-initializer" Apr 16 19:26:55.151675 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:26:55.151585 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a3e7319-d691-4ef3-a61b-0ef60560c20c" containerName="storage-initializer" Apr 16 19:26:55.151675 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:26:55.151633 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="f6f278e6-4baa-4182-8cec-844fdff2d3aa" containerName="kserve-container" Apr 16 19:26:55.151675 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:26:55.151647 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="4a3e7319-d691-4ef3-a61b-0ef60560c20c" containerName="kserve-container" Apr 16 19:26:55.156417 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:26:55.156392 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-d58974886-vlcts" Apr 16 19:26:55.159926 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:26:55.159900 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-z66mq\"" Apr 16 19:26:55.164278 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:26:55.164252 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-d58974886-vlcts"] Apr 16 19:26:55.258930 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:26:55.258897 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/78d4c02b-6a51-467a-bac2-a2ab6649eb4e-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-d58974886-vlcts\" (UID: \"78d4c02b-6a51-467a-bac2-a2ab6649eb4e\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-d58974886-vlcts" Apr 16 19:26:55.360344 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:26:55.360299 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/78d4c02b-6a51-467a-bac2-a2ab6649eb4e-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-d58974886-vlcts\" (UID: \"78d4c02b-6a51-467a-bac2-a2ab6649eb4e\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-d58974886-vlcts" Apr 16 19:26:55.360670 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:26:55.360652 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/78d4c02b-6a51-467a-bac2-a2ab6649eb4e-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-d58974886-vlcts\" (UID: \"78d4c02b-6a51-467a-bac2-a2ab6649eb4e\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-d58974886-vlcts" Apr 16 19:26:55.470140 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:26:55.470113 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-d58974886-vlcts" Apr 16 19:26:55.589978 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:26:55.589902 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-d58974886-vlcts"] Apr 16 19:26:55.592332 ip-10-0-139-33 kubenswrapper[2578]: W0416 19:26:55.592307 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78d4c02b_6a51_467a_bac2_a2ab6649eb4e.slice/crio-6fdc70c4238ea974f622b663e19074b097c0749a72545b90625d94139e125dc5 WatchSource:0}: Error finding container 6fdc70c4238ea974f622b663e19074b097c0749a72545b90625d94139e125dc5: Status 404 returned error can't find the container with id 6fdc70c4238ea974f622b663e19074b097c0749a72545b90625d94139e125dc5 Apr 16 19:26:56.264340 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:26:56.264300 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-d58974886-vlcts" event={"ID":"78d4c02b-6a51-467a-bac2-a2ab6649eb4e","Type":"ContainerStarted","Data":"1a36dc65a9a1032beff76529aa147c37326e802fee96a720a1c4f829f6c96ac9"} Apr 16 19:26:56.264340 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:26:56.264339 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-d58974886-vlcts" event={"ID":"78d4c02b-6a51-467a-bac2-a2ab6649eb4e","Type":"ContainerStarted","Data":"6fdc70c4238ea974f622b663e19074b097c0749a72545b90625d94139e125dc5"} Apr 16 19:27:00.279471 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:27:00.279442 2578 generic.go:358] "Generic (PLEG): container finished" podID="78d4c02b-6a51-467a-bac2-a2ab6649eb4e" containerID="1a36dc65a9a1032beff76529aa147c37326e802fee96a720a1c4f829f6c96ac9" exitCode=0 Apr 16 19:27:00.279847 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:27:00.279514 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-d58974886-vlcts" event={"ID":"78d4c02b-6a51-467a-bac2-a2ab6649eb4e","Type":"ContainerDied","Data":"1a36dc65a9a1032beff76529aa147c37326e802fee96a720a1c4f829f6c96ac9"} Apr 16 19:27:01.284410 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:27:01.284374 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-d58974886-vlcts" event={"ID":"78d4c02b-6a51-467a-bac2-a2ab6649eb4e","Type":"ContainerStarted","Data":"003112a7a65397130c2c9eef27346b062b08c2ff4bcc0d0540c3249167ecd4e8"} Apr 16 19:27:01.284817 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:27:01.284587 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-d58974886-vlcts" Apr 16 19:27:01.301395 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:27:01.301347 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-d58974886-vlcts" podStartSLOduration=6.3013320329999996 podStartE2EDuration="6.301332033s" podCreationTimestamp="2026-04-16 19:26:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:27:01.299311579 +0000 UTC m=+3398.086647835" watchObservedRunningTime="2026-04-16 19:27:01.301332033 +0000 UTC m=+3398.088668287" Apr 16 19:27:32.378132 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:27:32.378087 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-d58974886-vlcts" podUID="78d4c02b-6a51-467a-bac2-a2ab6649eb4e" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 16 19:27:42.292993 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:27:42.292960 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-d58974886-vlcts" Apr 16 19:27:45.258062 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:27:45.258025 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-d58974886-vlcts"] Apr 16 19:27:45.258572 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:27:45.258516 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-d58974886-vlcts" podUID="78d4c02b-6a51-467a-bac2-a2ab6649eb4e" containerName="kserve-container" containerID="cri-o://003112a7a65397130c2c9eef27346b062b08c2ff4bcc0d0540c3249167ecd4e8" gracePeriod=30 Apr 16 19:27:52.291238 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:27:52.291191 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-d58974886-vlcts" podUID="78d4c02b-6a51-467a-bac2-a2ab6649eb4e" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.59:8080/v2/models/isvc-xgboost-v2-runtime/ready\": dial tcp 10.134.0.59:8080: connect: connection refused" Apr 16 19:27:52.601895 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:27:52.601868 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-d58974886-vlcts" Apr 16 19:27:52.725113 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:27:52.725082 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/78d4c02b-6a51-467a-bac2-a2ab6649eb4e-kserve-provision-location\") pod \"78d4c02b-6a51-467a-bac2-a2ab6649eb4e\" (UID: \"78d4c02b-6a51-467a-bac2-a2ab6649eb4e\") " Apr 16 19:27:52.725403 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:27:52.725379 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78d4c02b-6a51-467a-bac2-a2ab6649eb4e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "78d4c02b-6a51-467a-bac2-a2ab6649eb4e" (UID: "78d4c02b-6a51-467a-bac2-a2ab6649eb4e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:27:52.826678 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:27:52.826643 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/78d4c02b-6a51-467a-bac2-a2ab6649eb4e-kserve-provision-location\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 19:27:53.453677 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:27:53.453640 2578 generic.go:358] "Generic (PLEG): container finished" podID="78d4c02b-6a51-467a-bac2-a2ab6649eb4e" containerID="003112a7a65397130c2c9eef27346b062b08c2ff4bcc0d0540c3249167ecd4e8" exitCode=0 Apr 16 19:27:53.454080 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:27:53.453718 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-d58974886-vlcts" Apr 16 19:27:53.454080 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:27:53.453722 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-d58974886-vlcts" event={"ID":"78d4c02b-6a51-467a-bac2-a2ab6649eb4e","Type":"ContainerDied","Data":"003112a7a65397130c2c9eef27346b062b08c2ff4bcc0d0540c3249167ecd4e8"} Apr 16 19:27:53.454080 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:27:53.453767 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-d58974886-vlcts" event={"ID":"78d4c02b-6a51-467a-bac2-a2ab6649eb4e","Type":"ContainerDied","Data":"6fdc70c4238ea974f622b663e19074b097c0749a72545b90625d94139e125dc5"} Apr 16 19:27:53.454080 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:27:53.453783 2578 scope.go:117] "RemoveContainer" containerID="003112a7a65397130c2c9eef27346b062b08c2ff4bcc0d0540c3249167ecd4e8" Apr 16 19:27:53.464630 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:27:53.464614 2578 scope.go:117] "RemoveContainer" containerID="1a36dc65a9a1032beff76529aa147c37326e802fee96a720a1c4f829f6c96ac9" Apr 16 19:27:53.471576 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:27:53.471555 2578 scope.go:117] "RemoveContainer" containerID="003112a7a65397130c2c9eef27346b062b08c2ff4bcc0d0540c3249167ecd4e8" Apr 16 19:27:53.471798 ip-10-0-139-33 kubenswrapper[2578]: E0416 19:27:53.471778 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"003112a7a65397130c2c9eef27346b062b08c2ff4bcc0d0540c3249167ecd4e8\": container with ID starting with 003112a7a65397130c2c9eef27346b062b08c2ff4bcc0d0540c3249167ecd4e8 not found: ID does not exist" containerID="003112a7a65397130c2c9eef27346b062b08c2ff4bcc0d0540c3249167ecd4e8" Apr 16 19:27:53.471893 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:27:53.471810 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"003112a7a65397130c2c9eef27346b062b08c2ff4bcc0d0540c3249167ecd4e8"} err="failed to get container status \"003112a7a65397130c2c9eef27346b062b08c2ff4bcc0d0540c3249167ecd4e8\": rpc error: code = NotFound desc = could not find container \"003112a7a65397130c2c9eef27346b062b08c2ff4bcc0d0540c3249167ecd4e8\": container with ID starting with 003112a7a65397130c2c9eef27346b062b08c2ff4bcc0d0540c3249167ecd4e8 not found: ID does not exist" Apr 16 19:27:53.471893 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:27:53.471853 2578 scope.go:117] "RemoveContainer" containerID="1a36dc65a9a1032beff76529aa147c37326e802fee96a720a1c4f829f6c96ac9" Apr 16 19:27:53.472109 ip-10-0-139-33 kubenswrapper[2578]: E0416 19:27:53.472092 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a36dc65a9a1032beff76529aa147c37326e802fee96a720a1c4f829f6c96ac9\": container with ID starting with 1a36dc65a9a1032beff76529aa147c37326e802fee96a720a1c4f829f6c96ac9 not found: ID does not exist" containerID="1a36dc65a9a1032beff76529aa147c37326e802fee96a720a1c4f829f6c96ac9" Apr 16 19:27:53.472148 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:27:53.472116 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a36dc65a9a1032beff76529aa147c37326e802fee96a720a1c4f829f6c96ac9"} err="failed to get container status \"1a36dc65a9a1032beff76529aa147c37326e802fee96a720a1c4f829f6c96ac9\": rpc error: code = NotFound desc = could not find container \"1a36dc65a9a1032beff76529aa147c37326e802fee96a720a1c4f829f6c96ac9\": container with ID starting with 1a36dc65a9a1032beff76529aa147c37326e802fee96a720a1c4f829f6c96ac9 not found: ID does not exist" Apr 16 19:27:53.476252 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:27:53.476229 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-d58974886-vlcts"] Apr 16 19:27:53.479248 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:27:53.479224 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-d58974886-vlcts"] Apr 16 19:27:53.841405 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:27:53.841369 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78d4c02b-6a51-467a-bac2-a2ab6649eb4e" path="/var/lib/kubelet/pods/78d4c02b-6a51-467a-bac2-a2ab6649eb4e/volumes" Apr 16 19:28:55.486605 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:28:55.486525 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-57cc89c4d5-s5fxt"] Apr 16 19:28:55.487041 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:28:55.486864 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="78d4c02b-6a51-467a-bac2-a2ab6649eb4e" containerName="kserve-container" Apr 16 19:28:55.487041 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:28:55.486874 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="78d4c02b-6a51-467a-bac2-a2ab6649eb4e" containerName="kserve-container" Apr 16 19:28:55.487041 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:28:55.486893 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="78d4c02b-6a51-467a-bac2-a2ab6649eb4e" containerName="storage-initializer" Apr 16 19:28:55.487041 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:28:55.486898 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="78d4c02b-6a51-467a-bac2-a2ab6649eb4e" containerName="storage-initializer" Apr 16 19:28:55.487041 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:28:55.486947 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="78d4c02b-6a51-467a-bac2-a2ab6649eb4e" containerName="kserve-container" Apr 16 19:28:55.489964 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:28:55.489946 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-57cc89c4d5-s5fxt" Apr 16 19:28:55.492376 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:28:55.492356 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"storage-config\"" Apr 16 19:28:55.492464 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:28:55.492356 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-z66mq\"" Apr 16 19:28:55.496588 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:28:55.496565 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-57cc89c4d5-s5fxt"] Apr 16 19:28:55.633002 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:28:55.632970 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ac75e52e-5dcc-49cf-9451-ce0b39dbe94f-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-57cc89c4d5-s5fxt\" (UID: \"ac75e52e-5dcc-49cf-9451-ce0b39dbe94f\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-57cc89c4d5-s5fxt" Apr 16 19:28:55.734027 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:28:55.733994 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ac75e52e-5dcc-49cf-9451-ce0b39dbe94f-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-57cc89c4d5-s5fxt\" (UID: \"ac75e52e-5dcc-49cf-9451-ce0b39dbe94f\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-57cc89c4d5-s5fxt" Apr 16 19:28:55.734350 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:28:55.734331 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ac75e52e-5dcc-49cf-9451-ce0b39dbe94f-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-57cc89c4d5-s5fxt\" (UID: \"ac75e52e-5dcc-49cf-9451-ce0b39dbe94f\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-57cc89c4d5-s5fxt" Apr 16 19:28:55.800423 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:28:55.800351 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-57cc89c4d5-s5fxt" Apr 16 19:28:55.918976 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:28:55.918947 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-57cc89c4d5-s5fxt"] Apr 16 19:28:55.922096 ip-10-0-139-33 kubenswrapper[2578]: W0416 19:28:55.922065 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac75e52e_5dcc_49cf_9451_ce0b39dbe94f.slice/crio-e77e135d566a1378100ec343e400f4aa73733e3e72b66ec1e6884ee7b8c9ff70 WatchSource:0}: Error finding container e77e135d566a1378100ec343e400f4aa73733e3e72b66ec1e6884ee7b8c9ff70: Status 404 returned error can't find the container with id e77e135d566a1378100ec343e400f4aa73733e3e72b66ec1e6884ee7b8c9ff70 Apr 16 19:28:56.659873 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:28:56.659838 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-57cc89c4d5-s5fxt" event={"ID":"ac75e52e-5dcc-49cf-9451-ce0b39dbe94f","Type":"ContainerStarted","Data":"597c2d4af93eac27c858659a7616e9a21ab607329fbc25758db336e0424301dc"} Apr 16 19:28:56.659873 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:28:56.659874 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-57cc89c4d5-s5fxt" event={"ID":"ac75e52e-5dcc-49cf-9451-ce0b39dbe94f","Type":"ContainerStarted","Data":"e77e135d566a1378100ec343e400f4aa73733e3e72b66ec1e6884ee7b8c9ff70"} Apr 16 19:28:57.665217 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:28:57.665167 2578 generic.go:358] "Generic (PLEG): container finished" podID="ac75e52e-5dcc-49cf-9451-ce0b39dbe94f" containerID="597c2d4af93eac27c858659a7616e9a21ab607329fbc25758db336e0424301dc" exitCode=0 Apr 16 19:28:57.665706 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:28:57.665249 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-57cc89c4d5-s5fxt" event={"ID":"ac75e52e-5dcc-49cf-9451-ce0b39dbe94f","Type":"ContainerDied","Data":"597c2d4af93eac27c858659a7616e9a21ab607329fbc25758db336e0424301dc"} Apr 16 19:28:58.670632 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:28:58.670592 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-57cc89c4d5-s5fxt" event={"ID":"ac75e52e-5dcc-49cf-9451-ce0b39dbe94f","Type":"ContainerStarted","Data":"389a914aa4b1d5d8e04689f0c53afbf807442d86172a8c5cb534b42fe737a57d"} Apr 16 19:28:58.671116 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:28:58.670812 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-57cc89c4d5-s5fxt" Apr 16 19:28:58.672214 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:28:58.672164 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-57cc89c4d5-s5fxt" podUID="ac75e52e-5dcc-49cf-9451-ce0b39dbe94f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.60:8080: connect: connection refused" Apr 16 19:28:58.686458 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:28:58.686416 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-57cc89c4d5-s5fxt" podStartSLOduration=3.686400805 podStartE2EDuration="3.686400805s" podCreationTimestamp="2026-04-16 19:28:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:28:58.685601905 +0000 UTC m=+3515.472938164" watchObservedRunningTime="2026-04-16 19:28:58.686400805 +0000 UTC m=+3515.473737062" Apr 16 19:28:59.674910 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:28:59.674873 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-57cc89c4d5-s5fxt" podUID="ac75e52e-5dcc-49cf-9451-ce0b39dbe94f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.60:8080: connect: connection refused" Apr 16 19:29:09.675254 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:29:09.675204 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-57cc89c4d5-s5fxt" podUID="ac75e52e-5dcc-49cf-9451-ce0b39dbe94f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.60:8080: connect: connection refused" Apr 16 19:29:19.675325 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:29:19.675279 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-57cc89c4d5-s5fxt" podUID="ac75e52e-5dcc-49cf-9451-ce0b39dbe94f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.60:8080: connect: connection refused" Apr 16 19:29:29.675015 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:29:29.674975 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-57cc89c4d5-s5fxt" podUID="ac75e52e-5dcc-49cf-9451-ce0b39dbe94f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.60:8080: connect: connection refused" Apr 16 19:29:39.675400 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:29:39.675355 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-57cc89c4d5-s5fxt" podUID="ac75e52e-5dcc-49cf-9451-ce0b39dbe94f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.60:8080: connect: connection refused" Apr 16 19:29:49.675455 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:29:49.675401 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-57cc89c4d5-s5fxt" podUID="ac75e52e-5dcc-49cf-9451-ce0b39dbe94f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.60:8080: connect: connection refused" Apr 16 19:29:59.675494 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:29:59.675441 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-57cc89c4d5-s5fxt" podUID="ac75e52e-5dcc-49cf-9451-ce0b39dbe94f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.60:8080: connect: connection refused" Apr 16 19:30:06.838297 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:30:06.838266 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-57cc89c4d5-s5fxt" Apr 16 19:30:15.637671 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:30:15.637584 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-57cc89c4d5-s5fxt"] Apr 16 19:30:15.638162 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:30:15.637913 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-57cc89c4d5-s5fxt" podUID="ac75e52e-5dcc-49cf-9451-ce0b39dbe94f" containerName="kserve-container" containerID="cri-o://389a914aa4b1d5d8e04689f0c53afbf807442d86172a8c5cb534b42fe737a57d" gracePeriod=30 Apr 16 19:30:15.730221 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:30:15.730187 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-mtqlv"] Apr 16 19:30:15.733542 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:30:15.733520 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-mtqlv" Apr 16 19:30:15.736056 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:30:15.736039 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 16 19:30:15.741664 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:30:15.741638 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-mtqlv"] Apr 16 19:30:15.792809 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:30:15.792779 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7794452e-2cc1-4a14-80b6-8abfcfe5d349-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-mtqlv\" (UID: \"7794452e-2cc1-4a14-80b6-8abfcfe5d349\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-mtqlv" Apr 16 19:30:15.792931 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:30:15.792829 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/7794452e-2cc1-4a14-80b6-8abfcfe5d349-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-mtqlv\" (UID: \"7794452e-2cc1-4a14-80b6-8abfcfe5d349\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-mtqlv" Apr 16 19:30:15.893726 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:30:15.893641 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7794452e-2cc1-4a14-80b6-8abfcfe5d349-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-mtqlv\" (UID: \"7794452e-2cc1-4a14-80b6-8abfcfe5d349\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-mtqlv" Apr 16 19:30:15.893726 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:30:15.893708 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/7794452e-2cc1-4a14-80b6-8abfcfe5d349-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-mtqlv\" (UID: \"7794452e-2cc1-4a14-80b6-8abfcfe5d349\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-mtqlv" Apr 16 19:30:15.894090 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:30:15.894065 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7794452e-2cc1-4a14-80b6-8abfcfe5d349-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-mtqlv\" (UID: \"7794452e-2cc1-4a14-80b6-8abfcfe5d349\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-mtqlv" Apr 16 19:30:15.894316 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:30:15.894299 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/7794452e-2cc1-4a14-80b6-8abfcfe5d349-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-mtqlv\" (UID: \"7794452e-2cc1-4a14-80b6-8abfcfe5d349\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-mtqlv" Apr 16 19:30:16.044567 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:30:16.044536 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-mtqlv" Apr 16 19:30:16.169146 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:30:16.169062 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-mtqlv"] Apr 16 19:30:16.172065 ip-10-0-139-33 kubenswrapper[2578]: W0416 19:30:16.172034 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7794452e_2cc1_4a14_80b6_8abfcfe5d349.slice/crio-51f4efa21c86c9e4b5876aa353814c2fc50caa1d28d8c81118d0bd3a91043604 WatchSource:0}: Error finding container 51f4efa21c86c9e4b5876aa353814c2fc50caa1d28d8c81118d0bd3a91043604: Status 404 returned error can't find the container with id 51f4efa21c86c9e4b5876aa353814c2fc50caa1d28d8c81118d0bd3a91043604 Apr 16 19:30:16.174076 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:30:16.174055 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:30:16.837362 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:30:16.837318 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-57cc89c4d5-s5fxt" podUID="ac75e52e-5dcc-49cf-9451-ce0b39dbe94f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.60:8080: connect: connection refused" Apr 16 19:30:16.923022 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:30:16.922987 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-mtqlv" event={"ID":"7794452e-2cc1-4a14-80b6-8abfcfe5d349","Type":"ContainerStarted","Data":"9541216affd5c2e2a61f0976ab88e8544d6d8c79d297324e1bd69bce2d6b8d29"} Apr 16 19:30:16.923022 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:30:16.923026 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-mtqlv" event={"ID":"7794452e-2cc1-4a14-80b6-8abfcfe5d349","Type":"ContainerStarted","Data":"51f4efa21c86c9e4b5876aa353814c2fc50caa1d28d8c81118d0bd3a91043604"} Apr 16 19:30:17.927558 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:30:17.927522 2578 generic.go:358] "Generic (PLEG): container finished" podID="7794452e-2cc1-4a14-80b6-8abfcfe5d349" containerID="9541216affd5c2e2a61f0976ab88e8544d6d8c79d297324e1bd69bce2d6b8d29" exitCode=0 Apr 16 19:30:17.927968 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:30:17.927603 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-mtqlv" event={"ID":"7794452e-2cc1-4a14-80b6-8abfcfe5d349","Type":"ContainerDied","Data":"9541216affd5c2e2a61f0976ab88e8544d6d8c79d297324e1bd69bce2d6b8d29"} Apr 16 19:30:18.932425 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:30:18.932392 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-mtqlv" event={"ID":"7794452e-2cc1-4a14-80b6-8abfcfe5d349","Type":"ContainerStarted","Data":"f8deb29ad10da8be62247b61e24df5073f3f566adb8bbc01f8b533432a700f9c"} Apr 16 19:30:18.932858 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:30:18.932618 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-mtqlv" Apr 16 19:30:18.933993 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:30:18.933968 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-mtqlv" podUID="7794452e-2cc1-4a14-80b6-8abfcfe5d349" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.61:8080: connect: connection refused" Apr 16 19:30:18.949296 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:30:18.949251 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-mtqlv" podStartSLOduration=3.949237935 podStartE2EDuration="3.949237935s" podCreationTimestamp="2026-04-16 19:30:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:30:18.947787915 +0000 UTC m=+3595.735124173" watchObservedRunningTime="2026-04-16 19:30:18.949237935 +0000 UTC m=+3595.736574190" Apr 16 19:30:19.792463 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:30:19.792441 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-57cc89c4d5-s5fxt" Apr 16 19:30:19.819290 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:30:19.819266 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ac75e52e-5dcc-49cf-9451-ce0b39dbe94f-kserve-provision-location\") pod \"ac75e52e-5dcc-49cf-9451-ce0b39dbe94f\" (UID: \"ac75e52e-5dcc-49cf-9451-ce0b39dbe94f\") " Apr 16 19:30:19.819549 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:30:19.819525 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac75e52e-5dcc-49cf-9451-ce0b39dbe94f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ac75e52e-5dcc-49cf-9451-ce0b39dbe94f" (UID: "ac75e52e-5dcc-49cf-9451-ce0b39dbe94f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:30:19.919784 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:30:19.919753 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ac75e52e-5dcc-49cf-9451-ce0b39dbe94f-kserve-provision-location\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 19:30:19.936793 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:30:19.936767 2578 generic.go:358] "Generic (PLEG): container finished" podID="ac75e52e-5dcc-49cf-9451-ce0b39dbe94f" containerID="389a914aa4b1d5d8e04689f0c53afbf807442d86172a8c5cb534b42fe737a57d" exitCode=0 Apr 16 19:30:19.937207 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:30:19.936835 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-57cc89c4d5-s5fxt" Apr 16 19:30:19.937207 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:30:19.936861 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-57cc89c4d5-s5fxt" event={"ID":"ac75e52e-5dcc-49cf-9451-ce0b39dbe94f","Type":"ContainerDied","Data":"389a914aa4b1d5d8e04689f0c53afbf807442d86172a8c5cb534b42fe737a57d"} Apr 16 19:30:19.937207 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:30:19.936906 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-57cc89c4d5-s5fxt" event={"ID":"ac75e52e-5dcc-49cf-9451-ce0b39dbe94f","Type":"ContainerDied","Data":"e77e135d566a1378100ec343e400f4aa73733e3e72b66ec1e6884ee7b8c9ff70"} Apr 16 19:30:19.937207 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:30:19.936931 2578 scope.go:117] "RemoveContainer" containerID="389a914aa4b1d5d8e04689f0c53afbf807442d86172a8c5cb534b42fe737a57d" Apr 16 19:30:19.937412 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:30:19.937286 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-mtqlv" podUID="7794452e-2cc1-4a14-80b6-8abfcfe5d349" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.61:8080: connect: connection refused" Apr 16 19:30:19.944731 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:30:19.944714 2578 scope.go:117] "RemoveContainer" containerID="597c2d4af93eac27c858659a7616e9a21ab607329fbc25758db336e0424301dc" Apr 16 19:30:19.951629 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:30:19.951610 2578 scope.go:117] "RemoveContainer" containerID="389a914aa4b1d5d8e04689f0c53afbf807442d86172a8c5cb534b42fe737a57d" Apr 16 19:30:19.951883 ip-10-0-139-33 kubenswrapper[2578]: E0416 19:30:19.951866 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"389a914aa4b1d5d8e04689f0c53afbf807442d86172a8c5cb534b42fe737a57d\": container with ID starting with 389a914aa4b1d5d8e04689f0c53afbf807442d86172a8c5cb534b42fe737a57d not found: ID does not exist" containerID="389a914aa4b1d5d8e04689f0c53afbf807442d86172a8c5cb534b42fe737a57d" Apr 16 19:30:19.951931 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:30:19.951894 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"389a914aa4b1d5d8e04689f0c53afbf807442d86172a8c5cb534b42fe737a57d"} err="failed to get container status \"389a914aa4b1d5d8e04689f0c53afbf807442d86172a8c5cb534b42fe737a57d\": rpc error: code = NotFound desc = could not find container \"389a914aa4b1d5d8e04689f0c53afbf807442d86172a8c5cb534b42fe737a57d\": container with ID starting with 389a914aa4b1d5d8e04689f0c53afbf807442d86172a8c5cb534b42fe737a57d not found: ID does not exist" Apr 16 19:30:19.951931 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:30:19.951913 2578 scope.go:117] "RemoveContainer" containerID="597c2d4af93eac27c858659a7616e9a21ab607329fbc25758db336e0424301dc" Apr 16 19:30:19.952101 ip-10-0-139-33 kubenswrapper[2578]: E0416 19:30:19.952083 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"597c2d4af93eac27c858659a7616e9a21ab607329fbc25758db336e0424301dc\": container with ID starting with 597c2d4af93eac27c858659a7616e9a21ab607329fbc25758db336e0424301dc not found: ID does not exist" containerID="597c2d4af93eac27c858659a7616e9a21ab607329fbc25758db336e0424301dc" Apr 16 19:30:19.952139 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:30:19.952105 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"597c2d4af93eac27c858659a7616e9a21ab607329fbc25758db336e0424301dc"} err="failed to get container status \"597c2d4af93eac27c858659a7616e9a21ab607329fbc25758db336e0424301dc\": rpc error: code = NotFound desc = could not find container \"597c2d4af93eac27c858659a7616e9a21ab607329fbc25758db336e0424301dc\": container with ID starting with 597c2d4af93eac27c858659a7616e9a21ab607329fbc25758db336e0424301dc not found: ID does not exist" Apr 16 19:30:19.957098 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:30:19.957078 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-57cc89c4d5-s5fxt"] Apr 16 19:30:19.960647 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:30:19.960627 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-57cc89c4d5-s5fxt"] Apr 16 19:30:21.841588 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:30:21.841547 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac75e52e-5dcc-49cf-9451-ce0b39dbe94f" path="/var/lib/kubelet/pods/ac75e52e-5dcc-49cf-9451-ce0b39dbe94f/volumes" Apr 16 19:30:29.938154 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:30:29.938108 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-mtqlv" podUID="7794452e-2cc1-4a14-80b6-8abfcfe5d349" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.61:8080: connect: connection refused" Apr 16 19:30:39.938007 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:30:39.937962 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-mtqlv" podUID="7794452e-2cc1-4a14-80b6-8abfcfe5d349" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.61:8080: connect: connection refused" Apr 16 19:30:47.643873 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:30:47.643829 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fdzzg_02533a21-4e1f-4bc0-a493-7ac7d35295b8/ovn-acl-logging/0.log" Apr 16 19:30:47.651627 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:30:47.651604 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fdzzg_02533a21-4e1f-4bc0-a493-7ac7d35295b8/ovn-acl-logging/0.log" Apr 16 19:30:49.937621 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:30:49.937580 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-mtqlv" podUID="7794452e-2cc1-4a14-80b6-8abfcfe5d349" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.61:8080: connect: connection refused" Apr 16 19:30:59.937834 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:30:59.937788 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-mtqlv" podUID="7794452e-2cc1-4a14-80b6-8abfcfe5d349" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.61:8080: connect: connection refused" Apr 16 19:31:09.937525 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:09.937465 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-mtqlv" podUID="7794452e-2cc1-4a14-80b6-8abfcfe5d349" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.61:8080: connect: connection refused" Apr 16 19:31:19.937992 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:19.937950 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-mtqlv" podUID="7794452e-2cc1-4a14-80b6-8abfcfe5d349" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.61:8080: connect: connection refused" Apr 16 19:31:26.838342 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:26.838307 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-mtqlv" Apr 16 19:31:35.790225 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:35.790195 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-mtqlv"] Apr 16 19:31:35.790596 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:35.790451 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-mtqlv" podUID="7794452e-2cc1-4a14-80b6-8abfcfe5d349" containerName="kserve-container" containerID="cri-o://f8deb29ad10da8be62247b61e24df5073f3f566adb8bbc01f8b533432a700f9c" gracePeriod=30 Apr 16 19:31:36.838276 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:36.838227 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-mtqlv" podUID="7794452e-2cc1-4a14-80b6-8abfcfe5d349" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.61:8080: connect: connection refused" Apr 16 19:31:36.857651 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:36.857619 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-67954558cd-xk7rx"] Apr 16 19:31:36.858130 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:36.858113 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ac75e52e-5dcc-49cf-9451-ce0b39dbe94f" containerName="kserve-container" Apr 16 19:31:36.858200 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:36.858134 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac75e52e-5dcc-49cf-9451-ce0b39dbe94f" containerName="kserve-container" Apr 16 19:31:36.858200 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:36.858154 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ac75e52e-5dcc-49cf-9451-ce0b39dbe94f" containerName="storage-initializer" Apr 16 19:31:36.858200 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:36.858163 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac75e52e-5dcc-49cf-9451-ce0b39dbe94f" containerName="storage-initializer" Apr 16 19:31:36.858329 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:36.858271 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="ac75e52e-5dcc-49cf-9451-ce0b39dbe94f" containerName="kserve-container" Apr 16 19:31:36.861513 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:36.861479 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-67954558cd-xk7rx" Apr 16 19:31:36.867565 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:36.867543 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-67954558cd-xk7rx"] Apr 16 19:31:36.943709 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:36.943676 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a814b853-738d-492e-ae2e-b2b124ec8caa-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-67954558cd-xk7rx\" (UID: \"a814b853-738d-492e-ae2e-b2b124ec8caa\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-67954558cd-xk7rx" Apr 16 19:31:37.044624 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:37.044586 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a814b853-738d-492e-ae2e-b2b124ec8caa-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-67954558cd-xk7rx\" (UID: \"a814b853-738d-492e-ae2e-b2b124ec8caa\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-67954558cd-xk7rx" Apr 16 19:31:37.045005 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:37.044979 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a814b853-738d-492e-ae2e-b2b124ec8caa-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-67954558cd-xk7rx\" (UID: \"a814b853-738d-492e-ae2e-b2b124ec8caa\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-67954558cd-xk7rx" Apr 16 19:31:37.173162 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:37.173073 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-67954558cd-xk7rx" Apr 16 19:31:37.295285 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:37.295255 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-67954558cd-xk7rx"] Apr 16 19:31:37.297740 ip-10-0-139-33 kubenswrapper[2578]: W0416 19:31:37.297709 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda814b853_738d_492e_ae2e_b2b124ec8caa.slice/crio-c575137c66f1b2f9488db8607271d4bb55122d19767b7f5353dd2d8f992d303f WatchSource:0}: Error finding container c575137c66f1b2f9488db8607271d4bb55122d19767b7f5353dd2d8f992d303f: Status 404 returned error can't find the container with id c575137c66f1b2f9488db8607271d4bb55122d19767b7f5353dd2d8f992d303f Apr 16 19:31:38.198156 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:38.198123 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-67954558cd-xk7rx" event={"ID":"a814b853-738d-492e-ae2e-b2b124ec8caa","Type":"ContainerStarted","Data":"7fb97fb90dd03825e46809920e73172eeaf68d5b85c4d7d986edc05326e3fc38"} Apr 16 19:31:38.198156 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:38.198156 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-67954558cd-xk7rx" event={"ID":"a814b853-738d-492e-ae2e-b2b124ec8caa","Type":"ContainerStarted","Data":"c575137c66f1b2f9488db8607271d4bb55122d19767b7f5353dd2d8f992d303f"} Apr 16 19:31:40.207875 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:40.207849 2578 generic.go:358] "Generic (PLEG): container finished" podID="7794452e-2cc1-4a14-80b6-8abfcfe5d349" containerID="f8deb29ad10da8be62247b61e24df5073f3f566adb8bbc01f8b533432a700f9c" exitCode=0 Apr 16 19:31:40.208245 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:40.207917 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-mtqlv" event={"ID":"7794452e-2cc1-4a14-80b6-8abfcfe5d349","Type":"ContainerDied","Data":"f8deb29ad10da8be62247b61e24df5073f3f566adb8bbc01f8b533432a700f9c"} Apr 16 19:31:40.234711 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:40.234691 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-mtqlv" Apr 16 19:31:40.268427 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:40.268397 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/7794452e-2cc1-4a14-80b6-8abfcfe5d349-cabundle-cert\") pod \"7794452e-2cc1-4a14-80b6-8abfcfe5d349\" (UID: \"7794452e-2cc1-4a14-80b6-8abfcfe5d349\") " Apr 16 19:31:40.268597 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:40.268587 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7794452e-2cc1-4a14-80b6-8abfcfe5d349-kserve-provision-location\") pod \"7794452e-2cc1-4a14-80b6-8abfcfe5d349\" (UID: \"7794452e-2cc1-4a14-80b6-8abfcfe5d349\") " Apr 16 19:31:40.268757 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:40.268734 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7794452e-2cc1-4a14-80b6-8abfcfe5d349-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "7794452e-2cc1-4a14-80b6-8abfcfe5d349" (UID: "7794452e-2cc1-4a14-80b6-8abfcfe5d349"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:31:40.268964 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:40.268942 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7794452e-2cc1-4a14-80b6-8abfcfe5d349-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7794452e-2cc1-4a14-80b6-8abfcfe5d349" (UID: "7794452e-2cc1-4a14-80b6-8abfcfe5d349"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:31:40.369929 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:40.369797 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7794452e-2cc1-4a14-80b6-8abfcfe5d349-kserve-provision-location\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 19:31:40.369929 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:40.369836 2578 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/7794452e-2cc1-4a14-80b6-8abfcfe5d349-cabundle-cert\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 19:31:41.212972 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:41.212941 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-mtqlv" event={"ID":"7794452e-2cc1-4a14-80b6-8abfcfe5d349","Type":"ContainerDied","Data":"51f4efa21c86c9e4b5876aa353814c2fc50caa1d28d8c81118d0bd3a91043604"} Apr 16 19:31:41.213436 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:41.212988 2578 scope.go:117] "RemoveContainer" containerID="f8deb29ad10da8be62247b61e24df5073f3f566adb8bbc01f8b533432a700f9c" Apr 16 19:31:41.213436 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:41.212993 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-mtqlv" Apr 16 19:31:41.221356 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:41.221331 2578 scope.go:117] "RemoveContainer" containerID="9541216affd5c2e2a61f0976ab88e8544d6d8c79d297324e1bd69bce2d6b8d29" Apr 16 19:31:41.234286 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:41.234257 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-mtqlv"] Apr 16 19:31:41.237481 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:41.237459 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-mtqlv"] Apr 16 19:31:41.841265 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:41.841236 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7794452e-2cc1-4a14-80b6-8abfcfe5d349" path="/var/lib/kubelet/pods/7794452e-2cc1-4a14-80b6-8abfcfe5d349/volumes" Apr 16 19:31:42.218160 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:42.218134 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-67954558cd-xk7rx_a814b853-738d-492e-ae2e-b2b124ec8caa/storage-initializer/0.log" Apr 16 19:31:42.218552 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:42.218172 2578 generic.go:358] "Generic (PLEG): container finished" podID="a814b853-738d-492e-ae2e-b2b124ec8caa" containerID="7fb97fb90dd03825e46809920e73172eeaf68d5b85c4d7d986edc05326e3fc38" exitCode=1 Apr 16 19:31:42.218552 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:42.218203 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-67954558cd-xk7rx" event={"ID":"a814b853-738d-492e-ae2e-b2b124ec8caa","Type":"ContainerDied","Data":"7fb97fb90dd03825e46809920e73172eeaf68d5b85c4d7d986edc05326e3fc38"} Apr 16 19:31:43.224275 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:43.224246 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-67954558cd-xk7rx_a814b853-738d-492e-ae2e-b2b124ec8caa/storage-initializer/0.log" Apr 16 19:31:43.224682 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:43.224288 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-67954558cd-xk7rx" event={"ID":"a814b853-738d-492e-ae2e-b2b124ec8caa","Type":"ContainerStarted","Data":"e2caed2ef0f9a4281761560957d11601e30bd71efdd8be1f089ad4cfe6a9b149"} Apr 16 19:31:45.231403 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:45.231374 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-67954558cd-xk7rx_a814b853-738d-492e-ae2e-b2b124ec8caa/storage-initializer/1.log" Apr 16 19:31:45.231822 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:45.231777 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-67954558cd-xk7rx_a814b853-738d-492e-ae2e-b2b124ec8caa/storage-initializer/0.log" Apr 16 19:31:45.231822 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:45.231809 2578 generic.go:358] "Generic (PLEG): container finished" podID="a814b853-738d-492e-ae2e-b2b124ec8caa" containerID="e2caed2ef0f9a4281761560957d11601e30bd71efdd8be1f089ad4cfe6a9b149" exitCode=1 Apr 16 19:31:45.231900 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:45.231882 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-67954558cd-xk7rx" event={"ID":"a814b853-738d-492e-ae2e-b2b124ec8caa","Type":"ContainerDied","Data":"e2caed2ef0f9a4281761560957d11601e30bd71efdd8be1f089ad4cfe6a9b149"} Apr 16 19:31:45.231935 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:45.231925 2578 scope.go:117] "RemoveContainer" containerID="7fb97fb90dd03825e46809920e73172eeaf68d5b85c4d7d986edc05326e3fc38" Apr 16 19:31:45.232301 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:45.232266 2578 scope.go:117] "RemoveContainer" containerID="7fb97fb90dd03825e46809920e73172eeaf68d5b85c4d7d986edc05326e3fc38" Apr 16 19:31:45.242469 ip-10-0-139-33 kubenswrapper[2578]: E0416 19:31:45.242443 2578 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-global-fail-predictor-67954558cd-xk7rx_kserve-ci-e2e-test_a814b853-738d-492e-ae2e-b2b124ec8caa_0 in pod sandbox c575137c66f1b2f9488db8607271d4bb55122d19767b7f5353dd2d8f992d303f from index: no such id: '7fb97fb90dd03825e46809920e73172eeaf68d5b85c4d7d986edc05326e3fc38'" containerID="7fb97fb90dd03825e46809920e73172eeaf68d5b85c4d7d986edc05326e3fc38" Apr 16 19:31:45.242546 ip-10-0-139-33 kubenswrapper[2578]: E0416 19:31:45.242485 2578 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-global-fail-predictor-67954558cd-xk7rx_kserve-ci-e2e-test_a814b853-738d-492e-ae2e-b2b124ec8caa_0 in pod sandbox c575137c66f1b2f9488db8607271d4bb55122d19767b7f5353dd2d8f992d303f from index: no such id: '7fb97fb90dd03825e46809920e73172eeaf68d5b85c4d7d986edc05326e3fc38'; Skipping pod \"isvc-sklearn-s3-tls-global-fail-predictor-67954558cd-xk7rx_kserve-ci-e2e-test(a814b853-738d-492e-ae2e-b2b124ec8caa)\"" logger="UnhandledError" Apr 16 19:31:45.243828 ip-10-0-139-33 kubenswrapper[2578]: E0416 19:31:45.243807 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-sklearn-s3-tls-global-fail-predictor-67954558cd-xk7rx_kserve-ci-e2e-test(a814b853-738d-492e-ae2e-b2b124ec8caa)\"" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-67954558cd-xk7rx" podUID="a814b853-738d-492e-ae2e-b2b124ec8caa" Apr 16 19:31:46.236031 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:46.236005 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-67954558cd-xk7rx_a814b853-738d-492e-ae2e-b2b124ec8caa/storage-initializer/1.log" Apr 16 19:31:46.865952 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:46.865920 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-67954558cd-xk7rx"] Apr 16 19:31:46.993814 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:46.993792 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-67954558cd-xk7rx_a814b853-738d-492e-ae2e-b2b124ec8caa/storage-initializer/1.log" Apr 16 19:31:46.993944 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:46.993863 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-67954558cd-xk7rx" Apr 16 19:31:47.126531 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:47.126453 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a814b853-738d-492e-ae2e-b2b124ec8caa-kserve-provision-location\") pod \"a814b853-738d-492e-ae2e-b2b124ec8caa\" (UID: \"a814b853-738d-492e-ae2e-b2b124ec8caa\") " Apr 16 19:31:47.126716 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:47.126693 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a814b853-738d-492e-ae2e-b2b124ec8caa-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a814b853-738d-492e-ae2e-b2b124ec8caa" (UID: "a814b853-738d-492e-ae2e-b2b124ec8caa"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:31:47.227895 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:47.227868 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a814b853-738d-492e-ae2e-b2b124ec8caa-kserve-provision-location\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 19:31:47.240399 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:47.240377 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-67954558cd-xk7rx_a814b853-738d-492e-ae2e-b2b124ec8caa/storage-initializer/1.log" Apr 16 19:31:47.240761 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:47.240479 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-67954558cd-xk7rx" Apr 16 19:31:47.240761 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:47.240487 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-67954558cd-xk7rx" event={"ID":"a814b853-738d-492e-ae2e-b2b124ec8caa","Type":"ContainerDied","Data":"c575137c66f1b2f9488db8607271d4bb55122d19767b7f5353dd2d8f992d303f"} Apr 16 19:31:47.240761 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:47.240520 2578 scope.go:117] "RemoveContainer" containerID="e2caed2ef0f9a4281761560957d11601e30bd71efdd8be1f089ad4cfe6a9b149" Apr 16 19:31:47.272443 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:47.272419 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-67954558cd-xk7rx"] Apr 16 19:31:47.276037 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:47.276014 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-67954558cd-xk7rx"] Apr 16 19:31:47.841603 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:47.841563 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a814b853-738d-492e-ae2e-b2b124ec8caa" path="/var/lib/kubelet/pods/a814b853-738d-492e-ae2e-b2b124ec8caa/volumes" Apr 16 19:31:47.918537 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:47.918507 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-675b8"] Apr 16 19:31:47.918850 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:47.918835 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a814b853-738d-492e-ae2e-b2b124ec8caa" containerName="storage-initializer" Apr 16 19:31:47.918894 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:47.918853 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="a814b853-738d-492e-ae2e-b2b124ec8caa" containerName="storage-initializer" Apr 16 19:31:47.918894 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:47.918862 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a814b853-738d-492e-ae2e-b2b124ec8caa" containerName="storage-initializer" Apr 16 19:31:47.918894 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:47.918867 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="a814b853-738d-492e-ae2e-b2b124ec8caa" containerName="storage-initializer" Apr 16 19:31:47.918894 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:47.918876 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7794452e-2cc1-4a14-80b6-8abfcfe5d349" containerName="storage-initializer" Apr 16 19:31:47.918894 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:47.918882 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="7794452e-2cc1-4a14-80b6-8abfcfe5d349" containerName="storage-initializer" Apr 16 19:31:47.919060 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:47.918902 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7794452e-2cc1-4a14-80b6-8abfcfe5d349" containerName="kserve-container" Apr 16 19:31:47.919060 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:47.918907 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="7794452e-2cc1-4a14-80b6-8abfcfe5d349" containerName="kserve-container" Apr 16 19:31:47.919060 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:47.918964 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="a814b853-738d-492e-ae2e-b2b124ec8caa" containerName="storage-initializer" Apr 16 19:31:47.919060 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:47.918975 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="7794452e-2cc1-4a14-80b6-8abfcfe5d349" containerName="kserve-container" Apr 16 19:31:47.919185 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:47.919081 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="a814b853-738d-492e-ae2e-b2b124ec8caa" containerName="storage-initializer" Apr 16 19:31:47.923277 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:47.923259 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-675b8" Apr 16 19:31:47.925882 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:47.925862 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-z66mq\"" Apr 16 19:31:47.925993 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:47.925861 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"storage-config\"" Apr 16 19:31:47.926957 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:47.926941 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 16 19:31:47.930027 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:47.930006 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-675b8"] Apr 16 19:31:48.034280 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:48.034245 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a67e27c8-2144-4f81-9e0f-c257809f6958-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-675b8\" (UID: \"a67e27c8-2144-4f81-9e0f-c257809f6958\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-675b8" Apr 16 19:31:48.034446 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:48.034299 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/a67e27c8-2144-4f81-9e0f-c257809f6958-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-675b8\" (UID: \"a67e27c8-2144-4f81-9e0f-c257809f6958\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-675b8" Apr 16 19:31:48.134933 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:48.134836 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a67e27c8-2144-4f81-9e0f-c257809f6958-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-675b8\" (UID: \"a67e27c8-2144-4f81-9e0f-c257809f6958\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-675b8" Apr 16 19:31:48.134933 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:48.134916 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/a67e27c8-2144-4f81-9e0f-c257809f6958-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-675b8\" (UID: \"a67e27c8-2144-4f81-9e0f-c257809f6958\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-675b8" Apr 16 19:31:48.135327 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:48.135303 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a67e27c8-2144-4f81-9e0f-c257809f6958-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-675b8\" (UID: \"a67e27c8-2144-4f81-9e0f-c257809f6958\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-675b8" Apr 16 19:31:48.135540 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:48.135522 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/a67e27c8-2144-4f81-9e0f-c257809f6958-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-675b8\" (UID: \"a67e27c8-2144-4f81-9e0f-c257809f6958\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-675b8" Apr 16 19:31:48.234232 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:48.234199 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-675b8" Apr 16 19:31:48.355057 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:48.355031 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-675b8"] Apr 16 19:31:48.357917 ip-10-0-139-33 kubenswrapper[2578]: W0416 19:31:48.357886 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda67e27c8_2144_4f81_9e0f_c257809f6958.slice/crio-9b7606a4716f3f0632686c76dca79a3cca0553c7ea734bbd14fa11a9d767ba75 WatchSource:0}: Error finding container 9b7606a4716f3f0632686c76dca79a3cca0553c7ea734bbd14fa11a9d767ba75: Status 404 returned error can't find the container with id 9b7606a4716f3f0632686c76dca79a3cca0553c7ea734bbd14fa11a9d767ba75 Apr 16 19:31:49.254991 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:49.254940 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-675b8" event={"ID":"a67e27c8-2144-4f81-9e0f-c257809f6958","Type":"ContainerStarted","Data":"dbd5aa4e92dbf4050c5a4d9009b37e8eaec5e6864c664bd31a2519122bd74166"} Apr 16 19:31:49.254991 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:49.254991 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-675b8" event={"ID":"a67e27c8-2144-4f81-9e0f-c257809f6958","Type":"ContainerStarted","Data":"9b7606a4716f3f0632686c76dca79a3cca0553c7ea734bbd14fa11a9d767ba75"} Apr 16 19:31:50.259579 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:50.259545 2578 generic.go:358] "Generic (PLEG): container finished" podID="a67e27c8-2144-4f81-9e0f-c257809f6958" containerID="dbd5aa4e92dbf4050c5a4d9009b37e8eaec5e6864c664bd31a2519122bd74166" exitCode=0 Apr 16 19:31:50.259999 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:50.259632 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-675b8" event={"ID":"a67e27c8-2144-4f81-9e0f-c257809f6958","Type":"ContainerDied","Data":"dbd5aa4e92dbf4050c5a4d9009b37e8eaec5e6864c664bd31a2519122bd74166"} Apr 16 19:31:51.265127 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:51.265093 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-675b8" event={"ID":"a67e27c8-2144-4f81-9e0f-c257809f6958","Type":"ContainerStarted","Data":"83c04c7b2430f8feba1d7405ea5be55cd83d658887d714ef2f6b07e243e22888"} Apr 16 19:31:51.265698 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:51.265242 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-675b8" Apr 16 19:31:51.266574 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:51.266551 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-675b8" podUID="a67e27c8-2144-4f81-9e0f-c257809f6958" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.63:8080: connect: connection refused" Apr 16 19:31:51.282585 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:51.282544 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-675b8" podStartSLOduration=4.28253239 podStartE2EDuration="4.28253239s" podCreationTimestamp="2026-04-16 19:31:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:31:51.280288522 +0000 UTC m=+3688.067624778" watchObservedRunningTime="2026-04-16 19:31:51.28253239 +0000 UTC m=+3688.069868646" Apr 16 19:31:52.268645 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:31:52.268602 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-675b8" podUID="a67e27c8-2144-4f81-9e0f-c257809f6958" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.63:8080: connect: connection refused" Apr 16 19:32:02.268585 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:32:02.268542 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-675b8" podUID="a67e27c8-2144-4f81-9e0f-c257809f6958" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.63:8080: connect: connection refused" Apr 16 19:32:12.269356 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:32:12.269315 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-675b8" podUID="a67e27c8-2144-4f81-9e0f-c257809f6958" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.63:8080: connect: connection refused" Apr 16 19:32:22.269446 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:32:22.269396 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-675b8" podUID="a67e27c8-2144-4f81-9e0f-c257809f6958" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.63:8080: connect: connection refused" Apr 16 19:32:32.269117 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:32:32.269075 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-675b8" podUID="a67e27c8-2144-4f81-9e0f-c257809f6958" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.63:8080: connect: connection refused" Apr 16 19:32:42.268964 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:32:42.268920 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-675b8" podUID="a67e27c8-2144-4f81-9e0f-c257809f6958" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.63:8080: connect: connection refused" Apr 16 19:32:52.268702 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:32:52.268662 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-675b8" podUID="a67e27c8-2144-4f81-9e0f-c257809f6958" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.63:8080: connect: connection refused" Apr 16 19:33:02.270168 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:02.270136 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-675b8" Apr 16 19:33:07.988079 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:07.988039 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-675b8"] Apr 16 19:33:07.988466 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:07.988401 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-675b8" podUID="a67e27c8-2144-4f81-9e0f-c257809f6958" containerName="kserve-container" containerID="cri-o://83c04c7b2430f8feba1d7405ea5be55cd83d658887d714ef2f6b07e243e22888" gracePeriod=30 Apr 16 19:33:09.084441 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:09.084399 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-66b6cdf85d-zr6k9"] Apr 16 19:33:09.088043 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:09.088025 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-66b6cdf85d-zr6k9" Apr 16 19:33:09.094525 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:09.094495 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-66b6cdf85d-zr6k9"] Apr 16 19:33:09.212070 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:09.212032 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/29d52757-0ea6-4733-9053-00e5ff0d9a38-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-66b6cdf85d-zr6k9\" (UID: \"29d52757-0ea6-4733-9053-00e5ff0d9a38\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-66b6cdf85d-zr6k9" Apr 16 19:33:09.313463 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:09.313425 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/29d52757-0ea6-4733-9053-00e5ff0d9a38-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-66b6cdf85d-zr6k9\" (UID: \"29d52757-0ea6-4733-9053-00e5ff0d9a38\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-66b6cdf85d-zr6k9" Apr 16 19:33:09.313924 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:09.313903 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/29d52757-0ea6-4733-9053-00e5ff0d9a38-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-66b6cdf85d-zr6k9\" (UID: \"29d52757-0ea6-4733-9053-00e5ff0d9a38\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-66b6cdf85d-zr6k9" Apr 16 19:33:09.399317 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:09.399218 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-66b6cdf85d-zr6k9" Apr 16 19:33:09.518065 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:09.518038 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-66b6cdf85d-zr6k9"] Apr 16 19:33:09.520657 ip-10-0-139-33 kubenswrapper[2578]: W0416 19:33:09.520615 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29d52757_0ea6_4733_9053_00e5ff0d9a38.slice/crio-fc2941428bc3ffffb6f0d308bdd1add4fdb8b8b48f43418ec48d49e9b5667244 WatchSource:0}: Error finding container fc2941428bc3ffffb6f0d308bdd1add4fdb8b8b48f43418ec48d49e9b5667244: Status 404 returned error can't find the container with id fc2941428bc3ffffb6f0d308bdd1add4fdb8b8b48f43418ec48d49e9b5667244 Apr 16 19:33:10.523402 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:10.523323 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-66b6cdf85d-zr6k9" event={"ID":"29d52757-0ea6-4733-9053-00e5ff0d9a38","Type":"ContainerStarted","Data":"a93dc3dae8ac058645834634a3b630c8ae7c105c9b556ce6db7acc5e806e6800"} Apr 16 19:33:10.523402 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:10.523361 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-66b6cdf85d-zr6k9" event={"ID":"29d52757-0ea6-4733-9053-00e5ff0d9a38","Type":"ContainerStarted","Data":"fc2941428bc3ffffb6f0d308bdd1add4fdb8b8b48f43418ec48d49e9b5667244"} Apr 16 19:33:12.333435 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:12.333413 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-675b8" Apr 16 19:33:12.438539 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:12.438458 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/a67e27c8-2144-4f81-9e0f-c257809f6958-cabundle-cert\") pod \"a67e27c8-2144-4f81-9e0f-c257809f6958\" (UID: \"a67e27c8-2144-4f81-9e0f-c257809f6958\") " Apr 16 19:33:12.438539 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:12.438496 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a67e27c8-2144-4f81-9e0f-c257809f6958-kserve-provision-location\") pod \"a67e27c8-2144-4f81-9e0f-c257809f6958\" (UID: \"a67e27c8-2144-4f81-9e0f-c257809f6958\") " Apr 16 19:33:12.438884 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:12.438858 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a67e27c8-2144-4f81-9e0f-c257809f6958-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a67e27c8-2144-4f81-9e0f-c257809f6958" (UID: "a67e27c8-2144-4f81-9e0f-c257809f6958"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:33:12.438922 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:12.438882 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a67e27c8-2144-4f81-9e0f-c257809f6958-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "a67e27c8-2144-4f81-9e0f-c257809f6958" (UID: "a67e27c8-2144-4f81-9e0f-c257809f6958"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:33:12.530625 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:12.530590 2578 generic.go:358] "Generic (PLEG): container finished" podID="a67e27c8-2144-4f81-9e0f-c257809f6958" containerID="83c04c7b2430f8feba1d7405ea5be55cd83d658887d714ef2f6b07e243e22888" exitCode=0 Apr 16 19:33:12.530801 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:12.530671 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-675b8" Apr 16 19:33:12.530801 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:12.530674 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-675b8" event={"ID":"a67e27c8-2144-4f81-9e0f-c257809f6958","Type":"ContainerDied","Data":"83c04c7b2430f8feba1d7405ea5be55cd83d658887d714ef2f6b07e243e22888"} Apr 16 19:33:12.530801 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:12.530708 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-675b8" event={"ID":"a67e27c8-2144-4f81-9e0f-c257809f6958","Type":"ContainerDied","Data":"9b7606a4716f3f0632686c76dca79a3cca0553c7ea734bbd14fa11a9d767ba75"} Apr 16 19:33:12.530801 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:12.530725 2578 scope.go:117] "RemoveContainer" containerID="83c04c7b2430f8feba1d7405ea5be55cd83d658887d714ef2f6b07e243e22888" Apr 16 19:33:12.538981 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:12.538965 2578 scope.go:117] "RemoveContainer" containerID="dbd5aa4e92dbf4050c5a4d9009b37e8eaec5e6864c664bd31a2519122bd74166" Apr 16 19:33:12.539050 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:12.539010 2578 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/a67e27c8-2144-4f81-9e0f-c257809f6958-cabundle-cert\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 19:33:12.539050 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:12.539026 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a67e27c8-2144-4f81-9e0f-c257809f6958-kserve-provision-location\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 19:33:12.546105 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:12.546088 2578 scope.go:117] "RemoveContainer" containerID="83c04c7b2430f8feba1d7405ea5be55cd83d658887d714ef2f6b07e243e22888" Apr 16 19:33:12.546382 ip-10-0-139-33 kubenswrapper[2578]: E0416 19:33:12.546358 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83c04c7b2430f8feba1d7405ea5be55cd83d658887d714ef2f6b07e243e22888\": container with ID starting with 83c04c7b2430f8feba1d7405ea5be55cd83d658887d714ef2f6b07e243e22888 not found: ID does not exist" containerID="83c04c7b2430f8feba1d7405ea5be55cd83d658887d714ef2f6b07e243e22888" Apr 16 19:33:12.546452 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:12.546389 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83c04c7b2430f8feba1d7405ea5be55cd83d658887d714ef2f6b07e243e22888"} err="failed to get container status \"83c04c7b2430f8feba1d7405ea5be55cd83d658887d714ef2f6b07e243e22888\": rpc error: code = NotFound desc = could not find container \"83c04c7b2430f8feba1d7405ea5be55cd83d658887d714ef2f6b07e243e22888\": container with ID starting with 83c04c7b2430f8feba1d7405ea5be55cd83d658887d714ef2f6b07e243e22888 not found: ID does not exist" Apr 16 19:33:12.546452 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:12.546408 2578 scope.go:117] "RemoveContainer" containerID="dbd5aa4e92dbf4050c5a4d9009b37e8eaec5e6864c664bd31a2519122bd74166" Apr 16 19:33:12.546644 ip-10-0-139-33 kubenswrapper[2578]: E0416 19:33:12.546628 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbd5aa4e92dbf4050c5a4d9009b37e8eaec5e6864c664bd31a2519122bd74166\": container with ID starting with dbd5aa4e92dbf4050c5a4d9009b37e8eaec5e6864c664bd31a2519122bd74166 not found: ID does not exist" containerID="dbd5aa4e92dbf4050c5a4d9009b37e8eaec5e6864c664bd31a2519122bd74166" Apr 16 19:33:12.546688 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:12.546648 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbd5aa4e92dbf4050c5a4d9009b37e8eaec5e6864c664bd31a2519122bd74166"} err="failed to get container status \"dbd5aa4e92dbf4050c5a4d9009b37e8eaec5e6864c664bd31a2519122bd74166\": rpc error: code = NotFound desc = could not find container \"dbd5aa4e92dbf4050c5a4d9009b37e8eaec5e6864c664bd31a2519122bd74166\": container with ID starting with dbd5aa4e92dbf4050c5a4d9009b37e8eaec5e6864c664bd31a2519122bd74166 not found: ID does not exist" Apr 16 19:33:12.550210 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:12.550172 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-675b8"] Apr 16 19:33:12.553294 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:12.553273 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-675b8"] Apr 16 19:33:13.269825 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:13.269787 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-675b8" podUID="a67e27c8-2144-4f81-9e0f-c257809f6958" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.63:8080: i/o timeout" Apr 16 19:33:13.535688 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:13.535620 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-66b6cdf85d-zr6k9_29d52757-0ea6-4733-9053-00e5ff0d9a38/storage-initializer/0.log" Apr 16 19:33:13.535688 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:13.535656 2578 generic.go:358] "Generic (PLEG): container finished" podID="29d52757-0ea6-4733-9053-00e5ff0d9a38" containerID="a93dc3dae8ac058645834634a3b630c8ae7c105c9b556ce6db7acc5e806e6800" exitCode=1 Apr 16 19:33:13.536268 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:13.535734 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-66b6cdf85d-zr6k9" event={"ID":"29d52757-0ea6-4733-9053-00e5ff0d9a38","Type":"ContainerDied","Data":"a93dc3dae8ac058645834634a3b630c8ae7c105c9b556ce6db7acc5e806e6800"} Apr 16 19:33:13.841864 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:13.841775 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a67e27c8-2144-4f81-9e0f-c257809f6958" path="/var/lib/kubelet/pods/a67e27c8-2144-4f81-9e0f-c257809f6958/volumes" Apr 16 19:33:14.542075 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:14.542044 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-66b6cdf85d-zr6k9_29d52757-0ea6-4733-9053-00e5ff0d9a38/storage-initializer/0.log" Apr 16 19:33:14.542491 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:14.542169 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-66b6cdf85d-zr6k9" event={"ID":"29d52757-0ea6-4733-9053-00e5ff0d9a38","Type":"ContainerStarted","Data":"c86f00f4d3579cd51d31233622789070b2e833f47d8459d7c5f59342394e2cd1"} Apr 16 19:33:18.556296 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:18.556266 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-66b6cdf85d-zr6k9_29d52757-0ea6-4733-9053-00e5ff0d9a38/storage-initializer/1.log" Apr 16 19:33:18.556710 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:18.556570 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-66b6cdf85d-zr6k9_29d52757-0ea6-4733-9053-00e5ff0d9a38/storage-initializer/0.log" Apr 16 19:33:18.556710 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:18.556603 2578 generic.go:358] "Generic (PLEG): container finished" podID="29d52757-0ea6-4733-9053-00e5ff0d9a38" containerID="c86f00f4d3579cd51d31233622789070b2e833f47d8459d7c5f59342394e2cd1" exitCode=1 Apr 16 19:33:18.556710 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:18.556682 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-66b6cdf85d-zr6k9" event={"ID":"29d52757-0ea6-4733-9053-00e5ff0d9a38","Type":"ContainerDied","Data":"c86f00f4d3579cd51d31233622789070b2e833f47d8459d7c5f59342394e2cd1"} Apr 16 19:33:18.556806 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:18.556728 2578 scope.go:117] "RemoveContainer" containerID="a93dc3dae8ac058645834634a3b630c8ae7c105c9b556ce6db7acc5e806e6800" Apr 16 19:33:18.557051 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:18.557033 2578 scope.go:117] "RemoveContainer" containerID="a93dc3dae8ac058645834634a3b630c8ae7c105c9b556ce6db7acc5e806e6800" Apr 16 19:33:18.568059 ip-10-0-139-33 kubenswrapper[2578]: E0416 19:33:18.568030 2578 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-custom-fail-predictor-66b6cdf85d-zr6k9_kserve-ci-e2e-test_29d52757-0ea6-4733-9053-00e5ff0d9a38_0 in pod sandbox fc2941428bc3ffffb6f0d308bdd1add4fdb8b8b48f43418ec48d49e9b5667244 from index: no such id: 'a93dc3dae8ac058645834634a3b630c8ae7c105c9b556ce6db7acc5e806e6800'" containerID="a93dc3dae8ac058645834634a3b630c8ae7c105c9b556ce6db7acc5e806e6800" Apr 16 19:33:18.568235 ip-10-0-139-33 kubenswrapper[2578]: E0416 19:33:18.568212 2578 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-custom-fail-predictor-66b6cdf85d-zr6k9_kserve-ci-e2e-test_29d52757-0ea6-4733-9053-00e5ff0d9a38_0 in pod sandbox fc2941428bc3ffffb6f0d308bdd1add4fdb8b8b48f43418ec48d49e9b5667244 from index: no such id: 'a93dc3dae8ac058645834634a3b630c8ae7c105c9b556ce6db7acc5e806e6800'; Skipping pod \"isvc-sklearn-s3-tls-custom-fail-predictor-66b6cdf85d-zr6k9_kserve-ci-e2e-test(29d52757-0ea6-4733-9053-00e5ff0d9a38)\"" logger="UnhandledError" Apr 16 19:33:18.569797 ip-10-0-139-33 kubenswrapper[2578]: E0416 19:33:18.569767 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-sklearn-s3-tls-custom-fail-predictor-66b6cdf85d-zr6k9_kserve-ci-e2e-test(29d52757-0ea6-4733-9053-00e5ff0d9a38)\"" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-66b6cdf85d-zr6k9" podUID="29d52757-0ea6-4733-9053-00e5ff0d9a38" Apr 16 19:33:19.075732 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:19.075698 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-66b6cdf85d-zr6k9"] Apr 16 19:33:19.561146 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:19.561123 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-66b6cdf85d-zr6k9_29d52757-0ea6-4733-9053-00e5ff0d9a38/storage-initializer/1.log" Apr 16 19:33:19.685674 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:19.685655 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-66b6cdf85d-zr6k9_29d52757-0ea6-4733-9053-00e5ff0d9a38/storage-initializer/1.log" Apr 16 19:33:19.685792 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:19.685714 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-66b6cdf85d-zr6k9" Apr 16 19:33:19.797135 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:19.797106 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/29d52757-0ea6-4733-9053-00e5ff0d9a38-kserve-provision-location\") pod \"29d52757-0ea6-4733-9053-00e5ff0d9a38\" (UID: \"29d52757-0ea6-4733-9053-00e5ff0d9a38\") " Apr 16 19:33:19.797458 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:19.797429 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29d52757-0ea6-4733-9053-00e5ff0d9a38-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "29d52757-0ea6-4733-9053-00e5ff0d9a38" (UID: "29d52757-0ea6-4733-9053-00e5ff0d9a38"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:33:19.898164 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:19.898137 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/29d52757-0ea6-4733-9053-00e5ff0d9a38-kserve-provision-location\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 19:33:20.132806 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:20.132728 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-2wqkh"] Apr 16 19:33:20.133055 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:20.133043 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="29d52757-0ea6-4733-9053-00e5ff0d9a38" containerName="storage-initializer" Apr 16 19:33:20.133098 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:20.133057 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="29d52757-0ea6-4733-9053-00e5ff0d9a38" containerName="storage-initializer" Apr 16 19:33:20.133098 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:20.133075 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a67e27c8-2144-4f81-9e0f-c257809f6958" containerName="storage-initializer" Apr 16 19:33:20.133098 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:20.133081 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="a67e27c8-2144-4f81-9e0f-c257809f6958" containerName="storage-initializer" Apr 16 19:33:20.133098 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:20.133087 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a67e27c8-2144-4f81-9e0f-c257809f6958" containerName="kserve-container" Apr 16 19:33:20.133098 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:20.133094 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="a67e27c8-2144-4f81-9e0f-c257809f6958" containerName="kserve-container" Apr 16 19:33:20.133279 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:20.133145 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="a67e27c8-2144-4f81-9e0f-c257809f6958" containerName="kserve-container" Apr 16 19:33:20.133279 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:20.133157 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="29d52757-0ea6-4733-9053-00e5ff0d9a38" containerName="storage-initializer" Apr 16 19:33:20.133279 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:20.133164 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="29d52757-0ea6-4733-9053-00e5ff0d9a38" containerName="storage-initializer" Apr 16 19:33:20.133279 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:20.133239 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="29d52757-0ea6-4733-9053-00e5ff0d9a38" containerName="storage-initializer" Apr 16 19:33:20.133279 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:20.133245 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="29d52757-0ea6-4733-9053-00e5ff0d9a38" containerName="storage-initializer" Apr 16 19:33:20.137440 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:20.137422 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-2wqkh" Apr 16 19:33:20.139716 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:20.139696 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 16 19:33:20.143704 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:20.143684 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-2wqkh"] Apr 16 19:33:20.302165 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:20.302138 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c41ae15b-65ee-4c1a-8aaf-2b799e237ce4-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-2wqkh\" (UID: \"c41ae15b-65ee-4c1a-8aaf-2b799e237ce4\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-2wqkh" Apr 16 19:33:20.302322 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:20.302214 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/c41ae15b-65ee-4c1a-8aaf-2b799e237ce4-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-2wqkh\" (UID: \"c41ae15b-65ee-4c1a-8aaf-2b799e237ce4\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-2wqkh" Apr 16 19:33:20.403265 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:20.403190 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/c41ae15b-65ee-4c1a-8aaf-2b799e237ce4-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-2wqkh\" (UID: \"c41ae15b-65ee-4c1a-8aaf-2b799e237ce4\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-2wqkh" Apr 16 19:33:20.403265 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:20.403251 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c41ae15b-65ee-4c1a-8aaf-2b799e237ce4-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-2wqkh\" (UID: \"c41ae15b-65ee-4c1a-8aaf-2b799e237ce4\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-2wqkh" Apr 16 19:33:20.403564 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:20.403549 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c41ae15b-65ee-4c1a-8aaf-2b799e237ce4-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-2wqkh\" (UID: \"c41ae15b-65ee-4c1a-8aaf-2b799e237ce4\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-2wqkh" Apr 16 19:33:20.403737 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:20.403721 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/c41ae15b-65ee-4c1a-8aaf-2b799e237ce4-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-2wqkh\" (UID: \"c41ae15b-65ee-4c1a-8aaf-2b799e237ce4\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-2wqkh" Apr 16 19:33:20.448885 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:20.448864 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-2wqkh" Apr 16 19:33:20.572300 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:20.569013 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-66b6cdf85d-zr6k9_29d52757-0ea6-4733-9053-00e5ff0d9a38/storage-initializer/1.log" Apr 16 19:33:20.572300 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:20.569145 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-66b6cdf85d-zr6k9" event={"ID":"29d52757-0ea6-4733-9053-00e5ff0d9a38","Type":"ContainerDied","Data":"fc2941428bc3ffffb6f0d308bdd1add4fdb8b8b48f43418ec48d49e9b5667244"} Apr 16 19:33:20.572300 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:20.569215 2578 scope.go:117] "RemoveContainer" containerID="c86f00f4d3579cd51d31233622789070b2e833f47d8459d7c5f59342394e2cd1" Apr 16 19:33:20.572300 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:20.569426 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-66b6cdf85d-zr6k9" Apr 16 19:33:20.572300 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:20.570576 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-2wqkh"] Apr 16 19:33:20.577208 ip-10-0-139-33 kubenswrapper[2578]: W0416 19:33:20.577163 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc41ae15b_65ee_4c1a_8aaf_2b799e237ce4.slice/crio-14248394bc2e1a09e2b0b0491cb2a7049d3eda02795de5989c42c7e8b550897d WatchSource:0}: Error finding container 14248394bc2e1a09e2b0b0491cb2a7049d3eda02795de5989c42c7e8b550897d: Status 404 returned error can't find the container with id 14248394bc2e1a09e2b0b0491cb2a7049d3eda02795de5989c42c7e8b550897d Apr 16 19:33:20.597655 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:20.597633 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-66b6cdf85d-zr6k9"] Apr 16 19:33:20.601805 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:20.601781 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-66b6cdf85d-zr6k9"] Apr 16 19:33:21.574109 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:21.574080 2578 generic.go:358] "Generic (PLEG): container finished" podID="c41ae15b-65ee-4c1a-8aaf-2b799e237ce4" containerID="2aa2b8e20e40d763726a2ceedf9386c5025bbb2d382534dd91975f25d388a62c" exitCode=0 Apr 16 19:33:21.574455 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:21.574130 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-2wqkh" event={"ID":"c41ae15b-65ee-4c1a-8aaf-2b799e237ce4","Type":"ContainerDied","Data":"2aa2b8e20e40d763726a2ceedf9386c5025bbb2d382534dd91975f25d388a62c"} Apr 16 19:33:21.574455 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:21.574153 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-2wqkh" event={"ID":"c41ae15b-65ee-4c1a-8aaf-2b799e237ce4","Type":"ContainerStarted","Data":"14248394bc2e1a09e2b0b0491cb2a7049d3eda02795de5989c42c7e8b550897d"} Apr 16 19:33:21.842631 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:21.842546 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29d52757-0ea6-4733-9053-00e5ff0d9a38" path="/var/lib/kubelet/pods/29d52757-0ea6-4733-9053-00e5ff0d9a38/volumes" Apr 16 19:33:22.579318 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:22.579281 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-2wqkh" event={"ID":"c41ae15b-65ee-4c1a-8aaf-2b799e237ce4","Type":"ContainerStarted","Data":"4ccf54191978010a668314a14ff84451c32348b0804bcdf80ed87dfecbec6b87"} Apr 16 19:33:22.579737 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:22.579590 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-2wqkh" Apr 16 19:33:22.580876 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:22.580849 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-2wqkh" podUID="c41ae15b-65ee-4c1a-8aaf-2b799e237ce4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.65:8080: connect: connection refused" Apr 16 19:33:22.596719 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:22.596676 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-2wqkh" podStartSLOduration=2.596664278 podStartE2EDuration="2.596664278s" podCreationTimestamp="2026-04-16 19:33:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:33:22.594610458 +0000 UTC m=+3779.381946725" watchObservedRunningTime="2026-04-16 19:33:22.596664278 +0000 UTC m=+3779.384000533" Apr 16 19:33:23.583160 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:23.583123 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-2wqkh" podUID="c41ae15b-65ee-4c1a-8aaf-2b799e237ce4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.65:8080: connect: connection refused" Apr 16 19:33:33.583399 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:33.583355 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-2wqkh" podUID="c41ae15b-65ee-4c1a-8aaf-2b799e237ce4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.65:8080: connect: connection refused" Apr 16 19:33:43.583436 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:43.583386 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-2wqkh" podUID="c41ae15b-65ee-4c1a-8aaf-2b799e237ce4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.65:8080: connect: connection refused" Apr 16 19:33:53.584060 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:33:53.584017 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-2wqkh" podUID="c41ae15b-65ee-4c1a-8aaf-2b799e237ce4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.65:8080: connect: connection refused" Apr 16 19:34:03.584127 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:34:03.584073 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-2wqkh" podUID="c41ae15b-65ee-4c1a-8aaf-2b799e237ce4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.65:8080: connect: connection refused" Apr 16 19:34:13.583848 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:34:13.583807 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-2wqkh" podUID="c41ae15b-65ee-4c1a-8aaf-2b799e237ce4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.65:8080: connect: connection refused" Apr 16 19:34:23.583922 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:34:23.583879 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-2wqkh" podUID="c41ae15b-65ee-4c1a-8aaf-2b799e237ce4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.65:8080: connect: connection refused" Apr 16 19:34:23.839644 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:34:23.839542 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-2wqkh" podUID="c41ae15b-65ee-4c1a-8aaf-2b799e237ce4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.65:8080: connect: connection refused" Apr 16 19:34:33.840567 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:34:33.840538 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-2wqkh" Apr 16 19:34:40.192036 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:34:40.191999 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-2wqkh"] Apr 16 19:34:40.192545 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:34:40.192368 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-2wqkh" podUID="c41ae15b-65ee-4c1a-8aaf-2b799e237ce4" containerName="kserve-container" containerID="cri-o://4ccf54191978010a668314a14ff84451c32348b0804bcdf80ed87dfecbec6b87" gracePeriod=30 Apr 16 19:34:41.244213 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:34:41.244165 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-r7dbg"] Apr 16 19:34:41.247568 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:34:41.247547 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-r7dbg" Apr 16 19:34:41.255570 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:34:41.255548 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-r7dbg"] Apr 16 19:34:41.334817 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:34:41.334779 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f47b1a7d-6a88-4365-92a7-924d6bd9145a-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-r7dbg\" (UID: \"f47b1a7d-6a88-4365-92a7-924d6bd9145a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-r7dbg" Apr 16 19:34:41.435955 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:34:41.435917 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f47b1a7d-6a88-4365-92a7-924d6bd9145a-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-r7dbg\" (UID: \"f47b1a7d-6a88-4365-92a7-924d6bd9145a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-r7dbg" Apr 16 19:34:41.436364 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:34:41.436342 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f47b1a7d-6a88-4365-92a7-924d6bd9145a-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-r7dbg\" (UID: \"f47b1a7d-6a88-4365-92a7-924d6bd9145a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-r7dbg" Apr 16 19:34:41.558090 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:34:41.558008 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-r7dbg" Apr 16 19:34:41.676898 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:34:41.676866 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-r7dbg"] Apr 16 19:34:41.678608 ip-10-0-139-33 kubenswrapper[2578]: W0416 19:34:41.678572 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf47b1a7d_6a88_4365_92a7_924d6bd9145a.slice/crio-bb5298637a94fc8b85c5e758002484eb2cf55ce2d700a02da48c18fd20d6ae8e WatchSource:0}: Error finding container bb5298637a94fc8b85c5e758002484eb2cf55ce2d700a02da48c18fd20d6ae8e: Status 404 returned error can't find the container with id bb5298637a94fc8b85c5e758002484eb2cf55ce2d700a02da48c18fd20d6ae8e Apr 16 19:34:41.844545 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:34:41.844456 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-r7dbg" event={"ID":"f47b1a7d-6a88-4365-92a7-924d6bd9145a","Type":"ContainerStarted","Data":"92fda828b1ce2fc3495fb9ad7f1fa99d45e40529e0d1b772578b1e4d7de917d5"} Apr 16 19:34:41.844545 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:34:41.844503 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-r7dbg" event={"ID":"f47b1a7d-6a88-4365-92a7-924d6bd9145a","Type":"ContainerStarted","Data":"bb5298637a94fc8b85c5e758002484eb2cf55ce2d700a02da48c18fd20d6ae8e"} Apr 16 19:34:43.839712 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:34:43.839671 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-2wqkh" podUID="c41ae15b-65ee-4c1a-8aaf-2b799e237ce4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.65:8080: connect: connection refused" Apr 16 19:34:44.338048 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:34:44.338020 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-2wqkh" Apr 16 19:34:44.356517 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:34:44.356478 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/c41ae15b-65ee-4c1a-8aaf-2b799e237ce4-cabundle-cert\") pod \"c41ae15b-65ee-4c1a-8aaf-2b799e237ce4\" (UID: \"c41ae15b-65ee-4c1a-8aaf-2b799e237ce4\") " Apr 16 19:34:44.356675 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:34:44.356580 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c41ae15b-65ee-4c1a-8aaf-2b799e237ce4-kserve-provision-location\") pod \"c41ae15b-65ee-4c1a-8aaf-2b799e237ce4\" (UID: \"c41ae15b-65ee-4c1a-8aaf-2b799e237ce4\") " Apr 16 19:34:44.356877 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:34:44.356854 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c41ae15b-65ee-4c1a-8aaf-2b799e237ce4-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "c41ae15b-65ee-4c1a-8aaf-2b799e237ce4" (UID: "c41ae15b-65ee-4c1a-8aaf-2b799e237ce4"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:34:44.356963 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:34:44.356883 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c41ae15b-65ee-4c1a-8aaf-2b799e237ce4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c41ae15b-65ee-4c1a-8aaf-2b799e237ce4" (UID: "c41ae15b-65ee-4c1a-8aaf-2b799e237ce4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:34:44.457941 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:34:44.457835 2578 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/c41ae15b-65ee-4c1a-8aaf-2b799e237ce4-cabundle-cert\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 19:34:44.457941 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:34:44.457883 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c41ae15b-65ee-4c1a-8aaf-2b799e237ce4-kserve-provision-location\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 19:34:44.851460 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:34:44.851427 2578 generic.go:358] "Generic (PLEG): container finished" podID="c41ae15b-65ee-4c1a-8aaf-2b799e237ce4" containerID="4ccf54191978010a668314a14ff84451c32348b0804bcdf80ed87dfecbec6b87" exitCode=0 Apr 16 19:34:44.851948 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:34:44.851495 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-2wqkh" Apr 16 19:34:44.851948 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:34:44.851510 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-2wqkh" event={"ID":"c41ae15b-65ee-4c1a-8aaf-2b799e237ce4","Type":"ContainerDied","Data":"4ccf54191978010a668314a14ff84451c32348b0804bcdf80ed87dfecbec6b87"} Apr 16 19:34:44.851948 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:34:44.851547 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-2wqkh" event={"ID":"c41ae15b-65ee-4c1a-8aaf-2b799e237ce4","Type":"ContainerDied","Data":"14248394bc2e1a09e2b0b0491cb2a7049d3eda02795de5989c42c7e8b550897d"} Apr 16 19:34:44.851948 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:34:44.851562 2578 scope.go:117] "RemoveContainer" containerID="4ccf54191978010a668314a14ff84451c32348b0804bcdf80ed87dfecbec6b87" Apr 16 19:34:44.859730 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:34:44.859709 2578 scope.go:117] "RemoveContainer" containerID="2aa2b8e20e40d763726a2ceedf9386c5025bbb2d382534dd91975f25d388a62c" Apr 16 19:34:44.866687 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:34:44.866670 2578 scope.go:117] "RemoveContainer" containerID="4ccf54191978010a668314a14ff84451c32348b0804bcdf80ed87dfecbec6b87" Apr 16 19:34:44.866905 ip-10-0-139-33 kubenswrapper[2578]: E0416 19:34:44.866885 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ccf54191978010a668314a14ff84451c32348b0804bcdf80ed87dfecbec6b87\": container with ID starting with 4ccf54191978010a668314a14ff84451c32348b0804bcdf80ed87dfecbec6b87 not found: ID does not exist" containerID="4ccf54191978010a668314a14ff84451c32348b0804bcdf80ed87dfecbec6b87" Apr 16 19:34:44.866968 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:34:44.866913 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ccf54191978010a668314a14ff84451c32348b0804bcdf80ed87dfecbec6b87"} err="failed to get container status \"4ccf54191978010a668314a14ff84451c32348b0804bcdf80ed87dfecbec6b87\": rpc error: code = NotFound desc = could not find container \"4ccf54191978010a668314a14ff84451c32348b0804bcdf80ed87dfecbec6b87\": container with ID starting with 4ccf54191978010a668314a14ff84451c32348b0804bcdf80ed87dfecbec6b87 not found: ID does not exist" Apr 16 19:34:44.866968 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:34:44.866930 2578 scope.go:117] "RemoveContainer" containerID="2aa2b8e20e40d763726a2ceedf9386c5025bbb2d382534dd91975f25d388a62c" Apr 16 19:34:44.867146 ip-10-0-139-33 kubenswrapper[2578]: E0416 19:34:44.867131 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2aa2b8e20e40d763726a2ceedf9386c5025bbb2d382534dd91975f25d388a62c\": container with ID starting with 2aa2b8e20e40d763726a2ceedf9386c5025bbb2d382534dd91975f25d388a62c not found: ID does not exist" containerID="2aa2b8e20e40d763726a2ceedf9386c5025bbb2d382534dd91975f25d388a62c" Apr 16 19:34:44.867202 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:34:44.867151 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2aa2b8e20e40d763726a2ceedf9386c5025bbb2d382534dd91975f25d388a62c"} err="failed to get container status \"2aa2b8e20e40d763726a2ceedf9386c5025bbb2d382534dd91975f25d388a62c\": rpc error: code = NotFound desc = could not find container \"2aa2b8e20e40d763726a2ceedf9386c5025bbb2d382534dd91975f25d388a62c\": container with ID starting with 2aa2b8e20e40d763726a2ceedf9386c5025bbb2d382534dd91975f25d388a62c not found: ID does not exist" Apr 16 19:34:44.872927 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:34:44.872898 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-2wqkh"] Apr 16 19:34:44.875736 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:34:44.875716 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-2wqkh"] Apr 16 19:34:45.842168 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:34:45.842135 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c41ae15b-65ee-4c1a-8aaf-2b799e237ce4" path="/var/lib/kubelet/pods/c41ae15b-65ee-4c1a-8aaf-2b799e237ce4/volumes" Apr 16 19:34:45.855370 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:34:45.855344 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-r7dbg_f47b1a7d-6a88-4365-92a7-924d6bd9145a/storage-initializer/0.log" Apr 16 19:34:45.855771 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:34:45.855383 2578 generic.go:358] "Generic (PLEG): container finished" podID="f47b1a7d-6a88-4365-92a7-924d6bd9145a" containerID="92fda828b1ce2fc3495fb9ad7f1fa99d45e40529e0d1b772578b1e4d7de917d5" exitCode=1 Apr 16 19:34:45.855771 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:34:45.855456 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-r7dbg" event={"ID":"f47b1a7d-6a88-4365-92a7-924d6bd9145a","Type":"ContainerDied","Data":"92fda828b1ce2fc3495fb9ad7f1fa99d45e40529e0d1b772578b1e4d7de917d5"} Apr 16 19:34:46.866254 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:34:46.866224 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-r7dbg_f47b1a7d-6a88-4365-92a7-924d6bd9145a/storage-initializer/0.log" Apr 16 19:34:46.866626 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:34:46.866287 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-r7dbg" event={"ID":"f47b1a7d-6a88-4365-92a7-924d6bd9145a","Type":"ContainerStarted","Data":"4bf44c097c79c6af9e9d8239d68f757bddee0019d280072ca7f024d5a1ba429a"} Apr 16 19:34:47.869995 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:34:47.869967 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-r7dbg_f47b1a7d-6a88-4365-92a7-924d6bd9145a/storage-initializer/1.log" Apr 16 19:34:47.870411 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:34:47.870308 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-r7dbg_f47b1a7d-6a88-4365-92a7-924d6bd9145a/storage-initializer/0.log" Apr 16 19:34:47.870411 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:34:47.870337 2578 generic.go:358] "Generic (PLEG): container finished" podID="f47b1a7d-6a88-4365-92a7-924d6bd9145a" containerID="4bf44c097c79c6af9e9d8239d68f757bddee0019d280072ca7f024d5a1ba429a" exitCode=1 Apr 16 19:34:47.870411 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:34:47.870407 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-r7dbg" event={"ID":"f47b1a7d-6a88-4365-92a7-924d6bd9145a","Type":"ContainerDied","Data":"4bf44c097c79c6af9e9d8239d68f757bddee0019d280072ca7f024d5a1ba429a"} Apr 16 19:34:47.870520 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:34:47.870443 2578 scope.go:117] "RemoveContainer" containerID="92fda828b1ce2fc3495fb9ad7f1fa99d45e40529e0d1b772578b1e4d7de917d5" Apr 16 19:34:47.870785 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:34:47.870763 2578 scope.go:117] "RemoveContainer" containerID="92fda828b1ce2fc3495fb9ad7f1fa99d45e40529e0d1b772578b1e4d7de917d5" Apr 16 19:34:47.880984 ip-10-0-139-33 kubenswrapper[2578]: E0416 19:34:47.880941 2578 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-r7dbg_kserve-ci-e2e-test_f47b1a7d-6a88-4365-92a7-924d6bd9145a_0 in pod sandbox bb5298637a94fc8b85c5e758002484eb2cf55ce2d700a02da48c18fd20d6ae8e from index: no such id: '92fda828b1ce2fc3495fb9ad7f1fa99d45e40529e0d1b772578b1e4d7de917d5'" containerID="92fda828b1ce2fc3495fb9ad7f1fa99d45e40529e0d1b772578b1e4d7de917d5" Apr 16 19:34:47.881056 ip-10-0-139-33 kubenswrapper[2578]: E0416 19:34:47.881002 2578 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-r7dbg_kserve-ci-e2e-test_f47b1a7d-6a88-4365-92a7-924d6bd9145a_0 in pod sandbox bb5298637a94fc8b85c5e758002484eb2cf55ce2d700a02da48c18fd20d6ae8e from index: no such id: '92fda828b1ce2fc3495fb9ad7f1fa99d45e40529e0d1b772578b1e4d7de917d5'; Skipping pod \"isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-r7dbg_kserve-ci-e2e-test(f47b1a7d-6a88-4365-92a7-924d6bd9145a)\"" logger="UnhandledError" Apr 16 19:34:47.882385 ip-10-0-139-33 kubenswrapper[2578]: E0416 19:34:47.882360 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-r7dbg_kserve-ci-e2e-test(f47b1a7d-6a88-4365-92a7-924d6bd9145a)\"" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-r7dbg" podUID="f47b1a7d-6a88-4365-92a7-924d6bd9145a" Apr 16 19:34:48.874413 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:34:48.874386 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-r7dbg_f47b1a7d-6a88-4365-92a7-924d6bd9145a/storage-initializer/1.log" Apr 16 19:34:51.261554 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:34:51.261520 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-r7dbg"] Apr 16 19:34:51.396095 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:34:51.396071 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-r7dbg_f47b1a7d-6a88-4365-92a7-924d6bd9145a/storage-initializer/1.log" Apr 16 19:34:51.396230 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:34:51.396131 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-r7dbg" Apr 16 19:34:51.417992 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:34:51.417960 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f47b1a7d-6a88-4365-92a7-924d6bd9145a-kserve-provision-location\") pod \"f47b1a7d-6a88-4365-92a7-924d6bd9145a\" (UID: \"f47b1a7d-6a88-4365-92a7-924d6bd9145a\") " Apr 16 19:34:51.418353 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:34:51.418329 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f47b1a7d-6a88-4365-92a7-924d6bd9145a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f47b1a7d-6a88-4365-92a7-924d6bd9145a" (UID: "f47b1a7d-6a88-4365-92a7-924d6bd9145a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:34:51.519530 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:34:51.519455 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f47b1a7d-6a88-4365-92a7-924d6bd9145a-kserve-provision-location\") on node \"ip-10-0-139-33.ec2.internal\" DevicePath \"\"" Apr 16 19:34:51.884933 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:34:51.884907 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-r7dbg_f47b1a7d-6a88-4365-92a7-924d6bd9145a/storage-initializer/1.log" Apr 16 19:34:51.885095 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:34:51.885019 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-r7dbg" event={"ID":"f47b1a7d-6a88-4365-92a7-924d6bd9145a","Type":"ContainerDied","Data":"bb5298637a94fc8b85c5e758002484eb2cf55ce2d700a02da48c18fd20d6ae8e"} Apr 16 19:34:51.885095 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:34:51.885057 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-r7dbg" Apr 16 19:34:51.885095 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:34:51.885074 2578 scope.go:117] "RemoveContainer" containerID="4bf44c097c79c6af9e9d8239d68f757bddee0019d280072ca7f024d5a1ba429a" Apr 16 19:34:51.910970 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:34:51.910945 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-r7dbg"] Apr 16 19:34:51.915069 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:34:51.915042 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-r7dbg"] Apr 16 19:34:53.841801 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:34:53.841765 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f47b1a7d-6a88-4365-92a7-924d6bd9145a" path="/var/lib/kubelet/pods/f47b1a7d-6a88-4365-92a7-924d6bd9145a/volumes" Apr 16 19:35:22.247829 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:22.247802 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-dgp56_38d2b4bc-50cc-4d31-b125-9af46f137f46/global-pull-secret-syncer/0.log" Apr 16 19:35:22.401361 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:22.401330 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-gcdwk_f4d6c00d-a887-4de5-87f8-5b4449359aa4/konnectivity-agent/0.log" Apr 16 19:35:22.489915 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:22.489878 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-139-33.ec2.internal_27e141a8b2c2991dacebb3be05ed01ab/haproxy/0.log" Apr 16 19:35:26.252188 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:26.252133 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-dfzr4_bdf8d744-507e-4943-8866-e60d7c582151/cluster-monitoring-operator/0.log" Apr 16 19:35:26.274811 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:26.274768 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-mcgzx_41a2f87e-530d-4207-8cc8-e7ee979357e0/kube-state-metrics/0.log" Apr 16 19:35:26.297487 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:26.297468 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-mcgzx_41a2f87e-530d-4207-8cc8-e7ee979357e0/kube-rbac-proxy-main/0.log" Apr 16 19:35:26.320220 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:26.320197 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-mcgzx_41a2f87e-530d-4207-8cc8-e7ee979357e0/kube-rbac-proxy-self/0.log" Apr 16 19:35:26.351730 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:26.351707 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-67b789b557-rnptf_bd1da6bd-aefd-4703-afb9-ec982489f130/metrics-server/0.log" Apr 16 19:35:26.379164 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:26.379143 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-5876b4bbc7-4zlht_414da762-c744-460c-8983-4a538c9e63e7/monitoring-plugin/0.log" Apr 16 19:35:26.494229 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:26.494194 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-b6czb_758ed149-3d81-4eec-bdac-e7eeca35aecc/node-exporter/0.log" Apr 16 19:35:26.515020 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:26.514950 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-b6czb_758ed149-3d81-4eec-bdac-e7eeca35aecc/kube-rbac-proxy/0.log" Apr 16 19:35:26.537522 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:26.537498 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-b6czb_758ed149-3d81-4eec-bdac-e7eeca35aecc/init-textfile/0.log" Apr 16 19:35:26.917721 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:26.917647 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-7sgp9_c1e239f4-fc22-47cc-bda3-343f85242b88/prometheus-operator/0.log" Apr 16 19:35:26.934577 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:26.934559 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-7sgp9_c1e239f4-fc22-47cc-bda3-343f85242b88/kube-rbac-proxy/0.log" Apr 16 19:35:26.963824 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:26.963798 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-9cb97cd87-w6jqv_c4040798-ae88-4f3b-abb7-2af899225127/prometheus-operator-admission-webhook/0.log" Apr 16 19:35:27.025387 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:27.025364 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6d557589b4-w5v7x_6472b575-f118-402f-b723-1a8ec3fdae51/telemeter-client/0.log" Apr 16 19:35:27.059269 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:27.059245 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6d557589b4-w5v7x_6472b575-f118-402f-b723-1a8ec3fdae51/reload/0.log" Apr 16 19:35:27.089185 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:27.089162 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6d557589b4-w5v7x_6472b575-f118-402f-b723-1a8ec3fdae51/kube-rbac-proxy/0.log" Apr 16 19:35:29.144563 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:29.144532 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7999ffd77-fmmjm_02bdcfc3-3fd9-47ff-ab10-d2223dfc8e74/console/0.log" Apr 16 19:35:29.489744 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:29.489713 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-dncfj/perf-node-gather-daemonset-4hzvg"] Apr 16 19:35:29.490053 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:29.490040 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c41ae15b-65ee-4c1a-8aaf-2b799e237ce4" containerName="storage-initializer" Apr 16 19:35:29.490105 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:29.490054 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="c41ae15b-65ee-4c1a-8aaf-2b799e237ce4" containerName="storage-initializer" Apr 16 19:35:29.490105 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:29.490074 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f47b1a7d-6a88-4365-92a7-924d6bd9145a" containerName="storage-initializer" Apr 16 19:35:29.490105 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:29.490079 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="f47b1a7d-6a88-4365-92a7-924d6bd9145a" containerName="storage-initializer" Apr 16 19:35:29.490105 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:29.490090 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c41ae15b-65ee-4c1a-8aaf-2b799e237ce4" containerName="kserve-container" Apr 16 19:35:29.490105 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:29.490095 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="c41ae15b-65ee-4c1a-8aaf-2b799e237ce4" containerName="kserve-container" Apr 16 19:35:29.490278 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:29.490146 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="f47b1a7d-6a88-4365-92a7-924d6bd9145a" containerName="storage-initializer" Apr 16 19:35:29.490278 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:29.490153 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="f47b1a7d-6a88-4365-92a7-924d6bd9145a" containerName="storage-initializer" Apr 16 19:35:29.490278 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:29.490161 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="c41ae15b-65ee-4c1a-8aaf-2b799e237ce4" containerName="kserve-container" Apr 16 19:35:29.494300 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:29.494281 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dncfj/perf-node-gather-daemonset-4hzvg" Apr 16 19:35:29.497162 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:29.497133 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-dncfj\"/\"openshift-service-ca.crt\"" Apr 16 19:35:29.497162 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:29.497161 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-dncfj\"/\"default-dockercfg-4wqkq\"" Apr 16 19:35:29.498349 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:29.498330 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-dncfj\"/\"kube-root-ca.crt\"" Apr 16 19:35:29.503319 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:29.503297 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dncfj/perf-node-gather-daemonset-4hzvg"] Apr 16 19:35:29.625732 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:29.625705 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7d955d5dd4-gdpt2_305a7e75-9ebd-4072-9b0a-9eff1f2ca870/volume-data-source-validator/0.log" Apr 16 19:35:29.634539 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:29.634508 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d8f4f747-36ee-49be-beb7-9fb48ee1dbf8-proc\") pod \"perf-node-gather-daemonset-4hzvg\" (UID: \"d8f4f747-36ee-49be-beb7-9fb48ee1dbf8\") " pod="openshift-must-gather-dncfj/perf-node-gather-daemonset-4hzvg" Apr 16 19:35:29.634708 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:29.634547 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d8f4f747-36ee-49be-beb7-9fb48ee1dbf8-sys\") pod \"perf-node-gather-daemonset-4hzvg\" (UID: \"d8f4f747-36ee-49be-beb7-9fb48ee1dbf8\") " pod="openshift-must-gather-dncfj/perf-node-gather-daemonset-4hzvg" Apr 16 19:35:29.634708 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:29.634568 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d8f4f747-36ee-49be-beb7-9fb48ee1dbf8-podres\") pod \"perf-node-gather-daemonset-4hzvg\" (UID: \"d8f4f747-36ee-49be-beb7-9fb48ee1dbf8\") " pod="openshift-must-gather-dncfj/perf-node-gather-daemonset-4hzvg" Apr 16 19:35:29.634708 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:29.634664 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d8f4f747-36ee-49be-beb7-9fb48ee1dbf8-lib-modules\") pod \"perf-node-gather-daemonset-4hzvg\" (UID: \"d8f4f747-36ee-49be-beb7-9fb48ee1dbf8\") " pod="openshift-must-gather-dncfj/perf-node-gather-daemonset-4hzvg" Apr 16 19:35:29.634836 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:29.634714 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xldb7\" (UniqueName: \"kubernetes.io/projected/d8f4f747-36ee-49be-beb7-9fb48ee1dbf8-kube-api-access-xldb7\") pod \"perf-node-gather-daemonset-4hzvg\" (UID: \"d8f4f747-36ee-49be-beb7-9fb48ee1dbf8\") " pod="openshift-must-gather-dncfj/perf-node-gather-daemonset-4hzvg" Apr 16 19:35:29.735728 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:29.735690 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xldb7\" (UniqueName: \"kubernetes.io/projected/d8f4f747-36ee-49be-beb7-9fb48ee1dbf8-kube-api-access-xldb7\") pod \"perf-node-gather-daemonset-4hzvg\" (UID: \"d8f4f747-36ee-49be-beb7-9fb48ee1dbf8\") " pod="openshift-must-gather-dncfj/perf-node-gather-daemonset-4hzvg" Apr 16 19:35:29.735920 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:29.735750 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d8f4f747-36ee-49be-beb7-9fb48ee1dbf8-proc\") pod \"perf-node-gather-daemonset-4hzvg\" (UID: \"d8f4f747-36ee-49be-beb7-9fb48ee1dbf8\") " pod="openshift-must-gather-dncfj/perf-node-gather-daemonset-4hzvg" Apr 16 19:35:29.735920 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:29.735776 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d8f4f747-36ee-49be-beb7-9fb48ee1dbf8-sys\") pod \"perf-node-gather-daemonset-4hzvg\" (UID: \"d8f4f747-36ee-49be-beb7-9fb48ee1dbf8\") " pod="openshift-must-gather-dncfj/perf-node-gather-daemonset-4hzvg" Apr 16 19:35:29.735920 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:29.735795 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d8f4f747-36ee-49be-beb7-9fb48ee1dbf8-podres\") pod \"perf-node-gather-daemonset-4hzvg\" (UID: \"d8f4f747-36ee-49be-beb7-9fb48ee1dbf8\") " pod="openshift-must-gather-dncfj/perf-node-gather-daemonset-4hzvg" Apr 16 19:35:29.735920 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:29.735845 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d8f4f747-36ee-49be-beb7-9fb48ee1dbf8-lib-modules\") pod \"perf-node-gather-daemonset-4hzvg\" (UID: \"d8f4f747-36ee-49be-beb7-9fb48ee1dbf8\") " pod="openshift-must-gather-dncfj/perf-node-gather-daemonset-4hzvg" Apr 16 19:35:29.735920 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:29.735882 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d8f4f747-36ee-49be-beb7-9fb48ee1dbf8-proc\") pod \"perf-node-gather-daemonset-4hzvg\" (UID: \"d8f4f747-36ee-49be-beb7-9fb48ee1dbf8\") " pod="openshift-must-gather-dncfj/perf-node-gather-daemonset-4hzvg" Apr 16 19:35:29.735920 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:29.735890 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d8f4f747-36ee-49be-beb7-9fb48ee1dbf8-sys\") pod \"perf-node-gather-daemonset-4hzvg\" (UID: \"d8f4f747-36ee-49be-beb7-9fb48ee1dbf8\") " pod="openshift-must-gather-dncfj/perf-node-gather-daemonset-4hzvg" Apr 16 19:35:29.736137 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:29.735947 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d8f4f747-36ee-49be-beb7-9fb48ee1dbf8-podres\") pod \"perf-node-gather-daemonset-4hzvg\" (UID: \"d8f4f747-36ee-49be-beb7-9fb48ee1dbf8\") " pod="openshift-must-gather-dncfj/perf-node-gather-daemonset-4hzvg" Apr 16 19:35:29.736137 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:29.735965 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d8f4f747-36ee-49be-beb7-9fb48ee1dbf8-lib-modules\") pod \"perf-node-gather-daemonset-4hzvg\" (UID: \"d8f4f747-36ee-49be-beb7-9fb48ee1dbf8\") " pod="openshift-must-gather-dncfj/perf-node-gather-daemonset-4hzvg" Apr 16 19:35:29.744031 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:29.743986 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xldb7\" (UniqueName: \"kubernetes.io/projected/d8f4f747-36ee-49be-beb7-9fb48ee1dbf8-kube-api-access-xldb7\") pod \"perf-node-gather-daemonset-4hzvg\" (UID: \"d8f4f747-36ee-49be-beb7-9fb48ee1dbf8\") " pod="openshift-must-gather-dncfj/perf-node-gather-daemonset-4hzvg" Apr 16 19:35:29.804774 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:29.804740 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dncfj/perf-node-gather-daemonset-4hzvg" Apr 16 19:35:29.926066 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:29.926041 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dncfj/perf-node-gather-daemonset-4hzvg"] Apr 16 19:35:29.928484 ip-10-0-139-33 kubenswrapper[2578]: W0416 19:35:29.928449 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd8f4f747_36ee_49be_beb7_9fb48ee1dbf8.slice/crio-1138391921b81e06682a03e74b00c685edf9558a0a7ad59d3d9f7511d165193b WatchSource:0}: Error finding container 1138391921b81e06682a03e74b00c685edf9558a0a7ad59d3d9f7511d165193b: Status 404 returned error can't find the container with id 1138391921b81e06682a03e74b00c685edf9558a0a7ad59d3d9f7511d165193b Apr 16 19:35:29.930324 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:29.930307 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:35:30.004560 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:30.004483 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dncfj/perf-node-gather-daemonset-4hzvg" event={"ID":"d8f4f747-36ee-49be-beb7-9fb48ee1dbf8","Type":"ContainerStarted","Data":"144be1373c32057724934a89cfc669ff8658d035827ef0d6b2d71e783eb88cf0"} Apr 16 19:35:30.004560 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:30.004519 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dncfj/perf-node-gather-daemonset-4hzvg" event={"ID":"d8f4f747-36ee-49be-beb7-9fb48ee1dbf8","Type":"ContainerStarted","Data":"1138391921b81e06682a03e74b00c685edf9558a0a7ad59d3d9f7511d165193b"} Apr 16 19:35:30.004726 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:30.004640 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-dncfj/perf-node-gather-daemonset-4hzvg" Apr 16 19:35:30.025324 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:30.025281 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-dncfj/perf-node-gather-daemonset-4hzvg" podStartSLOduration=1.025267541 podStartE2EDuration="1.025267541s" podCreationTimestamp="2026-04-16 19:35:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:35:30.024950668 +0000 UTC m=+3906.812286922" watchObservedRunningTime="2026-04-16 19:35:30.025267541 +0000 UTC m=+3906.812603796" Apr 16 19:35:30.316352 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:30.316269 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-dlnfz_ccb22e48-3cd7-442e-ac5a-7ec7666b48e9/dns/0.log" Apr 16 19:35:30.345022 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:30.344984 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-dlnfz_ccb22e48-3cd7-442e-ac5a-7ec7666b48e9/kube-rbac-proxy/0.log" Apr 16 19:35:30.563036 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:30.563012 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-hkqvd_0003db0c-dda0-4476-bd64-528082f53f33/dns-node-resolver/0.log" Apr 16 19:35:31.044705 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:31.044673 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-fj55k_1168c01f-c07f-44f9-b56f-cc88b2028e0b/node-ca/0.log" Apr 16 19:35:32.188097 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:32.188067 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-pb8c8_714c4beb-40ac-4478-80ff-d058fb5fd1a3/serve-healthcheck-canary/0.log" Apr 16 19:35:32.588571 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:32.588543 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-99n6h_9eddac0a-a3b8-4340-8157-5cbbd08512d7/kube-rbac-proxy/0.log" Apr 16 19:35:32.610772 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:32.610746 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-99n6h_9eddac0a-a3b8-4340-8157-5cbbd08512d7/exporter/0.log" Apr 16 19:35:32.636064 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:32.636036 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-99n6h_9eddac0a-a3b8-4340-8157-5cbbd08512d7/extractor/0.log" Apr 16 19:35:34.822338 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:34.822303 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-7c68cb4fc8-ww5qj_1164058e-92a2-41a8-8700-0fc714f73eac/manager/0.log" Apr 16 19:35:34.847122 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:34.847100 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-xbnbv_e204c87e-ef63-4ef3-9fb1-2fd0e2775752/manager/0.log" Apr 16 19:35:36.018809 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:36.018783 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-dncfj/perf-node-gather-daemonset-4hzvg" Apr 16 19:35:39.286665 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:39.286580 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-7hlch_a8092b73-120e-4d31-8d9c-2567ffdcad38/migrator/0.log" Apr 16 19:35:39.317052 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:39.317026 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-7hlch_a8092b73-120e-4d31-8d9c-2567ffdcad38/graceful-termination/0.log" Apr 16 19:35:40.649364 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:40.649325 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-562j6_4a775d94-d89f-4059-894a-f78b252c1c3c/kube-multus/0.log" Apr 16 19:35:41.350279 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:41.350254 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-nv72w_3bf46358-96b5-41b1-9b21-a398d5f87d6e/kube-multus-additional-cni-plugins/0.log" Apr 16 19:35:41.389355 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:41.389262 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-nv72w_3bf46358-96b5-41b1-9b21-a398d5f87d6e/egress-router-binary-copy/0.log" Apr 16 19:35:41.429718 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:41.429683 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-nv72w_3bf46358-96b5-41b1-9b21-a398d5f87d6e/cni-plugins/0.log" Apr 16 19:35:41.471608 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:41.471578 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-nv72w_3bf46358-96b5-41b1-9b21-a398d5f87d6e/bond-cni-plugin/0.log" Apr 16 19:35:41.524141 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:41.524115 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-nv72w_3bf46358-96b5-41b1-9b21-a398d5f87d6e/routeoverride-cni/0.log" Apr 16 19:35:41.573568 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:41.573540 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-nv72w_3bf46358-96b5-41b1-9b21-a398d5f87d6e/whereabouts-cni-bincopy/0.log" Apr 16 19:35:41.617615 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:41.617586 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-nv72w_3bf46358-96b5-41b1-9b21-a398d5f87d6e/whereabouts-cni/0.log" Apr 16 19:35:41.715557 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:41.715474 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-f4smb_b62474b5-9999-4dd6-83ae-96e3bc355df3/network-metrics-daemon/0.log" Apr 16 19:35:41.763811 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:41.763782 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-f4smb_b62474b5-9999-4dd6-83ae-96e3bc355df3/kube-rbac-proxy/0.log" Apr 16 19:35:42.704924 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:42.704891 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fdzzg_02533a21-4e1f-4bc0-a493-7ac7d35295b8/ovn-controller/0.log" Apr 16 19:35:42.722626 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:42.722597 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fdzzg_02533a21-4e1f-4bc0-a493-7ac7d35295b8/ovn-acl-logging/0.log" Apr 16 19:35:42.756739 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:42.756703 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fdzzg_02533a21-4e1f-4bc0-a493-7ac7d35295b8/ovn-acl-logging/1.log" Apr 16 19:35:42.786397 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:42.786372 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fdzzg_02533a21-4e1f-4bc0-a493-7ac7d35295b8/kube-rbac-proxy-node/0.log" Apr 16 19:35:42.811689 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:42.811667 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fdzzg_02533a21-4e1f-4bc0-a493-7ac7d35295b8/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 19:35:42.830008 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:42.829982 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fdzzg_02533a21-4e1f-4bc0-a493-7ac7d35295b8/northd/0.log" Apr 16 19:35:42.852511 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:42.852491 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fdzzg_02533a21-4e1f-4bc0-a493-7ac7d35295b8/nbdb/0.log" Apr 16 19:35:42.876429 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:42.876408 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fdzzg_02533a21-4e1f-4bc0-a493-7ac7d35295b8/sbdb/0.log" Apr 16 19:35:43.062477 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:43.062434 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fdzzg_02533a21-4e1f-4bc0-a493-7ac7d35295b8/ovnkube-controller/0.log" Apr 16 19:35:44.667158 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:44.667134 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-rp72n_dcf43e1d-4165-4661-a113-011616920ebe/network-check-target-container/0.log" Apr 16 19:35:45.702641 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:45.702544 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-npwmv_68a69615-3320-4f7f-b763-6991f367c93d/iptables-alerter/0.log" Apr 16 19:35:46.405515 ip-10-0-139-33 kubenswrapper[2578]: I0416 19:35:46.405482 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-fctsh_7be083de-137d-4eb1-b371-dc0a37d2d527/tuned/0.log"