Apr 22 14:15:29.544093 ip-10-0-142-195 systemd[1]: Starting Kubernetes Kubelet... Apr 22 14:15:30.009820 ip-10-0-142-195 kubenswrapper[2576]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 14:15:30.009820 ip-10-0-142-195 kubenswrapper[2576]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 14:15:30.009820 ip-10-0-142-195 kubenswrapper[2576]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 14:15:30.009820 ip-10-0-142-195 kubenswrapper[2576]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 14:15:30.009820 ip-10-0-142-195 kubenswrapper[2576]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 14:15:30.011394 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.011318 2576 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 14:15:30.014346 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014332 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 14:15:30.014346 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014346 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 14:15:30.014409 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014351 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 14:15:30.014409 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014354 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 14:15:30.014409 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014356 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 14:15:30.014409 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014359 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 14:15:30.014409 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014362 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 14:15:30.014409 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014365 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 14:15:30.014409 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014367 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 14:15:30.014409 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014370 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 14:15:30.014409 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014372 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 14:15:30.014409 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014375 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 14:15:30.014409 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014377 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 14:15:30.014409 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014380 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 14:15:30.014409 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014383 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 14:15:30.014409 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014385 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 14:15:30.014409 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014388 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 22 14:15:30.014409 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014390 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 14:15:30.014409 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014393 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 14:15:30.014409 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014396 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 14:15:30.014409 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014398 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 14:15:30.014409 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014401 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 14:15:30.015044 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014403 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 14:15:30.015044 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014406 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 14:15:30.015044 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014408 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 14:15:30.015044 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014411 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 14:15:30.015044 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014414 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 14:15:30.015044 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014416 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 14:15:30.015044 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014419 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 14:15:30.015044 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014422 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 14:15:30.015044 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014425 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 14:15:30.015044 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014427 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 14:15:30.015044 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014429 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 14:15:30.015044 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014432 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 14:15:30.015044 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014435 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 14:15:30.015044 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014437 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 14:15:30.015044 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014440 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 14:15:30.015044 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014443 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 14:15:30.015044 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014445 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 14:15:30.015044 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014448 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 14:15:30.015044 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014450 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 14:15:30.015044 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014453 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 14:15:30.015800 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014455 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 14:15:30.015800 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014458 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 14:15:30.015800 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014460 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 14:15:30.015800 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014462 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 14:15:30.015800 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014465 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 14:15:30.015800 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014467 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 14:15:30.015800 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014469 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 14:15:30.015800 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014472 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 14:15:30.015800 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014475 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 14:15:30.015800 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014477 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 14:15:30.015800 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014480 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 14:15:30.015800 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014482 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 14:15:30.015800 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014485 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 14:15:30.015800 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014488 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 14:15:30.015800 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014491 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 14:15:30.015800 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014495 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 14:15:30.015800 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014499 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 14:15:30.015800 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014502 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 14:15:30.015800 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014505 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 14:15:30.015800 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014507 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 14:15:30.016358 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014510 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 14:15:30.016358 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014513 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 14:15:30.016358 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014516 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 14:15:30.016358 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014519 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 14:15:30.016358 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014521 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 14:15:30.016358 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014523 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 14:15:30.016358 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014560 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 14:15:30.016358 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014594 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 14:15:30.016358 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014598 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 14:15:30.016358 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014602 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 14:15:30.016358 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014605 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 14:15:30.016358 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014608 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 14:15:30.016358 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014660 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 14:15:30.016358 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014664 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 14:15:30.016358 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014669 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 14:15:30.016358 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014675 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 14:15:30.016358 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014680 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 14:15:30.016358 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014684 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 14:15:30.016358 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014693 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 14:15:30.016828 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014697 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 14:15:30.016828 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014701 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 14:15:30.016828 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014707 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 14:15:30.016828 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014711 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 14:15:30.016828 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.014716 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 14:15:30.016828 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.015824 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 14:15:30.016828 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.015835 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 14:15:30.016828 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.015839 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 14:15:30.016828 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.015842 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 14:15:30.016828 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.015845 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 14:15:30.016828 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.015848 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 14:15:30.016828 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.015851 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 14:15:30.016828 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.015854 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 14:15:30.016828 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.015857 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 14:15:30.016828 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.015860 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 14:15:30.016828 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.015865 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 14:15:30.016828 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.015868 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 14:15:30.016828 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.015877 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 14:15:30.016828 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.015881 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 22 14:15:30.016828 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.015884 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 14:15:30.017286 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.015887 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 14:15:30.017286 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.015890 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 14:15:30.017286 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.015893 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 14:15:30.017286 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.015895 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 14:15:30.017286 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.015898 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 14:15:30.017286 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.015900 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 14:15:30.017286 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.015903 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 14:15:30.017286 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.015905 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 14:15:30.017286 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.015909 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 14:15:30.017286 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.015911 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 14:15:30.017286 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.015914 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 14:15:30.017286 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.015916 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 14:15:30.017286 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.015919 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 14:15:30.017286 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.015921 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 14:15:30.017286 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.015925 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 14:15:30.017286 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.015928 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 14:15:30.017286 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.015930 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 14:15:30.017286 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.015933 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 14:15:30.017286 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.015936 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 14:15:30.017740 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.015939 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 14:15:30.017740 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.015942 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 14:15:30.017740 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.015944 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 14:15:30.017740 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.015947 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 14:15:30.017740 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.015950 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 14:15:30.017740 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.015952 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 14:15:30.017740 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.015956 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 14:15:30.017740 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.015960 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 14:15:30.017740 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.015963 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 14:15:30.017740 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.015966 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 14:15:30.017740 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.015968 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 14:15:30.017740 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.015971 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 14:15:30.017740 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.015973 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 14:15:30.017740 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.015976 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 14:15:30.017740 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.015978 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 14:15:30.017740 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.015981 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 14:15:30.017740 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.015983 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 14:15:30.017740 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.015985 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 14:15:30.017740 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.015988 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 14:15:30.018187 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.015990 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 14:15:30.018187 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.015992 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 14:15:30.018187 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.015995 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 14:15:30.018187 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.015997 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 14:15:30.018187 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016000 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 14:15:30.018187 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016002 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 14:15:30.018187 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016004 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 14:15:30.018187 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016007 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 14:15:30.018187 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016009 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 14:15:30.018187 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016012 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 14:15:30.018187 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016014 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 14:15:30.018187 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016017 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 14:15:30.018187 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016020 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 14:15:30.018187 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016022 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 14:15:30.018187 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016025 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 14:15:30.018187 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016028 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 14:15:30.018187 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016032 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 14:15:30.018187 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016034 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 14:15:30.018187 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016037 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 14:15:30.018187 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016039 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 14:15:30.018665 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016042 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 14:15:30.018665 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016044 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 14:15:30.018665 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016047 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 14:15:30.018665 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016049 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 14:15:30.018665 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016052 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 14:15:30.018665 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016054 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 14:15:30.018665 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016057 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 14:15:30.018665 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016059 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 14:15:30.018665 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016061 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 14:15:30.018665 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016064 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 14:15:30.018665 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016067 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 14:15:30.018665 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016069 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 14:15:30.018665 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016072 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 14:15:30.018665 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016142 2576 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 14:15:30.018665 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016148 2576 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 14:15:30.018665 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016155 2576 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 14:15:30.018665 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016159 2576 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 14:15:30.018665 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016163 2576 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 14:15:30.018665 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016166 2576 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 14:15:30.018665 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016171 2576 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 14:15:30.018665 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016176 2576 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 14:15:30.019173 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016179 2576 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 14:15:30.019173 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016182 2576 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 14:15:30.019173 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016185 2576 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 14:15:30.019173 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016189 2576 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 14:15:30.019173 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016192 2576 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 14:15:30.019173 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016195 2576 flags.go:64] FLAG: --cgroup-root="" Apr 22 14:15:30.019173 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016198 2576 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 14:15:30.019173 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016200 2576 flags.go:64] FLAG: --client-ca-file="" Apr 22 14:15:30.019173 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016203 2576 flags.go:64] FLAG: --cloud-config="" Apr 22 14:15:30.019173 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016206 2576 flags.go:64] FLAG: --cloud-provider="external" Apr 22 14:15:30.019173 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016208 2576 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 14:15:30.019173 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016213 2576 flags.go:64] FLAG: --cluster-domain="" Apr 22 14:15:30.019173 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016215 2576 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 14:15:30.019173 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016218 2576 flags.go:64] FLAG: --config-dir="" Apr 22 14:15:30.019173 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016221 2576 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 14:15:30.019173 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016224 2576 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 14:15:30.019173 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016228 2576 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 14:15:30.019173 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016231 2576 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 14:15:30.019173 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016234 2576 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 14:15:30.019173 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016237 2576 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 14:15:30.019173 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016239 2576 flags.go:64] FLAG: --contention-profiling="false" Apr 22 14:15:30.019173 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016242 2576 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 14:15:30.019173 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016245 2576 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 14:15:30.019173 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016248 2576 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 14:15:30.019173 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016251 2576 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 14:15:30.019753 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016255 2576 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 14:15:30.019753 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016257 2576 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 14:15:30.019753 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016260 2576 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 14:15:30.019753 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016263 2576 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 14:15:30.019753 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016266 2576 flags.go:64] FLAG: --enable-server="true" Apr 22 14:15:30.019753 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016268 2576 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 14:15:30.019753 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016273 2576 flags.go:64] FLAG: --event-burst="100" Apr 22 14:15:30.019753 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016276 2576 flags.go:64] FLAG: --event-qps="50" Apr 22 14:15:30.019753 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016278 2576 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 14:15:30.019753 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016282 2576 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 14:15:30.019753 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016285 2576 flags.go:64] FLAG: --eviction-hard="" Apr 22 14:15:30.019753 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016289 2576 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 14:15:30.019753 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016292 2576 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 14:15:30.019753 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016295 2576 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 14:15:30.019753 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016298 2576 flags.go:64] FLAG: --eviction-soft="" Apr 22 14:15:30.019753 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016301 2576 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 14:15:30.019753 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016303 2576 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 14:15:30.019753 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016306 2576 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 14:15:30.019753 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016309 2576 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 14:15:30.019753 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016312 2576 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 14:15:30.019753 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016315 2576 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 14:15:30.019753 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016317 2576 flags.go:64] FLAG: --feature-gates="" Apr 22 14:15:30.019753 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016321 2576 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 14:15:30.019753 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016324 2576 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 14:15:30.019753 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016327 2576 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 14:15:30.020381 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016330 2576 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 14:15:30.020381 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016333 2576 flags.go:64] FLAG: --healthz-port="10248" Apr 22 14:15:30.020381 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016336 2576 flags.go:64] FLAG: --help="false" Apr 22 14:15:30.020381 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016339 2576 flags.go:64] FLAG: --hostname-override="ip-10-0-142-195.ec2.internal" Apr 22 14:15:30.020381 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016342 2576 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 14:15:30.020381 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016345 2576 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 14:15:30.020381 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016348 2576 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 14:15:30.020381 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016351 2576 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 14:15:30.020381 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016354 2576 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 14:15:30.020381 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016357 2576 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 14:15:30.020381 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016360 2576 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 14:15:30.020381 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016362 2576 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 14:15:30.020381 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016365 2576 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 14:15:30.020381 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016368 2576 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 14:15:30.020381 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016372 2576 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 14:15:30.020381 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016375 2576 flags.go:64] FLAG: --kube-reserved="" Apr 22 14:15:30.020381 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016378 2576 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 14:15:30.020381 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016381 2576 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 14:15:30.020381 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016384 2576 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 14:15:30.020381 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016387 2576 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 14:15:30.020381 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016390 2576 flags.go:64] FLAG: --lock-file="" Apr 22 14:15:30.020381 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016393 2576 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 14:15:30.020381 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016396 2576 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 14:15:30.020381 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016399 2576 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 14:15:30.020993 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016404 2576 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 14:15:30.020993 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016406 2576 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 14:15:30.020993 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016409 2576 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 14:15:30.020993 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016412 2576 flags.go:64] FLAG: --logging-format="text" Apr 22 14:15:30.020993 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016415 2576 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 14:15:30.020993 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016418 2576 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 14:15:30.020993 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016420 2576 flags.go:64] FLAG: --manifest-url="" Apr 22 14:15:30.020993 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016423 2576 flags.go:64] FLAG: --manifest-url-header="" Apr 22 14:15:30.020993 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016427 2576 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 14:15:30.020993 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016430 2576 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 14:15:30.020993 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016435 2576 flags.go:64] FLAG: --max-pods="110" Apr 22 14:15:30.020993 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016438 2576 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 14:15:30.020993 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016441 2576 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 14:15:30.020993 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016444 2576 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 14:15:30.020993 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016447 2576 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 14:15:30.020993 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016450 2576 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 14:15:30.020993 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016453 2576 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 14:15:30.020993 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016455 2576 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 14:15:30.020993 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016463 2576 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 14:15:30.020993 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016466 2576 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 14:15:30.020993 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016469 2576 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 14:15:30.020993 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016472 2576 flags.go:64] FLAG: --pod-cidr="" Apr 22 14:15:30.020993 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016475 2576 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 14:15:30.021503 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016480 2576 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 14:15:30.021503 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016483 2576 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 14:15:30.021503 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016486 2576 flags.go:64] FLAG: --pods-per-core="0" Apr 22 14:15:30.021503 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016489 2576 flags.go:64] FLAG: --port="10250" Apr 22 14:15:30.021503 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016492 2576 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 14:15:30.021503 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016495 2576 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0c8ed41abe0ced4e9" Apr 22 14:15:30.021503 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016498 2576 flags.go:64] FLAG: --qos-reserved="" Apr 22 14:15:30.021503 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016501 2576 flags.go:64] FLAG: --read-only-port="10255" Apr 22 14:15:30.021503 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016504 2576 flags.go:64] FLAG: --register-node="true" Apr 22 14:15:30.021503 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016507 2576 flags.go:64] FLAG: --register-schedulable="true" Apr 22 14:15:30.021503 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016510 2576 flags.go:64] FLAG: --register-with-taints="" Apr 22 14:15:30.021503 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016513 2576 flags.go:64] FLAG: --registry-burst="10" Apr 22 14:15:30.021503 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016516 2576 flags.go:64] FLAG: --registry-qps="5" Apr 22 14:15:30.021503 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016519 2576 flags.go:64] FLAG: --reserved-cpus="" Apr 22 14:15:30.021503 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016522 2576 flags.go:64] FLAG: --reserved-memory="" Apr 22 14:15:30.021503 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016526 2576 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 14:15:30.021503 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016529 2576 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 14:15:30.021503 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016532 2576 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 14:15:30.021503 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016534 2576 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 14:15:30.021503 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016537 2576 flags.go:64] FLAG: --runonce="false" Apr 22 14:15:30.021503 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016540 2576 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 14:15:30.021503 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016542 2576 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 14:15:30.021503 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016545 2576 flags.go:64] FLAG: --seccomp-default="false" Apr 22 14:15:30.021503 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016548 2576 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 14:15:30.021503 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016551 2576 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 14:15:30.021503 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016554 2576 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 14:15:30.022134 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016557 2576 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 14:15:30.022134 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016559 2576 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 14:15:30.022134 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016562 2576 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 14:15:30.022134 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016565 2576 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 14:15:30.022134 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016568 2576 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 14:15:30.022134 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016570 2576 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 14:15:30.022134 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016573 2576 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 14:15:30.022134 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016576 2576 flags.go:64] FLAG: --system-cgroups="" Apr 22 14:15:30.022134 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016583 2576 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 14:15:30.022134 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016588 2576 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 14:15:30.022134 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016591 2576 flags.go:64] FLAG: --tls-cert-file="" Apr 22 14:15:30.022134 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016594 2576 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 14:15:30.022134 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016598 2576 flags.go:64] FLAG: --tls-min-version="" Apr 22 14:15:30.022134 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016600 2576 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 14:15:30.022134 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016603 2576 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 14:15:30.022134 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016606 2576 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 14:15:30.022134 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016609 2576 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 14:15:30.022134 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016612 2576 flags.go:64] FLAG: --v="2" Apr 22 14:15:30.022134 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016616 2576 flags.go:64] FLAG: --version="false" Apr 22 14:15:30.022134 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016620 2576 flags.go:64] FLAG: --vmodule="" Apr 22 14:15:30.022134 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016624 2576 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 14:15:30.022134 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.016627 2576 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 14:15:30.022134 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016710 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 14:15:30.022134 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016715 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 14:15:30.022716 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016718 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 14:15:30.022716 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016721 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 14:15:30.022716 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016723 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 14:15:30.022716 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016726 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 14:15:30.022716 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016729 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 14:15:30.022716 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016731 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 14:15:30.022716 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016734 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 14:15:30.022716 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016736 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 14:15:30.022716 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016739 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 14:15:30.022716 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016742 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 14:15:30.022716 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016744 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 14:15:30.022716 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016747 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 14:15:30.022716 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016749 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 14:15:30.022716 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016752 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 14:15:30.022716 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016754 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 14:15:30.022716 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016757 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 14:15:30.022716 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016762 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 14:15:30.022716 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016765 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 14:15:30.022716 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016767 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 14:15:30.022716 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016770 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 14:15:30.023210 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016772 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 14:15:30.023210 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016775 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 14:15:30.023210 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016777 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 14:15:30.023210 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016780 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 14:15:30.023210 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016782 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 14:15:30.023210 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016785 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 14:15:30.023210 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016788 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 14:15:30.023210 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016790 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 14:15:30.023210 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016793 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 14:15:30.023210 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016795 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 14:15:30.023210 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016798 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 14:15:30.023210 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016800 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 14:15:30.023210 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016803 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 14:15:30.023210 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016818 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 14:15:30.023210 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016821 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 14:15:30.023210 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016823 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 22 14:15:30.023210 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016826 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 14:15:30.023210 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016828 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 14:15:30.023210 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016831 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 14:15:30.023698 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016835 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 14:15:30.023698 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016839 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 14:15:30.023698 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016842 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 14:15:30.023698 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016845 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 14:15:30.023698 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016847 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 14:15:30.023698 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016850 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 14:15:30.023698 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016853 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 14:15:30.023698 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016855 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 14:15:30.023698 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016858 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 14:15:30.023698 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016861 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 14:15:30.023698 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016864 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 14:15:30.023698 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016867 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 14:15:30.023698 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016869 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 14:15:30.023698 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016872 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 14:15:30.023698 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016874 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 14:15:30.023698 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016877 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 14:15:30.023698 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016881 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 14:15:30.023698 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016884 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 14:15:30.023698 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016886 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 14:15:30.024170 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016889 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 14:15:30.024170 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016891 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 14:15:30.024170 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016894 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 14:15:30.024170 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016896 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 14:15:30.024170 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016898 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 14:15:30.024170 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016901 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 14:15:30.024170 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016904 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 14:15:30.024170 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016906 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 14:15:30.024170 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016914 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 14:15:30.024170 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016917 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 14:15:30.024170 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016919 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 14:15:30.024170 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016922 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 14:15:30.024170 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016924 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 14:15:30.024170 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016926 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 14:15:30.024170 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016929 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 14:15:30.024170 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016931 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 14:15:30.024170 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016934 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 14:15:30.024170 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016937 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 14:15:30.024170 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016939 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 14:15:30.024170 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016942 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 14:15:30.024170 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016945 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 14:15:30.024667 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016947 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 14:15:30.024667 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016951 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 14:15:30.024667 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016954 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 14:15:30.024667 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016957 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 14:15:30.024667 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.016959 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 14:15:30.024667 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.017631 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 14:15:30.024667 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.023626 2576 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 14:15:30.024667 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.023643 2576 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 14:15:30.024667 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023686 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 14:15:30.024667 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023691 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 14:15:30.024667 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023695 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 14:15:30.024667 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023698 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 14:15:30.024667 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023701 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 14:15:30.024667 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023704 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 14:15:30.024667 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023707 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 14:15:30.024667 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023710 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 14:15:30.025112 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023713 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 14:15:30.025112 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023716 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 14:15:30.025112 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023719 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 14:15:30.025112 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023721 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 14:15:30.025112 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023724 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 14:15:30.025112 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023726 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 14:15:30.025112 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023729 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 14:15:30.025112 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023732 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 14:15:30.025112 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023734 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 14:15:30.025112 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023738 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 14:15:30.025112 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023742 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 14:15:30.025112 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023744 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 14:15:30.025112 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023747 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 14:15:30.025112 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023751 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 14:15:30.025112 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023754 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 14:15:30.025112 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023757 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 14:15:30.025112 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023760 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 22 14:15:30.025112 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023763 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 14:15:30.025112 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023765 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 14:15:30.025554 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023768 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 14:15:30.025554 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023770 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 14:15:30.025554 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023773 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 14:15:30.025554 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023776 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 14:15:30.025554 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023780 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 14:15:30.025554 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023783 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 14:15:30.025554 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023785 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 14:15:30.025554 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023788 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 14:15:30.025554 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023791 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 14:15:30.025554 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023793 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 14:15:30.025554 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023796 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 14:15:30.025554 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023798 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 14:15:30.025554 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023801 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 14:15:30.025554 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023818 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 14:15:30.025554 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023822 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 14:15:30.025554 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023825 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 14:15:30.025554 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023827 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 14:15:30.025554 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023830 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 14:15:30.025554 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023832 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 14:15:30.026018 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023835 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 14:15:30.026018 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023837 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 14:15:30.026018 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023840 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 14:15:30.026018 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023843 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 14:15:30.026018 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023845 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 14:15:30.026018 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023848 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 14:15:30.026018 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023850 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 14:15:30.026018 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023853 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 14:15:30.026018 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023855 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 14:15:30.026018 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023858 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 14:15:30.026018 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023860 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 14:15:30.026018 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023862 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 14:15:30.026018 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023865 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 14:15:30.026018 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023867 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 14:15:30.026018 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023877 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 14:15:30.026018 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023881 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 14:15:30.026018 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023884 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 14:15:30.026018 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023887 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 14:15:30.026018 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023891 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 14:15:30.026018 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023894 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 14:15:30.026485 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023896 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 14:15:30.026485 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023899 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 14:15:30.026485 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023902 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 14:15:30.026485 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023904 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 14:15:30.026485 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023907 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 14:15:30.026485 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023909 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 14:15:30.026485 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023912 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 14:15:30.026485 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023914 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 14:15:30.026485 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023917 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 14:15:30.026485 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023920 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 14:15:30.026485 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023922 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 14:15:30.026485 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023924 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 14:15:30.026485 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023927 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 14:15:30.026485 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023930 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 14:15:30.026485 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023933 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 14:15:30.026485 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023935 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 14:15:30.026485 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023937 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 14:15:30.026485 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023940 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 14:15:30.026485 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023943 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 14:15:30.026485 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.023945 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 14:15:30.026964 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.023950 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 14:15:30.026964 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024042 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 14:15:30.026964 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024046 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 14:15:30.026964 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024049 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 14:15:30.026964 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024052 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 14:15:30.026964 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024056 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 14:15:30.026964 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024059 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 14:15:30.026964 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024062 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 14:15:30.026964 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024065 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 14:15:30.026964 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024068 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 14:15:30.026964 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024072 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 14:15:30.026964 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024075 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 14:15:30.026964 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024078 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 14:15:30.026964 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024081 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 14:15:30.026964 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024084 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 14:15:30.027318 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024087 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 14:15:30.027318 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024090 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 14:15:30.027318 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024092 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 14:15:30.027318 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024095 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 14:15:30.027318 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024098 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 14:15:30.027318 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024100 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 14:15:30.027318 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024103 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 14:15:30.027318 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024105 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 14:15:30.027318 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024108 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 14:15:30.027318 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024110 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 14:15:30.027318 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024113 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 14:15:30.027318 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024116 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 14:15:30.027318 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024118 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 14:15:30.027318 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024121 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 14:15:30.027318 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024123 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 14:15:30.027318 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024126 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 14:15:30.027318 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024128 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 14:15:30.027318 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024130 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 14:15:30.027318 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024133 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 14:15:30.027318 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024135 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 14:15:30.027794 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024137 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 14:15:30.027794 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024140 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 14:15:30.027794 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024142 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 14:15:30.027794 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024145 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 14:15:30.027794 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024148 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 14:15:30.027794 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024150 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 14:15:30.027794 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024152 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 14:15:30.027794 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024155 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 14:15:30.027794 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024157 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 14:15:30.027794 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024160 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 14:15:30.027794 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024163 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 14:15:30.027794 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024165 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 14:15:30.027794 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024167 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 14:15:30.027794 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024170 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 14:15:30.027794 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024172 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 14:15:30.027794 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024175 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 14:15:30.027794 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024177 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 14:15:30.027794 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024180 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 14:15:30.027794 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024183 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 14:15:30.027794 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024186 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 22 14:15:30.028281 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024188 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 14:15:30.028281 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024190 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 14:15:30.028281 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024193 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 14:15:30.028281 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024195 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 14:15:30.028281 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024198 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 14:15:30.028281 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024200 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 14:15:30.028281 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024203 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 14:15:30.028281 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024205 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 14:15:30.028281 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024208 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 14:15:30.028281 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024210 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 14:15:30.028281 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024212 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 14:15:30.028281 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024215 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 14:15:30.028281 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024217 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 14:15:30.028281 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024220 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 14:15:30.028281 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024222 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 14:15:30.028281 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024225 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 14:15:30.028281 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024227 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 14:15:30.028281 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024229 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 14:15:30.028281 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024232 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 14:15:30.028717 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024234 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 14:15:30.028717 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024237 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 14:15:30.028717 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024239 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 14:15:30.028717 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024242 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 14:15:30.028717 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024244 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 14:15:30.028717 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024247 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 14:15:30.028717 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024249 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 14:15:30.028717 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024252 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 14:15:30.028717 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024254 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 14:15:30.028717 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024256 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 14:15:30.028717 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024259 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 14:15:30.028717 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024261 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 14:15:30.028717 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:30.024264 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 14:15:30.028717 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.024268 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 14:15:30.028717 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.026034 2576 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 14:15:30.029262 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.029248 2576 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 14:15:30.030094 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.030083 2576 server.go:1019] "Starting client certificate rotation" Apr 22 14:15:30.030192 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.030177 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 14:15:30.030235 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.030211 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 14:15:30.060381 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.060363 2576 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 14:15:30.066167 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.066152 2576 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 14:15:30.087835 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.087819 2576 log.go:25] "Validated CRI v1 runtime API" Apr 22 14:15:30.094718 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.094702 2576 log.go:25] "Validated CRI v1 image API" Apr 22 14:15:30.096019 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.096004 2576 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 14:15:30.096620 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.096604 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 14:15:30.098097 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.098079 2576 fs.go:135] Filesystem UUIDs: map[118b76b3-0658-4942-90eb-85f3802987fe:/dev/nvme0n1p4 53030d7e-3da6-4342-8bba-c865492848c2:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 22 14:15:30.098150 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.098097 2576 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 14:15:30.103799 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.103696 2576 manager.go:217] Machine: {Timestamp:2026-04-22 14:15:30.101626543 +0000 UTC m=+0.427360950 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3188350 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec29346db83ca070d691aa00be332f72 SystemUUID:ec29346d-b83c-a070-d691-aa00be332f72 BootID:f9491fe3-551c-4f97-8af9-f427e4ce5c75 Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:70:43:61:13:a5 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:70:43:61:13:a5 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:4a:3b:54:42:7c:33 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 14:15:30.103799 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.103795 2576 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 14:15:30.103902 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.103889 2576 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 14:15:30.105191 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.105173 2576 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 14:15:30.105315 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.105193 2576 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-142-195.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 14:15:30.105362 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.105325 2576 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 14:15:30.105362 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.105333 2576 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 14:15:30.105362 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.105346 2576 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 14:15:30.106659 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.106639 2576 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 14:15:30.108301 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.108284 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 22 14:15:30.108561 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.108551 2576 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 14:15:30.110766 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.110756 2576 kubelet.go:491] "Attempting to sync node with API server" Apr 22 14:15:30.110824 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.110769 2576 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 14:15:30.110824 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.110780 2576 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 14:15:30.110824 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.110788 2576 kubelet.go:397] "Adding apiserver pod source" Apr 22 14:15:30.110824 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.110798 2576 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 14:15:30.112019 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.112006 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 14:15:30.112345 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.112032 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 14:15:30.115171 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.115154 2576 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 14:15:30.116564 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.116552 2576 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 14:15:30.117091 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.117071 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-x2wsc" Apr 22 14:15:30.118334 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.118322 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 14:15:30.118334 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.118339 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 14:15:30.118429 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.118345 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 14:15:30.118429 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.118361 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 14:15:30.118429 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.118369 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 14:15:30.118429 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.118375 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 14:15:30.118429 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.118389 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 14:15:30.118429 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.118395 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 14:15:30.118429 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.118402 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 14:15:30.118429 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.118407 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 14:15:30.118429 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.118422 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 14:15:30.118429 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.118431 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 14:15:30.120646 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.120633 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 14:15:30.120646 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.120643 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 14:15:30.121793 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:30.121764 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 14:15:30.121793 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:30.121772 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-142-195.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 14:15:30.123549 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.123534 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-142-195.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 14:15:30.124148 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.124137 2576 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 14:15:30.124192 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.124172 2576 server.go:1295] "Started kubelet" Apr 22 14:15:30.124245 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.124220 2576 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 14:15:30.125755 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.125702 2576 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 14:15:30.125843 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.125784 2576 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 14:15:30.125910 ip-10-0-142-195 systemd[1]: Started Kubernetes Kubelet. Apr 22 14:15:30.126264 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.125328 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-x2wsc" Apr 22 14:15:30.129311 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.129294 2576 server.go:317] "Adding debug handlers to kubelet server" Apr 22 14:15:30.129783 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.129767 2576 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 14:15:30.135317 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:30.135298 2576 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 14:15:30.136537 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.136523 2576 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 14:15:30.136600 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.136562 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 14:15:30.137249 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.137202 2576 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 14:15:30.137350 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.137243 2576 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 14:15:30.137350 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.137319 2576 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 14:15:30.137448 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.137415 2576 reconstruct.go:97] "Volume reconstruction finished" Apr 22 14:15:30.137448 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.137426 2576 reconciler.go:26] "Reconciler: start to sync state" Apr 22 14:15:30.137448 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:30.137437 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-195.ec2.internal\" not found" Apr 22 14:15:30.137587 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.137568 2576 factory.go:55] Registering systemd factory Apr 22 14:15:30.137638 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.137596 2576 factory.go:223] Registration of the systemd container factory successfully Apr 22 14:15:30.137823 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.137778 2576 factory.go:153] Registering CRI-O factory Apr 22 14:15:30.137823 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.137798 2576 factory.go:223] Registration of the crio container factory successfully Apr 22 14:15:30.137955 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.137859 2576 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 14:15:30.137955 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.137876 2576 factory.go:103] Registering Raw factory Apr 22 14:15:30.137955 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.137890 2576 manager.go:1196] Started watching for new ooms in manager Apr 22 14:15:30.138220 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.138209 2576 manager.go:319] Starting recovery of all containers Apr 22 14:15:30.143846 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.143678 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:15:30.147136 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.147122 2576 manager.go:324] Recovery completed Apr 22 14:15:30.148927 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:30.148909 2576 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-142-195.ec2.internal\" not found" node="ip-10-0-142-195.ec2.internal" Apr 22 14:15:30.150966 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.150951 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:15:30.153685 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.153665 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-195.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:15:30.153751 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.153691 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-195.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:15:30.153751 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.153702 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-195.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:15:30.154198 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.154182 2576 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 14:15:30.154198 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.154198 2576 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 14:15:30.154311 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.154213 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 22 14:15:30.156705 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.156693 2576 policy_none.go:49] "None policy: Start" Apr 22 14:15:30.156753 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.156708 2576 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 14:15:30.157164 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.157155 2576 state_mem.go:35] "Initializing new in-memory state store" Apr 22 14:15:30.196347 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.196329 2576 manager.go:341] "Starting Device Plugin manager" Apr 22 14:15:30.214517 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:30.196364 2576 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 14:15:30.214517 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.196376 2576 server.go:85] "Starting device plugin registration server" Apr 22 14:15:30.214517 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.196572 2576 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 14:15:30.214517 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.196581 2576 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 14:15:30.214517 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.196675 2576 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 14:15:30.214517 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.196754 2576 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 14:15:30.214517 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.196766 2576 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 14:15:30.214517 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:30.197301 2576 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 14:15:30.214517 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:30.197333 2576 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-142-195.ec2.internal\" not found" Apr 22 14:15:30.286436 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.286368 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 14:15:30.287463 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.287442 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 14:15:30.287554 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.287468 2576 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 14:15:30.287554 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.287485 2576 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 14:15:30.287554 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.287492 2576 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 14:15:30.287554 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:30.287522 2576 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 14:15:30.291563 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.291549 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:15:30.297596 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.297583 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:15:30.298463 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.298450 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-195.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:15:30.298532 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.298483 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-195.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:15:30.298532 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.298493 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-195.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:15:30.298532 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.298512 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-142-195.ec2.internal" Apr 22 14:15:30.304687 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.304671 2576 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-142-195.ec2.internal" Apr 22 14:15:30.304768 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:30.304696 2576 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-142-195.ec2.internal\": node \"ip-10-0-142-195.ec2.internal\" not found" Apr 22 14:15:30.315232 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:30.315212 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-195.ec2.internal\" not found" Apr 22 14:15:30.387959 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.387928 2576 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-195.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-142-195.ec2.internal"] Apr 22 14:15:30.388058 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.388006 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:15:30.388958 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.388943 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-195.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:15:30.389034 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.388971 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-195.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:15:30.389034 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.388980 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-195.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:15:30.390513 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.390501 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:15:30.390658 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.390644 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-195.ec2.internal" Apr 22 14:15:30.390692 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.390674 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:15:30.391184 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.391171 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-195.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:15:30.391269 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.391186 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-195.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:15:30.391269 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.391197 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-195.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:15:30.391269 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.391208 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-195.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:15:30.391269 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.391224 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-195.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:15:30.391269 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.391210 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-195.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:15:30.392484 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.392469 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-195.ec2.internal" Apr 22 14:15:30.392536 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.392497 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:15:30.393136 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.393110 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-195.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:15:30.393199 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.393140 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-195.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:15:30.393199 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.393151 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-195.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:15:30.415278 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:30.415259 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-195.ec2.internal\" not found" Apr 22 14:15:30.415356 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:30.415342 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-142-195.ec2.internal\" not found" node="ip-10-0-142-195.ec2.internal" Apr 22 14:15:30.419548 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:30.419528 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-142-195.ec2.internal\" not found" node="ip-10-0-142-195.ec2.internal" Apr 22 14:15:30.439274 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.439261 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b865d1ca2ef1f242e56d7c94255bbcb7-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-195.ec2.internal\" (UID: \"b865d1ca2ef1f242e56d7c94255bbcb7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-195.ec2.internal" Apr 22 14:15:30.439339 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.439283 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b865d1ca2ef1f242e56d7c94255bbcb7-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-195.ec2.internal\" (UID: \"b865d1ca2ef1f242e56d7c94255bbcb7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-195.ec2.internal" Apr 22 14:15:30.439339 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.439301 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/24b00f41063a53ab6b85ee845aa19b10-config\") pod \"kube-apiserver-proxy-ip-10-0-142-195.ec2.internal\" (UID: \"24b00f41063a53ab6b85ee845aa19b10\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-195.ec2.internal" Apr 22 14:15:30.516347 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:30.516327 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-195.ec2.internal\" not found" Apr 22 14:15:30.539777 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.539718 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b865d1ca2ef1f242e56d7c94255bbcb7-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-195.ec2.internal\" (UID: \"b865d1ca2ef1f242e56d7c94255bbcb7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-195.ec2.internal" Apr 22 14:15:30.539777 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.539672 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b865d1ca2ef1f242e56d7c94255bbcb7-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-195.ec2.internal\" (UID: \"b865d1ca2ef1f242e56d7c94255bbcb7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-195.ec2.internal" Apr 22 14:15:30.539917 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.539793 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b865d1ca2ef1f242e56d7c94255bbcb7-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-195.ec2.internal\" (UID: \"b865d1ca2ef1f242e56d7c94255bbcb7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-195.ec2.internal" Apr 22 14:15:30.539917 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.539773 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b865d1ca2ef1f242e56d7c94255bbcb7-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-195.ec2.internal\" (UID: \"b865d1ca2ef1f242e56d7c94255bbcb7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-195.ec2.internal" Apr 22 14:15:30.539917 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.539848 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/24b00f41063a53ab6b85ee845aa19b10-config\") pod \"kube-apiserver-proxy-ip-10-0-142-195.ec2.internal\" (UID: \"24b00f41063a53ab6b85ee845aa19b10\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-195.ec2.internal" Apr 22 14:15:30.539917 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.539880 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/24b00f41063a53ab6b85ee845aa19b10-config\") pod \"kube-apiserver-proxy-ip-10-0-142-195.ec2.internal\" (UID: \"24b00f41063a53ab6b85ee845aa19b10\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-195.ec2.internal" Apr 22 14:15:30.617101 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:30.617074 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-195.ec2.internal\" not found" Apr 22 14:15:30.717582 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:30.717547 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-195.ec2.internal\" not found" Apr 22 14:15:30.717582 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.717581 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-195.ec2.internal" Apr 22 14:15:30.722206 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.722188 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-195.ec2.internal" Apr 22 14:15:30.818376 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:30.818301 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-195.ec2.internal\" not found" Apr 22 14:15:30.918768 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:30.918743 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-195.ec2.internal\" not found" Apr 22 14:15:30.956896 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:30.956875 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:15:31.019104 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:31.019081 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-195.ec2.internal\" not found" Apr 22 14:15:31.030249 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:31.030231 2576 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 14:15:31.030367 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:31.030345 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 14:15:31.030426 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:31.030375 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 14:15:31.030426 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:31.030377 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 14:15:31.120088 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:31.120065 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-195.ec2.internal\" not found" Apr 22 14:15:31.132755 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:31.132724 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 14:10:30 +0000 UTC" deadline="2027-12-15 06:57:42.323890037 +0000 UTC" Apr 22 14:15:31.132755 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:31.132752 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14440h42m11.191142027s" Apr 22 14:15:31.136771 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:31.136754 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 14:15:31.151621 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:31.151600 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 14:15:31.175950 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:31.175931 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-h44qt" Apr 22 14:15:31.183947 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:31.183930 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-h44qt" Apr 22 14:15:31.220570 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:31.220546 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-195.ec2.internal\" not found" Apr 22 14:15:31.222551 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:31.222519 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24b00f41063a53ab6b85ee845aa19b10.slice/crio-e4d4a38bf2670c82611ac5747cb5543eb963c26974dede9cd966e3e5da1ad0f5 WatchSource:0}: Error finding container e4d4a38bf2670c82611ac5747cb5543eb963c26974dede9cd966e3e5da1ad0f5: Status 404 returned error can't find the container with id e4d4a38bf2670c82611ac5747cb5543eb963c26974dede9cd966e3e5da1ad0f5 Apr 22 14:15:31.222685 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:31.222671 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb865d1ca2ef1f242e56d7c94255bbcb7.slice/crio-7f357acb165e43a6d2d61c61f00771054a510f809f260ce8207d67b489b0f9c3 WatchSource:0}: Error finding container 7f357acb165e43a6d2d61c61f00771054a510f809f260ce8207d67b489b0f9c3: Status 404 returned error can't find the container with id 7f357acb165e43a6d2d61c61f00771054a510f809f260ce8207d67b489b0f9c3 Apr 22 14:15:31.226056 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:31.226036 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 14:15:31.289699 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:31.289660 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-195.ec2.internal" event={"ID":"24b00f41063a53ab6b85ee845aa19b10","Type":"ContainerStarted","Data":"e4d4a38bf2670c82611ac5747cb5543eb963c26974dede9cd966e3e5da1ad0f5"} Apr 22 14:15:31.290584 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:31.290567 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-195.ec2.internal" event={"ID":"b865d1ca2ef1f242e56d7c94255bbcb7","Type":"ContainerStarted","Data":"7f357acb165e43a6d2d61c61f00771054a510f809f260ce8207d67b489b0f9c3"} Apr 22 14:15:31.320648 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:31.320630 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-195.ec2.internal\" not found" Apr 22 14:15:31.421179 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:31.421129 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-195.ec2.internal\" not found" Apr 22 14:15:31.454173 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:31.454154 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:15:31.537148 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:31.537130 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-195.ec2.internal" Apr 22 14:15:31.556462 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:31.556440 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 14:15:31.557409 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:31.557395 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-195.ec2.internal" Apr 22 14:15:31.574119 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:31.574102 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 14:15:31.930419 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:31.930317 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:15:32.112362 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.112338 2576 apiserver.go:52] "Watching apiserver" Apr 22 14:15:32.123889 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.123856 2576 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 14:15:32.124737 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.124710 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-rbpm2","kube-system/konnectivity-agent-mj62z","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-195.ec2.internal","openshift-multus/multus-7mzfp","openshift-multus/multus-additional-cni-plugins-8pqgl","openshift-network-operator/iptables-alerter-h5kfd","openshift-ovn-kubernetes/ovnkube-node-x7g6p","kube-system/kube-apiserver-proxy-ip-10-0-142-195.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rxlwr","openshift-cluster-node-tuning-operator/tuned-89qbq","openshift-dns/node-resolver-8h8cm","openshift-image-registry/node-ca-bzcbn","openshift-multus/network-metrics-daemon-r7psp"] Apr 22 14:15:32.127168 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.127151 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:32.128238 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.128218 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-mj62z" Apr 22 14:15:32.129645 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.129617 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-7mzfp" Apr 22 14:15:32.129906 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.129883 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 14:15:32.130022 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.129969 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 14:15:32.130022 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.129994 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 14:15:32.130137 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.130093 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 14:15:32.130347 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.130329 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 14:15:32.131311 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.131293 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-8pqgl" Apr 22 14:15:32.131711 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.131549 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-hrf9n\"" Apr 22 14:15:32.131711 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.131605 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 14:15:32.131711 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.131646 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 14:15:32.131924 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.131910 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-xkp9b\"" Apr 22 14:15:32.131963 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.131940 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 14:15:32.133062 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.132831 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 14:15:32.133062 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.132911 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 14:15:32.133062 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.132934 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 14:15:32.133062 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.133023 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-s27n5\"" Apr 22 14:15:32.133280 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.133133 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 14:15:32.133786 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.133734 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 14:15:32.133786 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.133770 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-86fvw\"" Apr 22 14:15:32.133786 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.133781 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 14:15:32.134169 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.134154 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-h5kfd" Apr 22 14:15:32.134252 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.134225 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rbpm2" Apr 22 14:15:32.134324 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:32.134298 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rbpm2" podUID="e0b90fcd-ae7e-46f8-83c0-b9bdec12c8aa" Apr 22 14:15:32.135646 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.135629 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rxlwr" Apr 22 14:15:32.136509 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.136491 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 14:15:32.136668 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.136647 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 14:15:32.136750 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.136713 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-2nd57\"" Apr 22 14:15:32.136950 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.136888 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-89qbq" Apr 22 14:15:32.136950 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.136915 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 14:15:32.138071 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.137991 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 14:15:32.138071 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.138052 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 14:15:32.138886 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.138219 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 14:15:32.138886 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.138590 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8h8cm" Apr 22 14:15:32.138886 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.138594 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-fc8c7\"" Apr 22 14:15:32.140068 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.140045 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 14:15:32.140138 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.140082 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 14:15:32.140138 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.140096 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-76fxw\"" Apr 22 14:15:32.141594 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.141577 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 14:15:32.141829 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.141792 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-rxc75\"" Apr 22 14:15:32.141909 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.141863 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 14:15:32.143400 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.143200 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-bzcbn" Apr 22 14:15:32.143543 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.143527 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r7psp" Apr 22 14:15:32.143695 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:32.143653 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r7psp" podUID="dbee0e27-41ea-4d42-84c7-681872bfcda1" Apr 22 14:15:32.145528 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.145511 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 14:15:32.145628 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.145585 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 14:15:32.145695 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.145637 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 14:15:32.145752 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.145720 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-vwwdk\"" Apr 22 14:15:32.149289 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.149263 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5880a8e9-777a-4921-b5f6-c6325c768bf2-host-slash\") pod \"ovnkube-node-x7g6p\" (UID: \"5880a8e9-777a-4921-b5f6-c6325c768bf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:32.149377 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.149302 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5880a8e9-777a-4921-b5f6-c6325c768bf2-run-openvswitch\") pod \"ovnkube-node-x7g6p\" (UID: \"5880a8e9-777a-4921-b5f6-c6325c768bf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:32.149377 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.149327 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzbm2\" (UniqueName: \"kubernetes.io/projected/5880a8e9-777a-4921-b5f6-c6325c768bf2-kube-api-access-tzbm2\") pod \"ovnkube-node-x7g6p\" (UID: \"5880a8e9-777a-4921-b5f6-c6325c768bf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:32.149509 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.149375 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3895a4bb-68d7-4a37-8937-3ce81c84a431-host-run-k8s-cni-cncf-io\") pod \"multus-7mzfp\" (UID: \"3895a4bb-68d7-4a37-8937-3ce81c84a431\") " pod="openshift-multus/multus-7mzfp" Apr 22 14:15:32.149509 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.149418 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/35bb6914-dec8-4b09-9315-761449933a8a-host\") pod \"tuned-89qbq\" (UID: \"35bb6914-dec8-4b09-9315-761449933a8a\") " pod="openshift-cluster-node-tuning-operator/tuned-89qbq" Apr 22 14:15:32.149509 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.149476 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/0d3f90ab-bb01-4735-b78f-250f7955b56f-etc-selinux\") pod \"aws-ebs-csi-driver-node-rxlwr\" (UID: \"0d3f90ab-bb01-4735-b78f-250f7955b56f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rxlwr" Apr 22 14:15:32.149656 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.149506 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5880a8e9-777a-4921-b5f6-c6325c768bf2-run-ovn\") pod \"ovnkube-node-x7g6p\" (UID: \"5880a8e9-777a-4921-b5f6-c6325c768bf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:32.149656 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.149532 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3895a4bb-68d7-4a37-8937-3ce81c84a431-host-var-lib-cni-bin\") pod \"multus-7mzfp\" (UID: \"3895a4bb-68d7-4a37-8937-3ce81c84a431\") " pod="openshift-multus/multus-7mzfp" Apr 22 14:15:32.149656 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.149555 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhk54\" (UniqueName: \"kubernetes.io/projected/110f68fd-5d58-411e-a7fc-980d5d6050e4-kube-api-access-jhk54\") pod \"multus-additional-cni-plugins-8pqgl\" (UID: \"110f68fd-5d58-411e-a7fc-980d5d6050e4\") " pod="openshift-multus/multus-additional-cni-plugins-8pqgl" Apr 22 14:15:32.149656 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.149582 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/35bb6914-dec8-4b09-9315-761449933a8a-var-lib-kubelet\") pod \"tuned-89qbq\" (UID: \"35bb6914-dec8-4b09-9315-761449933a8a\") " pod="openshift-cluster-node-tuning-operator/tuned-89qbq" Apr 22 14:15:32.149656 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.149606 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/35bb6914-dec8-4b09-9315-761449933a8a-etc-tuned\") pod \"tuned-89qbq\" (UID: \"35bb6914-dec8-4b09-9315-761449933a8a\") " pod="openshift-cluster-node-tuning-operator/tuned-89qbq" Apr 22 14:15:32.149656 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.149628 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0d3f90ab-bb01-4735-b78f-250f7955b56f-socket-dir\") pod \"aws-ebs-csi-driver-node-rxlwr\" (UID: \"0d3f90ab-bb01-4735-b78f-250f7955b56f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rxlwr" Apr 22 14:15:32.149656 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.149651 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/0d3f90ab-bb01-4735-b78f-250f7955b56f-device-dir\") pod \"aws-ebs-csi-driver-node-rxlwr\" (UID: \"0d3f90ab-bb01-4735-b78f-250f7955b56f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rxlwr" Apr 22 14:15:32.149994 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.149719 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5880a8e9-777a-4921-b5f6-c6325c768bf2-node-log\") pod \"ovnkube-node-x7g6p\" (UID: \"5880a8e9-777a-4921-b5f6-c6325c768bf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:32.149994 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.149779 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3895a4bb-68d7-4a37-8937-3ce81c84a431-multus-socket-dir-parent\") pod \"multus-7mzfp\" (UID: \"3895a4bb-68d7-4a37-8937-3ce81c84a431\") " pod="openshift-multus/multus-7mzfp" Apr 22 14:15:32.149994 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.149800 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5880a8e9-777a-4921-b5f6-c6325c768bf2-systemd-units\") pod \"ovnkube-node-x7g6p\" (UID: \"5880a8e9-777a-4921-b5f6-c6325c768bf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:32.149994 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.149837 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3895a4bb-68d7-4a37-8937-3ce81c84a431-os-release\") pod \"multus-7mzfp\" (UID: \"3895a4bb-68d7-4a37-8937-3ce81c84a431\") " pod="openshift-multus/multus-7mzfp" Apr 22 14:15:32.149994 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.149896 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3895a4bb-68d7-4a37-8937-3ce81c84a431-host-var-lib-kubelet\") pod \"multus-7mzfp\" (UID: \"3895a4bb-68d7-4a37-8937-3ce81c84a431\") " pod="openshift-multus/multus-7mzfp" Apr 22 14:15:32.149994 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.149923 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/35bb6914-dec8-4b09-9315-761449933a8a-etc-sysctl-conf\") pod \"tuned-89qbq\" (UID: \"35bb6914-dec8-4b09-9315-761449933a8a\") " pod="openshift-cluster-node-tuning-operator/tuned-89qbq" Apr 22 14:15:32.149994 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.149947 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/0d3f90ab-bb01-4735-b78f-250f7955b56f-sys-fs\") pod \"aws-ebs-csi-driver-node-rxlwr\" (UID: \"0d3f90ab-bb01-4735-b78f-250f7955b56f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rxlwr" Apr 22 14:15:32.149994 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.149971 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5880a8e9-777a-4921-b5f6-c6325c768bf2-ovn-node-metrics-cert\") pod \"ovnkube-node-x7g6p\" (UID: \"5880a8e9-777a-4921-b5f6-c6325c768bf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:32.150420 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.149997 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqwrm\" (UniqueName: \"kubernetes.io/projected/88aefe02-de5c-45b6-a697-2d18d8ae2754-kube-api-access-pqwrm\") pod \"iptables-alerter-h5kfd\" (UID: \"88aefe02-de5c-45b6-a697-2d18d8ae2754\") " pod="openshift-network-operator/iptables-alerter-h5kfd" Apr 22 14:15:32.150420 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.150119 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5880a8e9-777a-4921-b5f6-c6325c768bf2-run-systemd\") pod \"ovnkube-node-x7g6p\" (UID: \"5880a8e9-777a-4921-b5f6-c6325c768bf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:32.150420 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.150155 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5880a8e9-777a-4921-b5f6-c6325c768bf2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x7g6p\" (UID: \"5880a8e9-777a-4921-b5f6-c6325c768bf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:32.150420 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.150180 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3895a4bb-68d7-4a37-8937-3ce81c84a431-system-cni-dir\") pod \"multus-7mzfp\" (UID: \"3895a4bb-68d7-4a37-8937-3ce81c84a431\") " pod="openshift-multus/multus-7mzfp" Apr 22 14:15:32.150420 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.150202 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3895a4bb-68d7-4a37-8937-3ce81c84a431-host-var-lib-cni-multus\") pod \"multus-7mzfp\" (UID: \"3895a4bb-68d7-4a37-8937-3ce81c84a431\") " pod="openshift-multus/multus-7mzfp" Apr 22 14:15:32.150420 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.150251 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/110f68fd-5d58-411e-a7fc-980d5d6050e4-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8pqgl\" (UID: \"110f68fd-5d58-411e-a7fc-980d5d6050e4\") " pod="openshift-multus/multus-additional-cni-plugins-8pqgl" Apr 22 14:15:32.150420 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.150283 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/35bb6914-dec8-4b09-9315-761449933a8a-lib-modules\") pod \"tuned-89qbq\" (UID: \"35bb6914-dec8-4b09-9315-761449933a8a\") " pod="openshift-cluster-node-tuning-operator/tuned-89qbq" Apr 22 14:15:32.150420 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.150313 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5880a8e9-777a-4921-b5f6-c6325c768bf2-var-lib-openvswitch\") pod \"ovnkube-node-x7g6p\" (UID: \"5880a8e9-777a-4921-b5f6-c6325c768bf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:32.150420 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.150337 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/35bb6914-dec8-4b09-9315-761449933a8a-etc-sysctl-d\") pod \"tuned-89qbq\" (UID: \"35bb6914-dec8-4b09-9315-761449933a8a\") " pod="openshift-cluster-node-tuning-operator/tuned-89qbq" Apr 22 14:15:32.150420 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.150358 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/88aefe02-de5c-45b6-a697-2d18d8ae2754-host-slash\") pod \"iptables-alerter-h5kfd\" (UID: \"88aefe02-de5c-45b6-a697-2d18d8ae2754\") " pod="openshift-network-operator/iptables-alerter-h5kfd" Apr 22 14:15:32.150420 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.150380 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5880a8e9-777a-4921-b5f6-c6325c768bf2-host-run-netns\") pod \"ovnkube-node-x7g6p\" (UID: \"5880a8e9-777a-4921-b5f6-c6325c768bf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:32.150420 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.150404 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5880a8e9-777a-4921-b5f6-c6325c768bf2-log-socket\") pod \"ovnkube-node-x7g6p\" (UID: \"5880a8e9-777a-4921-b5f6-c6325c768bf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:32.150921 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.150425 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5880a8e9-777a-4921-b5f6-c6325c768bf2-host-cni-netd\") pod \"ovnkube-node-x7g6p\" (UID: \"5880a8e9-777a-4921-b5f6-c6325c768bf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:32.150921 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.150447 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3895a4bb-68d7-4a37-8937-3ce81c84a431-etc-kubernetes\") pod \"multus-7mzfp\" (UID: \"3895a4bb-68d7-4a37-8937-3ce81c84a431\") " pod="openshift-multus/multus-7mzfp" Apr 22 14:15:32.150921 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.150469 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/110f68fd-5d58-411e-a7fc-980d5d6050e4-cnibin\") pod \"multus-additional-cni-plugins-8pqgl\" (UID: \"110f68fd-5d58-411e-a7fc-980d5d6050e4\") " pod="openshift-multus/multus-additional-cni-plugins-8pqgl" Apr 22 14:15:32.150921 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.150490 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/35bb6914-dec8-4b09-9315-761449933a8a-etc-modprobe-d\") pod \"tuned-89qbq\" (UID: \"35bb6914-dec8-4b09-9315-761449933a8a\") " pod="openshift-cluster-node-tuning-operator/tuned-89qbq" Apr 22 14:15:32.150921 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.150511 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/35bb6914-dec8-4b09-9315-761449933a8a-tmp\") pod \"tuned-89qbq\" (UID: \"35bb6914-dec8-4b09-9315-761449933a8a\") " pod="openshift-cluster-node-tuning-operator/tuned-89qbq" Apr 22 14:15:32.150921 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.150533 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/2b976276-fdc8-4595-a9a4-76cc1b34317e-konnectivity-ca\") pod \"konnectivity-agent-mj62z\" (UID: \"2b976276-fdc8-4595-a9a4-76cc1b34317e\") " pod="kube-system/konnectivity-agent-mj62z" Apr 22 14:15:32.150921 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.150555 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3895a4bb-68d7-4a37-8937-3ce81c84a431-multus-conf-dir\") pod \"multus-7mzfp\" (UID: \"3895a4bb-68d7-4a37-8937-3ce81c84a431\") " pod="openshift-multus/multus-7mzfp" Apr 22 14:15:32.150921 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.150603 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/110f68fd-5d58-411e-a7fc-980d5d6050e4-cni-binary-copy\") pod \"multus-additional-cni-plugins-8pqgl\" (UID: \"110f68fd-5d58-411e-a7fc-980d5d6050e4\") " pod="openshift-multus/multus-additional-cni-plugins-8pqgl" Apr 22 14:15:32.150921 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.150631 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tk6vz\" (UniqueName: \"kubernetes.io/projected/e0b90fcd-ae7e-46f8-83c0-b9bdec12c8aa-kube-api-access-tk6vz\") pod \"network-check-target-rbpm2\" (UID: \"e0b90fcd-ae7e-46f8-83c0-b9bdec12c8aa\") " pod="openshift-network-diagnostics/network-check-target-rbpm2" Apr 22 14:15:32.150921 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.150654 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/35bb6914-dec8-4b09-9315-761449933a8a-etc-sysconfig\") pod \"tuned-89qbq\" (UID: \"35bb6914-dec8-4b09-9315-761449933a8a\") " pod="openshift-cluster-node-tuning-operator/tuned-89qbq" Apr 22 14:15:32.150921 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.150677 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/35bb6914-dec8-4b09-9315-761449933a8a-etc-systemd\") pod \"tuned-89qbq\" (UID: \"35bb6914-dec8-4b09-9315-761449933a8a\") " pod="openshift-cluster-node-tuning-operator/tuned-89qbq" Apr 22 14:15:32.150921 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.150697 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/35bb6914-dec8-4b09-9315-761449933a8a-run\") pod \"tuned-89qbq\" (UID: \"35bb6914-dec8-4b09-9315-761449933a8a\") " pod="openshift-cluster-node-tuning-operator/tuned-89qbq" Apr 22 14:15:32.150921 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.150732 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fbf11e8c-40e6-4fe9-b51b-b3a87ff88577-hosts-file\") pod \"node-resolver-8h8cm\" (UID: \"fbf11e8c-40e6-4fe9-b51b-b3a87ff88577\") " pod="openshift-dns/node-resolver-8h8cm" Apr 22 14:15:32.150921 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.150753 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3895a4bb-68d7-4a37-8937-3ce81c84a431-cnibin\") pod \"multus-7mzfp\" (UID: \"3895a4bb-68d7-4a37-8937-3ce81c84a431\") " pod="openshift-multus/multus-7mzfp" Apr 22 14:15:32.150921 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.150773 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/110f68fd-5d58-411e-a7fc-980d5d6050e4-system-cni-dir\") pod \"multus-additional-cni-plugins-8pqgl\" (UID: \"110f68fd-5d58-411e-a7fc-980d5d6050e4\") " pod="openshift-multus/multus-additional-cni-plugins-8pqgl" Apr 22 14:15:32.150921 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.150821 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5880a8e9-777a-4921-b5f6-c6325c768bf2-ovnkube-script-lib\") pod \"ovnkube-node-x7g6p\" (UID: \"5880a8e9-777a-4921-b5f6-c6325c768bf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:32.150921 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.150841 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3895a4bb-68d7-4a37-8937-3ce81c84a431-host-run-netns\") pod \"multus-7mzfp\" (UID: \"3895a4bb-68d7-4a37-8937-3ce81c84a431\") " pod="openshift-multus/multus-7mzfp" Apr 22 14:15:32.151557 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.150872 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3895a4bb-68d7-4a37-8937-3ce81c84a431-hostroot\") pod \"multus-7mzfp\" (UID: \"3895a4bb-68d7-4a37-8937-3ce81c84a431\") " pod="openshift-multus/multus-7mzfp" Apr 22 14:15:32.151557 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.150893 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twfth\" (UniqueName: \"kubernetes.io/projected/3895a4bb-68d7-4a37-8937-3ce81c84a431-kube-api-access-twfth\") pod \"multus-7mzfp\" (UID: \"3895a4bb-68d7-4a37-8937-3ce81c84a431\") " pod="openshift-multus/multus-7mzfp" Apr 22 14:15:32.151557 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.150915 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/110f68fd-5d58-411e-a7fc-980d5d6050e4-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8pqgl\" (UID: \"110f68fd-5d58-411e-a7fc-980d5d6050e4\") " pod="openshift-multus/multus-additional-cni-plugins-8pqgl" Apr 22 14:15:32.151557 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.150944 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fbf11e8c-40e6-4fe9-b51b-b3a87ff88577-tmp-dir\") pod \"node-resolver-8h8cm\" (UID: \"fbf11e8c-40e6-4fe9-b51b-b3a87ff88577\") " pod="openshift-dns/node-resolver-8h8cm" Apr 22 14:15:32.151557 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.150971 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5880a8e9-777a-4921-b5f6-c6325c768bf2-host-kubelet\") pod \"ovnkube-node-x7g6p\" (UID: \"5880a8e9-777a-4921-b5f6-c6325c768bf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:32.151557 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.150996 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5880a8e9-777a-4921-b5f6-c6325c768bf2-etc-openvswitch\") pod \"ovnkube-node-x7g6p\" (UID: \"5880a8e9-777a-4921-b5f6-c6325c768bf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:32.151557 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.151016 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/35bb6914-dec8-4b09-9315-761449933a8a-etc-kubernetes\") pod \"tuned-89qbq\" (UID: \"35bb6914-dec8-4b09-9315-761449933a8a\") " pod="openshift-cluster-node-tuning-operator/tuned-89qbq" Apr 22 14:15:32.151557 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.151035 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/2b976276-fdc8-4595-a9a4-76cc1b34317e-agent-certs\") pod \"konnectivity-agent-mj62z\" (UID: \"2b976276-fdc8-4595-a9a4-76cc1b34317e\") " pod="kube-system/konnectivity-agent-mj62z" Apr 22 14:15:32.151557 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.151050 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5880a8e9-777a-4921-b5f6-c6325c768bf2-host-cni-bin\") pod \"ovnkube-node-x7g6p\" (UID: \"5880a8e9-777a-4921-b5f6-c6325c768bf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:32.151557 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.151064 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5880a8e9-777a-4921-b5f6-c6325c768bf2-ovnkube-config\") pod \"ovnkube-node-x7g6p\" (UID: \"5880a8e9-777a-4921-b5f6-c6325c768bf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:32.151557 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.151080 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5880a8e9-777a-4921-b5f6-c6325c768bf2-env-overrides\") pod \"ovnkube-node-x7g6p\" (UID: \"5880a8e9-777a-4921-b5f6-c6325c768bf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:32.151557 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.151100 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3895a4bb-68d7-4a37-8937-3ce81c84a431-multus-cni-dir\") pod \"multus-7mzfp\" (UID: \"3895a4bb-68d7-4a37-8937-3ce81c84a431\") " pod="openshift-multus/multus-7mzfp" Apr 22 14:15:32.151557 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.151115 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3895a4bb-68d7-4a37-8937-3ce81c84a431-cni-binary-copy\") pod \"multus-7mzfp\" (UID: \"3895a4bb-68d7-4a37-8937-3ce81c84a431\") " pod="openshift-multus/multus-7mzfp" Apr 22 14:15:32.151557 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.151148 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/110f68fd-5d58-411e-a7fc-980d5d6050e4-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-8pqgl\" (UID: \"110f68fd-5d58-411e-a7fc-980d5d6050e4\") " pod="openshift-multus/multus-additional-cni-plugins-8pqgl" Apr 22 14:15:32.151557 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.151172 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/88aefe02-de5c-45b6-a697-2d18d8ae2754-iptables-alerter-script\") pod \"iptables-alerter-h5kfd\" (UID: \"88aefe02-de5c-45b6-a697-2d18d8ae2754\") " pod="openshift-network-operator/iptables-alerter-h5kfd" Apr 22 14:15:32.151557 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.151197 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5880a8e9-777a-4921-b5f6-c6325c768bf2-host-run-ovn-kubernetes\") pod \"ovnkube-node-x7g6p\" (UID: \"5880a8e9-777a-4921-b5f6-c6325c768bf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:32.152034 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.151216 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/110f68fd-5d58-411e-a7fc-980d5d6050e4-os-release\") pod \"multus-additional-cni-plugins-8pqgl\" (UID: \"110f68fd-5d58-411e-a7fc-980d5d6050e4\") " pod="openshift-multus/multus-additional-cni-plugins-8pqgl" Apr 22 14:15:32.152034 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.151230 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46zg5\" (UniqueName: \"kubernetes.io/projected/0d3f90ab-bb01-4735-b78f-250f7955b56f-kube-api-access-46zg5\") pod \"aws-ebs-csi-driver-node-rxlwr\" (UID: \"0d3f90ab-bb01-4735-b78f-250f7955b56f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rxlwr" Apr 22 14:15:32.152034 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.151243 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fs8qg\" (UniqueName: \"kubernetes.io/projected/fbf11e8c-40e6-4fe9-b51b-b3a87ff88577-kube-api-access-fs8qg\") pod \"node-resolver-8h8cm\" (UID: \"fbf11e8c-40e6-4fe9-b51b-b3a87ff88577\") " pod="openshift-dns/node-resolver-8h8cm" Apr 22 14:15:32.152034 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.151257 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3895a4bb-68d7-4a37-8937-3ce81c84a431-multus-daemon-config\") pod \"multus-7mzfp\" (UID: \"3895a4bb-68d7-4a37-8937-3ce81c84a431\") " pod="openshift-multus/multus-7mzfp" Apr 22 14:15:32.152034 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.151277 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3895a4bb-68d7-4a37-8937-3ce81c84a431-host-run-multus-certs\") pod \"multus-7mzfp\" (UID: \"3895a4bb-68d7-4a37-8937-3ce81c84a431\") " pod="openshift-multus/multus-7mzfp" Apr 22 14:15:32.152034 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.151299 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/35bb6914-dec8-4b09-9315-761449933a8a-sys\") pod \"tuned-89qbq\" (UID: \"35bb6914-dec8-4b09-9315-761449933a8a\") " pod="openshift-cluster-node-tuning-operator/tuned-89qbq" Apr 22 14:15:32.152034 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.151321 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvfl4\" (UniqueName: \"kubernetes.io/projected/35bb6914-dec8-4b09-9315-761449933a8a-kube-api-access-jvfl4\") pod \"tuned-89qbq\" (UID: \"35bb6914-dec8-4b09-9315-761449933a8a\") " pod="openshift-cluster-node-tuning-operator/tuned-89qbq" Apr 22 14:15:32.152034 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.151344 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0d3f90ab-bb01-4735-b78f-250f7955b56f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-rxlwr\" (UID: \"0d3f90ab-bb01-4735-b78f-250f7955b56f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rxlwr" Apr 22 14:15:32.152034 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.151371 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0d3f90ab-bb01-4735-b78f-250f7955b56f-registration-dir\") pod \"aws-ebs-csi-driver-node-rxlwr\" (UID: \"0d3f90ab-bb01-4735-b78f-250f7955b56f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rxlwr" Apr 22 14:15:32.185457 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.185429 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 14:10:31 +0000 UTC" deadline="2027-10-16 03:24:36.648066061 +0000 UTC" Apr 22 14:15:32.185548 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.185457 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12997h9m4.46261248s" Apr 22 14:15:32.238072 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.238042 2576 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 14:15:32.251966 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.251944 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3895a4bb-68d7-4a37-8937-3ce81c84a431-cnibin\") pod \"multus-7mzfp\" (UID: \"3895a4bb-68d7-4a37-8937-3ce81c84a431\") " pod="openshift-multus/multus-7mzfp" Apr 22 14:15:32.252073 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.251974 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/110f68fd-5d58-411e-a7fc-980d5d6050e4-system-cni-dir\") pod \"multus-additional-cni-plugins-8pqgl\" (UID: \"110f68fd-5d58-411e-a7fc-980d5d6050e4\") " pod="openshift-multus/multus-additional-cni-plugins-8pqgl" Apr 22 14:15:32.252073 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.251990 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5880a8e9-777a-4921-b5f6-c6325c768bf2-ovnkube-script-lib\") pod \"ovnkube-node-x7g6p\" (UID: \"5880a8e9-777a-4921-b5f6-c6325c768bf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:32.252073 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.252018 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3895a4bb-68d7-4a37-8937-3ce81c84a431-host-run-netns\") pod \"multus-7mzfp\" (UID: \"3895a4bb-68d7-4a37-8937-3ce81c84a431\") " pod="openshift-multus/multus-7mzfp" Apr 22 14:15:32.252073 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.252055 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3895a4bb-68d7-4a37-8937-3ce81c84a431-hostroot\") pod \"multus-7mzfp\" (UID: \"3895a4bb-68d7-4a37-8937-3ce81c84a431\") " pod="openshift-multus/multus-7mzfp" Apr 22 14:15:32.252073 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.252064 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3895a4bb-68d7-4a37-8937-3ce81c84a431-cnibin\") pod \"multus-7mzfp\" (UID: \"3895a4bb-68d7-4a37-8937-3ce81c84a431\") " pod="openshift-multus/multus-7mzfp" Apr 22 14:15:32.252312 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.252071 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/110f68fd-5d58-411e-a7fc-980d5d6050e4-system-cni-dir\") pod \"multus-additional-cni-plugins-8pqgl\" (UID: \"110f68fd-5d58-411e-a7fc-980d5d6050e4\") " pod="openshift-multus/multus-additional-cni-plugins-8pqgl" Apr 22 14:15:32.252312 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.252093 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3895a4bb-68d7-4a37-8937-3ce81c84a431-hostroot\") pod \"multus-7mzfp\" (UID: \"3895a4bb-68d7-4a37-8937-3ce81c84a431\") " pod="openshift-multus/multus-7mzfp" Apr 22 14:15:32.252312 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.252112 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3895a4bb-68d7-4a37-8937-3ce81c84a431-host-run-netns\") pod \"multus-7mzfp\" (UID: \"3895a4bb-68d7-4a37-8937-3ce81c84a431\") " pod="openshift-multus/multus-7mzfp" Apr 22 14:15:32.252312 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.252180 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-twfth\" (UniqueName: \"kubernetes.io/projected/3895a4bb-68d7-4a37-8937-3ce81c84a431-kube-api-access-twfth\") pod \"multus-7mzfp\" (UID: \"3895a4bb-68d7-4a37-8937-3ce81c84a431\") " pod="openshift-multus/multus-7mzfp" Apr 22 14:15:32.252312 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.252210 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/110f68fd-5d58-411e-a7fc-980d5d6050e4-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8pqgl\" (UID: \"110f68fd-5d58-411e-a7fc-980d5d6050e4\") " pod="openshift-multus/multus-additional-cni-plugins-8pqgl" Apr 22 14:15:32.252312 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.252257 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fbf11e8c-40e6-4fe9-b51b-b3a87ff88577-tmp-dir\") pod \"node-resolver-8h8cm\" (UID: \"fbf11e8c-40e6-4fe9-b51b-b3a87ff88577\") " pod="openshift-dns/node-resolver-8h8cm" Apr 22 14:15:32.252312 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.252296 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b185de97-4d30-47da-bb08-402ac8989235-host\") pod \"node-ca-bzcbn\" (UID: \"b185de97-4d30-47da-bb08-402ac8989235\") " pod="openshift-image-registry/node-ca-bzcbn" Apr 22 14:15:32.252591 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.252319 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5880a8e9-777a-4921-b5f6-c6325c768bf2-host-kubelet\") pod \"ovnkube-node-x7g6p\" (UID: \"5880a8e9-777a-4921-b5f6-c6325c768bf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:32.252591 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.252367 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5880a8e9-777a-4921-b5f6-c6325c768bf2-host-kubelet\") pod \"ovnkube-node-x7g6p\" (UID: \"5880a8e9-777a-4921-b5f6-c6325c768bf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:32.252591 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.252383 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5880a8e9-777a-4921-b5f6-c6325c768bf2-etc-openvswitch\") pod \"ovnkube-node-x7g6p\" (UID: \"5880a8e9-777a-4921-b5f6-c6325c768bf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:32.252591 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.252415 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/35bb6914-dec8-4b09-9315-761449933a8a-etc-kubernetes\") pod \"tuned-89qbq\" (UID: \"35bb6914-dec8-4b09-9315-761449933a8a\") " pod="openshift-cluster-node-tuning-operator/tuned-89qbq" Apr 22 14:15:32.252591 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.252420 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5880a8e9-777a-4921-b5f6-c6325c768bf2-etc-openvswitch\") pod \"ovnkube-node-x7g6p\" (UID: \"5880a8e9-777a-4921-b5f6-c6325c768bf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:32.252591 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.252445 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/2b976276-fdc8-4595-a9a4-76cc1b34317e-agent-certs\") pod \"konnectivity-agent-mj62z\" (UID: \"2b976276-fdc8-4595-a9a4-76cc1b34317e\") " pod="kube-system/konnectivity-agent-mj62z" Apr 22 14:15:32.252591 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.252468 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5880a8e9-777a-4921-b5f6-c6325c768bf2-host-cni-bin\") pod \"ovnkube-node-x7g6p\" (UID: \"5880a8e9-777a-4921-b5f6-c6325c768bf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:32.252591 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.252473 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/35bb6914-dec8-4b09-9315-761449933a8a-etc-kubernetes\") pod \"tuned-89qbq\" (UID: \"35bb6914-dec8-4b09-9315-761449933a8a\") " pod="openshift-cluster-node-tuning-operator/tuned-89qbq" Apr 22 14:15:32.252591 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.252493 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5880a8e9-777a-4921-b5f6-c6325c768bf2-ovnkube-config\") pod \"ovnkube-node-x7g6p\" (UID: \"5880a8e9-777a-4921-b5f6-c6325c768bf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:32.252591 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.252515 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5880a8e9-777a-4921-b5f6-c6325c768bf2-host-cni-bin\") pod \"ovnkube-node-x7g6p\" (UID: \"5880a8e9-777a-4921-b5f6-c6325c768bf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:32.252591 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.252519 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5880a8e9-777a-4921-b5f6-c6325c768bf2-env-overrides\") pod \"ovnkube-node-x7g6p\" (UID: \"5880a8e9-777a-4921-b5f6-c6325c768bf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:32.252591 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.252545 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3895a4bb-68d7-4a37-8937-3ce81c84a431-multus-cni-dir\") pod \"multus-7mzfp\" (UID: \"3895a4bb-68d7-4a37-8937-3ce81c84a431\") " pod="openshift-multus/multus-7mzfp" Apr 22 14:15:32.252591 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.252570 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3895a4bb-68d7-4a37-8937-3ce81c84a431-cni-binary-copy\") pod \"multus-7mzfp\" (UID: \"3895a4bb-68d7-4a37-8937-3ce81c84a431\") " pod="openshift-multus/multus-7mzfp" Apr 22 14:15:32.252591 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.252595 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fbf11e8c-40e6-4fe9-b51b-b3a87ff88577-tmp-dir\") pod \"node-resolver-8h8cm\" (UID: \"fbf11e8c-40e6-4fe9-b51b-b3a87ff88577\") " pod="openshift-dns/node-resolver-8h8cm" Apr 22 14:15:32.253231 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.252599 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/110f68fd-5d58-411e-a7fc-980d5d6050e4-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-8pqgl\" (UID: \"110f68fd-5d58-411e-a7fc-980d5d6050e4\") " pod="openshift-multus/multus-additional-cni-plugins-8pqgl" Apr 22 14:15:32.253231 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.252628 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/88aefe02-de5c-45b6-a697-2d18d8ae2754-iptables-alerter-script\") pod \"iptables-alerter-h5kfd\" (UID: \"88aefe02-de5c-45b6-a697-2d18d8ae2754\") " pod="openshift-network-operator/iptables-alerter-h5kfd" Apr 22 14:15:32.253231 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.252655 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5880a8e9-777a-4921-b5f6-c6325c768bf2-host-run-ovn-kubernetes\") pod \"ovnkube-node-x7g6p\" (UID: \"5880a8e9-777a-4921-b5f6-c6325c768bf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:32.253231 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.252680 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/110f68fd-5d58-411e-a7fc-980d5d6050e4-os-release\") pod \"multus-additional-cni-plugins-8pqgl\" (UID: \"110f68fd-5d58-411e-a7fc-980d5d6050e4\") " pod="openshift-multus/multus-additional-cni-plugins-8pqgl" Apr 22 14:15:32.253231 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.252704 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-46zg5\" (UniqueName: \"kubernetes.io/projected/0d3f90ab-bb01-4735-b78f-250f7955b56f-kube-api-access-46zg5\") pod \"aws-ebs-csi-driver-node-rxlwr\" (UID: \"0d3f90ab-bb01-4735-b78f-250f7955b56f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rxlwr" Apr 22 14:15:32.253231 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.253076 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/110f68fd-5d58-411e-a7fc-980d5d6050e4-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8pqgl\" (UID: \"110f68fd-5d58-411e-a7fc-980d5d6050e4\") " pod="openshift-multus/multus-additional-cni-plugins-8pqgl" Apr 22 14:15:32.253231 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.253070 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fs8qg\" (UniqueName: \"kubernetes.io/projected/fbf11e8c-40e6-4fe9-b51b-b3a87ff88577-kube-api-access-fs8qg\") pod \"node-resolver-8h8cm\" (UID: \"fbf11e8c-40e6-4fe9-b51b-b3a87ff88577\") " pod="openshift-dns/node-resolver-8h8cm" Apr 22 14:15:32.253231 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.253221 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b185de97-4d30-47da-bb08-402ac8989235-serviceca\") pod \"node-ca-bzcbn\" (UID: \"b185de97-4d30-47da-bb08-402ac8989235\") " pod="openshift-image-registry/node-ca-bzcbn" Apr 22 14:15:32.253592 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.253254 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5880a8e9-777a-4921-b5f6-c6325c768bf2-ovnkube-script-lib\") pod \"ovnkube-node-x7g6p\" (UID: \"5880a8e9-777a-4921-b5f6-c6325c768bf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:32.253592 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.253268 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3895a4bb-68d7-4a37-8937-3ce81c84a431-multus-daemon-config\") pod \"multus-7mzfp\" (UID: \"3895a4bb-68d7-4a37-8937-3ce81c84a431\") " pod="openshift-multus/multus-7mzfp" Apr 22 14:15:32.253592 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.253320 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5880a8e9-777a-4921-b5f6-c6325c768bf2-ovnkube-config\") pod \"ovnkube-node-x7g6p\" (UID: \"5880a8e9-777a-4921-b5f6-c6325c768bf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:32.253592 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.253367 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3895a4bb-68d7-4a37-8937-3ce81c84a431-multus-cni-dir\") pod \"multus-7mzfp\" (UID: \"3895a4bb-68d7-4a37-8937-3ce81c84a431\") " pod="openshift-multus/multus-7mzfp" Apr 22 14:15:32.253592 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.253342 2576 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 14:15:32.253592 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.253398 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3895a4bb-68d7-4a37-8937-3ce81c84a431-host-run-multus-certs\") pod \"multus-7mzfp\" (UID: \"3895a4bb-68d7-4a37-8937-3ce81c84a431\") " pod="openshift-multus/multus-7mzfp" Apr 22 14:15:32.253592 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.253398 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5880a8e9-777a-4921-b5f6-c6325c768bf2-host-run-ovn-kubernetes\") pod \"ovnkube-node-x7g6p\" (UID: \"5880a8e9-777a-4921-b5f6-c6325c768bf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:32.253592 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.253504 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/110f68fd-5d58-411e-a7fc-980d5d6050e4-os-release\") pod \"multus-additional-cni-plugins-8pqgl\" (UID: \"110f68fd-5d58-411e-a7fc-980d5d6050e4\") " pod="openshift-multus/multus-additional-cni-plugins-8pqgl" Apr 22 14:15:32.253592 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.253521 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3895a4bb-68d7-4a37-8937-3ce81c84a431-cni-binary-copy\") pod \"multus-7mzfp\" (UID: \"3895a4bb-68d7-4a37-8937-3ce81c84a431\") " pod="openshift-multus/multus-7mzfp" Apr 22 14:15:32.254013 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.253828 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5880a8e9-777a-4921-b5f6-c6325c768bf2-env-overrides\") pod \"ovnkube-node-x7g6p\" (UID: \"5880a8e9-777a-4921-b5f6-c6325c768bf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:32.254066 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.254020 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/110f68fd-5d58-411e-a7fc-980d5d6050e4-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-8pqgl\" (UID: \"110f68fd-5d58-411e-a7fc-980d5d6050e4\") " pod="openshift-multus/multus-additional-cni-plugins-8pqgl" Apr 22 14:15:32.254066 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.254036 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3895a4bb-68d7-4a37-8937-3ce81c84a431-multus-daemon-config\") pod \"multus-7mzfp\" (UID: \"3895a4bb-68d7-4a37-8937-3ce81c84a431\") " pod="openshift-multus/multus-7mzfp" Apr 22 14:15:32.254409 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.254379 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3895a4bb-68d7-4a37-8937-3ce81c84a431-host-run-multus-certs\") pod \"multus-7mzfp\" (UID: \"3895a4bb-68d7-4a37-8937-3ce81c84a431\") " pod="openshift-multus/multus-7mzfp" Apr 22 14:15:32.254509 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.254438 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/35bb6914-dec8-4b09-9315-761449933a8a-sys\") pod \"tuned-89qbq\" (UID: \"35bb6914-dec8-4b09-9315-761449933a8a\") " pod="openshift-cluster-node-tuning-operator/tuned-89qbq" Apr 22 14:15:32.254563 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.254472 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jvfl4\" (UniqueName: \"kubernetes.io/projected/35bb6914-dec8-4b09-9315-761449933a8a-kube-api-access-jvfl4\") pod \"tuned-89qbq\" (UID: \"35bb6914-dec8-4b09-9315-761449933a8a\") " pod="openshift-cluster-node-tuning-operator/tuned-89qbq" Apr 22 14:15:32.254609 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.254587 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0d3f90ab-bb01-4735-b78f-250f7955b56f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-rxlwr\" (UID: \"0d3f90ab-bb01-4735-b78f-250f7955b56f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rxlwr" Apr 22 14:15:32.254653 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.254618 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0d3f90ab-bb01-4735-b78f-250f7955b56f-registration-dir\") pod \"aws-ebs-csi-driver-node-rxlwr\" (UID: \"0d3f90ab-bb01-4735-b78f-250f7955b56f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rxlwr" Apr 22 14:15:32.254694 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.254652 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhz6r\" (UniqueName: \"kubernetes.io/projected/b185de97-4d30-47da-bb08-402ac8989235-kube-api-access-rhz6r\") pod \"node-ca-bzcbn\" (UID: \"b185de97-4d30-47da-bb08-402ac8989235\") " pod="openshift-image-registry/node-ca-bzcbn" Apr 22 14:15:32.254694 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.254685 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5880a8e9-777a-4921-b5f6-c6325c768bf2-host-slash\") pod \"ovnkube-node-x7g6p\" (UID: \"5880a8e9-777a-4921-b5f6-c6325c768bf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:32.254776 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.254716 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5880a8e9-777a-4921-b5f6-c6325c768bf2-run-openvswitch\") pod \"ovnkube-node-x7g6p\" (UID: \"5880a8e9-777a-4921-b5f6-c6325c768bf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:32.258286 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.254860 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tzbm2\" (UniqueName: \"kubernetes.io/projected/5880a8e9-777a-4921-b5f6-c6325c768bf2-kube-api-access-tzbm2\") pod \"ovnkube-node-x7g6p\" (UID: \"5880a8e9-777a-4921-b5f6-c6325c768bf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:32.258286 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.254901 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3895a4bb-68d7-4a37-8937-3ce81c84a431-host-run-k8s-cni-cncf-io\") pod \"multus-7mzfp\" (UID: \"3895a4bb-68d7-4a37-8937-3ce81c84a431\") " pod="openshift-multus/multus-7mzfp" Apr 22 14:15:32.258286 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.254911 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0d3f90ab-bb01-4735-b78f-250f7955b56f-registration-dir\") pod \"aws-ebs-csi-driver-node-rxlwr\" (UID: \"0d3f90ab-bb01-4735-b78f-250f7955b56f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rxlwr" Apr 22 14:15:32.258286 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.254932 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/35bb6914-dec8-4b09-9315-761449933a8a-host\") pod \"tuned-89qbq\" (UID: \"35bb6914-dec8-4b09-9315-761449933a8a\") " pod="openshift-cluster-node-tuning-operator/tuned-89qbq" Apr 22 14:15:32.258286 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.254958 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/0d3f90ab-bb01-4735-b78f-250f7955b56f-etc-selinux\") pod \"aws-ebs-csi-driver-node-rxlwr\" (UID: \"0d3f90ab-bb01-4735-b78f-250f7955b56f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rxlwr" Apr 22 14:15:32.258286 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.254970 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0d3f90ab-bb01-4735-b78f-250f7955b56f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-rxlwr\" (UID: \"0d3f90ab-bb01-4735-b78f-250f7955b56f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rxlwr" Apr 22 14:15:32.258286 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.254993 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5880a8e9-777a-4921-b5f6-c6325c768bf2-run-ovn\") pod \"ovnkube-node-x7g6p\" (UID: \"5880a8e9-777a-4921-b5f6-c6325c768bf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:32.258286 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.255038 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5880a8e9-777a-4921-b5f6-c6325c768bf2-run-openvswitch\") pod \"ovnkube-node-x7g6p\" (UID: \"5880a8e9-777a-4921-b5f6-c6325c768bf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:32.258286 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.255058 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/35bb6914-dec8-4b09-9315-761449933a8a-sys\") pod \"tuned-89qbq\" (UID: \"35bb6914-dec8-4b09-9315-761449933a8a\") " pod="openshift-cluster-node-tuning-operator/tuned-89qbq" Apr 22 14:15:32.258286 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.255101 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5880a8e9-777a-4921-b5f6-c6325c768bf2-host-slash\") pod \"ovnkube-node-x7g6p\" (UID: \"5880a8e9-777a-4921-b5f6-c6325c768bf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:32.258286 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.255151 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5880a8e9-777a-4921-b5f6-c6325c768bf2-run-ovn\") pod \"ovnkube-node-x7g6p\" (UID: \"5880a8e9-777a-4921-b5f6-c6325c768bf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:32.258286 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.255175 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/88aefe02-de5c-45b6-a697-2d18d8ae2754-iptables-alerter-script\") pod \"iptables-alerter-h5kfd\" (UID: \"88aefe02-de5c-45b6-a697-2d18d8ae2754\") " pod="openshift-network-operator/iptables-alerter-h5kfd" Apr 22 14:15:32.258286 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.255209 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3895a4bb-68d7-4a37-8937-3ce81c84a431-host-run-k8s-cni-cncf-io\") pod \"multus-7mzfp\" (UID: \"3895a4bb-68d7-4a37-8937-3ce81c84a431\") " pod="openshift-multus/multus-7mzfp" Apr 22 14:15:32.258286 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.255203 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/0d3f90ab-bb01-4735-b78f-250f7955b56f-etc-selinux\") pod \"aws-ebs-csi-driver-node-rxlwr\" (UID: \"0d3f90ab-bb01-4735-b78f-250f7955b56f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rxlwr" Apr 22 14:15:32.258286 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.255244 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3895a4bb-68d7-4a37-8937-3ce81c84a431-host-var-lib-cni-bin\") pod \"multus-7mzfp\" (UID: \"3895a4bb-68d7-4a37-8937-3ce81c84a431\") " pod="openshift-multus/multus-7mzfp" Apr 22 14:15:32.258286 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.255275 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jhk54\" (UniqueName: \"kubernetes.io/projected/110f68fd-5d58-411e-a7fc-980d5d6050e4-kube-api-access-jhk54\") pod \"multus-additional-cni-plugins-8pqgl\" (UID: \"110f68fd-5d58-411e-a7fc-980d5d6050e4\") " pod="openshift-multus/multus-additional-cni-plugins-8pqgl" Apr 22 14:15:32.258286 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.255305 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/35bb6914-dec8-4b09-9315-761449933a8a-var-lib-kubelet\") pod \"tuned-89qbq\" (UID: \"35bb6914-dec8-4b09-9315-761449933a8a\") " pod="openshift-cluster-node-tuning-operator/tuned-89qbq" Apr 22 14:15:32.259101 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.255330 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/35bb6914-dec8-4b09-9315-761449933a8a-etc-tuned\") pod \"tuned-89qbq\" (UID: \"35bb6914-dec8-4b09-9315-761449933a8a\") " pod="openshift-cluster-node-tuning-operator/tuned-89qbq" Apr 22 14:15:32.259101 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.255335 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3895a4bb-68d7-4a37-8937-3ce81c84a431-host-var-lib-cni-bin\") pod \"multus-7mzfp\" (UID: \"3895a4bb-68d7-4a37-8937-3ce81c84a431\") " pod="openshift-multus/multus-7mzfp" Apr 22 14:15:32.259101 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.255358 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0d3f90ab-bb01-4735-b78f-250f7955b56f-socket-dir\") pod \"aws-ebs-csi-driver-node-rxlwr\" (UID: \"0d3f90ab-bb01-4735-b78f-250f7955b56f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rxlwr" Apr 22 14:15:32.259101 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.255388 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/0d3f90ab-bb01-4735-b78f-250f7955b56f-device-dir\") pod \"aws-ebs-csi-driver-node-rxlwr\" (UID: \"0d3f90ab-bb01-4735-b78f-250f7955b56f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rxlwr" Apr 22 14:15:32.259101 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.255392 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/35bb6914-dec8-4b09-9315-761449933a8a-var-lib-kubelet\") pod \"tuned-89qbq\" (UID: \"35bb6914-dec8-4b09-9315-761449933a8a\") " pod="openshift-cluster-node-tuning-operator/tuned-89qbq" Apr 22 14:15:32.259101 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.255417 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5880a8e9-777a-4921-b5f6-c6325c768bf2-node-log\") pod \"ovnkube-node-x7g6p\" (UID: \"5880a8e9-777a-4921-b5f6-c6325c768bf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:32.259101 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.255446 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3895a4bb-68d7-4a37-8937-3ce81c84a431-multus-socket-dir-parent\") pod \"multus-7mzfp\" (UID: \"3895a4bb-68d7-4a37-8937-3ce81c84a431\") " pod="openshift-multus/multus-7mzfp" Apr 22 14:15:32.259101 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.255453 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/35bb6914-dec8-4b09-9315-761449933a8a-host\") pod \"tuned-89qbq\" (UID: \"35bb6914-dec8-4b09-9315-761449933a8a\") " pod="openshift-cluster-node-tuning-operator/tuned-89qbq" Apr 22 14:15:32.259101 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.255474 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5880a8e9-777a-4921-b5f6-c6325c768bf2-systemd-units\") pod \"ovnkube-node-x7g6p\" (UID: \"5880a8e9-777a-4921-b5f6-c6325c768bf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:32.259101 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.255514 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5880a8e9-777a-4921-b5f6-c6325c768bf2-systemd-units\") pod \"ovnkube-node-x7g6p\" (UID: \"5880a8e9-777a-4921-b5f6-c6325c768bf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:32.259101 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.255521 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3895a4bb-68d7-4a37-8937-3ce81c84a431-os-release\") pod \"multus-7mzfp\" (UID: \"3895a4bb-68d7-4a37-8937-3ce81c84a431\") " pod="openshift-multus/multus-7mzfp" Apr 22 14:15:32.259101 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.255557 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3895a4bb-68d7-4a37-8937-3ce81c84a431-host-var-lib-kubelet\") pod \"multus-7mzfp\" (UID: \"3895a4bb-68d7-4a37-8937-3ce81c84a431\") " pod="openshift-multus/multus-7mzfp" Apr 22 14:15:32.259101 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.255575 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3895a4bb-68d7-4a37-8937-3ce81c84a431-os-release\") pod \"multus-7mzfp\" (UID: \"3895a4bb-68d7-4a37-8937-3ce81c84a431\") " pod="openshift-multus/multus-7mzfp" Apr 22 14:15:32.259101 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.255585 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/35bb6914-dec8-4b09-9315-761449933a8a-etc-sysctl-conf\") pod \"tuned-89qbq\" (UID: \"35bb6914-dec8-4b09-9315-761449933a8a\") " pod="openshift-cluster-node-tuning-operator/tuned-89qbq" Apr 22 14:15:32.259101 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.255624 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5880a8e9-777a-4921-b5f6-c6325c768bf2-node-log\") pod \"ovnkube-node-x7g6p\" (UID: \"5880a8e9-777a-4921-b5f6-c6325c768bf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:32.259101 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.255658 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/0d3f90ab-bb01-4735-b78f-250f7955b56f-sys-fs\") pod \"aws-ebs-csi-driver-node-rxlwr\" (UID: \"0d3f90ab-bb01-4735-b78f-250f7955b56f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rxlwr" Apr 22 14:15:32.259101 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.255689 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5880a8e9-777a-4921-b5f6-c6325c768bf2-ovn-node-metrics-cert\") pod \"ovnkube-node-x7g6p\" (UID: \"5880a8e9-777a-4921-b5f6-c6325c768bf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:32.259101 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.255702 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/35bb6914-dec8-4b09-9315-761449933a8a-etc-sysctl-conf\") pod \"tuned-89qbq\" (UID: \"35bb6914-dec8-4b09-9315-761449933a8a\") " pod="openshift-cluster-node-tuning-operator/tuned-89qbq" Apr 22 14:15:32.259792 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.255719 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pqwrm\" (UniqueName: \"kubernetes.io/projected/88aefe02-de5c-45b6-a697-2d18d8ae2754-kube-api-access-pqwrm\") pod \"iptables-alerter-h5kfd\" (UID: \"88aefe02-de5c-45b6-a697-2d18d8ae2754\") " pod="openshift-network-operator/iptables-alerter-h5kfd" Apr 22 14:15:32.259792 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.255744 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0d3f90ab-bb01-4735-b78f-250f7955b56f-socket-dir\") pod \"aws-ebs-csi-driver-node-rxlwr\" (UID: \"0d3f90ab-bb01-4735-b78f-250f7955b56f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rxlwr" Apr 22 14:15:32.259792 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.255765 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3895a4bb-68d7-4a37-8937-3ce81c84a431-multus-socket-dir-parent\") pod \"multus-7mzfp\" (UID: \"3895a4bb-68d7-4a37-8937-3ce81c84a431\") " pod="openshift-multus/multus-7mzfp" Apr 22 14:15:32.259792 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.255769 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dbee0e27-41ea-4d42-84c7-681872bfcda1-metrics-certs\") pod \"network-metrics-daemon-r7psp\" (UID: \"dbee0e27-41ea-4d42-84c7-681872bfcda1\") " pod="openshift-multus/network-metrics-daemon-r7psp" Apr 22 14:15:32.259792 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.255824 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3895a4bb-68d7-4a37-8937-3ce81c84a431-host-var-lib-kubelet\") pod \"multus-7mzfp\" (UID: \"3895a4bb-68d7-4a37-8937-3ce81c84a431\") " pod="openshift-multus/multus-7mzfp" Apr 22 14:15:32.259792 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.255843 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/0d3f90ab-bb01-4735-b78f-250f7955b56f-device-dir\") pod \"aws-ebs-csi-driver-node-rxlwr\" (UID: \"0d3f90ab-bb01-4735-b78f-250f7955b56f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rxlwr" Apr 22 14:15:32.259792 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.255858 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5880a8e9-777a-4921-b5f6-c6325c768bf2-run-systemd\") pod \"ovnkube-node-x7g6p\" (UID: \"5880a8e9-777a-4921-b5f6-c6325c768bf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:32.259792 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.255902 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5880a8e9-777a-4921-b5f6-c6325c768bf2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x7g6p\" (UID: \"5880a8e9-777a-4921-b5f6-c6325c768bf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:32.259792 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.255932 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3895a4bb-68d7-4a37-8937-3ce81c84a431-system-cni-dir\") pod \"multus-7mzfp\" (UID: \"3895a4bb-68d7-4a37-8937-3ce81c84a431\") " pod="openshift-multus/multus-7mzfp" Apr 22 14:15:32.259792 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.255956 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3895a4bb-68d7-4a37-8937-3ce81c84a431-host-var-lib-cni-multus\") pod \"multus-7mzfp\" (UID: \"3895a4bb-68d7-4a37-8937-3ce81c84a431\") " pod="openshift-multus/multus-7mzfp" Apr 22 14:15:32.259792 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.255984 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/110f68fd-5d58-411e-a7fc-980d5d6050e4-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8pqgl\" (UID: \"110f68fd-5d58-411e-a7fc-980d5d6050e4\") " pod="openshift-multus/multus-additional-cni-plugins-8pqgl" Apr 22 14:15:32.259792 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.256014 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/35bb6914-dec8-4b09-9315-761449933a8a-lib-modules\") pod \"tuned-89qbq\" (UID: \"35bb6914-dec8-4b09-9315-761449933a8a\") " pod="openshift-cluster-node-tuning-operator/tuned-89qbq" Apr 22 14:15:32.259792 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.256042 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5880a8e9-777a-4921-b5f6-c6325c768bf2-var-lib-openvswitch\") pod \"ovnkube-node-x7g6p\" (UID: \"5880a8e9-777a-4921-b5f6-c6325c768bf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:32.259792 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.256075 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/35bb6914-dec8-4b09-9315-761449933a8a-etc-sysctl-d\") pod \"tuned-89qbq\" (UID: \"35bb6914-dec8-4b09-9315-761449933a8a\") " pod="openshift-cluster-node-tuning-operator/tuned-89qbq" Apr 22 14:15:32.259792 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.256103 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/88aefe02-de5c-45b6-a697-2d18d8ae2754-host-slash\") pod \"iptables-alerter-h5kfd\" (UID: \"88aefe02-de5c-45b6-a697-2d18d8ae2754\") " pod="openshift-network-operator/iptables-alerter-h5kfd" Apr 22 14:15:32.259792 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.256130 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5880a8e9-777a-4921-b5f6-c6325c768bf2-host-run-netns\") pod \"ovnkube-node-x7g6p\" (UID: \"5880a8e9-777a-4921-b5f6-c6325c768bf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:32.259792 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.256159 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5880a8e9-777a-4921-b5f6-c6325c768bf2-log-socket\") pod \"ovnkube-node-x7g6p\" (UID: \"5880a8e9-777a-4921-b5f6-c6325c768bf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:32.260537 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.256188 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5880a8e9-777a-4921-b5f6-c6325c768bf2-host-cni-netd\") pod \"ovnkube-node-x7g6p\" (UID: \"5880a8e9-777a-4921-b5f6-c6325c768bf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:32.260537 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.256218 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3895a4bb-68d7-4a37-8937-3ce81c84a431-etc-kubernetes\") pod \"multus-7mzfp\" (UID: \"3895a4bb-68d7-4a37-8937-3ce81c84a431\") " pod="openshift-multus/multus-7mzfp" Apr 22 14:15:32.260537 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.256245 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/110f68fd-5d58-411e-a7fc-980d5d6050e4-cnibin\") pod \"multus-additional-cni-plugins-8pqgl\" (UID: \"110f68fd-5d58-411e-a7fc-980d5d6050e4\") " pod="openshift-multus/multus-additional-cni-plugins-8pqgl" Apr 22 14:15:32.260537 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.256269 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/35bb6914-dec8-4b09-9315-761449933a8a-etc-modprobe-d\") pod \"tuned-89qbq\" (UID: \"35bb6914-dec8-4b09-9315-761449933a8a\") " pod="openshift-cluster-node-tuning-operator/tuned-89qbq" Apr 22 14:15:32.260537 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.256297 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/35bb6914-dec8-4b09-9315-761449933a8a-tmp\") pod \"tuned-89qbq\" (UID: \"35bb6914-dec8-4b09-9315-761449933a8a\") " pod="openshift-cluster-node-tuning-operator/tuned-89qbq" Apr 22 14:15:32.260537 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.256329 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/2b976276-fdc8-4595-a9a4-76cc1b34317e-konnectivity-ca\") pod \"konnectivity-agent-mj62z\" (UID: \"2b976276-fdc8-4595-a9a4-76cc1b34317e\") " pod="kube-system/konnectivity-agent-mj62z" Apr 22 14:15:32.260537 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.256357 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3895a4bb-68d7-4a37-8937-3ce81c84a431-multus-conf-dir\") pod \"multus-7mzfp\" (UID: \"3895a4bb-68d7-4a37-8937-3ce81c84a431\") " pod="openshift-multus/multus-7mzfp" Apr 22 14:15:32.260537 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.256386 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/110f68fd-5d58-411e-a7fc-980d5d6050e4-cni-binary-copy\") pod \"multus-additional-cni-plugins-8pqgl\" (UID: \"110f68fd-5d58-411e-a7fc-980d5d6050e4\") " pod="openshift-multus/multus-additional-cni-plugins-8pqgl" Apr 22 14:15:32.260537 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.256411 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tk6vz\" (UniqueName: \"kubernetes.io/projected/e0b90fcd-ae7e-46f8-83c0-b9bdec12c8aa-kube-api-access-tk6vz\") pod \"network-check-target-rbpm2\" (UID: \"e0b90fcd-ae7e-46f8-83c0-b9bdec12c8aa\") " pod="openshift-network-diagnostics/network-check-target-rbpm2" Apr 22 14:15:32.260537 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.256440 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/35bb6914-dec8-4b09-9315-761449933a8a-etc-sysconfig\") pod \"tuned-89qbq\" (UID: \"35bb6914-dec8-4b09-9315-761449933a8a\") " pod="openshift-cluster-node-tuning-operator/tuned-89qbq" Apr 22 14:15:32.260537 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.256469 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/35bb6914-dec8-4b09-9315-761449933a8a-etc-systemd\") pod \"tuned-89qbq\" (UID: \"35bb6914-dec8-4b09-9315-761449933a8a\") " pod="openshift-cluster-node-tuning-operator/tuned-89qbq" Apr 22 14:15:32.260537 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.256505 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/35bb6914-dec8-4b09-9315-761449933a8a-run\") pod \"tuned-89qbq\" (UID: \"35bb6914-dec8-4b09-9315-761449933a8a\") " pod="openshift-cluster-node-tuning-operator/tuned-89qbq" Apr 22 14:15:32.260537 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.256535 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fbf11e8c-40e6-4fe9-b51b-b3a87ff88577-hosts-file\") pod \"node-resolver-8h8cm\" (UID: \"fbf11e8c-40e6-4fe9-b51b-b3a87ff88577\") " pod="openshift-dns/node-resolver-8h8cm" Apr 22 14:15:32.260537 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.256569 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbx7v\" (UniqueName: \"kubernetes.io/projected/dbee0e27-41ea-4d42-84c7-681872bfcda1-kube-api-access-dbx7v\") pod \"network-metrics-daemon-r7psp\" (UID: \"dbee0e27-41ea-4d42-84c7-681872bfcda1\") " pod="openshift-multus/network-metrics-daemon-r7psp" Apr 22 14:15:32.260537 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.256686 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5880a8e9-777a-4921-b5f6-c6325c768bf2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x7g6p\" (UID: \"5880a8e9-777a-4921-b5f6-c6325c768bf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:32.260537 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.256708 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3895a4bb-68d7-4a37-8937-3ce81c84a431-etc-kubernetes\") pod \"multus-7mzfp\" (UID: \"3895a4bb-68d7-4a37-8937-3ce81c84a431\") " pod="openshift-multus/multus-7mzfp" Apr 22 14:15:32.260537 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.256744 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/0d3f90ab-bb01-4735-b78f-250f7955b56f-sys-fs\") pod \"aws-ebs-csi-driver-node-rxlwr\" (UID: \"0d3f90ab-bb01-4735-b78f-250f7955b56f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rxlwr" Apr 22 14:15:32.261253 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.256772 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5880a8e9-777a-4921-b5f6-c6325c768bf2-var-lib-openvswitch\") pod \"ovnkube-node-x7g6p\" (UID: \"5880a8e9-777a-4921-b5f6-c6325c768bf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:32.261253 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.256932 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3895a4bb-68d7-4a37-8937-3ce81c84a431-system-cni-dir\") pod \"multus-7mzfp\" (UID: \"3895a4bb-68d7-4a37-8937-3ce81c84a431\") " pod="openshift-multus/multus-7mzfp" Apr 22 14:15:32.261253 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.257106 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/110f68fd-5d58-411e-a7fc-980d5d6050e4-cni-binary-copy\") pod \"multus-additional-cni-plugins-8pqgl\" (UID: \"110f68fd-5d58-411e-a7fc-980d5d6050e4\") " pod="openshift-multus/multus-additional-cni-plugins-8pqgl" Apr 22 14:15:32.261253 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.257179 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/35bb6914-dec8-4b09-9315-761449933a8a-etc-sysconfig\") pod \"tuned-89qbq\" (UID: \"35bb6914-dec8-4b09-9315-761449933a8a\") " pod="openshift-cluster-node-tuning-operator/tuned-89qbq" Apr 22 14:15:32.261253 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.257192 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/110f68fd-5d58-411e-a7fc-980d5d6050e4-cnibin\") pod \"multus-additional-cni-plugins-8pqgl\" (UID: \"110f68fd-5d58-411e-a7fc-980d5d6050e4\") " pod="openshift-multus/multus-additional-cni-plugins-8pqgl" Apr 22 14:15:32.261253 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.257489 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5880a8e9-777a-4921-b5f6-c6325c768bf2-host-cni-netd\") pod \"ovnkube-node-x7g6p\" (UID: \"5880a8e9-777a-4921-b5f6-c6325c768bf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:32.261253 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.257577 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/35bb6914-dec8-4b09-9315-761449933a8a-etc-systemd\") pod \"tuned-89qbq\" (UID: \"35bb6914-dec8-4b09-9315-761449933a8a\") " pod="openshift-cluster-node-tuning-operator/tuned-89qbq" Apr 22 14:15:32.261253 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.257605 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/2b976276-fdc8-4595-a9a4-76cc1b34317e-konnectivity-ca\") pod \"konnectivity-agent-mj62z\" (UID: \"2b976276-fdc8-4595-a9a4-76cc1b34317e\") " pod="kube-system/konnectivity-agent-mj62z" Apr 22 14:15:32.261253 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.257643 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/35bb6914-dec8-4b09-9315-761449933a8a-run\") pod \"tuned-89qbq\" (UID: \"35bb6914-dec8-4b09-9315-761449933a8a\") " pod="openshift-cluster-node-tuning-operator/tuned-89qbq" Apr 22 14:15:32.261253 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.257663 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3895a4bb-68d7-4a37-8937-3ce81c84a431-multus-conf-dir\") pod \"multus-7mzfp\" (UID: \"3895a4bb-68d7-4a37-8937-3ce81c84a431\") " pod="openshift-multus/multus-7mzfp" Apr 22 14:15:32.261253 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.257686 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5880a8e9-777a-4921-b5f6-c6325c768bf2-run-systemd\") pod \"ovnkube-node-x7g6p\" (UID: \"5880a8e9-777a-4921-b5f6-c6325c768bf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:32.261253 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.257727 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5880a8e9-777a-4921-b5f6-c6325c768bf2-host-run-netns\") pod \"ovnkube-node-x7g6p\" (UID: \"5880a8e9-777a-4921-b5f6-c6325c768bf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:32.261253 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.257764 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fbf11e8c-40e6-4fe9-b51b-b3a87ff88577-hosts-file\") pod \"node-resolver-8h8cm\" (UID: \"fbf11e8c-40e6-4fe9-b51b-b3a87ff88577\") " pod="openshift-dns/node-resolver-8h8cm" Apr 22 14:15:32.261253 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.257824 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3895a4bb-68d7-4a37-8937-3ce81c84a431-host-var-lib-cni-multus\") pod \"multus-7mzfp\" (UID: \"3895a4bb-68d7-4a37-8937-3ce81c84a431\") " pod="openshift-multus/multus-7mzfp" Apr 22 14:15:32.261253 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.257837 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/35bb6914-dec8-4b09-9315-761449933a8a-etc-modprobe-d\") pod \"tuned-89qbq\" (UID: \"35bb6914-dec8-4b09-9315-761449933a8a\") " pod="openshift-cluster-node-tuning-operator/tuned-89qbq" Apr 22 14:15:32.261253 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.257877 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/88aefe02-de5c-45b6-a697-2d18d8ae2754-host-slash\") pod \"iptables-alerter-h5kfd\" (UID: \"88aefe02-de5c-45b6-a697-2d18d8ae2754\") " pod="openshift-network-operator/iptables-alerter-h5kfd" Apr 22 14:15:32.261253 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.257895 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5880a8e9-777a-4921-b5f6-c6325c768bf2-log-socket\") pod \"ovnkube-node-x7g6p\" (UID: \"5880a8e9-777a-4921-b5f6-c6325c768bf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:32.261253 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.258001 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/35bb6914-dec8-4b09-9315-761449933a8a-lib-modules\") pod \"tuned-89qbq\" (UID: \"35bb6914-dec8-4b09-9315-761449933a8a\") " pod="openshift-cluster-node-tuning-operator/tuned-89qbq" Apr 22 14:15:32.262064 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.258003 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/110f68fd-5d58-411e-a7fc-980d5d6050e4-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8pqgl\" (UID: \"110f68fd-5d58-411e-a7fc-980d5d6050e4\") " pod="openshift-multus/multus-additional-cni-plugins-8pqgl" Apr 22 14:15:32.262064 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.258329 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/35bb6914-dec8-4b09-9315-761449933a8a-etc-sysctl-d\") pod \"tuned-89qbq\" (UID: \"35bb6914-dec8-4b09-9315-761449933a8a\") " pod="openshift-cluster-node-tuning-operator/tuned-89qbq" Apr 22 14:15:32.262064 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.258613 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5880a8e9-777a-4921-b5f6-c6325c768bf2-ovn-node-metrics-cert\") pod \"ovnkube-node-x7g6p\" (UID: \"5880a8e9-777a-4921-b5f6-c6325c768bf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:32.262064 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.258820 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/2b976276-fdc8-4595-a9a4-76cc1b34317e-agent-certs\") pod \"konnectivity-agent-mj62z\" (UID: \"2b976276-fdc8-4595-a9a4-76cc1b34317e\") " pod="kube-system/konnectivity-agent-mj62z" Apr 22 14:15:32.262064 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.259033 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/35bb6914-dec8-4b09-9315-761449933a8a-etc-tuned\") pod \"tuned-89qbq\" (UID: \"35bb6914-dec8-4b09-9315-761449933a8a\") " pod="openshift-cluster-node-tuning-operator/tuned-89qbq" Apr 22 14:15:32.262064 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.259564 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/35bb6914-dec8-4b09-9315-761449933a8a-tmp\") pod \"tuned-89qbq\" (UID: \"35bb6914-dec8-4b09-9315-761449933a8a\") " pod="openshift-cluster-node-tuning-operator/tuned-89qbq" Apr 22 14:15:32.262332 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.262145 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-46zg5\" (UniqueName: \"kubernetes.io/projected/0d3f90ab-bb01-4735-b78f-250f7955b56f-kube-api-access-46zg5\") pod \"aws-ebs-csi-driver-node-rxlwr\" (UID: \"0d3f90ab-bb01-4735-b78f-250f7955b56f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rxlwr" Apr 22 14:15:32.263115 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.263097 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvfl4\" (UniqueName: \"kubernetes.io/projected/35bb6914-dec8-4b09-9315-761449933a8a-kube-api-access-jvfl4\") pod \"tuned-89qbq\" (UID: \"35bb6914-dec8-4b09-9315-761449933a8a\") " pod="openshift-cluster-node-tuning-operator/tuned-89qbq" Apr 22 14:15:32.263716 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.263695 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-twfth\" (UniqueName: \"kubernetes.io/projected/3895a4bb-68d7-4a37-8937-3ce81c84a431-kube-api-access-twfth\") pod \"multus-7mzfp\" (UID: \"3895a4bb-68d7-4a37-8937-3ce81c84a431\") " pod="openshift-multus/multus-7mzfp" Apr 22 14:15:32.263954 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.263730 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzbm2\" (UniqueName: \"kubernetes.io/projected/5880a8e9-777a-4921-b5f6-c6325c768bf2-kube-api-access-tzbm2\") pod \"ovnkube-node-x7g6p\" (UID: \"5880a8e9-777a-4921-b5f6-c6325c768bf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:32.263954 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:32.263823 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:15:32.263954 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:32.263842 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:15:32.263954 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:32.263854 2576 projected.go:194] Error preparing data for projected volume kube-api-access-tk6vz for pod openshift-network-diagnostics/network-check-target-rbpm2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:32.263954 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:32.263932 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e0b90fcd-ae7e-46f8-83c0-b9bdec12c8aa-kube-api-access-tk6vz podName:e0b90fcd-ae7e-46f8-83c0-b9bdec12c8aa nodeName:}" failed. No retries permitted until 2026-04-22 14:15:32.763902532 +0000 UTC m=+3.089636958 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-tk6vz" (UniqueName: "kubernetes.io/projected/e0b90fcd-ae7e-46f8-83c0-b9bdec12c8aa-kube-api-access-tk6vz") pod "network-check-target-rbpm2" (UID: "e0b90fcd-ae7e-46f8-83c0-b9bdec12c8aa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:32.264555 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.264537 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqwrm\" (UniqueName: \"kubernetes.io/projected/88aefe02-de5c-45b6-a697-2d18d8ae2754-kube-api-access-pqwrm\") pod \"iptables-alerter-h5kfd\" (UID: \"88aefe02-de5c-45b6-a697-2d18d8ae2754\") " pod="openshift-network-operator/iptables-alerter-h5kfd" Apr 22 14:15:32.264785 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.264766 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fs8qg\" (UniqueName: \"kubernetes.io/projected/fbf11e8c-40e6-4fe9-b51b-b3a87ff88577-kube-api-access-fs8qg\") pod \"node-resolver-8h8cm\" (UID: \"fbf11e8c-40e6-4fe9-b51b-b3a87ff88577\") " pod="openshift-dns/node-resolver-8h8cm" Apr 22 14:15:32.264932 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.264913 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhk54\" (UniqueName: \"kubernetes.io/projected/110f68fd-5d58-411e-a7fc-980d5d6050e4-kube-api-access-jhk54\") pod \"multus-additional-cni-plugins-8pqgl\" (UID: \"110f68fd-5d58-411e-a7fc-980d5d6050e4\") " pod="openshift-multus/multus-additional-cni-plugins-8pqgl" Apr 22 14:15:32.357702 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.357673 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b185de97-4d30-47da-bb08-402ac8989235-serviceca\") pod \"node-ca-bzcbn\" (UID: \"b185de97-4d30-47da-bb08-402ac8989235\") " pod="openshift-image-registry/node-ca-bzcbn" Apr 22 14:15:32.357859 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.357711 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rhz6r\" (UniqueName: \"kubernetes.io/projected/b185de97-4d30-47da-bb08-402ac8989235-kube-api-access-rhz6r\") pod \"node-ca-bzcbn\" (UID: \"b185de97-4d30-47da-bb08-402ac8989235\") " pod="openshift-image-registry/node-ca-bzcbn" Apr 22 14:15:32.357859 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.357741 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dbee0e27-41ea-4d42-84c7-681872bfcda1-metrics-certs\") pod \"network-metrics-daemon-r7psp\" (UID: \"dbee0e27-41ea-4d42-84c7-681872bfcda1\") " pod="openshift-multus/network-metrics-daemon-r7psp" Apr 22 14:15:32.357859 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.357796 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dbx7v\" (UniqueName: \"kubernetes.io/projected/dbee0e27-41ea-4d42-84c7-681872bfcda1-kube-api-access-dbx7v\") pod \"network-metrics-daemon-r7psp\" (UID: \"dbee0e27-41ea-4d42-84c7-681872bfcda1\") " pod="openshift-multus/network-metrics-daemon-r7psp" Apr 22 14:15:32.357978 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:32.357908 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:32.357978 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.357932 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b185de97-4d30-47da-bb08-402ac8989235-host\") pod \"node-ca-bzcbn\" (UID: \"b185de97-4d30-47da-bb08-402ac8989235\") " pod="openshift-image-registry/node-ca-bzcbn" Apr 22 14:15:32.358048 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.358010 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b185de97-4d30-47da-bb08-402ac8989235-host\") pod \"node-ca-bzcbn\" (UID: \"b185de97-4d30-47da-bb08-402ac8989235\") " pod="openshift-image-registry/node-ca-bzcbn" Apr 22 14:15:32.358048 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:32.358010 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dbee0e27-41ea-4d42-84c7-681872bfcda1-metrics-certs podName:dbee0e27-41ea-4d42-84c7-681872bfcda1 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:32.857967472 +0000 UTC m=+3.183701872 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dbee0e27-41ea-4d42-84c7-681872bfcda1-metrics-certs") pod "network-metrics-daemon-r7psp" (UID: "dbee0e27-41ea-4d42-84c7-681872bfcda1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:32.358164 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.358144 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b185de97-4d30-47da-bb08-402ac8989235-serviceca\") pod \"node-ca-bzcbn\" (UID: \"b185de97-4d30-47da-bb08-402ac8989235\") " pod="openshift-image-registry/node-ca-bzcbn" Apr 22 14:15:32.366596 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.366574 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhz6r\" (UniqueName: \"kubernetes.io/projected/b185de97-4d30-47da-bb08-402ac8989235-kube-api-access-rhz6r\") pod \"node-ca-bzcbn\" (UID: \"b185de97-4d30-47da-bb08-402ac8989235\") " pod="openshift-image-registry/node-ca-bzcbn" Apr 22 14:15:32.367005 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.366987 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbx7v\" (UniqueName: \"kubernetes.io/projected/dbee0e27-41ea-4d42-84c7-681872bfcda1-kube-api-access-dbx7v\") pod \"network-metrics-daemon-r7psp\" (UID: \"dbee0e27-41ea-4d42-84c7-681872bfcda1\") " pod="openshift-multus/network-metrics-daemon-r7psp" Apr 22 14:15:32.406242 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.406214 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:15:32.438461 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.438411 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:32.447062 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.447046 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-mj62z" Apr 22 14:15:32.454671 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.454656 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-7mzfp" Apr 22 14:15:32.459258 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.459239 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-8pqgl" Apr 22 14:15:32.466727 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.466707 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-h5kfd" Apr 22 14:15:32.473280 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.473264 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rxlwr" Apr 22 14:15:32.479792 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.479778 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-89qbq" Apr 22 14:15:32.486316 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.486300 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8h8cm" Apr 22 14:15:32.491756 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.491741 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-bzcbn" Apr 22 14:15:32.821370 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:32.821343 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88aefe02_de5c_45b6_a697_2d18d8ae2754.slice/crio-2a7a5a536bc6cf7810b7eb6e3d34d9b5022f228e49495888f959a3d2da990559 WatchSource:0}: Error finding container 2a7a5a536bc6cf7810b7eb6e3d34d9b5022f228e49495888f959a3d2da990559: Status 404 returned error can't find the container with id 2a7a5a536bc6cf7810b7eb6e3d34d9b5022f228e49495888f959a3d2da990559 Apr 22 14:15:32.822076 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:32.822034 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5880a8e9_777a_4921_b5f6_c6325c768bf2.slice/crio-454a46b08fee849c74c9ce284caf706355135a8d42df9d3fa99a0ac306e21891 WatchSource:0}: Error finding container 454a46b08fee849c74c9ce284caf706355135a8d42df9d3fa99a0ac306e21891: Status 404 returned error can't find the container with id 454a46b08fee849c74c9ce284caf706355135a8d42df9d3fa99a0ac306e21891 Apr 22 14:15:32.823891 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:32.823859 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35bb6914_dec8_4b09_9315_761449933a8a.slice/crio-4f49069d5b6c6544ae7fe0652e3addcb5611e32d1c905975db73c38e471df253 WatchSource:0}: Error finding container 4f49069d5b6c6544ae7fe0652e3addcb5611e32d1c905975db73c38e471df253: Status 404 returned error can't find the container with id 4f49069d5b6c6544ae7fe0652e3addcb5611e32d1c905975db73c38e471df253 Apr 22 14:15:32.826442 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:32.826419 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod110f68fd_5d58_411e_a7fc_980d5d6050e4.slice/crio-c8d96f3942b501001acf569e3bc3fa6c4175382e997405c10f1516632a7107cc WatchSource:0}: Error finding container c8d96f3942b501001acf569e3bc3fa6c4175382e997405c10f1516632a7107cc: Status 404 returned error can't find the container with id c8d96f3942b501001acf569e3bc3fa6c4175382e997405c10f1516632a7107cc Apr 22 14:15:32.827428 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:32.827359 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b976276_fdc8_4595_a9a4_76cc1b34317e.slice/crio-786d45cc175c4e0dbabfdce16514cf986873841a2fd115f34464485dcd779920 WatchSource:0}: Error finding container 786d45cc175c4e0dbabfdce16514cf986873841a2fd115f34464485dcd779920: Status 404 returned error can't find the container with id 786d45cc175c4e0dbabfdce16514cf986873841a2fd115f34464485dcd779920 Apr 22 14:15:32.828224 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:32.828126 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d3f90ab_bb01_4735_b78f_250f7955b56f.slice/crio-f50f2ca47c1929eb4e85e991863a6679eac865cead02cbf3831b379bda9ea891 WatchSource:0}: Error finding container f50f2ca47c1929eb4e85e991863a6679eac865cead02cbf3831b379bda9ea891: Status 404 returned error can't find the container with id f50f2ca47c1929eb4e85e991863a6679eac865cead02cbf3831b379bda9ea891 Apr 22 14:15:32.828790 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:32.828763 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3895a4bb_68d7_4a37_8937_3ce81c84a431.slice/crio-167d43096f1263aa784bb51934df1d77dc7b16aaa6c2c88d824901376306a1c8 WatchSource:0}: Error finding container 167d43096f1263aa784bb51934df1d77dc7b16aaa6c2c88d824901376306a1c8: Status 404 returned error can't find the container with id 167d43096f1263aa784bb51934df1d77dc7b16aaa6c2c88d824901376306a1c8 Apr 22 14:15:32.830210 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:32.830186 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb185de97_4d30_47da_bb08_402ac8989235.slice/crio-315903ead95e905760d11ff3bddfb09564be408a27ce2672d919647b3bbc3590 WatchSource:0}: Error finding container 315903ead95e905760d11ff3bddfb09564be408a27ce2672d919647b3bbc3590: Status 404 returned error can't find the container with id 315903ead95e905760d11ff3bddfb09564be408a27ce2672d919647b3bbc3590 Apr 22 14:15:32.831335 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:15:32.831210 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfbf11e8c_40e6_4fe9_b51b_b3a87ff88577.slice/crio-cbdd4f52b211e2f23916f2a21f5fc84d0cb9fc29af7aa65bb2d287dbc1e8037d WatchSource:0}: Error finding container cbdd4f52b211e2f23916f2a21f5fc84d0cb9fc29af7aa65bb2d287dbc1e8037d: Status 404 returned error can't find the container with id cbdd4f52b211e2f23916f2a21f5fc84d0cb9fc29af7aa65bb2d287dbc1e8037d Apr 22 14:15:32.861658 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.861529 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tk6vz\" (UniqueName: \"kubernetes.io/projected/e0b90fcd-ae7e-46f8-83c0-b9bdec12c8aa-kube-api-access-tk6vz\") pod \"network-check-target-rbpm2\" (UID: \"e0b90fcd-ae7e-46f8-83c0-b9bdec12c8aa\") " pod="openshift-network-diagnostics/network-check-target-rbpm2" Apr 22 14:15:32.861745 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:32.861668 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:15:32.861745 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:32.861689 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:15:32.861745 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:32.861701 2576 projected.go:194] Error preparing data for projected volume kube-api-access-tk6vz for pod openshift-network-diagnostics/network-check-target-rbpm2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:32.861745 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:32.861709 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dbee0e27-41ea-4d42-84c7-681872bfcda1-metrics-certs\") pod \"network-metrics-daemon-r7psp\" (UID: \"dbee0e27-41ea-4d42-84c7-681872bfcda1\") " pod="openshift-multus/network-metrics-daemon-r7psp" Apr 22 14:15:32.861973 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:32.861750 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e0b90fcd-ae7e-46f8-83c0-b9bdec12c8aa-kube-api-access-tk6vz podName:e0b90fcd-ae7e-46f8-83c0-b9bdec12c8aa nodeName:}" failed. No retries permitted until 2026-04-22 14:15:33.861731886 +0000 UTC m=+4.187466284 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-tk6vz" (UniqueName: "kubernetes.io/projected/e0b90fcd-ae7e-46f8-83c0-b9bdec12c8aa-kube-api-access-tk6vz") pod "network-check-target-rbpm2" (UID: "e0b90fcd-ae7e-46f8-83c0-b9bdec12c8aa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:32.861973 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:32.861837 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:32.861973 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:32.861898 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dbee0e27-41ea-4d42-84c7-681872bfcda1-metrics-certs podName:dbee0e27-41ea-4d42-84c7-681872bfcda1 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:33.861878381 +0000 UTC m=+4.187612789 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dbee0e27-41ea-4d42-84c7-681872bfcda1-metrics-certs") pod "network-metrics-daemon-r7psp" (UID: "dbee0e27-41ea-4d42-84c7-681872bfcda1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:33.105165 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:33.105085 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:15:33.185630 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:33.185586 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 14:10:31 +0000 UTC" deadline="2028-01-02 08:55:55.282033406 +0000 UTC" Apr 22 14:15:33.185630 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:33.185618 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14874h40m22.096418519s" Apr 22 14:15:33.302885 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:33.302454 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7mzfp" event={"ID":"3895a4bb-68d7-4a37-8937-3ce81c84a431","Type":"ContainerStarted","Data":"167d43096f1263aa784bb51934df1d77dc7b16aaa6c2c88d824901376306a1c8"} Apr 22 14:15:33.305582 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:33.305518 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rxlwr" event={"ID":"0d3f90ab-bb01-4735-b78f-250f7955b56f","Type":"ContainerStarted","Data":"f50f2ca47c1929eb4e85e991863a6679eac865cead02cbf3831b379bda9ea891"} Apr 22 14:15:33.317427 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:33.317363 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-mj62z" event={"ID":"2b976276-fdc8-4595-a9a4-76cc1b34317e","Type":"ContainerStarted","Data":"786d45cc175c4e0dbabfdce16514cf986873841a2fd115f34464485dcd779920"} Apr 22 14:15:33.325348 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:33.325238 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-h5kfd" event={"ID":"88aefe02-de5c-45b6-a697-2d18d8ae2754","Type":"ContainerStarted","Data":"2a7a5a536bc6cf7810b7eb6e3d34d9b5022f228e49495888f959a3d2da990559"} Apr 22 14:15:33.328832 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:33.328520 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-195.ec2.internal" event={"ID":"24b00f41063a53ab6b85ee845aa19b10","Type":"ContainerStarted","Data":"21200ef470406e5447530745a71cc323d78aaadc6fe18456d35cfc19f0c5d064"} Apr 22 14:15:33.334663 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:33.334638 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-bzcbn" event={"ID":"b185de97-4d30-47da-bb08-402ac8989235","Type":"ContainerStarted","Data":"315903ead95e905760d11ff3bddfb09564be408a27ce2672d919647b3bbc3590"} Apr 22 14:15:33.340300 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:33.340228 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8pqgl" event={"ID":"110f68fd-5d58-411e-a7fc-980d5d6050e4","Type":"ContainerStarted","Data":"c8d96f3942b501001acf569e3bc3fa6c4175382e997405c10f1516632a7107cc"} Apr 22 14:15:33.342269 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:33.342219 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-89qbq" event={"ID":"35bb6914-dec8-4b09-9315-761449933a8a","Type":"ContainerStarted","Data":"4f49069d5b6c6544ae7fe0652e3addcb5611e32d1c905975db73c38e471df253"} Apr 22 14:15:33.345516 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:33.345478 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" event={"ID":"5880a8e9-777a-4921-b5f6-c6325c768bf2","Type":"ContainerStarted","Data":"454a46b08fee849c74c9ce284caf706355135a8d42df9d3fa99a0ac306e21891"} Apr 22 14:15:33.359111 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:33.359051 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8h8cm" event={"ID":"fbf11e8c-40e6-4fe9-b51b-b3a87ff88577","Type":"ContainerStarted","Data":"cbdd4f52b211e2f23916f2a21f5fc84d0cb9fc29af7aa65bb2d287dbc1e8037d"} Apr 22 14:15:33.870626 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:33.870597 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dbee0e27-41ea-4d42-84c7-681872bfcda1-metrics-certs\") pod \"network-metrics-daemon-r7psp\" (UID: \"dbee0e27-41ea-4d42-84c7-681872bfcda1\") " pod="openshift-multus/network-metrics-daemon-r7psp" Apr 22 14:15:33.870744 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:33.870651 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tk6vz\" (UniqueName: \"kubernetes.io/projected/e0b90fcd-ae7e-46f8-83c0-b9bdec12c8aa-kube-api-access-tk6vz\") pod \"network-check-target-rbpm2\" (UID: \"e0b90fcd-ae7e-46f8-83c0-b9bdec12c8aa\") " pod="openshift-network-diagnostics/network-check-target-rbpm2" Apr 22 14:15:33.870853 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:33.870835 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:15:33.870913 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:33.870860 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:15:33.870913 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:33.870873 2576 projected.go:194] Error preparing data for projected volume kube-api-access-tk6vz for pod openshift-network-diagnostics/network-check-target-rbpm2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:33.871010 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:33.870930 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e0b90fcd-ae7e-46f8-83c0-b9bdec12c8aa-kube-api-access-tk6vz podName:e0b90fcd-ae7e-46f8-83c0-b9bdec12c8aa nodeName:}" failed. No retries permitted until 2026-04-22 14:15:35.87091073 +0000 UTC m=+6.196645143 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-tk6vz" (UniqueName: "kubernetes.io/projected/e0b90fcd-ae7e-46f8-83c0-b9bdec12c8aa-kube-api-access-tk6vz") pod "network-check-target-rbpm2" (UID: "e0b90fcd-ae7e-46f8-83c0-b9bdec12c8aa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:33.871365 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:33.871328 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:33.871456 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:33.871380 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dbee0e27-41ea-4d42-84c7-681872bfcda1-metrics-certs podName:dbee0e27-41ea-4d42-84c7-681872bfcda1 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:35.871366228 +0000 UTC m=+6.197100640 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dbee0e27-41ea-4d42-84c7-681872bfcda1-metrics-certs") pod "network-metrics-daemon-r7psp" (UID: "dbee0e27-41ea-4d42-84c7-681872bfcda1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:34.290941 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:34.290245 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rbpm2" Apr 22 14:15:34.290941 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:34.290361 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rbpm2" podUID="e0b90fcd-ae7e-46f8-83c0-b9bdec12c8aa" Apr 22 14:15:34.290941 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:34.290764 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r7psp" Apr 22 14:15:34.290941 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:34.290880 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r7psp" podUID="dbee0e27-41ea-4d42-84c7-681872bfcda1" Apr 22 14:15:34.367565 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:34.367536 2576 generic.go:358] "Generic (PLEG): container finished" podID="b865d1ca2ef1f242e56d7c94255bbcb7" containerID="d6d874ac7d51904b565cf9c2620a7da40df94e86a420d181bf6520c105ca2753" exitCode=0 Apr 22 14:15:34.367726 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:34.367568 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-195.ec2.internal" event={"ID":"b865d1ca2ef1f242e56d7c94255bbcb7","Type":"ContainerDied","Data":"d6d874ac7d51904b565cf9c2620a7da40df94e86a420d181bf6520c105ca2753"} Apr 22 14:15:34.380732 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:34.380679 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-195.ec2.internal" podStartSLOduration=3.380662828 podStartE2EDuration="3.380662828s" podCreationTimestamp="2026-04-22 14:15:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:15:33.345040075 +0000 UTC m=+3.670774491" watchObservedRunningTime="2026-04-22 14:15:34.380662828 +0000 UTC m=+4.706397244" Apr 22 14:15:35.379159 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:35.379127 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-195.ec2.internal" event={"ID":"b865d1ca2ef1f242e56d7c94255bbcb7","Type":"ContainerStarted","Data":"b089a9da8759d6718d60761d4120e86292e697c27b9bff2643469ee75f6a0092"} Apr 22 14:15:35.393862 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:35.393818 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-195.ec2.internal" podStartSLOduration=4.393788524 podStartE2EDuration="4.393788524s" podCreationTimestamp="2026-04-22 14:15:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:15:35.393619468 +0000 UTC m=+5.719353885" watchObservedRunningTime="2026-04-22 14:15:35.393788524 +0000 UTC m=+5.719522940" Apr 22 14:15:35.884417 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:35.884375 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dbee0e27-41ea-4d42-84c7-681872bfcda1-metrics-certs\") pod \"network-metrics-daemon-r7psp\" (UID: \"dbee0e27-41ea-4d42-84c7-681872bfcda1\") " pod="openshift-multus/network-metrics-daemon-r7psp" Apr 22 14:15:35.884576 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:35.884437 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tk6vz\" (UniqueName: \"kubernetes.io/projected/e0b90fcd-ae7e-46f8-83c0-b9bdec12c8aa-kube-api-access-tk6vz\") pod \"network-check-target-rbpm2\" (UID: \"e0b90fcd-ae7e-46f8-83c0-b9bdec12c8aa\") " pod="openshift-network-diagnostics/network-check-target-rbpm2" Apr 22 14:15:35.884650 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:35.884596 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:15:35.884650 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:35.884615 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:15:35.884650 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:35.884630 2576 projected.go:194] Error preparing data for projected volume kube-api-access-tk6vz for pod openshift-network-diagnostics/network-check-target-rbpm2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:35.884825 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:35.884683 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e0b90fcd-ae7e-46f8-83c0-b9bdec12c8aa-kube-api-access-tk6vz podName:e0b90fcd-ae7e-46f8-83c0-b9bdec12c8aa nodeName:}" failed. No retries permitted until 2026-04-22 14:15:39.884664716 +0000 UTC m=+10.210399114 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-tk6vz" (UniqueName: "kubernetes.io/projected/e0b90fcd-ae7e-46f8-83c0-b9bdec12c8aa-kube-api-access-tk6vz") pod "network-check-target-rbpm2" (UID: "e0b90fcd-ae7e-46f8-83c0-b9bdec12c8aa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:35.885098 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:35.884999 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:35.885098 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:35.885065 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dbee0e27-41ea-4d42-84c7-681872bfcda1-metrics-certs podName:dbee0e27-41ea-4d42-84c7-681872bfcda1 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:39.885048146 +0000 UTC m=+10.210782543 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dbee0e27-41ea-4d42-84c7-681872bfcda1-metrics-certs") pod "network-metrics-daemon-r7psp" (UID: "dbee0e27-41ea-4d42-84c7-681872bfcda1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:36.288842 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:36.288317 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rbpm2" Apr 22 14:15:36.288842 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:36.288419 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rbpm2" podUID="e0b90fcd-ae7e-46f8-83c0-b9bdec12c8aa" Apr 22 14:15:36.288842 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:36.288564 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r7psp" Apr 22 14:15:36.288842 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:36.288654 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r7psp" podUID="dbee0e27-41ea-4d42-84c7-681872bfcda1" Apr 22 14:15:38.289053 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:38.288990 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r7psp" Apr 22 14:15:38.289053 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:38.289052 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rbpm2" Apr 22 14:15:38.289521 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:38.289149 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rbpm2" podUID="e0b90fcd-ae7e-46f8-83c0-b9bdec12c8aa" Apr 22 14:15:38.289662 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:38.289632 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r7psp" podUID="dbee0e27-41ea-4d42-84c7-681872bfcda1" Apr 22 14:15:39.918975 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:39.918449 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tk6vz\" (UniqueName: \"kubernetes.io/projected/e0b90fcd-ae7e-46f8-83c0-b9bdec12c8aa-kube-api-access-tk6vz\") pod \"network-check-target-rbpm2\" (UID: \"e0b90fcd-ae7e-46f8-83c0-b9bdec12c8aa\") " pod="openshift-network-diagnostics/network-check-target-rbpm2" Apr 22 14:15:39.918975 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:39.918535 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dbee0e27-41ea-4d42-84c7-681872bfcda1-metrics-certs\") pod \"network-metrics-daemon-r7psp\" (UID: \"dbee0e27-41ea-4d42-84c7-681872bfcda1\") " pod="openshift-multus/network-metrics-daemon-r7psp" Apr 22 14:15:39.918975 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:39.918649 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:39.918975 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:39.918720 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dbee0e27-41ea-4d42-84c7-681872bfcda1-metrics-certs podName:dbee0e27-41ea-4d42-84c7-681872bfcda1 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:47.918687614 +0000 UTC m=+18.244422029 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dbee0e27-41ea-4d42-84c7-681872bfcda1-metrics-certs") pod "network-metrics-daemon-r7psp" (UID: "dbee0e27-41ea-4d42-84c7-681872bfcda1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:39.918975 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:39.918799 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:15:39.918975 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:39.918828 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:15:39.918975 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:39.918841 2576 projected.go:194] Error preparing data for projected volume kube-api-access-tk6vz for pod openshift-network-diagnostics/network-check-target-rbpm2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:39.918975 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:39.918880 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e0b90fcd-ae7e-46f8-83c0-b9bdec12c8aa-kube-api-access-tk6vz podName:e0b90fcd-ae7e-46f8-83c0-b9bdec12c8aa nodeName:}" failed. No retries permitted until 2026-04-22 14:15:47.918868216 +0000 UTC m=+18.244602629 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-tk6vz" (UniqueName: "kubernetes.io/projected/e0b90fcd-ae7e-46f8-83c0-b9bdec12c8aa-kube-api-access-tk6vz") pod "network-check-target-rbpm2" (UID: "e0b90fcd-ae7e-46f8-83c0-b9bdec12c8aa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:40.289365 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:40.288873 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rbpm2" Apr 22 14:15:40.289365 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:40.288978 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rbpm2" podUID="e0b90fcd-ae7e-46f8-83c0-b9bdec12c8aa" Apr 22 14:15:40.289365 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:40.289041 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r7psp" Apr 22 14:15:40.289365 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:40.289132 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r7psp" podUID="dbee0e27-41ea-4d42-84c7-681872bfcda1" Apr 22 14:15:42.290546 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:42.290517 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r7psp" Apr 22 14:15:42.290976 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:42.290518 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rbpm2" Apr 22 14:15:42.290976 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:42.290640 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r7psp" podUID="dbee0e27-41ea-4d42-84c7-681872bfcda1" Apr 22 14:15:42.290976 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:42.290705 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rbpm2" podUID="e0b90fcd-ae7e-46f8-83c0-b9bdec12c8aa" Apr 22 14:15:44.290419 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:44.290388 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rbpm2" Apr 22 14:15:44.290419 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:44.290406 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r7psp" Apr 22 14:15:44.290927 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:44.290486 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rbpm2" podUID="e0b90fcd-ae7e-46f8-83c0-b9bdec12c8aa" Apr 22 14:15:44.290927 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:44.290610 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r7psp" podUID="dbee0e27-41ea-4d42-84c7-681872bfcda1" Apr 22 14:15:46.288665 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:46.288629 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r7psp" Apr 22 14:15:46.289132 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:46.288680 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rbpm2" Apr 22 14:15:46.289132 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:46.288781 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r7psp" podUID="dbee0e27-41ea-4d42-84c7-681872bfcda1" Apr 22 14:15:46.289132 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:46.288891 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rbpm2" podUID="e0b90fcd-ae7e-46f8-83c0-b9bdec12c8aa" Apr 22 14:15:47.983148 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:47.983114 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dbee0e27-41ea-4d42-84c7-681872bfcda1-metrics-certs\") pod \"network-metrics-daemon-r7psp\" (UID: \"dbee0e27-41ea-4d42-84c7-681872bfcda1\") " pod="openshift-multus/network-metrics-daemon-r7psp" Apr 22 14:15:47.983622 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:47.983159 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tk6vz\" (UniqueName: \"kubernetes.io/projected/e0b90fcd-ae7e-46f8-83c0-b9bdec12c8aa-kube-api-access-tk6vz\") pod \"network-check-target-rbpm2\" (UID: \"e0b90fcd-ae7e-46f8-83c0-b9bdec12c8aa\") " pod="openshift-network-diagnostics/network-check-target-rbpm2" Apr 22 14:15:47.983622 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:47.983274 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:47.983622 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:47.983289 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:15:47.983622 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:47.983305 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:15:47.983622 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:47.983317 2576 projected.go:194] Error preparing data for projected volume kube-api-access-tk6vz for pod openshift-network-diagnostics/network-check-target-rbpm2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:47.983622 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:47.983332 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dbee0e27-41ea-4d42-84c7-681872bfcda1-metrics-certs podName:dbee0e27-41ea-4d42-84c7-681872bfcda1 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:03.983317704 +0000 UTC m=+34.309052102 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dbee0e27-41ea-4d42-84c7-681872bfcda1-metrics-certs") pod "network-metrics-daemon-r7psp" (UID: "dbee0e27-41ea-4d42-84c7-681872bfcda1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:47.983622 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:47.983361 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e0b90fcd-ae7e-46f8-83c0-b9bdec12c8aa-kube-api-access-tk6vz podName:e0b90fcd-ae7e-46f8-83c0-b9bdec12c8aa nodeName:}" failed. No retries permitted until 2026-04-22 14:16:03.983346871 +0000 UTC m=+34.309081280 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-tk6vz" (UniqueName: "kubernetes.io/projected/e0b90fcd-ae7e-46f8-83c0-b9bdec12c8aa-kube-api-access-tk6vz") pod "network-check-target-rbpm2" (UID: "e0b90fcd-ae7e-46f8-83c0-b9bdec12c8aa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:48.290260 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:48.290192 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rbpm2" Apr 22 14:15:48.290412 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:48.290195 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r7psp" Apr 22 14:15:48.290412 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:48.290342 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rbpm2" podUID="e0b90fcd-ae7e-46f8-83c0-b9bdec12c8aa" Apr 22 14:15:48.290516 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:48.290471 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r7psp" podUID="dbee0e27-41ea-4d42-84c7-681872bfcda1" Apr 22 14:15:49.401640 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:49.401409 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-bzcbn" event={"ID":"b185de97-4d30-47da-bb08-402ac8989235","Type":"ContainerStarted","Data":"7c4407b7365eccf200ed0ac0cc5f4a2e095449b1732087311e31c4694afa2fb6"} Apr 22 14:15:49.402597 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:49.402572 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8pqgl" event={"ID":"110f68fd-5d58-411e-a7fc-980d5d6050e4","Type":"ContainerStarted","Data":"abaa1644cc3a1e000c1b4f4f75b209ffe21d69c6e5c25c873b9d755d347f30cd"} Apr 22 14:15:49.403716 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:49.403686 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-89qbq" event={"ID":"35bb6914-dec8-4b09-9315-761449933a8a","Type":"ContainerStarted","Data":"156d7696fa7fb342f8d0052b57532f2be479956a4e315ae83aaa78ccf7ae2c3b"} Apr 22 14:15:49.405199 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:49.405180 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" event={"ID":"5880a8e9-777a-4921-b5f6-c6325c768bf2","Type":"ContainerStarted","Data":"78eabeb027424585c39ec2194dde8e1e70790186d0f70dad3c2232c0e0f1aa59"} Apr 22 14:15:49.405199 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:49.405204 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" event={"ID":"5880a8e9-777a-4921-b5f6-c6325c768bf2","Type":"ContainerStarted","Data":"44fd5328c6c459b406df4bc6de63b165077d72b8dde2c7101ccfcc485eb560ca"} Apr 22 14:15:49.406565 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:49.406507 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8h8cm" event={"ID":"fbf11e8c-40e6-4fe9-b51b-b3a87ff88577","Type":"ContainerStarted","Data":"2c8113b8ebe35298b33d3dd57ce8ed59e5abefbd8498ea03327a469bda5845eb"} Apr 22 14:15:49.407698 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:49.407673 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7mzfp" event={"ID":"3895a4bb-68d7-4a37-8937-3ce81c84a431","Type":"ContainerStarted","Data":"dcb2bc4acd1cd03a4a845ab1f02814caa2a51a458bca0fd76c0da28aacf22ea9"} Apr 22 14:15:49.408775 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:49.408758 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rxlwr" event={"ID":"0d3f90ab-bb01-4735-b78f-250f7955b56f","Type":"ContainerStarted","Data":"7f50675236a505d173dc42e9ffaaefd6212c86f4507c4ec23f4ad730e671f97d"} Apr 22 14:15:49.409764 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:49.409748 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-mj62z" event={"ID":"2b976276-fdc8-4595-a9a4-76cc1b34317e","Type":"ContainerStarted","Data":"8f9598b0442c133af45e460c084a6cb3c0822cb56569380241207a5aa97cc394"} Apr 22 14:15:49.417025 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:49.416990 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-bzcbn" podStartSLOduration=10.850756813 podStartE2EDuration="19.41698044s" podCreationTimestamp="2026-04-22 14:15:30 +0000 UTC" firstStartedPulling="2026-04-22 14:15:32.831857345 +0000 UTC m=+3.157591739" lastFinishedPulling="2026-04-22 14:15:41.398080953 +0000 UTC m=+11.723815366" observedRunningTime="2026-04-22 14:15:49.416939846 +0000 UTC m=+19.742674417" watchObservedRunningTime="2026-04-22 14:15:49.41698044 +0000 UTC m=+19.742714854" Apr 22 14:15:49.455515 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:49.455483 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-89qbq" podStartSLOduration=3.220438791 podStartE2EDuration="19.455474255s" podCreationTimestamp="2026-04-22 14:15:30 +0000 UTC" firstStartedPulling="2026-04-22 14:15:32.826062889 +0000 UTC m=+3.151797296" lastFinishedPulling="2026-04-22 14:15:49.061098364 +0000 UTC m=+19.386832760" observedRunningTime="2026-04-22 14:15:49.455289528 +0000 UTC m=+19.781023953" watchObservedRunningTime="2026-04-22 14:15:49.455474255 +0000 UTC m=+19.781208669" Apr 22 14:15:49.470392 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:49.470357 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-mj62z" podStartSLOduration=3.441466472 podStartE2EDuration="19.470343311s" podCreationTimestamp="2026-04-22 14:15:30 +0000 UTC" firstStartedPulling="2026-04-22 14:15:32.829568061 +0000 UTC m=+3.155302469" lastFinishedPulling="2026-04-22 14:15:48.85844491 +0000 UTC m=+19.184179308" observedRunningTime="2026-04-22 14:15:49.470205013 +0000 UTC m=+19.795939419" watchObservedRunningTime="2026-04-22 14:15:49.470343311 +0000 UTC m=+19.796077726" Apr 22 14:15:49.488201 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:49.488158 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-7mzfp" podStartSLOduration=3.235062185 podStartE2EDuration="19.488144422s" podCreationTimestamp="2026-04-22 14:15:30 +0000 UTC" firstStartedPulling="2026-04-22 14:15:32.830963921 +0000 UTC m=+3.156698320" lastFinishedPulling="2026-04-22 14:15:49.084046149 +0000 UTC m=+19.409780557" observedRunningTime="2026-04-22 14:15:49.487208699 +0000 UTC m=+19.812943114" watchObservedRunningTime="2026-04-22 14:15:49.488144422 +0000 UTC m=+19.813878834" Apr 22 14:15:50.288677 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:50.288654 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r7psp" Apr 22 14:15:50.288789 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:50.288770 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r7psp" podUID="dbee0e27-41ea-4d42-84c7-681872bfcda1" Apr 22 14:15:50.288880 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:50.288860 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rbpm2" Apr 22 14:15:50.288989 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:50.288970 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rbpm2" podUID="e0b90fcd-ae7e-46f8-83c0-b9bdec12c8aa" Apr 22 14:15:50.293992 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:50.293975 2576 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 14:15:50.412318 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:50.412297 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rxlwr" event={"ID":"0d3f90ab-bb01-4735-b78f-250f7955b56f","Type":"ContainerStarted","Data":"b37c3f6d1450a2495e19256fd16c22f7bed46860c2042ffea59b67a313de96f4"} Apr 22 14:15:50.413489 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:50.413468 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-h5kfd" event={"ID":"88aefe02-de5c-45b6-a697-2d18d8ae2754","Type":"ContainerStarted","Data":"7bfbb421d1d9459ff4b81ca868571f372ded3f1c7ae572d831428412d676975a"} Apr 22 14:15:50.414707 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:50.414684 2576 generic.go:358] "Generic (PLEG): container finished" podID="110f68fd-5d58-411e-a7fc-980d5d6050e4" containerID="abaa1644cc3a1e000c1b4f4f75b209ffe21d69c6e5c25c873b9d755d347f30cd" exitCode=0 Apr 22 14:15:50.414774 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:50.414741 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8pqgl" event={"ID":"110f68fd-5d58-411e-a7fc-980d5d6050e4","Type":"ContainerDied","Data":"abaa1644cc3a1e000c1b4f4f75b209ffe21d69c6e5c25c873b9d755d347f30cd"} Apr 22 14:15:50.419744 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:50.419729 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x7g6p_5880a8e9-777a-4921-b5f6-c6325c768bf2/ovn-acl-logging/0.log" Apr 22 14:15:50.420041 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:50.420024 2576 generic.go:358] "Generic (PLEG): container finished" podID="5880a8e9-777a-4921-b5f6-c6325c768bf2" containerID="78eabeb027424585c39ec2194dde8e1e70790186d0f70dad3c2232c0e0f1aa59" exitCode=1 Apr 22 14:15:50.420148 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:50.420120 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" event={"ID":"5880a8e9-777a-4921-b5f6-c6325c768bf2","Type":"ContainerDied","Data":"78eabeb027424585c39ec2194dde8e1e70790186d0f70dad3c2232c0e0f1aa59"} Apr 22 14:15:50.420228 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:50.420156 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" event={"ID":"5880a8e9-777a-4921-b5f6-c6325c768bf2","Type":"ContainerStarted","Data":"e66b6437f41e272132f53e781d2c686cbc85b729b389b32c87d4beafd4098547"} Apr 22 14:15:50.420228 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:50.420167 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" event={"ID":"5880a8e9-777a-4921-b5f6-c6325c768bf2","Type":"ContainerStarted","Data":"d73ff8a624351d3d610ab3bf09fdc61a04b1874907b2cd302815b070405e93e9"} Apr 22 14:15:50.420228 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:50.420175 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" event={"ID":"5880a8e9-777a-4921-b5f6-c6325c768bf2","Type":"ContainerStarted","Data":"6f82f8574bddc5b0430e3012329b1ce804a6a32d10b436de4ccd428a087dd757"} Apr 22 14:15:50.420228 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:50.420182 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" event={"ID":"5880a8e9-777a-4921-b5f6-c6325c768bf2","Type":"ContainerStarted","Data":"8be24bbcbb23417df96dfb26877dbdff75e5322daa45354cc69baa40bf190d69"} Apr 22 14:15:50.431975 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:50.431939 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-8h8cm" podStartSLOduration=4.206163967 podStartE2EDuration="20.431930082s" podCreationTimestamp="2026-04-22 14:15:30 +0000 UTC" firstStartedPulling="2026-04-22 14:15:32.833540263 +0000 UTC m=+3.159274662" lastFinishedPulling="2026-04-22 14:15:49.059306369 +0000 UTC m=+19.385040777" observedRunningTime="2026-04-22 14:15:49.502991195 +0000 UTC m=+19.828725609" watchObservedRunningTime="2026-04-22 14:15:50.431930082 +0000 UTC m=+20.757664542" Apr 22 14:15:50.432282 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:50.432255 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-h5kfd" podStartSLOduration=4.208615591 podStartE2EDuration="20.43224971s" podCreationTimestamp="2026-04-22 14:15:30 +0000 UTC" firstStartedPulling="2026-04-22 14:15:32.823262281 +0000 UTC m=+3.148996674" lastFinishedPulling="2026-04-22 14:15:49.046896386 +0000 UTC m=+19.372630793" observedRunningTime="2026-04-22 14:15:50.431636015 +0000 UTC m=+20.757370429" watchObservedRunningTime="2026-04-22 14:15:50.43224971 +0000 UTC m=+20.757984123" Apr 22 14:15:51.208142 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:51.208015 2576 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T14:15:50.29398866Z","UUID":"9e803d9e-d356-4d22-adf8-1e05c4d5769f","Handler":null,"Name":"","Endpoint":""} Apr 22 14:15:51.209735 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:51.209714 2576 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 14:15:51.209735 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:51.209739 2576 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 14:15:52.288218 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:52.288048 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rbpm2" Apr 22 14:15:52.288638 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:52.288098 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r7psp" Apr 22 14:15:52.288638 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:52.288342 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rbpm2" podUID="e0b90fcd-ae7e-46f8-83c0-b9bdec12c8aa" Apr 22 14:15:52.288638 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:52.288410 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r7psp" podUID="dbee0e27-41ea-4d42-84c7-681872bfcda1" Apr 22 14:15:52.426634 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:52.426611 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x7g6p_5880a8e9-777a-4921-b5f6-c6325c768bf2/ovn-acl-logging/0.log" Apr 22 14:15:52.427048 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:52.427027 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" event={"ID":"5880a8e9-777a-4921-b5f6-c6325c768bf2","Type":"ContainerStarted","Data":"c6683573a6219d7937a55f347ec8fd27f7641ae909a3fde3bc6b8bd915eb93ec"} Apr 22 14:15:52.428921 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:52.428891 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rxlwr" event={"ID":"0d3f90ab-bb01-4735-b78f-250f7955b56f","Type":"ContainerStarted","Data":"c21edc27ca50cf44d7092c566fd259520f4ad3eb3d7b8f67c84ebc4af01dd006"} Apr 22 14:15:52.448661 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:52.448618 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rxlwr" podStartSLOduration=3.95463542 podStartE2EDuration="22.448605279s" podCreationTimestamp="2026-04-22 14:15:30 +0000 UTC" firstStartedPulling="2026-04-22 14:15:32.829920441 +0000 UTC m=+3.155654835" lastFinishedPulling="2026-04-22 14:15:51.323890297 +0000 UTC m=+21.649624694" observedRunningTime="2026-04-22 14:15:52.448514886 +0000 UTC m=+22.774249299" watchObservedRunningTime="2026-04-22 14:15:52.448605279 +0000 UTC m=+22.774339694" Apr 22 14:15:53.483504 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:53.483464 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-mj62z" Apr 22 14:15:53.484352 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:53.484334 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-mj62z" Apr 22 14:15:54.287911 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:54.287881 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rbpm2" Apr 22 14:15:54.288047 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:54.287894 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r7psp" Apr 22 14:15:54.288047 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:54.287972 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rbpm2" podUID="e0b90fcd-ae7e-46f8-83c0-b9bdec12c8aa" Apr 22 14:15:54.288141 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:54.288075 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r7psp" podUID="dbee0e27-41ea-4d42-84c7-681872bfcda1" Apr 22 14:15:54.436135 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:54.435985 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x7g6p_5880a8e9-777a-4921-b5f6-c6325c768bf2/ovn-acl-logging/0.log" Apr 22 14:15:54.436525 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:54.436319 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" event={"ID":"5880a8e9-777a-4921-b5f6-c6325c768bf2","Type":"ContainerStarted","Data":"7ed2859f982f2bfa9dee06db4d9691d2c702fa437482deb72febd85a051fbad0"} Apr 22 14:15:54.436970 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:54.436679 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-mj62z" Apr 22 14:15:54.436970 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:54.436791 2576 scope.go:117] "RemoveContainer" containerID="78eabeb027424585c39ec2194dde8e1e70790186d0f70dad3c2232c0e0f1aa59" Apr 22 14:15:54.437437 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:54.437258 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-mj62z" Apr 22 14:15:55.439295 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:55.439262 2576 generic.go:358] "Generic (PLEG): container finished" podID="110f68fd-5d58-411e-a7fc-980d5d6050e4" containerID="3a5e88bd56ec03c862eb340c0e5a8a98c68edcd142230f4ecc05a878a181b51a" exitCode=0 Apr 22 14:15:55.439894 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:55.439336 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8pqgl" event={"ID":"110f68fd-5d58-411e-a7fc-980d5d6050e4","Type":"ContainerDied","Data":"3a5e88bd56ec03c862eb340c0e5a8a98c68edcd142230f4ecc05a878a181b51a"} Apr 22 14:15:55.442400 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:55.442380 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x7g6p_5880a8e9-777a-4921-b5f6-c6325c768bf2/ovn-acl-logging/0.log" Apr 22 14:15:55.442716 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:55.442694 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" event={"ID":"5880a8e9-777a-4921-b5f6-c6325c768bf2","Type":"ContainerStarted","Data":"739abb6651cc5ab1721de6f58434c060bb6bed34dea1c9e1f1043b73c4426cee"} Apr 22 14:15:55.445364 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:55.443732 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:55.445364 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:55.443762 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:55.445364 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:55.443785 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:55.459458 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:55.459437 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:55.463179 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:55.463162 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:15:55.503337 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:55.503292 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" podStartSLOduration=9.214880872 podStartE2EDuration="25.503281536s" podCreationTimestamp="2026-04-22 14:15:30 +0000 UTC" firstStartedPulling="2026-04-22 14:15:32.824022188 +0000 UTC m=+3.149756590" lastFinishedPulling="2026-04-22 14:15:49.112422848 +0000 UTC m=+19.438157254" observedRunningTime="2026-04-22 14:15:55.50279545 +0000 UTC m=+25.828529864" watchObservedRunningTime="2026-04-22 14:15:55.503281536 +0000 UTC m=+25.829015951" Apr 22 14:15:56.288471 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:56.288444 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rbpm2" Apr 22 14:15:56.288633 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:56.288445 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r7psp" Apr 22 14:15:56.288633 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:56.288531 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rbpm2" podUID="e0b90fcd-ae7e-46f8-83c0-b9bdec12c8aa" Apr 22 14:15:56.288633 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:56.288619 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r7psp" podUID="dbee0e27-41ea-4d42-84c7-681872bfcda1" Apr 22 14:15:56.324901 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:56.324873 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-rbpm2"] Apr 22 14:15:56.325469 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:56.325447 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-r7psp"] Apr 22 14:15:56.444581 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:56.444520 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r7psp" Apr 22 14:15:56.444581 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:56.444519 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rbpm2" Apr 22 14:15:56.445055 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:56.444679 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r7psp" podUID="dbee0e27-41ea-4d42-84c7-681872bfcda1" Apr 22 14:15:56.445055 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:56.444771 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rbpm2" podUID="e0b90fcd-ae7e-46f8-83c0-b9bdec12c8aa" Apr 22 14:15:57.447949 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:57.447756 2576 generic.go:358] "Generic (PLEG): container finished" podID="110f68fd-5d58-411e-a7fc-980d5d6050e4" containerID="715aec6978e034d48b0a6eec62442f4d6c473f789a01a8722d994955c562691c" exitCode=0 Apr 22 14:15:57.447949 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:57.447841 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8pqgl" event={"ID":"110f68fd-5d58-411e-a7fc-980d5d6050e4","Type":"ContainerDied","Data":"715aec6978e034d48b0a6eec62442f4d6c473f789a01a8722d994955c562691c"} Apr 22 14:15:58.288821 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:58.288492 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rbpm2" Apr 22 14:15:58.288821 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:58.288492 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r7psp" Apr 22 14:15:58.288821 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:58.288619 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rbpm2" podUID="e0b90fcd-ae7e-46f8-83c0-b9bdec12c8aa" Apr 22 14:15:58.288821 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:15:58.288715 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r7psp" podUID="dbee0e27-41ea-4d42-84c7-681872bfcda1" Apr 22 14:15:59.452988 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:59.452951 2576 generic.go:358] "Generic (PLEG): container finished" podID="110f68fd-5d58-411e-a7fc-980d5d6050e4" containerID="96dd74887679e5db9594290e5d873b9b45365087a46bdcb7b54243c3a93645ea" exitCode=0 Apr 22 14:15:59.453479 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:15:59.453025 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8pqgl" event={"ID":"110f68fd-5d58-411e-a7fc-980d5d6050e4","Type":"ContainerDied","Data":"96dd74887679e5db9594290e5d873b9b45365087a46bdcb7b54243c3a93645ea"} Apr 22 14:16:00.288942 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:00.288908 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rbpm2" Apr 22 14:16:00.289117 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:16:00.289014 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rbpm2" podUID="e0b90fcd-ae7e-46f8-83c0-b9bdec12c8aa" Apr 22 14:16:00.289117 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:00.289102 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r7psp" Apr 22 14:16:00.289261 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:16:00.289216 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r7psp" podUID="dbee0e27-41ea-4d42-84c7-681872bfcda1" Apr 22 14:16:02.048467 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:02.048438 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-195.ec2.internal" event="NodeReady" Apr 22 14:16:02.049037 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:02.048568 2576 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 14:16:02.099503 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:02.099425 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-v29ss"] Apr 22 14:16:02.105581 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:02.105559 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-9cr92"] Apr 22 14:16:02.105741 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:02.105720 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-v29ss" Apr 22 14:16:02.108419 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:02.108401 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 14:16:02.108679 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:02.108661 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-zpmwg\"" Apr 22 14:16:02.108679 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:02.108675 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 14:16:02.111576 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:02.111558 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9cr92"] Apr 22 14:16:02.111682 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:02.111661 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9cr92" Apr 22 14:16:02.114067 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:02.114039 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-v29ss"] Apr 22 14:16:02.114684 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:02.114313 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 14:16:02.115124 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:02.115103 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-lthk8\"" Apr 22 14:16:02.115230 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:02.115214 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 14:16:02.115294 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:02.115267 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 14:16:02.193106 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:02.193077 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9eeb019a-e2e1-4f80-8bf1-18b7ce973747-metrics-tls\") pod \"dns-default-v29ss\" (UID: \"9eeb019a-e2e1-4f80-8bf1-18b7ce973747\") " pod="openshift-dns/dns-default-v29ss" Apr 22 14:16:02.193106 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:02.193108 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e7bf6435-22e2-4377-a9e9-e35bf103f96a-cert\") pod \"ingress-canary-9cr92\" (UID: \"e7bf6435-22e2-4377-a9e9-e35bf103f96a\") " pod="openshift-ingress-canary/ingress-canary-9cr92" Apr 22 14:16:02.193271 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:02.193139 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvpj4\" (UniqueName: \"kubernetes.io/projected/9eeb019a-e2e1-4f80-8bf1-18b7ce973747-kube-api-access-xvpj4\") pod \"dns-default-v29ss\" (UID: \"9eeb019a-e2e1-4f80-8bf1-18b7ce973747\") " pod="openshift-dns/dns-default-v29ss" Apr 22 14:16:02.193271 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:02.193217 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9eeb019a-e2e1-4f80-8bf1-18b7ce973747-config-volume\") pod \"dns-default-v29ss\" (UID: \"9eeb019a-e2e1-4f80-8bf1-18b7ce973747\") " pod="openshift-dns/dns-default-v29ss" Apr 22 14:16:02.193271 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:02.193259 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9eeb019a-e2e1-4f80-8bf1-18b7ce973747-tmp-dir\") pod \"dns-default-v29ss\" (UID: \"9eeb019a-e2e1-4f80-8bf1-18b7ce973747\") " pod="openshift-dns/dns-default-v29ss" Apr 22 14:16:02.193392 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:02.193286 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n58dt\" (UniqueName: \"kubernetes.io/projected/e7bf6435-22e2-4377-a9e9-e35bf103f96a-kube-api-access-n58dt\") pod \"ingress-canary-9cr92\" (UID: \"e7bf6435-22e2-4377-a9e9-e35bf103f96a\") " pod="openshift-ingress-canary/ingress-canary-9cr92" Apr 22 14:16:02.287951 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:02.287921 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rbpm2" Apr 22 14:16:02.288144 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:02.288126 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r7psp" Apr 22 14:16:02.291182 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:02.291157 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-mfkfw\"" Apr 22 14:16:02.291280 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:02.291189 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-jjvkf\"" Apr 22 14:16:02.291280 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:02.291209 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 14:16:02.291404 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:02.291389 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 14:16:02.291601 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:02.291588 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 14:16:02.293760 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:02.293726 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9eeb019a-e2e1-4f80-8bf1-18b7ce973747-metrics-tls\") pod \"dns-default-v29ss\" (UID: \"9eeb019a-e2e1-4f80-8bf1-18b7ce973747\") " pod="openshift-dns/dns-default-v29ss" Apr 22 14:16:02.293875 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:02.293759 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e7bf6435-22e2-4377-a9e9-e35bf103f96a-cert\") pod \"ingress-canary-9cr92\" (UID: \"e7bf6435-22e2-4377-a9e9-e35bf103f96a\") " pod="openshift-ingress-canary/ingress-canary-9cr92" Apr 22 14:16:02.293875 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:02.293792 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xvpj4\" (UniqueName: \"kubernetes.io/projected/9eeb019a-e2e1-4f80-8bf1-18b7ce973747-kube-api-access-xvpj4\") pod \"dns-default-v29ss\" (UID: \"9eeb019a-e2e1-4f80-8bf1-18b7ce973747\") " pod="openshift-dns/dns-default-v29ss" Apr 22 14:16:02.293875 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:02.293840 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9eeb019a-e2e1-4f80-8bf1-18b7ce973747-config-volume\") pod \"dns-default-v29ss\" (UID: \"9eeb019a-e2e1-4f80-8bf1-18b7ce973747\") " pod="openshift-dns/dns-default-v29ss" Apr 22 14:16:02.293875 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:16:02.293859 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:16:02.294070 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:16:02.293877 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:16:02.294070 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:02.293895 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9eeb019a-e2e1-4f80-8bf1-18b7ce973747-tmp-dir\") pod \"dns-default-v29ss\" (UID: \"9eeb019a-e2e1-4f80-8bf1-18b7ce973747\") " pod="openshift-dns/dns-default-v29ss" Apr 22 14:16:02.294070 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:16:02.293922 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7bf6435-22e2-4377-a9e9-e35bf103f96a-cert podName:e7bf6435-22e2-4377-a9e9-e35bf103f96a nodeName:}" failed. No retries permitted until 2026-04-22 14:16:02.793903835 +0000 UTC m=+33.119638238 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e7bf6435-22e2-4377-a9e9-e35bf103f96a-cert") pod "ingress-canary-9cr92" (UID: "e7bf6435-22e2-4377-a9e9-e35bf103f96a") : secret "canary-serving-cert" not found Apr 22 14:16:02.294220 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:16:02.294080 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9eeb019a-e2e1-4f80-8bf1-18b7ce973747-metrics-tls podName:9eeb019a-e2e1-4f80-8bf1-18b7ce973747 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:02.794044768 +0000 UTC m=+33.119779161 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9eeb019a-e2e1-4f80-8bf1-18b7ce973747-metrics-tls") pod "dns-default-v29ss" (UID: "9eeb019a-e2e1-4f80-8bf1-18b7ce973747") : secret "dns-default-metrics-tls" not found Apr 22 14:16:02.294220 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:02.294108 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n58dt\" (UniqueName: \"kubernetes.io/projected/e7bf6435-22e2-4377-a9e9-e35bf103f96a-kube-api-access-n58dt\") pod \"ingress-canary-9cr92\" (UID: \"e7bf6435-22e2-4377-a9e9-e35bf103f96a\") " pod="openshift-ingress-canary/ingress-canary-9cr92" Apr 22 14:16:02.294355 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:02.294332 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9eeb019a-e2e1-4f80-8bf1-18b7ce973747-tmp-dir\") pod \"dns-default-v29ss\" (UID: \"9eeb019a-e2e1-4f80-8bf1-18b7ce973747\") " pod="openshift-dns/dns-default-v29ss" Apr 22 14:16:02.294612 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:02.294594 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9eeb019a-e2e1-4f80-8bf1-18b7ce973747-config-volume\") pod \"dns-default-v29ss\" (UID: \"9eeb019a-e2e1-4f80-8bf1-18b7ce973747\") " pod="openshift-dns/dns-default-v29ss" Apr 22 14:16:02.308595 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:02.308570 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvpj4\" (UniqueName: \"kubernetes.io/projected/9eeb019a-e2e1-4f80-8bf1-18b7ce973747-kube-api-access-xvpj4\") pod \"dns-default-v29ss\" (UID: \"9eeb019a-e2e1-4f80-8bf1-18b7ce973747\") " pod="openshift-dns/dns-default-v29ss" Apr 22 14:16:02.308711 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:02.308673 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n58dt\" (UniqueName: \"kubernetes.io/projected/e7bf6435-22e2-4377-a9e9-e35bf103f96a-kube-api-access-n58dt\") pod \"ingress-canary-9cr92\" (UID: \"e7bf6435-22e2-4377-a9e9-e35bf103f96a\") " pod="openshift-ingress-canary/ingress-canary-9cr92" Apr 22 14:16:02.796979 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:02.796945 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9eeb019a-e2e1-4f80-8bf1-18b7ce973747-metrics-tls\") pod \"dns-default-v29ss\" (UID: \"9eeb019a-e2e1-4f80-8bf1-18b7ce973747\") " pod="openshift-dns/dns-default-v29ss" Apr 22 14:16:02.796979 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:02.796985 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e7bf6435-22e2-4377-a9e9-e35bf103f96a-cert\") pod \"ingress-canary-9cr92\" (UID: \"e7bf6435-22e2-4377-a9e9-e35bf103f96a\") " pod="openshift-ingress-canary/ingress-canary-9cr92" Apr 22 14:16:02.797205 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:16:02.797093 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:16:02.797205 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:16:02.797099 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:16:02.797205 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:16:02.797157 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7bf6435-22e2-4377-a9e9-e35bf103f96a-cert podName:e7bf6435-22e2-4377-a9e9-e35bf103f96a nodeName:}" failed. No retries permitted until 2026-04-22 14:16:03.797138271 +0000 UTC m=+34.122872668 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e7bf6435-22e2-4377-a9e9-e35bf103f96a-cert") pod "ingress-canary-9cr92" (UID: "e7bf6435-22e2-4377-a9e9-e35bf103f96a") : secret "canary-serving-cert" not found Apr 22 14:16:02.797205 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:16:02.797175 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9eeb019a-e2e1-4f80-8bf1-18b7ce973747-metrics-tls podName:9eeb019a-e2e1-4f80-8bf1-18b7ce973747 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:03.79716694 +0000 UTC m=+34.122901334 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9eeb019a-e2e1-4f80-8bf1-18b7ce973747-metrics-tls") pod "dns-default-v29ss" (UID: "9eeb019a-e2e1-4f80-8bf1-18b7ce973747") : secret "dns-default-metrics-tls" not found Apr 22 14:16:03.028902 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:03.028867 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-d7f49cbdc-fj6gv"] Apr 22 14:16:03.065106 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:03.065024 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-d7f49cbdc-fj6gv"] Apr 22 14:16:03.065106 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:03.065053 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-db7dbbfc5-pxs4q"] Apr 22 14:16:03.065694 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:03.065185 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-d7f49cbdc-fj6gv" Apr 22 14:16:03.069364 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:03.069341 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 22 14:16:03.069495 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:03.069341 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 22 14:16:03.069632 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:03.069615 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 22 14:16:03.069732 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:03.069720 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 22 14:16:03.081393 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:03.081370 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-db7dbbfc5-pxs4q" Apr 22 14:16:03.085881 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:03.085848 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 22 14:16:03.086078 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:03.086061 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 22 14:16:03.086164 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:03.086068 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 22 14:16:03.087185 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:03.087143 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 22 14:16:03.087274 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:03.087220 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-db7dbbfc5-pxs4q"] Apr 22 14:16:03.200071 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:03.200044 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/cbe7e75b-bd42-4538-9392-41d086dc6e34-ca\") pod \"cluster-proxy-proxy-agent-db7dbbfc5-pxs4q\" (UID: \"cbe7e75b-bd42-4538-9392-41d086dc6e34\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-db7dbbfc5-pxs4q" Apr 22 14:16:03.200071 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:03.200079 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/cbe7e75b-bd42-4538-9392-41d086dc6e34-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-db7dbbfc5-pxs4q\" (UID: \"cbe7e75b-bd42-4538-9392-41d086dc6e34\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-db7dbbfc5-pxs4q" Apr 22 14:16:03.200318 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:03.200115 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/cbe7e75b-bd42-4538-9392-41d086dc6e34-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-db7dbbfc5-pxs4q\" (UID: \"cbe7e75b-bd42-4538-9392-41d086dc6e34\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-db7dbbfc5-pxs4q" Apr 22 14:16:03.200318 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:03.200141 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxsbc\" (UniqueName: \"kubernetes.io/projected/cbe7e75b-bd42-4538-9392-41d086dc6e34-kube-api-access-fxsbc\") pod \"cluster-proxy-proxy-agent-db7dbbfc5-pxs4q\" (UID: \"cbe7e75b-bd42-4538-9392-41d086dc6e34\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-db7dbbfc5-pxs4q" Apr 22 14:16:03.200318 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:03.200169 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2fa093c6-79ec-4683-a1e6-c0078c63b94b-tmp\") pod \"klusterlet-addon-workmgr-d7f49cbdc-fj6gv\" (UID: \"2fa093c6-79ec-4683-a1e6-c0078c63b94b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-d7f49cbdc-fj6gv" Apr 22 14:16:03.200318 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:03.200183 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/cbe7e75b-bd42-4538-9392-41d086dc6e34-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-db7dbbfc5-pxs4q\" (UID: \"cbe7e75b-bd42-4538-9392-41d086dc6e34\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-db7dbbfc5-pxs4q" Apr 22 14:16:03.200318 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:03.200283 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dds5\" (UniqueName: \"kubernetes.io/projected/2fa093c6-79ec-4683-a1e6-c0078c63b94b-kube-api-access-9dds5\") pod \"klusterlet-addon-workmgr-d7f49cbdc-fj6gv\" (UID: \"2fa093c6-79ec-4683-a1e6-c0078c63b94b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-d7f49cbdc-fj6gv" Apr 22 14:16:03.200540 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:03.200336 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/2fa093c6-79ec-4683-a1e6-c0078c63b94b-klusterlet-config\") pod \"klusterlet-addon-workmgr-d7f49cbdc-fj6gv\" (UID: \"2fa093c6-79ec-4683-a1e6-c0078c63b94b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-d7f49cbdc-fj6gv" Apr 22 14:16:03.200540 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:03.200370 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/cbe7e75b-bd42-4538-9392-41d086dc6e34-hub\") pod \"cluster-proxy-proxy-agent-db7dbbfc5-pxs4q\" (UID: \"cbe7e75b-bd42-4538-9392-41d086dc6e34\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-db7dbbfc5-pxs4q" Apr 22 14:16:03.300757 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:03.300726 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/cbe7e75b-bd42-4538-9392-41d086dc6e34-ca\") pod \"cluster-proxy-proxy-agent-db7dbbfc5-pxs4q\" (UID: \"cbe7e75b-bd42-4538-9392-41d086dc6e34\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-db7dbbfc5-pxs4q" Apr 22 14:16:03.300757 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:03.300763 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/cbe7e75b-bd42-4538-9392-41d086dc6e34-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-db7dbbfc5-pxs4q\" (UID: \"cbe7e75b-bd42-4538-9392-41d086dc6e34\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-db7dbbfc5-pxs4q" Apr 22 14:16:03.300987 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:03.300791 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/cbe7e75b-bd42-4538-9392-41d086dc6e34-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-db7dbbfc5-pxs4q\" (UID: \"cbe7e75b-bd42-4538-9392-41d086dc6e34\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-db7dbbfc5-pxs4q" Apr 22 14:16:03.300987 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:03.300911 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fxsbc\" (UniqueName: \"kubernetes.io/projected/cbe7e75b-bd42-4538-9392-41d086dc6e34-kube-api-access-fxsbc\") pod \"cluster-proxy-proxy-agent-db7dbbfc5-pxs4q\" (UID: \"cbe7e75b-bd42-4538-9392-41d086dc6e34\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-db7dbbfc5-pxs4q" Apr 22 14:16:03.300987 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:03.300953 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2fa093c6-79ec-4683-a1e6-c0078c63b94b-tmp\") pod \"klusterlet-addon-workmgr-d7f49cbdc-fj6gv\" (UID: \"2fa093c6-79ec-4683-a1e6-c0078c63b94b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-d7f49cbdc-fj6gv" Apr 22 14:16:03.300987 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:03.300979 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/cbe7e75b-bd42-4538-9392-41d086dc6e34-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-db7dbbfc5-pxs4q\" (UID: \"cbe7e75b-bd42-4538-9392-41d086dc6e34\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-db7dbbfc5-pxs4q" Apr 22 14:16:03.301176 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:03.301050 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9dds5\" (UniqueName: \"kubernetes.io/projected/2fa093c6-79ec-4683-a1e6-c0078c63b94b-kube-api-access-9dds5\") pod \"klusterlet-addon-workmgr-d7f49cbdc-fj6gv\" (UID: \"2fa093c6-79ec-4683-a1e6-c0078c63b94b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-d7f49cbdc-fj6gv" Apr 22 14:16:03.301176 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:03.301079 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/2fa093c6-79ec-4683-a1e6-c0078c63b94b-klusterlet-config\") pod \"klusterlet-addon-workmgr-d7f49cbdc-fj6gv\" (UID: \"2fa093c6-79ec-4683-a1e6-c0078c63b94b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-d7f49cbdc-fj6gv" Apr 22 14:16:03.301176 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:03.301103 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/cbe7e75b-bd42-4538-9392-41d086dc6e34-hub\") pod \"cluster-proxy-proxy-agent-db7dbbfc5-pxs4q\" (UID: \"cbe7e75b-bd42-4538-9392-41d086dc6e34\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-db7dbbfc5-pxs4q" Apr 22 14:16:03.302020 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:03.301969 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/cbe7e75b-bd42-4538-9392-41d086dc6e34-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-db7dbbfc5-pxs4q\" (UID: \"cbe7e75b-bd42-4538-9392-41d086dc6e34\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-db7dbbfc5-pxs4q" Apr 22 14:16:03.302170 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:03.302145 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2fa093c6-79ec-4683-a1e6-c0078c63b94b-tmp\") pod \"klusterlet-addon-workmgr-d7f49cbdc-fj6gv\" (UID: \"2fa093c6-79ec-4683-a1e6-c0078c63b94b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-d7f49cbdc-fj6gv" Apr 22 14:16:03.303772 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:03.303753 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/cbe7e75b-bd42-4538-9392-41d086dc6e34-hub\") pod \"cluster-proxy-proxy-agent-db7dbbfc5-pxs4q\" (UID: \"cbe7e75b-bd42-4538-9392-41d086dc6e34\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-db7dbbfc5-pxs4q" Apr 22 14:16:03.304759 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:03.304726 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/2fa093c6-79ec-4683-a1e6-c0078c63b94b-klusterlet-config\") pod \"klusterlet-addon-workmgr-d7f49cbdc-fj6gv\" (UID: \"2fa093c6-79ec-4683-a1e6-c0078c63b94b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-d7f49cbdc-fj6gv" Apr 22 14:16:03.309368 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:03.309345 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dds5\" (UniqueName: \"kubernetes.io/projected/2fa093c6-79ec-4683-a1e6-c0078c63b94b-kube-api-access-9dds5\") pod \"klusterlet-addon-workmgr-d7f49cbdc-fj6gv\" (UID: \"2fa093c6-79ec-4683-a1e6-c0078c63b94b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-d7f49cbdc-fj6gv" Apr 22 14:16:03.316284 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:03.316211 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/cbe7e75b-bd42-4538-9392-41d086dc6e34-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-db7dbbfc5-pxs4q\" (UID: \"cbe7e75b-bd42-4538-9392-41d086dc6e34\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-db7dbbfc5-pxs4q" Apr 22 14:16:03.316284 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:03.316256 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/cbe7e75b-bd42-4538-9392-41d086dc6e34-ca\") pod \"cluster-proxy-proxy-agent-db7dbbfc5-pxs4q\" (UID: \"cbe7e75b-bd42-4538-9392-41d086dc6e34\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-db7dbbfc5-pxs4q" Apr 22 14:16:03.316468 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:03.316324 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/cbe7e75b-bd42-4538-9392-41d086dc6e34-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-db7dbbfc5-pxs4q\" (UID: \"cbe7e75b-bd42-4538-9392-41d086dc6e34\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-db7dbbfc5-pxs4q" Apr 22 14:16:03.318102 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:03.318080 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxsbc\" (UniqueName: \"kubernetes.io/projected/cbe7e75b-bd42-4538-9392-41d086dc6e34-kube-api-access-fxsbc\") pod \"cluster-proxy-proxy-agent-db7dbbfc5-pxs4q\" (UID: \"cbe7e75b-bd42-4538-9392-41d086dc6e34\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-db7dbbfc5-pxs4q" Apr 22 14:16:03.378156 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:03.378121 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-d7f49cbdc-fj6gv" Apr 22 14:16:03.399017 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:03.398987 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-db7dbbfc5-pxs4q" Apr 22 14:16:03.804087 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:03.804046 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9eeb019a-e2e1-4f80-8bf1-18b7ce973747-metrics-tls\") pod \"dns-default-v29ss\" (UID: \"9eeb019a-e2e1-4f80-8bf1-18b7ce973747\") " pod="openshift-dns/dns-default-v29ss" Apr 22 14:16:03.804087 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:03.804093 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e7bf6435-22e2-4377-a9e9-e35bf103f96a-cert\") pod \"ingress-canary-9cr92\" (UID: \"e7bf6435-22e2-4377-a9e9-e35bf103f96a\") " pod="openshift-ingress-canary/ingress-canary-9cr92" Apr 22 14:16:03.804305 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:16:03.804195 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:16:03.804305 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:16:03.804216 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:16:03.804305 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:16:03.804259 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7bf6435-22e2-4377-a9e9-e35bf103f96a-cert podName:e7bf6435-22e2-4377-a9e9-e35bf103f96a nodeName:}" failed. No retries permitted until 2026-04-22 14:16:05.804245357 +0000 UTC m=+36.129979750 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e7bf6435-22e2-4377-a9e9-e35bf103f96a-cert") pod "ingress-canary-9cr92" (UID: "e7bf6435-22e2-4377-a9e9-e35bf103f96a") : secret "canary-serving-cert" not found Apr 22 14:16:03.804305 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:16:03.804271 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9eeb019a-e2e1-4f80-8bf1-18b7ce973747-metrics-tls podName:9eeb019a-e2e1-4f80-8bf1-18b7ce973747 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:05.80426577 +0000 UTC m=+36.130000164 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9eeb019a-e2e1-4f80-8bf1-18b7ce973747-metrics-tls") pod "dns-default-v29ss" (UID: "9eeb019a-e2e1-4f80-8bf1-18b7ce973747") : secret "dns-default-metrics-tls" not found Apr 22 14:16:04.005691 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:04.005653 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dbee0e27-41ea-4d42-84c7-681872bfcda1-metrics-certs\") pod \"network-metrics-daemon-r7psp\" (UID: \"dbee0e27-41ea-4d42-84c7-681872bfcda1\") " pod="openshift-multus/network-metrics-daemon-r7psp" Apr 22 14:16:04.005894 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:04.005717 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tk6vz\" (UniqueName: \"kubernetes.io/projected/e0b90fcd-ae7e-46f8-83c0-b9bdec12c8aa-kube-api-access-tk6vz\") pod \"network-check-target-rbpm2\" (UID: \"e0b90fcd-ae7e-46f8-83c0-b9bdec12c8aa\") " pod="openshift-network-diagnostics/network-check-target-rbpm2" Apr 22 14:16:04.005894 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:16:04.005787 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 14:16:04.005894 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:16:04.005879 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dbee0e27-41ea-4d42-84c7-681872bfcda1-metrics-certs podName:dbee0e27-41ea-4d42-84c7-681872bfcda1 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:36.00585842 +0000 UTC m=+66.331592816 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dbee0e27-41ea-4d42-84c7-681872bfcda1-metrics-certs") pod "network-metrics-daemon-r7psp" (UID: "dbee0e27-41ea-4d42-84c7-681872bfcda1") : secret "metrics-daemon-secret" not found Apr 22 14:16:04.008472 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:04.008443 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tk6vz\" (UniqueName: \"kubernetes.io/projected/e0b90fcd-ae7e-46f8-83c0-b9bdec12c8aa-kube-api-access-tk6vz\") pod \"network-check-target-rbpm2\" (UID: \"e0b90fcd-ae7e-46f8-83c0-b9bdec12c8aa\") " pod="openshift-network-diagnostics/network-check-target-rbpm2" Apr 22 14:16:04.099677 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:04.099607 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rbpm2" Apr 22 14:16:04.986446 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:04.986212 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-rbpm2"] Apr 22 14:16:04.986882 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:04.986864 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-d7f49cbdc-fj6gv"] Apr 22 14:16:04.991377 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:04.991359 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-db7dbbfc5-pxs4q"] Apr 22 14:16:05.128412 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:16:05.128319 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0b90fcd_ae7e_46f8_83c0_b9bdec12c8aa.slice/crio-54f46d21b5ed17f40ab724a40f9654f6c72907f2f5a0a855140df16419cba11e WatchSource:0}: Error finding container 54f46d21b5ed17f40ab724a40f9654f6c72907f2f5a0a855140df16419cba11e: Status 404 returned error can't find the container with id 54f46d21b5ed17f40ab724a40f9654f6c72907f2f5a0a855140df16419cba11e Apr 22 14:16:05.134186 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:16:05.134160 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2fa093c6_79ec_4683_a1e6_c0078c63b94b.slice/crio-6145db945404ceabc61c3dd82351f496429fb16ce3a58e55b4c824ecc2e78c18 WatchSource:0}: Error finding container 6145db945404ceabc61c3dd82351f496429fb16ce3a58e55b4c824ecc2e78c18: Status 404 returned error can't find the container with id 6145db945404ceabc61c3dd82351f496429fb16ce3a58e55b4c824ecc2e78c18 Apr 22 14:16:05.135006 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:16:05.134984 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcbe7e75b_bd42_4538_9392_41d086dc6e34.slice/crio-fbd1efc6bb9b31de24606a8aa8ee194e9af4e907883239ce5c4cd3153ee2fd60 WatchSource:0}: Error finding container fbd1efc6bb9b31de24606a8aa8ee194e9af4e907883239ce5c4cd3153ee2fd60: Status 404 returned error can't find the container with id fbd1efc6bb9b31de24606a8aa8ee194e9af4e907883239ce5c4cd3153ee2fd60 Apr 22 14:16:05.465405 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:05.465380 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-db7dbbfc5-pxs4q" event={"ID":"cbe7e75b-bd42-4538-9392-41d086dc6e34","Type":"ContainerStarted","Data":"fbd1efc6bb9b31de24606a8aa8ee194e9af4e907883239ce5c4cd3153ee2fd60"} Apr 22 14:16:05.468012 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:05.467983 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8pqgl" event={"ID":"110f68fd-5d58-411e-a7fc-980d5d6050e4","Type":"ContainerStarted","Data":"ff575ae7450cc3923d24d7cd683c0ab1012b19a77e295b7d5b5f890943d69aca"} Apr 22 14:16:05.469019 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:05.468999 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-d7f49cbdc-fj6gv" event={"ID":"2fa093c6-79ec-4683-a1e6-c0078c63b94b","Type":"ContainerStarted","Data":"6145db945404ceabc61c3dd82351f496429fb16ce3a58e55b4c824ecc2e78c18"} Apr 22 14:16:05.469889 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:05.469872 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-rbpm2" event={"ID":"e0b90fcd-ae7e-46f8-83c0-b9bdec12c8aa","Type":"ContainerStarted","Data":"54f46d21b5ed17f40ab724a40f9654f6c72907f2f5a0a855140df16419cba11e"} Apr 22 14:16:05.823540 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:05.823488 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9eeb019a-e2e1-4f80-8bf1-18b7ce973747-metrics-tls\") pod \"dns-default-v29ss\" (UID: \"9eeb019a-e2e1-4f80-8bf1-18b7ce973747\") " pod="openshift-dns/dns-default-v29ss" Apr 22 14:16:05.823540 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:05.823536 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e7bf6435-22e2-4377-a9e9-e35bf103f96a-cert\") pod \"ingress-canary-9cr92\" (UID: \"e7bf6435-22e2-4377-a9e9-e35bf103f96a\") " pod="openshift-ingress-canary/ingress-canary-9cr92" Apr 22 14:16:05.823803 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:16:05.823657 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:16:05.823803 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:16:05.823687 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:16:05.823803 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:16:05.823739 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9eeb019a-e2e1-4f80-8bf1-18b7ce973747-metrics-tls podName:9eeb019a-e2e1-4f80-8bf1-18b7ce973747 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:09.823714275 +0000 UTC m=+40.149448681 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9eeb019a-e2e1-4f80-8bf1-18b7ce973747-metrics-tls") pod "dns-default-v29ss" (UID: "9eeb019a-e2e1-4f80-8bf1-18b7ce973747") : secret "dns-default-metrics-tls" not found Apr 22 14:16:05.823803 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:16:05.823759 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7bf6435-22e2-4377-a9e9-e35bf103f96a-cert podName:e7bf6435-22e2-4377-a9e9-e35bf103f96a nodeName:}" failed. No retries permitted until 2026-04-22 14:16:09.823748584 +0000 UTC m=+40.149482991 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e7bf6435-22e2-4377-a9e9-e35bf103f96a-cert") pod "ingress-canary-9cr92" (UID: "e7bf6435-22e2-4377-a9e9-e35bf103f96a") : secret "canary-serving-cert" not found Apr 22 14:16:06.478668 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:06.478629 2576 generic.go:358] "Generic (PLEG): container finished" podID="110f68fd-5d58-411e-a7fc-980d5d6050e4" containerID="ff575ae7450cc3923d24d7cd683c0ab1012b19a77e295b7d5b5f890943d69aca" exitCode=0 Apr 22 14:16:06.479173 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:06.478676 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8pqgl" event={"ID":"110f68fd-5d58-411e-a7fc-980d5d6050e4","Type":"ContainerDied","Data":"ff575ae7450cc3923d24d7cd683c0ab1012b19a77e295b7d5b5f890943d69aca"} Apr 22 14:16:09.854063 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:09.854017 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9eeb019a-e2e1-4f80-8bf1-18b7ce973747-metrics-tls\") pod \"dns-default-v29ss\" (UID: \"9eeb019a-e2e1-4f80-8bf1-18b7ce973747\") " pod="openshift-dns/dns-default-v29ss" Apr 22 14:16:09.854063 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:09.854066 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e7bf6435-22e2-4377-a9e9-e35bf103f96a-cert\") pod \"ingress-canary-9cr92\" (UID: \"e7bf6435-22e2-4377-a9e9-e35bf103f96a\") " pod="openshift-ingress-canary/ingress-canary-9cr92" Apr 22 14:16:09.854674 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:16:09.854170 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:16:09.854674 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:16:09.854190 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:16:09.854674 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:16:09.854240 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9eeb019a-e2e1-4f80-8bf1-18b7ce973747-metrics-tls podName:9eeb019a-e2e1-4f80-8bf1-18b7ce973747 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:17.854222192 +0000 UTC m=+48.179956589 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9eeb019a-e2e1-4f80-8bf1-18b7ce973747-metrics-tls") pod "dns-default-v29ss" (UID: "9eeb019a-e2e1-4f80-8bf1-18b7ce973747") : secret "dns-default-metrics-tls" not found Apr 22 14:16:09.854674 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:16:09.854259 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7bf6435-22e2-4377-a9e9-e35bf103f96a-cert podName:e7bf6435-22e2-4377-a9e9-e35bf103f96a nodeName:}" failed. No retries permitted until 2026-04-22 14:16:17.85425131 +0000 UTC m=+48.179985709 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e7bf6435-22e2-4377-a9e9-e35bf103f96a-cert") pod "ingress-canary-9cr92" (UID: "e7bf6435-22e2-4377-a9e9-e35bf103f96a") : secret "canary-serving-cert" not found Apr 22 14:16:11.491718 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:11.491687 2576 generic.go:358] "Generic (PLEG): container finished" podID="110f68fd-5d58-411e-a7fc-980d5d6050e4" containerID="1f8af06074a445109d0ef55682a94a2aa040cab44012bb213c7b7da5fa2de775" exitCode=0 Apr 22 14:16:11.492179 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:11.491771 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8pqgl" event={"ID":"110f68fd-5d58-411e-a7fc-980d5d6050e4","Type":"ContainerDied","Data":"1f8af06074a445109d0ef55682a94a2aa040cab44012bb213c7b7da5fa2de775"} Apr 22 14:16:11.493250 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:11.493220 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-d7f49cbdc-fj6gv" event={"ID":"2fa093c6-79ec-4683-a1e6-c0078c63b94b","Type":"ContainerStarted","Data":"e3c2ebef436b03aca12e9903e0aca894ab328a35c4c34faec89016d9facd4809"} Apr 22 14:16:11.493465 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:11.493430 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-d7f49cbdc-fj6gv" Apr 22 14:16:11.494655 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:11.494580 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-rbpm2" event={"ID":"e0b90fcd-ae7e-46f8-83c0-b9bdec12c8aa","Type":"ContainerStarted","Data":"9fd4c9b476e61a3e7de0f3b4ab1199f70c2f2987a98c826dc0273c4bbb8d2c41"} Apr 22 14:16:11.494834 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:11.494789 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-rbpm2" Apr 22 14:16:11.495333 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:11.495313 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-d7f49cbdc-fj6gv" Apr 22 14:16:11.496164 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:11.496144 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-db7dbbfc5-pxs4q" event={"ID":"cbe7e75b-bd42-4538-9392-41d086dc6e34","Type":"ContainerStarted","Data":"a52e4cb44b199a5cc6662fe21b67f9e2985ae72c18df98f9cb9d7fa683e5b450"} Apr 22 14:16:11.530577 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:11.530535 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-rbpm2" podStartSLOduration=36.217148182 podStartE2EDuration="41.530521792s" podCreationTimestamp="2026-04-22 14:15:30 +0000 UTC" firstStartedPulling="2026-04-22 14:16:05.130488776 +0000 UTC m=+35.456223170" lastFinishedPulling="2026-04-22 14:16:10.443862372 +0000 UTC m=+40.769596780" observedRunningTime="2026-04-22 14:16:11.530242716 +0000 UTC m=+41.855977131" watchObservedRunningTime="2026-04-22 14:16:11.530521792 +0000 UTC m=+41.856256207" Apr 22 14:16:12.500659 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:12.500626 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8pqgl" event={"ID":"110f68fd-5d58-411e-a7fc-980d5d6050e4","Type":"ContainerStarted","Data":"21c1c92b6c151231aa153aa55b89125b6cd35537a2d1e554f837bafa18c11062"} Apr 22 14:16:12.525617 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:12.525561 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-8pqgl" podStartSLOduration=10.183626659 podStartE2EDuration="42.525543645s" podCreationTimestamp="2026-04-22 14:15:30 +0000 UTC" firstStartedPulling="2026-04-22 14:15:32.828243706 +0000 UTC m=+3.153978101" lastFinishedPulling="2026-04-22 14:16:05.170160679 +0000 UTC m=+35.495895087" observedRunningTime="2026-04-22 14:16:12.52367079 +0000 UTC m=+42.849405203" watchObservedRunningTime="2026-04-22 14:16:12.525543645 +0000 UTC m=+42.851278060" Apr 22 14:16:12.525925 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:12.525898 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-d7f49cbdc-fj6gv" podStartSLOduration=4.209607881 podStartE2EDuration="9.525888915s" podCreationTimestamp="2026-04-22 14:16:03 +0000 UTC" firstStartedPulling="2026-04-22 14:16:05.14734964 +0000 UTC m=+35.473084033" lastFinishedPulling="2026-04-22 14:16:10.463630663 +0000 UTC m=+40.789365067" observedRunningTime="2026-04-22 14:16:11.545608484 +0000 UTC m=+41.871342904" watchObservedRunningTime="2026-04-22 14:16:12.525888915 +0000 UTC m=+42.851623322" Apr 22 14:16:13.504256 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:13.504220 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-db7dbbfc5-pxs4q" event={"ID":"cbe7e75b-bd42-4538-9392-41d086dc6e34","Type":"ContainerStarted","Data":"7ba88e02680b72e476bff1f5f700e8344da29e8683529905d90f81ca2465b87f"} Apr 22 14:16:13.504256 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:13.504258 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-db7dbbfc5-pxs4q" event={"ID":"cbe7e75b-bd42-4538-9392-41d086dc6e34","Type":"ContainerStarted","Data":"16fde03fd172b2c23a6df9bc170e99edf6ccb742d16148e31b577a56c6b19b1e"} Apr 22 14:16:13.523372 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:13.523330 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-db7dbbfc5-pxs4q" podStartSLOduration=3.153595273 podStartE2EDuration="10.523316491s" podCreationTimestamp="2026-04-22 14:16:03 +0000 UTC" firstStartedPulling="2026-04-22 14:16:05.147347268 +0000 UTC m=+35.473081665" lastFinishedPulling="2026-04-22 14:16:12.517068489 +0000 UTC m=+42.842802883" observedRunningTime="2026-04-22 14:16:13.522938796 +0000 UTC m=+43.848673213" watchObservedRunningTime="2026-04-22 14:16:13.523316491 +0000 UTC m=+43.849050906" Apr 22 14:16:17.910760 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:17.910722 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9eeb019a-e2e1-4f80-8bf1-18b7ce973747-metrics-tls\") pod \"dns-default-v29ss\" (UID: \"9eeb019a-e2e1-4f80-8bf1-18b7ce973747\") " pod="openshift-dns/dns-default-v29ss" Apr 22 14:16:17.910760 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:17.910759 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e7bf6435-22e2-4377-a9e9-e35bf103f96a-cert\") pod \"ingress-canary-9cr92\" (UID: \"e7bf6435-22e2-4377-a9e9-e35bf103f96a\") " pod="openshift-ingress-canary/ingress-canary-9cr92" Apr 22 14:16:17.911292 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:16:17.910873 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:16:17.911292 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:16:17.910877 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:16:17.911292 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:16:17.910931 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7bf6435-22e2-4377-a9e9-e35bf103f96a-cert podName:e7bf6435-22e2-4377-a9e9-e35bf103f96a nodeName:}" failed. No retries permitted until 2026-04-22 14:16:33.910917768 +0000 UTC m=+64.236652161 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e7bf6435-22e2-4377-a9e9-e35bf103f96a-cert") pod "ingress-canary-9cr92" (UID: "e7bf6435-22e2-4377-a9e9-e35bf103f96a") : secret "canary-serving-cert" not found Apr 22 14:16:17.911292 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:16:17.910943 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9eeb019a-e2e1-4f80-8bf1-18b7ce973747-metrics-tls podName:9eeb019a-e2e1-4f80-8bf1-18b7ce973747 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:33.910936847 +0000 UTC m=+64.236671240 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9eeb019a-e2e1-4f80-8bf1-18b7ce973747-metrics-tls") pod "dns-default-v29ss" (UID: "9eeb019a-e2e1-4f80-8bf1-18b7ce973747") : secret "dns-default-metrics-tls" not found Apr 22 14:16:27.462973 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:27.462938 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x7g6p" Apr 22 14:16:33.913496 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:33.913454 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9eeb019a-e2e1-4f80-8bf1-18b7ce973747-metrics-tls\") pod \"dns-default-v29ss\" (UID: \"9eeb019a-e2e1-4f80-8bf1-18b7ce973747\") " pod="openshift-dns/dns-default-v29ss" Apr 22 14:16:33.913496 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:33.913492 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e7bf6435-22e2-4377-a9e9-e35bf103f96a-cert\") pod \"ingress-canary-9cr92\" (UID: \"e7bf6435-22e2-4377-a9e9-e35bf103f96a\") " pod="openshift-ingress-canary/ingress-canary-9cr92" Apr 22 14:16:33.913964 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:16:33.913600 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:16:33.913964 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:16:33.913639 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:16:33.913964 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:16:33.913664 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9eeb019a-e2e1-4f80-8bf1-18b7ce973747-metrics-tls podName:9eeb019a-e2e1-4f80-8bf1-18b7ce973747 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:05.913648517 +0000 UTC m=+96.239382910 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9eeb019a-e2e1-4f80-8bf1-18b7ce973747-metrics-tls") pod "dns-default-v29ss" (UID: "9eeb019a-e2e1-4f80-8bf1-18b7ce973747") : secret "dns-default-metrics-tls" not found Apr 22 14:16:33.913964 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:16:33.913689 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7bf6435-22e2-4377-a9e9-e35bf103f96a-cert podName:e7bf6435-22e2-4377-a9e9-e35bf103f96a nodeName:}" failed. No retries permitted until 2026-04-22 14:17:05.91367726 +0000 UTC m=+96.239411653 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e7bf6435-22e2-4377-a9e9-e35bf103f96a-cert") pod "ingress-canary-9cr92" (UID: "e7bf6435-22e2-4377-a9e9-e35bf103f96a") : secret "canary-serving-cert" not found Apr 22 14:16:36.027983 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:36.027946 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dbee0e27-41ea-4d42-84c7-681872bfcda1-metrics-certs\") pod \"network-metrics-daemon-r7psp\" (UID: \"dbee0e27-41ea-4d42-84c7-681872bfcda1\") " pod="openshift-multus/network-metrics-daemon-r7psp" Apr 22 14:16:36.028351 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:16:36.028047 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 14:16:36.028351 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:16:36.028101 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dbee0e27-41ea-4d42-84c7-681872bfcda1-metrics-certs podName:dbee0e27-41ea-4d42-84c7-681872bfcda1 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:40.028086777 +0000 UTC m=+130.353821171 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dbee0e27-41ea-4d42-84c7-681872bfcda1-metrics-certs") pod "network-metrics-daemon-r7psp" (UID: "dbee0e27-41ea-4d42-84c7-681872bfcda1") : secret "metrics-daemon-secret" not found Apr 22 14:16:42.502915 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:16:42.502791 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-rbpm2" Apr 22 14:17:05.930644 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:17:05.930606 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9eeb019a-e2e1-4f80-8bf1-18b7ce973747-metrics-tls\") pod \"dns-default-v29ss\" (UID: \"9eeb019a-e2e1-4f80-8bf1-18b7ce973747\") " pod="openshift-dns/dns-default-v29ss" Apr 22 14:17:05.930644 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:17:05.930647 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e7bf6435-22e2-4377-a9e9-e35bf103f96a-cert\") pod \"ingress-canary-9cr92\" (UID: \"e7bf6435-22e2-4377-a9e9-e35bf103f96a\") " pod="openshift-ingress-canary/ingress-canary-9cr92" Apr 22 14:17:05.931022 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:17:05.930802 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:17:05.931022 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:17:05.930894 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9eeb019a-e2e1-4f80-8bf1-18b7ce973747-metrics-tls podName:9eeb019a-e2e1-4f80-8bf1-18b7ce973747 nodeName:}" failed. No retries permitted until 2026-04-22 14:18:09.930875369 +0000 UTC m=+160.256609773 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9eeb019a-e2e1-4f80-8bf1-18b7ce973747-metrics-tls") pod "dns-default-v29ss" (UID: "9eeb019a-e2e1-4f80-8bf1-18b7ce973747") : secret "dns-default-metrics-tls" not found Apr 22 14:17:05.931022 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:17:05.930834 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:17:05.931022 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:17:05.930960 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7bf6435-22e2-4377-a9e9-e35bf103f96a-cert podName:e7bf6435-22e2-4377-a9e9-e35bf103f96a nodeName:}" failed. No retries permitted until 2026-04-22 14:18:09.93094885 +0000 UTC m=+160.256683256 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e7bf6435-22e2-4377-a9e9-e35bf103f96a-cert") pod "ingress-canary-9cr92" (UID: "e7bf6435-22e2-4377-a9e9-e35bf103f96a") : secret "canary-serving-cert" not found Apr 22 14:17:40.060934 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:17:40.060878 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dbee0e27-41ea-4d42-84c7-681872bfcda1-metrics-certs\") pod \"network-metrics-daemon-r7psp\" (UID: \"dbee0e27-41ea-4d42-84c7-681872bfcda1\") " pod="openshift-multus/network-metrics-daemon-r7psp" Apr 22 14:17:40.061411 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:17:40.061022 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 14:17:40.061411 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:17:40.061097 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dbee0e27-41ea-4d42-84c7-681872bfcda1-metrics-certs podName:dbee0e27-41ea-4d42-84c7-681872bfcda1 nodeName:}" failed. No retries permitted until 2026-04-22 14:19:42.061078263 +0000 UTC m=+252.386812677 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dbee0e27-41ea-4d42-84c7-681872bfcda1-metrics-certs") pod "network-metrics-daemon-r7psp" (UID: "dbee0e27-41ea-4d42-84c7-681872bfcda1") : secret "metrics-daemon-secret" not found Apr 22 14:17:43.781369 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:17:43.781335 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-8h8cm_fbf11e8c-40e6-4fe9-b51b-b3a87ff88577/dns-node-resolver/0.log" Apr 22 14:17:44.985266 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:17:44.985239 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-bzcbn_b185de97-4d30-47da-bb08-402ac8989235/node-ca/0.log" Apr 22 14:18:05.119440 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:18:05.119393 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-v29ss" podUID="9eeb019a-e2e1-4f80-8bf1-18b7ce973747" Apr 22 14:18:05.125525 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:18:05.125497 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-9cr92" podUID="e7bf6435-22e2-4377-a9e9-e35bf103f96a" Apr 22 14:18:05.304869 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:18:05.304792 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-r7psp" podUID="dbee0e27-41ea-4d42-84c7-681872bfcda1" Apr 22 14:18:05.759213 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:05.759178 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-v29ss" Apr 22 14:18:07.307847 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:07.307796 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-sgkpp"] Apr 22 14:18:07.310735 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:07.310714 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-sgkpp" Apr 22 14:18:07.315937 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:07.315864 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 14:18:07.316055 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:07.316013 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 14:18:07.316055 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:07.316027 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 14:18:07.316411 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:07.316392 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-zwk4n\"" Apr 22 14:18:07.316563 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:07.316537 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 14:18:07.334155 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:07.334127 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-sgkpp"] Apr 22 14:18:07.378298 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:07.378263 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-74fb95cffd-6sw58"] Apr 22 14:18:07.381029 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:07.381009 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-74fb95cffd-6sw58" Apr 22 14:18:07.386553 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:07.386496 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 14:18:07.386790 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:07.386775 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 14:18:07.386919 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:07.386899 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 14:18:07.388123 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:07.388107 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-x72w4\"" Apr 22 14:18:07.398894 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:07.398876 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 14:18:07.429862 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:07.429833 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-74fb95cffd-6sw58"] Apr 22 14:18:07.450425 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:07.450386 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/057deb2d-ff54-4f6d-b3f5-419c3f0e070e-crio-socket\") pod \"insights-runtime-extractor-sgkpp\" (UID: \"057deb2d-ff54-4f6d-b3f5-419c3f0e070e\") " pod="openshift-insights/insights-runtime-extractor-sgkpp" Apr 22 14:18:07.450425 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:07.450433 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/057deb2d-ff54-4f6d-b3f5-419c3f0e070e-data-volume\") pod \"insights-runtime-extractor-sgkpp\" (UID: \"057deb2d-ff54-4f6d-b3f5-419c3f0e070e\") " pod="openshift-insights/insights-runtime-extractor-sgkpp" Apr 22 14:18:07.450661 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:07.450493 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/057deb2d-ff54-4f6d-b3f5-419c3f0e070e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-sgkpp\" (UID: \"057deb2d-ff54-4f6d-b3f5-419c3f0e070e\") " pod="openshift-insights/insights-runtime-extractor-sgkpp" Apr 22 14:18:07.450661 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:07.450529 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnh5t\" (UniqueName: \"kubernetes.io/projected/057deb2d-ff54-4f6d-b3f5-419c3f0e070e-kube-api-access-nnh5t\") pod \"insights-runtime-extractor-sgkpp\" (UID: \"057deb2d-ff54-4f6d-b3f5-419c3f0e070e\") " pod="openshift-insights/insights-runtime-extractor-sgkpp" Apr 22 14:18:07.450661 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:07.450591 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/057deb2d-ff54-4f6d-b3f5-419c3f0e070e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-sgkpp\" (UID: \"057deb2d-ff54-4f6d-b3f5-419c3f0e070e\") " pod="openshift-insights/insights-runtime-extractor-sgkpp" Apr 22 14:18:07.551078 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:07.551044 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/057deb2d-ff54-4f6d-b3f5-419c3f0e070e-data-volume\") pod \"insights-runtime-extractor-sgkpp\" (UID: \"057deb2d-ff54-4f6d-b3f5-419c3f0e070e\") " pod="openshift-insights/insights-runtime-extractor-sgkpp" Apr 22 14:18:07.551260 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:07.551100 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/057deb2d-ff54-4f6d-b3f5-419c3f0e070e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-sgkpp\" (UID: \"057deb2d-ff54-4f6d-b3f5-419c3f0e070e\") " pod="openshift-insights/insights-runtime-extractor-sgkpp" Apr 22 14:18:07.551260 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:07.551122 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdhsx\" (UniqueName: \"kubernetes.io/projected/2bb74d4a-dc01-4e62-87e5-7e08deed660e-kube-api-access-sdhsx\") pod \"image-registry-74fb95cffd-6sw58\" (UID: \"2bb74d4a-dc01-4e62-87e5-7e08deed660e\") " pod="openshift-image-registry/image-registry-74fb95cffd-6sw58" Apr 22 14:18:07.551260 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:07.551147 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2bb74d4a-dc01-4e62-87e5-7e08deed660e-image-registry-private-configuration\") pod \"image-registry-74fb95cffd-6sw58\" (UID: \"2bb74d4a-dc01-4e62-87e5-7e08deed660e\") " pod="openshift-image-registry/image-registry-74fb95cffd-6sw58" Apr 22 14:18:07.551260 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:07.551168 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nnh5t\" (UniqueName: \"kubernetes.io/projected/057deb2d-ff54-4f6d-b3f5-419c3f0e070e-kube-api-access-nnh5t\") pod \"insights-runtime-extractor-sgkpp\" (UID: \"057deb2d-ff54-4f6d-b3f5-419c3f0e070e\") " pod="openshift-insights/insights-runtime-extractor-sgkpp" Apr 22 14:18:07.551260 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:07.551188 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2bb74d4a-dc01-4e62-87e5-7e08deed660e-registry-certificates\") pod \"image-registry-74fb95cffd-6sw58\" (UID: \"2bb74d4a-dc01-4e62-87e5-7e08deed660e\") " pod="openshift-image-registry/image-registry-74fb95cffd-6sw58" Apr 22 14:18:07.551260 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:07.551241 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2bb74d4a-dc01-4e62-87e5-7e08deed660e-installation-pull-secrets\") pod \"image-registry-74fb95cffd-6sw58\" (UID: \"2bb74d4a-dc01-4e62-87e5-7e08deed660e\") " pod="openshift-image-registry/image-registry-74fb95cffd-6sw58" Apr 22 14:18:07.551532 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:07.551321 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2bb74d4a-dc01-4e62-87e5-7e08deed660e-registry-tls\") pod \"image-registry-74fb95cffd-6sw58\" (UID: \"2bb74d4a-dc01-4e62-87e5-7e08deed660e\") " pod="openshift-image-registry/image-registry-74fb95cffd-6sw58" Apr 22 14:18:07.551532 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:07.551350 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2bb74d4a-dc01-4e62-87e5-7e08deed660e-ca-trust-extracted\") pod \"image-registry-74fb95cffd-6sw58\" (UID: \"2bb74d4a-dc01-4e62-87e5-7e08deed660e\") " pod="openshift-image-registry/image-registry-74fb95cffd-6sw58" Apr 22 14:18:07.551532 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:07.551396 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/057deb2d-ff54-4f6d-b3f5-419c3f0e070e-data-volume\") pod \"insights-runtime-extractor-sgkpp\" (UID: \"057deb2d-ff54-4f6d-b3f5-419c3f0e070e\") " pod="openshift-insights/insights-runtime-extractor-sgkpp" Apr 22 14:18:07.551532 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:07.551416 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/057deb2d-ff54-4f6d-b3f5-419c3f0e070e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-sgkpp\" (UID: \"057deb2d-ff54-4f6d-b3f5-419c3f0e070e\") " pod="openshift-insights/insights-runtime-extractor-sgkpp" Apr 22 14:18:07.551532 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:07.551454 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2bb74d4a-dc01-4e62-87e5-7e08deed660e-trusted-ca\") pod \"image-registry-74fb95cffd-6sw58\" (UID: \"2bb74d4a-dc01-4e62-87e5-7e08deed660e\") " pod="openshift-image-registry/image-registry-74fb95cffd-6sw58" Apr 22 14:18:07.551532 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:07.551499 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/057deb2d-ff54-4f6d-b3f5-419c3f0e070e-crio-socket\") pod \"insights-runtime-extractor-sgkpp\" (UID: \"057deb2d-ff54-4f6d-b3f5-419c3f0e070e\") " pod="openshift-insights/insights-runtime-extractor-sgkpp" Apr 22 14:18:07.551532 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:07.551523 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2bb74d4a-dc01-4e62-87e5-7e08deed660e-bound-sa-token\") pod \"image-registry-74fb95cffd-6sw58\" (UID: \"2bb74d4a-dc01-4e62-87e5-7e08deed660e\") " pod="openshift-image-registry/image-registry-74fb95cffd-6sw58" Apr 22 14:18:07.551773 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:07.551575 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/057deb2d-ff54-4f6d-b3f5-419c3f0e070e-crio-socket\") pod \"insights-runtime-extractor-sgkpp\" (UID: \"057deb2d-ff54-4f6d-b3f5-419c3f0e070e\") " pod="openshift-insights/insights-runtime-extractor-sgkpp" Apr 22 14:18:07.551830 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:07.551791 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/057deb2d-ff54-4f6d-b3f5-419c3f0e070e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-sgkpp\" (UID: \"057deb2d-ff54-4f6d-b3f5-419c3f0e070e\") " pod="openshift-insights/insights-runtime-extractor-sgkpp" Apr 22 14:18:07.554570 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:07.554554 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/057deb2d-ff54-4f6d-b3f5-419c3f0e070e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-sgkpp\" (UID: \"057deb2d-ff54-4f6d-b3f5-419c3f0e070e\") " pod="openshift-insights/insights-runtime-extractor-sgkpp" Apr 22 14:18:07.561371 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:07.561319 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnh5t\" (UniqueName: \"kubernetes.io/projected/057deb2d-ff54-4f6d-b3f5-419c3f0e070e-kube-api-access-nnh5t\") pod \"insights-runtime-extractor-sgkpp\" (UID: \"057deb2d-ff54-4f6d-b3f5-419c3f0e070e\") " pod="openshift-insights/insights-runtime-extractor-sgkpp" Apr 22 14:18:07.619450 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:07.619414 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-sgkpp" Apr 22 14:18:07.652158 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:07.652121 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2bb74d4a-dc01-4e62-87e5-7e08deed660e-registry-certificates\") pod \"image-registry-74fb95cffd-6sw58\" (UID: \"2bb74d4a-dc01-4e62-87e5-7e08deed660e\") " pod="openshift-image-registry/image-registry-74fb95cffd-6sw58" Apr 22 14:18:07.652158 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:07.652159 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2bb74d4a-dc01-4e62-87e5-7e08deed660e-installation-pull-secrets\") pod \"image-registry-74fb95cffd-6sw58\" (UID: \"2bb74d4a-dc01-4e62-87e5-7e08deed660e\") " pod="openshift-image-registry/image-registry-74fb95cffd-6sw58" Apr 22 14:18:07.652365 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:07.652181 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2bb74d4a-dc01-4e62-87e5-7e08deed660e-registry-tls\") pod \"image-registry-74fb95cffd-6sw58\" (UID: \"2bb74d4a-dc01-4e62-87e5-7e08deed660e\") " pod="openshift-image-registry/image-registry-74fb95cffd-6sw58" Apr 22 14:18:07.652414 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:07.652382 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2bb74d4a-dc01-4e62-87e5-7e08deed660e-ca-trust-extracted\") pod \"image-registry-74fb95cffd-6sw58\" (UID: \"2bb74d4a-dc01-4e62-87e5-7e08deed660e\") " pod="openshift-image-registry/image-registry-74fb95cffd-6sw58" Apr 22 14:18:07.652462 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:07.652442 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2bb74d4a-dc01-4e62-87e5-7e08deed660e-trusted-ca\") pod \"image-registry-74fb95cffd-6sw58\" (UID: \"2bb74d4a-dc01-4e62-87e5-7e08deed660e\") " pod="openshift-image-registry/image-registry-74fb95cffd-6sw58" Apr 22 14:18:07.652515 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:07.652486 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2bb74d4a-dc01-4e62-87e5-7e08deed660e-bound-sa-token\") pod \"image-registry-74fb95cffd-6sw58\" (UID: \"2bb74d4a-dc01-4e62-87e5-7e08deed660e\") " pod="openshift-image-registry/image-registry-74fb95cffd-6sw58" Apr 22 14:18:07.652580 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:07.652561 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sdhsx\" (UniqueName: \"kubernetes.io/projected/2bb74d4a-dc01-4e62-87e5-7e08deed660e-kube-api-access-sdhsx\") pod \"image-registry-74fb95cffd-6sw58\" (UID: \"2bb74d4a-dc01-4e62-87e5-7e08deed660e\") " pod="openshift-image-registry/image-registry-74fb95cffd-6sw58" Apr 22 14:18:07.652634 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:07.652609 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2bb74d4a-dc01-4e62-87e5-7e08deed660e-image-registry-private-configuration\") pod \"image-registry-74fb95cffd-6sw58\" (UID: \"2bb74d4a-dc01-4e62-87e5-7e08deed660e\") " pod="openshift-image-registry/image-registry-74fb95cffd-6sw58" Apr 22 14:18:07.652792 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:07.652769 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2bb74d4a-dc01-4e62-87e5-7e08deed660e-ca-trust-extracted\") pod \"image-registry-74fb95cffd-6sw58\" (UID: \"2bb74d4a-dc01-4e62-87e5-7e08deed660e\") " pod="openshift-image-registry/image-registry-74fb95cffd-6sw58" Apr 22 14:18:07.653218 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:07.653192 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2bb74d4a-dc01-4e62-87e5-7e08deed660e-registry-certificates\") pod \"image-registry-74fb95cffd-6sw58\" (UID: \"2bb74d4a-dc01-4e62-87e5-7e08deed660e\") " pod="openshift-image-registry/image-registry-74fb95cffd-6sw58" Apr 22 14:18:07.654411 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:07.653920 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2bb74d4a-dc01-4e62-87e5-7e08deed660e-trusted-ca\") pod \"image-registry-74fb95cffd-6sw58\" (UID: \"2bb74d4a-dc01-4e62-87e5-7e08deed660e\") " pod="openshift-image-registry/image-registry-74fb95cffd-6sw58" Apr 22 14:18:07.655041 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:07.655012 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2bb74d4a-dc01-4e62-87e5-7e08deed660e-registry-tls\") pod \"image-registry-74fb95cffd-6sw58\" (UID: \"2bb74d4a-dc01-4e62-87e5-7e08deed660e\") " pod="openshift-image-registry/image-registry-74fb95cffd-6sw58" Apr 22 14:18:07.655644 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:07.655625 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2bb74d4a-dc01-4e62-87e5-7e08deed660e-installation-pull-secrets\") pod \"image-registry-74fb95cffd-6sw58\" (UID: \"2bb74d4a-dc01-4e62-87e5-7e08deed660e\") " pod="openshift-image-registry/image-registry-74fb95cffd-6sw58" Apr 22 14:18:07.655792 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:07.655774 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2bb74d4a-dc01-4e62-87e5-7e08deed660e-image-registry-private-configuration\") pod \"image-registry-74fb95cffd-6sw58\" (UID: \"2bb74d4a-dc01-4e62-87e5-7e08deed660e\") " pod="openshift-image-registry/image-registry-74fb95cffd-6sw58" Apr 22 14:18:07.661792 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:07.661774 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2bb74d4a-dc01-4e62-87e5-7e08deed660e-bound-sa-token\") pod \"image-registry-74fb95cffd-6sw58\" (UID: \"2bb74d4a-dc01-4e62-87e5-7e08deed660e\") " pod="openshift-image-registry/image-registry-74fb95cffd-6sw58" Apr 22 14:18:07.662305 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:07.662290 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdhsx\" (UniqueName: \"kubernetes.io/projected/2bb74d4a-dc01-4e62-87e5-7e08deed660e-kube-api-access-sdhsx\") pod \"image-registry-74fb95cffd-6sw58\" (UID: \"2bb74d4a-dc01-4e62-87e5-7e08deed660e\") " pod="openshift-image-registry/image-registry-74fb95cffd-6sw58" Apr 22 14:18:07.689145 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:07.689103 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-74fb95cffd-6sw58" Apr 22 14:18:07.751045 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:07.747327 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-sgkpp"] Apr 22 14:18:07.765932 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:07.765889 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-sgkpp" event={"ID":"057deb2d-ff54-4f6d-b3f5-419c3f0e070e","Type":"ContainerStarted","Data":"44a1be51826e6dee062f9f0c1880be8a577ffd3b269c392eece16b26b06298ec"} Apr 22 14:18:07.816250 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:07.816230 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-74fb95cffd-6sw58"] Apr 22 14:18:07.818372 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:18:07.818342 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bb74d4a_dc01_4e62_87e5_7e08deed660e.slice/crio-c3836fc30e5ae568c13bfd92345afa7178ce039be5a9e947f36129499baa5304 WatchSource:0}: Error finding container c3836fc30e5ae568c13bfd92345afa7178ce039be5a9e947f36129499baa5304: Status 404 returned error can't find the container with id c3836fc30e5ae568c13bfd92345afa7178ce039be5a9e947f36129499baa5304 Apr 22 14:18:08.769765 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:08.769729 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-74fb95cffd-6sw58" event={"ID":"2bb74d4a-dc01-4e62-87e5-7e08deed660e","Type":"ContainerStarted","Data":"1c4eefff092010e7eb205db9d66f89208e3365b73c567d712303ef945eabdb53"} Apr 22 14:18:08.770159 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:08.769772 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-74fb95cffd-6sw58" event={"ID":"2bb74d4a-dc01-4e62-87e5-7e08deed660e","Type":"ContainerStarted","Data":"c3836fc30e5ae568c13bfd92345afa7178ce039be5a9e947f36129499baa5304"} Apr 22 14:18:08.770159 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:08.769864 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-74fb95cffd-6sw58" Apr 22 14:18:08.771651 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:08.771630 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-sgkpp" event={"ID":"057deb2d-ff54-4f6d-b3f5-419c3f0e070e","Type":"ContainerStarted","Data":"7cb79226fc1e8ce7b0cb2a67594bd2396cb122784609d9de78661fdaf3a4d08a"} Apr 22 14:18:08.771651 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:08.771655 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-sgkpp" event={"ID":"057deb2d-ff54-4f6d-b3f5-419c3f0e070e","Type":"ContainerStarted","Data":"99ca70f33c16fbb27973e9bbbd61dc7adb7d4f248a58b5d5315503f43e9135e4"} Apr 22 14:18:08.796203 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:08.796158 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-74fb95cffd-6sw58" podStartSLOduration=1.79614667 podStartE2EDuration="1.79614667s" podCreationTimestamp="2026-04-22 14:18:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:18:08.794829746 +0000 UTC m=+159.120564155" watchObservedRunningTime="2026-04-22 14:18:08.79614667 +0000 UTC m=+159.121881085" Apr 22 14:18:09.969216 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:09.969188 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9eeb019a-e2e1-4f80-8bf1-18b7ce973747-metrics-tls\") pod \"dns-default-v29ss\" (UID: \"9eeb019a-e2e1-4f80-8bf1-18b7ce973747\") " pod="openshift-dns/dns-default-v29ss" Apr 22 14:18:09.969568 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:09.969228 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e7bf6435-22e2-4377-a9e9-e35bf103f96a-cert\") pod \"ingress-canary-9cr92\" (UID: \"e7bf6435-22e2-4377-a9e9-e35bf103f96a\") " pod="openshift-ingress-canary/ingress-canary-9cr92" Apr 22 14:18:09.971388 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:09.971361 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9eeb019a-e2e1-4f80-8bf1-18b7ce973747-metrics-tls\") pod \"dns-default-v29ss\" (UID: \"9eeb019a-e2e1-4f80-8bf1-18b7ce973747\") " pod="openshift-dns/dns-default-v29ss" Apr 22 14:18:09.971471 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:09.971404 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e7bf6435-22e2-4377-a9e9-e35bf103f96a-cert\") pod \"ingress-canary-9cr92\" (UID: \"e7bf6435-22e2-4377-a9e9-e35bf103f96a\") " pod="openshift-ingress-canary/ingress-canary-9cr92" Apr 22 14:18:10.262688 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:10.262659 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-zpmwg\"" Apr 22 14:18:10.270908 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:10.270871 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-v29ss" Apr 22 14:18:10.381998 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:10.381919 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-v29ss"] Apr 22 14:18:10.384309 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:18:10.384285 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9eeb019a_e2e1_4f80_8bf1_18b7ce973747.slice/crio-9d7206e697e35e9429acad59feea55cd5b19b046a743f04be51744d47352e330 WatchSource:0}: Error finding container 9d7206e697e35e9429acad59feea55cd5b19b046a743f04be51744d47352e330: Status 404 returned error can't find the container with id 9d7206e697e35e9429acad59feea55cd5b19b046a743f04be51744d47352e330 Apr 22 14:18:10.777855 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:10.777799 2576 generic.go:358] "Generic (PLEG): container finished" podID="2fa093c6-79ec-4683-a1e6-c0078c63b94b" containerID="e3c2ebef436b03aca12e9903e0aca894ab328a35c4c34faec89016d9facd4809" exitCode=1 Apr 22 14:18:10.778010 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:10.777847 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-d7f49cbdc-fj6gv" event={"ID":"2fa093c6-79ec-4683-a1e6-c0078c63b94b","Type":"ContainerDied","Data":"e3c2ebef436b03aca12e9903e0aca894ab328a35c4c34faec89016d9facd4809"} Apr 22 14:18:10.778230 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:10.778214 2576 scope.go:117] "RemoveContainer" containerID="e3c2ebef436b03aca12e9903e0aca894ab328a35c4c34faec89016d9facd4809" Apr 22 14:18:10.780168 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:10.780143 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-sgkpp" event={"ID":"057deb2d-ff54-4f6d-b3f5-419c3f0e070e","Type":"ContainerStarted","Data":"555e9782e068297d9dc11ed56f17d9668554233979e5a101ee1edee1f5d5d188"} Apr 22 14:18:10.781479 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:10.781456 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-v29ss" event={"ID":"9eeb019a-e2e1-4f80-8bf1-18b7ce973747","Type":"ContainerStarted","Data":"9d7206e697e35e9429acad59feea55cd5b19b046a743f04be51744d47352e330"} Apr 22 14:18:10.831580 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:10.831524 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-sgkpp" podStartSLOduration=1.834173633 podStartE2EDuration="3.831503907s" podCreationTimestamp="2026-04-22 14:18:07 +0000 UTC" firstStartedPulling="2026-04-22 14:18:07.821319002 +0000 UTC m=+158.147053396" lastFinishedPulling="2026-04-22 14:18:09.818649274 +0000 UTC m=+160.144383670" observedRunningTime="2026-04-22 14:18:10.831019545 +0000 UTC m=+161.156753960" watchObservedRunningTime="2026-04-22 14:18:10.831503907 +0000 UTC m=+161.157238324" Apr 22 14:18:11.493794 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:11.493755 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-d7f49cbdc-fj6gv" Apr 22 14:18:11.784758 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:11.784730 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-v29ss" event={"ID":"9eeb019a-e2e1-4f80-8bf1-18b7ce973747","Type":"ContainerStarted","Data":"bdcd7a4dc9294b9d95ecaf53ba60c6d43714ea4e01170c676e4e859d8034b06a"} Apr 22 14:18:11.786250 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:11.786226 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-d7f49cbdc-fj6gv" event={"ID":"2fa093c6-79ec-4683-a1e6-c0078c63b94b","Type":"ContainerStarted","Data":"faff43f0250d8324cd5994f74e0c2e8e0260f6b51b17e5aee00983d68abe4f22"} Apr 22 14:18:11.786594 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:11.786568 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-d7f49cbdc-fj6gv" Apr 22 14:18:11.787115 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:11.787097 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-d7f49cbdc-fj6gv" Apr 22 14:18:12.790218 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:12.790183 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-v29ss" event={"ID":"9eeb019a-e2e1-4f80-8bf1-18b7ce973747","Type":"ContainerStarted","Data":"20850166422d10c831277ea8e75b187b31fa209b4de773f67e337e9569453628"} Apr 22 14:18:12.790592 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:12.790320 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-v29ss" Apr 22 14:18:12.810149 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:12.810105 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-v29ss" podStartSLOduration=129.597468283 podStartE2EDuration="2m10.810091425s" podCreationTimestamp="2026-04-22 14:16:02 +0000 UTC" firstStartedPulling="2026-04-22 14:18:10.38596054 +0000 UTC m=+160.711694934" lastFinishedPulling="2026-04-22 14:18:11.598583677 +0000 UTC m=+161.924318076" observedRunningTime="2026-04-22 14:18:12.808534465 +0000 UTC m=+163.134268880" watchObservedRunningTime="2026-04-22 14:18:12.810091425 +0000 UTC m=+163.135825897" Apr 22 14:18:16.287878 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:16.287837 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r7psp" Apr 22 14:18:18.291390 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:18.291362 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9cr92" Apr 22 14:18:18.294221 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:18.294200 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-lthk8\"" Apr 22 14:18:18.301610 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:18.301592 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9cr92" Apr 22 14:18:18.411507 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:18.411478 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9cr92"] Apr 22 14:18:18.414389 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:18:18.414359 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7bf6435_22e2_4377_a9e9_e35bf103f96a.slice/crio-804a7094fc4252c76d9b68c95e423b9b87e6512b460d2db2a96adc29f4bf3805 WatchSource:0}: Error finding container 804a7094fc4252c76d9b68c95e423b9b87e6512b460d2db2a96adc29f4bf3805: Status 404 returned error can't find the container with id 804a7094fc4252c76d9b68c95e423b9b87e6512b460d2db2a96adc29f4bf3805 Apr 22 14:18:18.807622 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:18.807595 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9cr92" event={"ID":"e7bf6435-22e2-4377-a9e9-e35bf103f96a","Type":"ContainerStarted","Data":"804a7094fc4252c76d9b68c95e423b9b87e6512b460d2db2a96adc29f4bf3805"} Apr 22 14:18:20.813656 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:20.813611 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9cr92" event={"ID":"e7bf6435-22e2-4377-a9e9-e35bf103f96a","Type":"ContainerStarted","Data":"bd09f802b830854be27d4e06ece0b65ffbc614db7a1e5fcbe64f4dd9ecda0381"} Apr 22 14:18:20.830310 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:20.830256 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-9cr92" podStartSLOduration=137.429641596 podStartE2EDuration="2m18.830239555s" podCreationTimestamp="2026-04-22 14:16:02 +0000 UTC" firstStartedPulling="2026-04-22 14:18:18.416256543 +0000 UTC m=+168.741990940" lastFinishedPulling="2026-04-22 14:18:19.816854493 +0000 UTC m=+170.142588899" observedRunningTime="2026-04-22 14:18:20.829474063 +0000 UTC m=+171.155208471" watchObservedRunningTime="2026-04-22 14:18:20.830239555 +0000 UTC m=+171.155973969" Apr 22 14:18:22.795354 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:22.795324 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-v29ss" Apr 22 14:18:29.778631 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:29.778605 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-74fb95cffd-6sw58" Apr 22 14:18:30.563649 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:30.563619 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-2bnpq"] Apr 22 14:18:30.566771 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:30.566750 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-2bnpq" Apr 22 14:18:30.570123 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:30.570098 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 14:18:30.570270 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:30.570144 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 14:18:30.570270 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:30.570182 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 14:18:30.570270 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:30.570151 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 14:18:30.571263 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:30.571244 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 14:18:30.571359 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:30.571244 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-zgcjt\"" Apr 22 14:18:30.571359 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:30.571285 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 14:18:30.598538 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:30.598506 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/56ff2a83-14c9-44a2-99cb-54c501242f8a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2bnpq\" (UID: \"56ff2a83-14c9-44a2-99cb-54c501242f8a\") " pod="openshift-monitoring/node-exporter-2bnpq" Apr 22 14:18:30.598538 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:30.598544 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/56ff2a83-14c9-44a2-99cb-54c501242f8a-node-exporter-tls\") pod \"node-exporter-2bnpq\" (UID: \"56ff2a83-14c9-44a2-99cb-54c501242f8a\") " pod="openshift-monitoring/node-exporter-2bnpq" Apr 22 14:18:30.598763 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:30.598566 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/56ff2a83-14c9-44a2-99cb-54c501242f8a-node-exporter-textfile\") pod \"node-exporter-2bnpq\" (UID: \"56ff2a83-14c9-44a2-99cb-54c501242f8a\") " pod="openshift-monitoring/node-exporter-2bnpq" Apr 22 14:18:30.598763 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:30.598613 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/56ff2a83-14c9-44a2-99cb-54c501242f8a-node-exporter-wtmp\") pod \"node-exporter-2bnpq\" (UID: \"56ff2a83-14c9-44a2-99cb-54c501242f8a\") " pod="openshift-monitoring/node-exporter-2bnpq" Apr 22 14:18:30.598763 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:30.598646 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/56ff2a83-14c9-44a2-99cb-54c501242f8a-root\") pod \"node-exporter-2bnpq\" (UID: \"56ff2a83-14c9-44a2-99cb-54c501242f8a\") " pod="openshift-monitoring/node-exporter-2bnpq" Apr 22 14:18:30.598763 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:30.598687 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/56ff2a83-14c9-44a2-99cb-54c501242f8a-sys\") pod \"node-exporter-2bnpq\" (UID: \"56ff2a83-14c9-44a2-99cb-54c501242f8a\") " pod="openshift-monitoring/node-exporter-2bnpq" Apr 22 14:18:30.598763 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:30.598719 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/56ff2a83-14c9-44a2-99cb-54c501242f8a-metrics-client-ca\") pod \"node-exporter-2bnpq\" (UID: \"56ff2a83-14c9-44a2-99cb-54c501242f8a\") " pod="openshift-monitoring/node-exporter-2bnpq" Apr 22 14:18:30.598763 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:30.598747 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/56ff2a83-14c9-44a2-99cb-54c501242f8a-node-exporter-accelerators-collector-config\") pod \"node-exporter-2bnpq\" (UID: \"56ff2a83-14c9-44a2-99cb-54c501242f8a\") " pod="openshift-monitoring/node-exporter-2bnpq" Apr 22 14:18:30.598763 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:30.598763 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrqb4\" (UniqueName: \"kubernetes.io/projected/56ff2a83-14c9-44a2-99cb-54c501242f8a-kube-api-access-hrqb4\") pod \"node-exporter-2bnpq\" (UID: \"56ff2a83-14c9-44a2-99cb-54c501242f8a\") " pod="openshift-monitoring/node-exporter-2bnpq" Apr 22 14:18:30.699084 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:30.699044 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/56ff2a83-14c9-44a2-99cb-54c501242f8a-node-exporter-textfile\") pod \"node-exporter-2bnpq\" (UID: \"56ff2a83-14c9-44a2-99cb-54c501242f8a\") " pod="openshift-monitoring/node-exporter-2bnpq" Apr 22 14:18:30.699084 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:30.699089 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/56ff2a83-14c9-44a2-99cb-54c501242f8a-node-exporter-wtmp\") pod \"node-exporter-2bnpq\" (UID: \"56ff2a83-14c9-44a2-99cb-54c501242f8a\") " pod="openshift-monitoring/node-exporter-2bnpq" Apr 22 14:18:30.699342 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:30.699112 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/56ff2a83-14c9-44a2-99cb-54c501242f8a-root\") pod \"node-exporter-2bnpq\" (UID: \"56ff2a83-14c9-44a2-99cb-54c501242f8a\") " pod="openshift-monitoring/node-exporter-2bnpq" Apr 22 14:18:30.699342 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:30.699128 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/56ff2a83-14c9-44a2-99cb-54c501242f8a-sys\") pod \"node-exporter-2bnpq\" (UID: \"56ff2a83-14c9-44a2-99cb-54c501242f8a\") " pod="openshift-monitoring/node-exporter-2bnpq" Apr 22 14:18:30.699342 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:30.699143 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/56ff2a83-14c9-44a2-99cb-54c501242f8a-metrics-client-ca\") pod \"node-exporter-2bnpq\" (UID: \"56ff2a83-14c9-44a2-99cb-54c501242f8a\") " pod="openshift-monitoring/node-exporter-2bnpq" Apr 22 14:18:30.699342 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:30.699172 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/56ff2a83-14c9-44a2-99cb-54c501242f8a-node-exporter-accelerators-collector-config\") pod \"node-exporter-2bnpq\" (UID: \"56ff2a83-14c9-44a2-99cb-54c501242f8a\") " pod="openshift-monitoring/node-exporter-2bnpq" Apr 22 14:18:30.699342 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:30.699190 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/56ff2a83-14c9-44a2-99cb-54c501242f8a-root\") pod \"node-exporter-2bnpq\" (UID: \"56ff2a83-14c9-44a2-99cb-54c501242f8a\") " pod="openshift-monitoring/node-exporter-2bnpq" Apr 22 14:18:30.699342 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:30.699197 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hrqb4\" (UniqueName: \"kubernetes.io/projected/56ff2a83-14c9-44a2-99cb-54c501242f8a-kube-api-access-hrqb4\") pod \"node-exporter-2bnpq\" (UID: \"56ff2a83-14c9-44a2-99cb-54c501242f8a\") " pod="openshift-monitoring/node-exporter-2bnpq" Apr 22 14:18:30.699342 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:30.699234 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/56ff2a83-14c9-44a2-99cb-54c501242f8a-node-exporter-wtmp\") pod \"node-exporter-2bnpq\" (UID: \"56ff2a83-14c9-44a2-99cb-54c501242f8a\") " pod="openshift-monitoring/node-exporter-2bnpq" Apr 22 14:18:30.699342 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:30.699203 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/56ff2a83-14c9-44a2-99cb-54c501242f8a-sys\") pod \"node-exporter-2bnpq\" (UID: \"56ff2a83-14c9-44a2-99cb-54c501242f8a\") " pod="openshift-monitoring/node-exporter-2bnpq" Apr 22 14:18:30.699342 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:30.699321 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/56ff2a83-14c9-44a2-99cb-54c501242f8a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2bnpq\" (UID: \"56ff2a83-14c9-44a2-99cb-54c501242f8a\") " pod="openshift-monitoring/node-exporter-2bnpq" Apr 22 14:18:30.699342 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:30.699340 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/56ff2a83-14c9-44a2-99cb-54c501242f8a-node-exporter-tls\") pod \"node-exporter-2bnpq\" (UID: \"56ff2a83-14c9-44a2-99cb-54c501242f8a\") " pod="openshift-monitoring/node-exporter-2bnpq" Apr 22 14:18:30.699832 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:30.699493 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/56ff2a83-14c9-44a2-99cb-54c501242f8a-node-exporter-textfile\") pod \"node-exporter-2bnpq\" (UID: \"56ff2a83-14c9-44a2-99cb-54c501242f8a\") " pod="openshift-monitoring/node-exporter-2bnpq" Apr 22 14:18:30.699832 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:30.699717 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/56ff2a83-14c9-44a2-99cb-54c501242f8a-node-exporter-accelerators-collector-config\") pod \"node-exporter-2bnpq\" (UID: \"56ff2a83-14c9-44a2-99cb-54c501242f8a\") " pod="openshift-monitoring/node-exporter-2bnpq" Apr 22 14:18:30.699832 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:30.699759 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/56ff2a83-14c9-44a2-99cb-54c501242f8a-metrics-client-ca\") pod \"node-exporter-2bnpq\" (UID: \"56ff2a83-14c9-44a2-99cb-54c501242f8a\") " pod="openshift-monitoring/node-exporter-2bnpq" Apr 22 14:18:30.701585 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:30.701565 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/56ff2a83-14c9-44a2-99cb-54c501242f8a-node-exporter-tls\") pod \"node-exporter-2bnpq\" (UID: \"56ff2a83-14c9-44a2-99cb-54c501242f8a\") " pod="openshift-monitoring/node-exporter-2bnpq" Apr 22 14:18:30.701665 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:30.701593 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/56ff2a83-14c9-44a2-99cb-54c501242f8a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2bnpq\" (UID: \"56ff2a83-14c9-44a2-99cb-54c501242f8a\") " pod="openshift-monitoring/node-exporter-2bnpq" Apr 22 14:18:30.711380 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:30.711349 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrqb4\" (UniqueName: \"kubernetes.io/projected/56ff2a83-14c9-44a2-99cb-54c501242f8a-kube-api-access-hrqb4\") pod \"node-exporter-2bnpq\" (UID: \"56ff2a83-14c9-44a2-99cb-54c501242f8a\") " pod="openshift-monitoring/node-exporter-2bnpq" Apr 22 14:18:30.875014 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:30.874928 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-2bnpq" Apr 22 14:18:30.882585 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:18:30.882557 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56ff2a83_14c9_44a2_99cb_54c501242f8a.slice/crio-c948a4922f2443ade62080e0c2270b29cf27922580a853fbb9be4db1270a6434 WatchSource:0}: Error finding container c948a4922f2443ade62080e0c2270b29cf27922580a853fbb9be4db1270a6434: Status 404 returned error can't find the container with id c948a4922f2443ade62080e0c2270b29cf27922580a853fbb9be4db1270a6434 Apr 22 14:18:31.841387 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:31.841361 2576 generic.go:358] "Generic (PLEG): container finished" podID="56ff2a83-14c9-44a2-99cb-54c501242f8a" containerID="4177ee91e597a3ca183729ac8ad7402f89231d11de23c7cc53016d4564e14172" exitCode=0 Apr 22 14:18:31.841508 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:31.841407 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2bnpq" event={"ID":"56ff2a83-14c9-44a2-99cb-54c501242f8a","Type":"ContainerDied","Data":"4177ee91e597a3ca183729ac8ad7402f89231d11de23c7cc53016d4564e14172"} Apr 22 14:18:31.841508 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:31.841433 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2bnpq" event={"ID":"56ff2a83-14c9-44a2-99cb-54c501242f8a","Type":"ContainerStarted","Data":"c948a4922f2443ade62080e0c2270b29cf27922580a853fbb9be4db1270a6434"} Apr 22 14:18:32.845453 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:32.845422 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2bnpq" event={"ID":"56ff2a83-14c9-44a2-99cb-54c501242f8a","Type":"ContainerStarted","Data":"ff1e0819f574b8acd2cdaa93031532ca099615dee319a65734130a46565048ca"} Apr 22 14:18:32.845453 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:32.845458 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2bnpq" event={"ID":"56ff2a83-14c9-44a2-99cb-54c501242f8a","Type":"ContainerStarted","Data":"b4ab51bdd077b8ee60ce40ee985291a91064f42ce3ab3a630ff58512b73fa49f"} Apr 22 14:18:32.867982 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:18:32.867931 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-2bnpq" podStartSLOduration=2.117492082 podStartE2EDuration="2.867916997s" podCreationTimestamp="2026-04-22 14:18:30 +0000 UTC" firstStartedPulling="2026-04-22 14:18:30.884364178 +0000 UTC m=+181.210098577" lastFinishedPulling="2026-04-22 14:18:31.634789098 +0000 UTC m=+181.960523492" observedRunningTime="2026-04-22 14:18:32.866749248 +0000 UTC m=+183.192483662" watchObservedRunningTime="2026-04-22 14:18:32.867916997 +0000 UTC m=+183.193651412" Apr 22 14:19:03.400232 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:19:03.400172 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-db7dbbfc5-pxs4q" podUID="cbe7e75b-bd42-4538-9392-41d086dc6e34" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 14:19:13.400637 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:19:13.400591 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-db7dbbfc5-pxs4q" podUID="cbe7e75b-bd42-4538-9392-41d086dc6e34" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 14:19:23.400213 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:19:23.400171 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-db7dbbfc5-pxs4q" podUID="cbe7e75b-bd42-4538-9392-41d086dc6e34" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 14:19:23.400580 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:19:23.400238 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-db7dbbfc5-pxs4q" Apr 22 14:19:23.400696 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:19:23.400668 2576 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"7ba88e02680b72e476bff1f5f700e8344da29e8683529905d90f81ca2465b87f"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-db7dbbfc5-pxs4q" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 22 14:19:23.400732 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:19:23.400716 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-db7dbbfc5-pxs4q" podUID="cbe7e75b-bd42-4538-9392-41d086dc6e34" containerName="service-proxy" containerID="cri-o://7ba88e02680b72e476bff1f5f700e8344da29e8683529905d90f81ca2465b87f" gracePeriod=30 Apr 22 14:19:23.971922 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:19:23.971887 2576 generic.go:358] "Generic (PLEG): container finished" podID="cbe7e75b-bd42-4538-9392-41d086dc6e34" containerID="7ba88e02680b72e476bff1f5f700e8344da29e8683529905d90f81ca2465b87f" exitCode=2 Apr 22 14:19:23.972099 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:19:23.971948 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-db7dbbfc5-pxs4q" event={"ID":"cbe7e75b-bd42-4538-9392-41d086dc6e34","Type":"ContainerDied","Data":"7ba88e02680b72e476bff1f5f700e8344da29e8683529905d90f81ca2465b87f"} Apr 22 14:19:23.972099 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:19:23.971973 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-db7dbbfc5-pxs4q" event={"ID":"cbe7e75b-bd42-4538-9392-41d086dc6e34","Type":"ContainerStarted","Data":"51553febd2278b827005fceb08b3be480270f16c3570415155b5df1a9da5a39d"} Apr 22 14:19:42.120446 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:19:42.120332 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dbee0e27-41ea-4d42-84c7-681872bfcda1-metrics-certs\") pod \"network-metrics-daemon-r7psp\" (UID: \"dbee0e27-41ea-4d42-84c7-681872bfcda1\") " pod="openshift-multus/network-metrics-daemon-r7psp" Apr 22 14:19:42.122664 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:19:42.122640 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dbee0e27-41ea-4d42-84c7-681872bfcda1-metrics-certs\") pod \"network-metrics-daemon-r7psp\" (UID: \"dbee0e27-41ea-4d42-84c7-681872bfcda1\") " pod="openshift-multus/network-metrics-daemon-r7psp" Apr 22 14:19:42.391437 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:19:42.391352 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-mfkfw\"" Apr 22 14:19:42.398725 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:19:42.398708 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r7psp" Apr 22 14:19:42.507933 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:19:42.507895 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-r7psp"] Apr 22 14:19:42.510496 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:19:42.510463 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbee0e27_41ea_4d42_84c7_681872bfcda1.slice/crio-f9d5ae64ea62131022b90dfec924cbe6c32457a479d6f5df62be1febe9dd8792 WatchSource:0}: Error finding container f9d5ae64ea62131022b90dfec924cbe6c32457a479d6f5df62be1febe9dd8792: Status 404 returned error can't find the container with id f9d5ae64ea62131022b90dfec924cbe6c32457a479d6f5df62be1febe9dd8792 Apr 22 14:19:43.019192 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:19:43.019152 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-r7psp" event={"ID":"dbee0e27-41ea-4d42-84c7-681872bfcda1","Type":"ContainerStarted","Data":"f9d5ae64ea62131022b90dfec924cbe6c32457a479d6f5df62be1febe9dd8792"} Apr 22 14:19:44.023219 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:19:44.023178 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-r7psp" event={"ID":"dbee0e27-41ea-4d42-84c7-681872bfcda1","Type":"ContainerStarted","Data":"0e2fc7087c5c218267287893a8496e2dd619d82b7d5759b2ca833001f41c3ca9"} Apr 22 14:19:44.023219 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:19:44.023221 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-r7psp" event={"ID":"dbee0e27-41ea-4d42-84c7-681872bfcda1","Type":"ContainerStarted","Data":"a080e582ce53e75a2643020cf81d568bd10b8a3c89c2ee5953d493c7bcff7aaf"} Apr 22 14:19:44.041144 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:19:44.041091 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-r7psp" podStartSLOduration=253.020034195 podStartE2EDuration="4m14.041074826s" podCreationTimestamp="2026-04-22 14:15:30 +0000 UTC" firstStartedPulling="2026-04-22 14:19:42.512606673 +0000 UTC m=+252.838341069" lastFinishedPulling="2026-04-22 14:19:43.533647292 +0000 UTC m=+253.859381700" observedRunningTime="2026-04-22 14:19:44.040180022 +0000 UTC m=+254.365914437" watchObservedRunningTime="2026-04-22 14:19:44.041074826 +0000 UTC m=+254.366809241" Apr 22 14:20:09.836492 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:20:09.836455 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-fmwsm"] Apr 22 14:20:09.839344 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:20:09.839325 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fmwsm" Apr 22 14:20:09.841947 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:20:09.841929 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 14:20:09.847143 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:20:09.847122 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-fmwsm"] Apr 22 14:20:09.911107 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:20:09.911074 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/634576c9-9317-4b50-810b-568419af85ba-original-pull-secret\") pod \"global-pull-secret-syncer-fmwsm\" (UID: \"634576c9-9317-4b50-810b-568419af85ba\") " pod="kube-system/global-pull-secret-syncer-fmwsm" Apr 22 14:20:09.911107 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:20:09.911110 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/634576c9-9317-4b50-810b-568419af85ba-dbus\") pod \"global-pull-secret-syncer-fmwsm\" (UID: \"634576c9-9317-4b50-810b-568419af85ba\") " pod="kube-system/global-pull-secret-syncer-fmwsm" Apr 22 14:20:09.911278 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:20:09.911180 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/634576c9-9317-4b50-810b-568419af85ba-kubelet-config\") pod \"global-pull-secret-syncer-fmwsm\" (UID: \"634576c9-9317-4b50-810b-568419af85ba\") " pod="kube-system/global-pull-secret-syncer-fmwsm" Apr 22 14:20:10.011860 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:20:10.011799 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/634576c9-9317-4b50-810b-568419af85ba-original-pull-secret\") pod \"global-pull-secret-syncer-fmwsm\" (UID: \"634576c9-9317-4b50-810b-568419af85ba\") " pod="kube-system/global-pull-secret-syncer-fmwsm" Apr 22 14:20:10.011860 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:20:10.011865 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/634576c9-9317-4b50-810b-568419af85ba-dbus\") pod \"global-pull-secret-syncer-fmwsm\" (UID: \"634576c9-9317-4b50-810b-568419af85ba\") " pod="kube-system/global-pull-secret-syncer-fmwsm" Apr 22 14:20:10.012060 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:20:10.011926 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/634576c9-9317-4b50-810b-568419af85ba-kubelet-config\") pod \"global-pull-secret-syncer-fmwsm\" (UID: \"634576c9-9317-4b50-810b-568419af85ba\") " pod="kube-system/global-pull-secret-syncer-fmwsm" Apr 22 14:20:10.012060 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:20:10.011995 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/634576c9-9317-4b50-810b-568419af85ba-kubelet-config\") pod \"global-pull-secret-syncer-fmwsm\" (UID: \"634576c9-9317-4b50-810b-568419af85ba\") " pod="kube-system/global-pull-secret-syncer-fmwsm" Apr 22 14:20:10.012060 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:20:10.012049 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/634576c9-9317-4b50-810b-568419af85ba-dbus\") pod \"global-pull-secret-syncer-fmwsm\" (UID: \"634576c9-9317-4b50-810b-568419af85ba\") " pod="kube-system/global-pull-secret-syncer-fmwsm" Apr 22 14:20:10.014083 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:20:10.014065 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/634576c9-9317-4b50-810b-568419af85ba-original-pull-secret\") pod \"global-pull-secret-syncer-fmwsm\" (UID: \"634576c9-9317-4b50-810b-568419af85ba\") " pod="kube-system/global-pull-secret-syncer-fmwsm" Apr 22 14:20:10.148614 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:20:10.148533 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fmwsm" Apr 22 14:20:10.257722 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:20:10.257696 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-fmwsm"] Apr 22 14:20:10.260801 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:20:10.260776 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod634576c9_9317_4b50_810b_568419af85ba.slice/crio-1dd3f2631a3f390c16504a5d48d9c6e67aa4d9487d80bef271afad805d85e91c WatchSource:0}: Error finding container 1dd3f2631a3f390c16504a5d48d9c6e67aa4d9487d80bef271afad805d85e91c: Status 404 returned error can't find the container with id 1dd3f2631a3f390c16504a5d48d9c6e67aa4d9487d80bef271afad805d85e91c Apr 22 14:20:11.093468 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:20:11.093437 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-fmwsm" event={"ID":"634576c9-9317-4b50-810b-568419af85ba","Type":"ContainerStarted","Data":"1dd3f2631a3f390c16504a5d48d9c6e67aa4d9487d80bef271afad805d85e91c"} Apr 22 14:20:15.106557 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:20:15.106473 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-fmwsm" event={"ID":"634576c9-9317-4b50-810b-568419af85ba","Type":"ContainerStarted","Data":"1fdb4cc5786a3dc87f87a68d952cf3541a60a16327ebca6c466cba8d793747a7"} Apr 22 14:20:15.124013 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:20:15.123970 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-fmwsm" podStartSLOduration=1.548714245 podStartE2EDuration="6.123957625s" podCreationTimestamp="2026-04-22 14:20:09 +0000 UTC" firstStartedPulling="2026-04-22 14:20:10.262207578 +0000 UTC m=+280.587941972" lastFinishedPulling="2026-04-22 14:20:14.837450959 +0000 UTC m=+285.163185352" observedRunningTime="2026-04-22 14:20:15.122744505 +0000 UTC m=+285.448478920" watchObservedRunningTime="2026-04-22 14:20:15.123957625 +0000 UTC m=+285.449692100" Apr 22 14:20:30.150750 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:20:30.150708 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x7g6p_5880a8e9-777a-4921-b5f6-c6325c768bf2/ovn-acl-logging/0.log" Apr 22 14:20:30.152028 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:20:30.152002 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x7g6p_5880a8e9-777a-4921-b5f6-c6325c768bf2/ovn-acl-logging/0.log" Apr 22 14:20:30.153710 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:20:30.153689 2576 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 14:21:57.244942 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:21:57.244915 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-2xq56"] Apr 22 14:21:57.247645 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:21:57.247631 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-2xq56" Apr 22 14:21:57.251731 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:21:57.251533 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-gtznj\"" Apr 22 14:21:57.251731 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:21:57.251633 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 22 14:21:57.252784 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:21:57.252764 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 22 14:21:57.252953 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:21:57.252788 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 22 14:21:57.252953 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:21:57.252790 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 22 14:21:57.252953 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:21:57.252765 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 22 14:21:57.261000 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:21:57.260980 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-2xq56"] Apr 22 14:21:57.402660 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:21:57.402635 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/106d88f5-f80a-40e6-abea-d7fb8301da35-certificates\") pod \"keda-metrics-apiserver-7c9f485588-2xq56\" (UID: \"106d88f5-f80a-40e6-abea-d7fb8301da35\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-2xq56" Apr 22 14:21:57.402792 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:21:57.402667 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/106d88f5-f80a-40e6-abea-d7fb8301da35-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-2xq56\" (UID: \"106d88f5-f80a-40e6-abea-d7fb8301da35\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-2xq56" Apr 22 14:21:57.402792 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:21:57.402686 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vrqj\" (UniqueName: \"kubernetes.io/projected/106d88f5-f80a-40e6-abea-d7fb8301da35-kube-api-access-6vrqj\") pod \"keda-metrics-apiserver-7c9f485588-2xq56\" (UID: \"106d88f5-f80a-40e6-abea-d7fb8301da35\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-2xq56" Apr 22 14:21:57.503410 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:21:57.503339 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/106d88f5-f80a-40e6-abea-d7fb8301da35-certificates\") pod \"keda-metrics-apiserver-7c9f485588-2xq56\" (UID: \"106d88f5-f80a-40e6-abea-d7fb8301da35\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-2xq56" Apr 22 14:21:57.503410 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:21:57.503372 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/106d88f5-f80a-40e6-abea-d7fb8301da35-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-2xq56\" (UID: \"106d88f5-f80a-40e6-abea-d7fb8301da35\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-2xq56" Apr 22 14:21:57.503410 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:21:57.503389 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6vrqj\" (UniqueName: \"kubernetes.io/projected/106d88f5-f80a-40e6-abea-d7fb8301da35-kube-api-access-6vrqj\") pod \"keda-metrics-apiserver-7c9f485588-2xq56\" (UID: \"106d88f5-f80a-40e6-abea-d7fb8301da35\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-2xq56" Apr 22 14:21:57.503638 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:21:57.503503 2576 secret.go:281] references non-existent secret key: tls.crt Apr 22 14:21:57.503638 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:21:57.503521 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 14:21:57.503638 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:21:57.503541 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-2xq56: references non-existent secret key: tls.crt Apr 22 14:21:57.503638 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:21:57.503617 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/106d88f5-f80a-40e6-abea-d7fb8301da35-certificates podName:106d88f5-f80a-40e6-abea-d7fb8301da35 nodeName:}" failed. No retries permitted until 2026-04-22 14:21:58.0035954 +0000 UTC m=+388.329329794 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/106d88f5-f80a-40e6-abea-d7fb8301da35-certificates") pod "keda-metrics-apiserver-7c9f485588-2xq56" (UID: "106d88f5-f80a-40e6-abea-d7fb8301da35") : references non-existent secret key: tls.crt Apr 22 14:21:57.503838 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:21:57.503696 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/106d88f5-f80a-40e6-abea-d7fb8301da35-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-2xq56\" (UID: \"106d88f5-f80a-40e6-abea-d7fb8301da35\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-2xq56" Apr 22 14:21:57.512399 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:21:57.512369 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vrqj\" (UniqueName: \"kubernetes.io/projected/106d88f5-f80a-40e6-abea-d7fb8301da35-kube-api-access-6vrqj\") pod \"keda-metrics-apiserver-7c9f485588-2xq56\" (UID: \"106d88f5-f80a-40e6-abea-d7fb8301da35\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-2xq56" Apr 22 14:21:58.007986 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:21:58.007967 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/106d88f5-f80a-40e6-abea-d7fb8301da35-certificates\") pod \"keda-metrics-apiserver-7c9f485588-2xq56\" (UID: \"106d88f5-f80a-40e6-abea-d7fb8301da35\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-2xq56" Apr 22 14:21:58.008125 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:21:58.008074 2576 secret.go:281] references non-existent secret key: tls.crt Apr 22 14:21:58.008125 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:21:58.008084 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 14:21:58.008125 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:21:58.008099 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-2xq56: references non-existent secret key: tls.crt Apr 22 14:21:58.008228 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:21:58.008141 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/106d88f5-f80a-40e6-abea-d7fb8301da35-certificates podName:106d88f5-f80a-40e6-abea-d7fb8301da35 nodeName:}" failed. No retries permitted until 2026-04-22 14:21:59.008129669 +0000 UTC m=+389.333864062 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/106d88f5-f80a-40e6-abea-d7fb8301da35-certificates") pod "keda-metrics-apiserver-7c9f485588-2xq56" (UID: "106d88f5-f80a-40e6-abea-d7fb8301da35") : references non-existent secret key: tls.crt Apr 22 14:21:59.014699 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:21:59.014673 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/106d88f5-f80a-40e6-abea-d7fb8301da35-certificates\") pod \"keda-metrics-apiserver-7c9f485588-2xq56\" (UID: \"106d88f5-f80a-40e6-abea-d7fb8301da35\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-2xq56" Apr 22 14:21:59.015077 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:21:59.014781 2576 secret.go:281] references non-existent secret key: tls.crt Apr 22 14:21:59.015077 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:21:59.014793 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 14:21:59.015077 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:21:59.014825 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-2xq56: references non-existent secret key: tls.crt Apr 22 14:21:59.015077 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:21:59.014887 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/106d88f5-f80a-40e6-abea-d7fb8301da35-certificates podName:106d88f5-f80a-40e6-abea-d7fb8301da35 nodeName:}" failed. No retries permitted until 2026-04-22 14:22:01.014870303 +0000 UTC m=+391.340604700 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/106d88f5-f80a-40e6-abea-d7fb8301da35-certificates") pod "keda-metrics-apiserver-7c9f485588-2xq56" (UID: "106d88f5-f80a-40e6-abea-d7fb8301da35") : references non-existent secret key: tls.crt Apr 22 14:22:01.026258 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:22:01.026226 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/106d88f5-f80a-40e6-abea-d7fb8301da35-certificates\") pod \"keda-metrics-apiserver-7c9f485588-2xq56\" (UID: \"106d88f5-f80a-40e6-abea-d7fb8301da35\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-2xq56" Apr 22 14:22:01.028594 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:22:01.028576 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/106d88f5-f80a-40e6-abea-d7fb8301da35-certificates\") pod \"keda-metrics-apiserver-7c9f485588-2xq56\" (UID: \"106d88f5-f80a-40e6-abea-d7fb8301da35\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-2xq56" Apr 22 14:22:01.159489 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:22:01.159465 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-2xq56" Apr 22 14:22:01.271247 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:22:01.271217 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-2xq56"] Apr 22 14:22:01.274105 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:22:01.274081 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod106d88f5_f80a_40e6_abea_d7fb8301da35.slice/crio-997e91a9b05f73157bb39d987cd4c405cdaa459c9b038470d43d4b2729b73cf4 WatchSource:0}: Error finding container 997e91a9b05f73157bb39d987cd4c405cdaa459c9b038470d43d4b2729b73cf4: Status 404 returned error can't find the container with id 997e91a9b05f73157bb39d987cd4c405cdaa459c9b038470d43d4b2729b73cf4 Apr 22 14:22:01.275233 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:22:01.275217 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 14:22:01.359204 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:22:01.359179 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-2xq56" event={"ID":"106d88f5-f80a-40e6-abea-d7fb8301da35","Type":"ContainerStarted","Data":"997e91a9b05f73157bb39d987cd4c405cdaa459c9b038470d43d4b2729b73cf4"} Apr 22 14:22:05.369301 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:22:05.369270 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-2xq56" event={"ID":"106d88f5-f80a-40e6-abea-d7fb8301da35","Type":"ContainerStarted","Data":"5d3225ef88a8a1c03066cd9046263b070603b27ef8663b28bfeb4d1c691e7258"} Apr 22 14:22:05.369772 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:22:05.369421 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-2xq56" Apr 22 14:22:05.386460 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:22:05.386419 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-2xq56" podStartSLOduration=5.319395013 podStartE2EDuration="8.386405236s" podCreationTimestamp="2026-04-22 14:21:57 +0000 UTC" firstStartedPulling="2026-04-22 14:22:01.275337687 +0000 UTC m=+391.601072080" lastFinishedPulling="2026-04-22 14:22:04.342347908 +0000 UTC m=+394.668082303" observedRunningTime="2026-04-22 14:22:05.385539432 +0000 UTC m=+395.711273853" watchObservedRunningTime="2026-04-22 14:22:05.386405236 +0000 UTC m=+395.712139650" Apr 22 14:22:16.375868 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:22:16.375840 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-2xq56" Apr 22 14:23:13.422280 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:23:13.422244 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-9pjvw"] Apr 22 14:23:13.425411 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:23:13.425391 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-9pjvw" Apr 22 14:23:13.429397 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:23:13.429363 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 22 14:23:13.429397 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:23:13.429382 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 14:23:13.430177 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:23:13.430145 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 14:23:13.430177 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:23:13.430145 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-2cbcr\"" Apr 22 14:23:13.441058 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:23:13.441038 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-9pjvw"] Apr 22 14:23:13.463197 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:23:13.463172 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-5wkgh"] Apr 22 14:23:13.467303 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:23:13.467177 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-5wkgh" Apr 22 14:23:13.470298 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:23:13.470263 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-m75pq\"" Apr 22 14:23:13.470405 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:23:13.470396 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 22 14:23:13.479762 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:23:13.479739 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-5wkgh"] Apr 22 14:23:13.513133 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:23:13.513112 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/faa255cd-039a-42fe-8393-76525944bc1a-data\") pod \"seaweedfs-86cc847c5c-5wkgh\" (UID: \"faa255cd-039a-42fe-8393-76525944bc1a\") " pod="kserve/seaweedfs-86cc847c5c-5wkgh" Apr 22 14:23:13.513236 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:23:13.513161 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stq9x\" (UniqueName: \"kubernetes.io/projected/95948c22-da6d-425e-b9ab-3ce2bcb73905-kube-api-access-stq9x\") pod \"llmisvc-controller-manager-68cc5db7c4-9pjvw\" (UID: \"95948c22-da6d-425e-b9ab-3ce2bcb73905\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-9pjvw" Apr 22 14:23:13.513236 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:23:13.513229 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdxds\" (UniqueName: \"kubernetes.io/projected/faa255cd-039a-42fe-8393-76525944bc1a-kube-api-access-rdxds\") pod \"seaweedfs-86cc847c5c-5wkgh\" (UID: \"faa255cd-039a-42fe-8393-76525944bc1a\") " pod="kserve/seaweedfs-86cc847c5c-5wkgh" Apr 22 14:23:13.513329 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:23:13.513261 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/95948c22-da6d-425e-b9ab-3ce2bcb73905-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-9pjvw\" (UID: \"95948c22-da6d-425e-b9ab-3ce2bcb73905\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-9pjvw" Apr 22 14:23:13.614584 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:23:13.614552 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-stq9x\" (UniqueName: \"kubernetes.io/projected/95948c22-da6d-425e-b9ab-3ce2bcb73905-kube-api-access-stq9x\") pod \"llmisvc-controller-manager-68cc5db7c4-9pjvw\" (UID: \"95948c22-da6d-425e-b9ab-3ce2bcb73905\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-9pjvw" Apr 22 14:23:13.614747 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:23:13.614596 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rdxds\" (UniqueName: \"kubernetes.io/projected/faa255cd-039a-42fe-8393-76525944bc1a-kube-api-access-rdxds\") pod \"seaweedfs-86cc847c5c-5wkgh\" (UID: \"faa255cd-039a-42fe-8393-76525944bc1a\") " pod="kserve/seaweedfs-86cc847c5c-5wkgh" Apr 22 14:23:13.614747 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:23:13.614616 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/95948c22-da6d-425e-b9ab-3ce2bcb73905-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-9pjvw\" (UID: \"95948c22-da6d-425e-b9ab-3ce2bcb73905\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-9pjvw" Apr 22 14:23:13.614747 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:23:13.614639 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/faa255cd-039a-42fe-8393-76525944bc1a-data\") pod \"seaweedfs-86cc847c5c-5wkgh\" (UID: \"faa255cd-039a-42fe-8393-76525944bc1a\") " pod="kserve/seaweedfs-86cc847c5c-5wkgh" Apr 22 14:23:13.615055 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:23:13.615034 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/faa255cd-039a-42fe-8393-76525944bc1a-data\") pod \"seaweedfs-86cc847c5c-5wkgh\" (UID: \"faa255cd-039a-42fe-8393-76525944bc1a\") " pod="kserve/seaweedfs-86cc847c5c-5wkgh" Apr 22 14:23:13.617048 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:23:13.617028 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/95948c22-da6d-425e-b9ab-3ce2bcb73905-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-9pjvw\" (UID: \"95948c22-da6d-425e-b9ab-3ce2bcb73905\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-9pjvw" Apr 22 14:23:13.626707 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:23:13.626676 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdxds\" (UniqueName: \"kubernetes.io/projected/faa255cd-039a-42fe-8393-76525944bc1a-kube-api-access-rdxds\") pod \"seaweedfs-86cc847c5c-5wkgh\" (UID: \"faa255cd-039a-42fe-8393-76525944bc1a\") " pod="kserve/seaweedfs-86cc847c5c-5wkgh" Apr 22 14:23:13.627090 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:23:13.627075 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-stq9x\" (UniqueName: \"kubernetes.io/projected/95948c22-da6d-425e-b9ab-3ce2bcb73905-kube-api-access-stq9x\") pod \"llmisvc-controller-manager-68cc5db7c4-9pjvw\" (UID: \"95948c22-da6d-425e-b9ab-3ce2bcb73905\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-9pjvw" Apr 22 14:23:13.734292 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:23:13.734272 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-9pjvw" Apr 22 14:23:13.776169 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:23:13.776136 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-5wkgh" Apr 22 14:23:13.855398 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:23:13.855367 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-9pjvw"] Apr 22 14:23:13.858173 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:23:13.858150 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod95948c22_da6d_425e_b9ab_3ce2bcb73905.slice/crio-6c17909d68105d9cbaed4eea24192cad6e97a0afbab97a9a50c11e20094e53d9 WatchSource:0}: Error finding container 6c17909d68105d9cbaed4eea24192cad6e97a0afbab97a9a50c11e20094e53d9: Status 404 returned error can't find the container with id 6c17909d68105d9cbaed4eea24192cad6e97a0afbab97a9a50c11e20094e53d9 Apr 22 14:23:13.911670 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:23:13.911562 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-5wkgh"] Apr 22 14:23:13.914037 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:23:13.914012 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfaa255cd_039a_42fe_8393_76525944bc1a.slice/crio-57bb56e54a02043adc06be2a5f9d9781438517cb8de9432ace8c60dee530a126 WatchSource:0}: Error finding container 57bb56e54a02043adc06be2a5f9d9781438517cb8de9432ace8c60dee530a126: Status 404 returned error can't find the container with id 57bb56e54a02043adc06be2a5f9d9781438517cb8de9432ace8c60dee530a126 Apr 22 14:23:14.535405 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:23:14.535364 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-9pjvw" event={"ID":"95948c22-da6d-425e-b9ab-3ce2bcb73905","Type":"ContainerStarted","Data":"6c17909d68105d9cbaed4eea24192cad6e97a0afbab97a9a50c11e20094e53d9"} Apr 22 14:23:14.536570 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:23:14.536510 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-5wkgh" event={"ID":"faa255cd-039a-42fe-8393-76525944bc1a","Type":"ContainerStarted","Data":"57bb56e54a02043adc06be2a5f9d9781438517cb8de9432ace8c60dee530a126"} Apr 22 14:23:17.547216 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:23:17.547185 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-5wkgh" event={"ID":"faa255cd-039a-42fe-8393-76525944bc1a","Type":"ContainerStarted","Data":"53051bc90123cd829364561c077d8146c8cfdeca211c396b52a4fe0aefb37040"} Apr 22 14:23:17.547671 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:23:17.547255 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-5wkgh" Apr 22 14:23:17.548421 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:23:17.548398 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-9pjvw" event={"ID":"95948c22-da6d-425e-b9ab-3ce2bcb73905","Type":"ContainerStarted","Data":"7b6fbfb0310cde8658a202b3486ce1d457c49c331f38246e62b82d42eb825813"} Apr 22 14:23:17.548541 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:23:17.548531 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-9pjvw" Apr 22 14:23:17.591377 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:23:17.591337 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-9pjvw" podStartSLOduration=1.4035381120000001 podStartE2EDuration="4.591325858s" podCreationTimestamp="2026-04-22 14:23:13 +0000 UTC" firstStartedPulling="2026-04-22 14:23:13.859442374 +0000 UTC m=+464.185176781" lastFinishedPulling="2026-04-22 14:23:17.047230134 +0000 UTC m=+467.372964527" observedRunningTime="2026-04-22 14:23:17.590136566 +0000 UTC m=+467.915870980" watchObservedRunningTime="2026-04-22 14:23:17.591325858 +0000 UTC m=+467.917060273" Apr 22 14:23:17.591585 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:23:17.591563 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-5wkgh" podStartSLOduration=1.408110531 podStartE2EDuration="4.59155757s" podCreationTimestamp="2026-04-22 14:23:13 +0000 UTC" firstStartedPulling="2026-04-22 14:23:13.915306584 +0000 UTC m=+464.241040977" lastFinishedPulling="2026-04-22 14:23:17.098753623 +0000 UTC m=+467.424488016" observedRunningTime="2026-04-22 14:23:17.568433709 +0000 UTC m=+467.894168146" watchObservedRunningTime="2026-04-22 14:23:17.59155757 +0000 UTC m=+467.917291988" Apr 22 14:23:23.553284 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:23:23.553249 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-5wkgh" Apr 22 14:23:48.553319 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:23:48.553294 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-9pjvw" Apr 22 14:24:23.475163 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:24:23.475087 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-c2wh7"] Apr 22 14:24:23.478097 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:24:23.478081 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-c2wh7" Apr 22 14:24:23.481002 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:24:23.480979 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-xfc2j\"" Apr 22 14:24:23.481260 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:24:23.481246 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 22 14:24:23.488438 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:24:23.488419 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-c2wh7"] Apr 22 14:24:23.664554 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:24:23.664528 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e5a1967a-f314-4ecc-81d9-16eb90609713-cert\") pod \"odh-model-controller-696fc77849-c2wh7\" (UID: \"e5a1967a-f314-4ecc-81d9-16eb90609713\") " pod="kserve/odh-model-controller-696fc77849-c2wh7" Apr 22 14:24:23.664669 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:24:23.664558 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjxdr\" (UniqueName: \"kubernetes.io/projected/e5a1967a-f314-4ecc-81d9-16eb90609713-kube-api-access-cjxdr\") pod \"odh-model-controller-696fc77849-c2wh7\" (UID: \"e5a1967a-f314-4ecc-81d9-16eb90609713\") " pod="kserve/odh-model-controller-696fc77849-c2wh7" Apr 22 14:24:23.765395 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:24:23.765335 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e5a1967a-f314-4ecc-81d9-16eb90609713-cert\") pod \"odh-model-controller-696fc77849-c2wh7\" (UID: \"e5a1967a-f314-4ecc-81d9-16eb90609713\") " pod="kserve/odh-model-controller-696fc77849-c2wh7" Apr 22 14:24:23.765395 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:24:23.765368 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cjxdr\" (UniqueName: \"kubernetes.io/projected/e5a1967a-f314-4ecc-81d9-16eb90609713-kube-api-access-cjxdr\") pod \"odh-model-controller-696fc77849-c2wh7\" (UID: \"e5a1967a-f314-4ecc-81d9-16eb90609713\") " pod="kserve/odh-model-controller-696fc77849-c2wh7" Apr 22 14:24:23.765547 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:24:23.765460 2576 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 22 14:24:23.765547 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:24:23.765512 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5a1967a-f314-4ecc-81d9-16eb90609713-cert podName:e5a1967a-f314-4ecc-81d9-16eb90609713 nodeName:}" failed. No retries permitted until 2026-04-22 14:24:24.265496377 +0000 UTC m=+534.591230771 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e5a1967a-f314-4ecc-81d9-16eb90609713-cert") pod "odh-model-controller-696fc77849-c2wh7" (UID: "e5a1967a-f314-4ecc-81d9-16eb90609713") : secret "odh-model-controller-webhook-cert" not found Apr 22 14:24:23.775435 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:24:23.775408 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjxdr\" (UniqueName: \"kubernetes.io/projected/e5a1967a-f314-4ecc-81d9-16eb90609713-kube-api-access-cjxdr\") pod \"odh-model-controller-696fc77849-c2wh7\" (UID: \"e5a1967a-f314-4ecc-81d9-16eb90609713\") " pod="kserve/odh-model-controller-696fc77849-c2wh7" Apr 22 14:24:24.269351 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:24:24.269319 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e5a1967a-f314-4ecc-81d9-16eb90609713-cert\") pod \"odh-model-controller-696fc77849-c2wh7\" (UID: \"e5a1967a-f314-4ecc-81d9-16eb90609713\") " pod="kserve/odh-model-controller-696fc77849-c2wh7" Apr 22 14:24:24.271590 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:24:24.271562 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e5a1967a-f314-4ecc-81d9-16eb90609713-cert\") pod \"odh-model-controller-696fc77849-c2wh7\" (UID: \"e5a1967a-f314-4ecc-81d9-16eb90609713\") " pod="kserve/odh-model-controller-696fc77849-c2wh7" Apr 22 14:24:24.387657 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:24:24.387626 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-c2wh7" Apr 22 14:24:24.508914 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:24:24.508886 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-c2wh7"] Apr 22 14:24:24.511922 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:24:24.511891 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5a1967a_f314_4ecc_81d9_16eb90609713.slice/crio-48ea12196798b55bce2957bb5cd6f87a7ea15a8a89a592be2221b5b23700f734 WatchSource:0}: Error finding container 48ea12196798b55bce2957bb5cd6f87a7ea15a8a89a592be2221b5b23700f734: Status 404 returned error can't find the container with id 48ea12196798b55bce2957bb5cd6f87a7ea15a8a89a592be2221b5b23700f734 Apr 22 14:24:24.709320 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:24:24.709262 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-c2wh7" event={"ID":"e5a1967a-f314-4ecc-81d9-16eb90609713","Type":"ContainerStarted","Data":"48ea12196798b55bce2957bb5cd6f87a7ea15a8a89a592be2221b5b23700f734"} Apr 22 14:24:27.718713 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:24:27.718675 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-c2wh7" event={"ID":"e5a1967a-f314-4ecc-81d9-16eb90609713","Type":"ContainerStarted","Data":"fd22ae51632aa6c7373215b43dd8a6860d58ce3e31c867692c558789cdc422bd"} Apr 22 14:24:27.719080 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:24:27.718825 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-c2wh7" Apr 22 14:24:27.742492 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:24:27.742450 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-c2wh7" podStartSLOduration=2.433512066 podStartE2EDuration="4.742437964s" podCreationTimestamp="2026-04-22 14:24:23 +0000 UTC" firstStartedPulling="2026-04-22 14:24:24.51302574 +0000 UTC m=+534.838760133" lastFinishedPulling="2026-04-22 14:24:26.821951632 +0000 UTC m=+537.147686031" observedRunningTime="2026-04-22 14:24:27.74140088 +0000 UTC m=+538.067135304" watchObservedRunningTime="2026-04-22 14:24:27.742437964 +0000 UTC m=+538.068172377" Apr 22 14:24:38.723154 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:24:38.723124 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-c2wh7" Apr 22 14:24:58.753994 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:24:58.753959 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-4svkh"] Apr 22 14:24:58.757049 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:24:58.757034 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-4svkh" Apr 22 14:24:58.761212 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:24:58.761183 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 22 14:24:58.761390 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:24:58.761377 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom\"" Apr 22 14:24:58.770137 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:24:58.770114 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-4svkh"] Apr 22 14:24:58.794384 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:24:58.794357 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcnzf\" (UniqueName: \"kubernetes.io/projected/1ade1b13-4eaf-4e7b-ae1b-f5439819135b-kube-api-access-kcnzf\") pod \"seaweedfs-tls-custom-5c88b85bb7-4svkh\" (UID: \"1ade1b13-4eaf-4e7b-ae1b-f5439819135b\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-4svkh" Apr 22 14:24:58.794496 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:24:58.794389 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/1ade1b13-4eaf-4e7b-ae1b-f5439819135b-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-4svkh\" (UID: \"1ade1b13-4eaf-4e7b-ae1b-f5439819135b\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-4svkh" Apr 22 14:24:58.794496 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:24:58.794413 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/1ade1b13-4eaf-4e7b-ae1b-f5439819135b-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-4svkh\" (UID: \"1ade1b13-4eaf-4e7b-ae1b-f5439819135b\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-4svkh" Apr 22 14:24:58.895180 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:24:58.895149 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kcnzf\" (UniqueName: \"kubernetes.io/projected/1ade1b13-4eaf-4e7b-ae1b-f5439819135b-kube-api-access-kcnzf\") pod \"seaweedfs-tls-custom-5c88b85bb7-4svkh\" (UID: \"1ade1b13-4eaf-4e7b-ae1b-f5439819135b\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-4svkh" Apr 22 14:24:58.895180 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:24:58.895183 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/1ade1b13-4eaf-4e7b-ae1b-f5439819135b-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-4svkh\" (UID: \"1ade1b13-4eaf-4e7b-ae1b-f5439819135b\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-4svkh" Apr 22 14:24:58.895382 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:24:58.895200 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/1ade1b13-4eaf-4e7b-ae1b-f5439819135b-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-4svkh\" (UID: \"1ade1b13-4eaf-4e7b-ae1b-f5439819135b\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-4svkh" Apr 22 14:24:58.895603 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:24:58.895582 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/1ade1b13-4eaf-4e7b-ae1b-f5439819135b-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-4svkh\" (UID: \"1ade1b13-4eaf-4e7b-ae1b-f5439819135b\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-4svkh" Apr 22 14:24:58.897555 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:24:58.897533 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/1ade1b13-4eaf-4e7b-ae1b-f5439819135b-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-4svkh\" (UID: \"1ade1b13-4eaf-4e7b-ae1b-f5439819135b\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-4svkh" Apr 22 14:24:58.904383 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:24:58.904363 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcnzf\" (UniqueName: \"kubernetes.io/projected/1ade1b13-4eaf-4e7b-ae1b-f5439819135b-kube-api-access-kcnzf\") pod \"seaweedfs-tls-custom-5c88b85bb7-4svkh\" (UID: \"1ade1b13-4eaf-4e7b-ae1b-f5439819135b\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-4svkh" Apr 22 14:24:59.065617 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:24:59.065540 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-4svkh" Apr 22 14:24:59.185210 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:24:59.185176 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-4svkh"] Apr 22 14:24:59.189403 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:24:59.189381 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ade1b13_4eaf_4e7b_ae1b_f5439819135b.slice/crio-ecfd3097abaca53d9f6f5692041b30e799e82e1f3cf1f62c7308b0e2d061006b WatchSource:0}: Error finding container ecfd3097abaca53d9f6f5692041b30e799e82e1f3cf1f62c7308b0e2d061006b: Status 404 returned error can't find the container with id ecfd3097abaca53d9f6f5692041b30e799e82e1f3cf1f62c7308b0e2d061006b Apr 22 14:24:59.806080 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:24:59.806052 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-4svkh" event={"ID":"1ade1b13-4eaf-4e7b-ae1b-f5439819135b","Type":"ContainerStarted","Data":"7bb153331d5339650d1a09af2a7a58eab1a7f201681c10d11fea59e35031834d"} Apr 22 14:24:59.806080 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:24:59.806082 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-4svkh" event={"ID":"1ade1b13-4eaf-4e7b-ae1b-f5439819135b","Type":"ContainerStarted","Data":"ecfd3097abaca53d9f6f5692041b30e799e82e1f3cf1f62c7308b0e2d061006b"} Apr 22 14:24:59.824186 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:24:59.824145 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-4svkh" podStartSLOduration=1.547732566 podStartE2EDuration="1.824132336s" podCreationTimestamp="2026-04-22 14:24:58 +0000 UTC" firstStartedPulling="2026-04-22 14:24:59.190871364 +0000 UTC m=+569.516605770" lastFinishedPulling="2026-04-22 14:24:59.467271146 +0000 UTC m=+569.793005540" observedRunningTime="2026-04-22 14:24:59.821975148 +0000 UTC m=+570.147709563" watchObservedRunningTime="2026-04-22 14:24:59.824132336 +0000 UTC m=+570.149866773" Apr 22 14:25:08.406621 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:25:08.406590 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-hnw8m"] Apr 22 14:25:08.409577 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:25:08.409562 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-hnw8m" Apr 22 14:25:08.415831 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:25:08.415596 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving-artifact\"" Apr 22 14:25:08.416122 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:25:08.416106 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving\"" Apr 22 14:25:08.424017 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:25:08.423997 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-hnw8m"] Apr 22 14:25:08.467820 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:25:08.467788 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/ad315599-98f1-46ea-b529-63e21ca49370-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-hnw8m\" (UID: \"ad315599-98f1-46ea-b529-63e21ca49370\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-hnw8m" Apr 22 14:25:08.467943 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:25:08.467866 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/ad315599-98f1-46ea-b529-63e21ca49370-data\") pod \"seaweedfs-tls-serving-7fd5766db9-hnw8m\" (UID: \"ad315599-98f1-46ea-b529-63e21ca49370\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-hnw8m" Apr 22 14:25:08.467943 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:25:08.467901 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wp9bx\" (UniqueName: \"kubernetes.io/projected/ad315599-98f1-46ea-b529-63e21ca49370-kube-api-access-wp9bx\") pod \"seaweedfs-tls-serving-7fd5766db9-hnw8m\" (UID: \"ad315599-98f1-46ea-b529-63e21ca49370\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-hnw8m" Apr 22 14:25:08.568372 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:25:08.568350 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/ad315599-98f1-46ea-b529-63e21ca49370-data\") pod \"seaweedfs-tls-serving-7fd5766db9-hnw8m\" (UID: \"ad315599-98f1-46ea-b529-63e21ca49370\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-hnw8m" Apr 22 14:25:08.568483 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:25:08.568377 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wp9bx\" (UniqueName: \"kubernetes.io/projected/ad315599-98f1-46ea-b529-63e21ca49370-kube-api-access-wp9bx\") pod \"seaweedfs-tls-serving-7fd5766db9-hnw8m\" (UID: \"ad315599-98f1-46ea-b529-63e21ca49370\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-hnw8m" Apr 22 14:25:08.568483 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:25:08.568428 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/ad315599-98f1-46ea-b529-63e21ca49370-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-hnw8m\" (UID: \"ad315599-98f1-46ea-b529-63e21ca49370\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-hnw8m" Apr 22 14:25:08.568562 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:25:08.568530 2576 projected.go:264] Couldn't get secret kserve/seaweedfs-tls-serving: secret "seaweedfs-tls-serving" not found Apr 22 14:25:08.568562 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:25:08.568543 2576 projected.go:194] Error preparing data for projected volume seaweedfs-tls-serving for pod kserve/seaweedfs-tls-serving-7fd5766db9-hnw8m: secret "seaweedfs-tls-serving" not found Apr 22 14:25:08.568624 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:25:08.568608 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ad315599-98f1-46ea-b529-63e21ca49370-seaweedfs-tls-serving podName:ad315599-98f1-46ea-b529-63e21ca49370 nodeName:}" failed. No retries permitted until 2026-04-22 14:25:09.068589359 +0000 UTC m=+579.394323752 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "seaweedfs-tls-serving" (UniqueName: "kubernetes.io/projected/ad315599-98f1-46ea-b529-63e21ca49370-seaweedfs-tls-serving") pod "seaweedfs-tls-serving-7fd5766db9-hnw8m" (UID: "ad315599-98f1-46ea-b529-63e21ca49370") : secret "seaweedfs-tls-serving" not found Apr 22 14:25:08.568669 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:25:08.568641 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/ad315599-98f1-46ea-b529-63e21ca49370-data\") pod \"seaweedfs-tls-serving-7fd5766db9-hnw8m\" (UID: \"ad315599-98f1-46ea-b529-63e21ca49370\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-hnw8m" Apr 22 14:25:08.578795 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:25:08.578775 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wp9bx\" (UniqueName: \"kubernetes.io/projected/ad315599-98f1-46ea-b529-63e21ca49370-kube-api-access-wp9bx\") pod \"seaweedfs-tls-serving-7fd5766db9-hnw8m\" (UID: \"ad315599-98f1-46ea-b529-63e21ca49370\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-hnw8m" Apr 22 14:25:09.071536 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:25:09.071490 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/ad315599-98f1-46ea-b529-63e21ca49370-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-hnw8m\" (UID: \"ad315599-98f1-46ea-b529-63e21ca49370\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-hnw8m" Apr 22 14:25:09.073706 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:25:09.073679 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/ad315599-98f1-46ea-b529-63e21ca49370-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-hnw8m\" (UID: \"ad315599-98f1-46ea-b529-63e21ca49370\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-hnw8m" Apr 22 14:25:09.318380 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:25:09.318349 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-hnw8m" Apr 22 14:25:09.433161 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:25:09.433137 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-hnw8m"] Apr 22 14:25:09.435127 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:25:09.435101 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad315599_98f1_46ea_b529_63e21ca49370.slice/crio-c9e002cb3207e696c33491b6bef3428c461c2d175b606b1ac746708666b0f568 WatchSource:0}: Error finding container c9e002cb3207e696c33491b6bef3428c461c2d175b606b1ac746708666b0f568: Status 404 returned error can't find the container with id c9e002cb3207e696c33491b6bef3428c461c2d175b606b1ac746708666b0f568 Apr 22 14:25:09.832950 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:25:09.832914 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-hnw8m" event={"ID":"ad315599-98f1-46ea-b529-63e21ca49370","Type":"ContainerStarted","Data":"7d554454214711a13020d3c0ddb08e4f5055e18b86e66f8c81cce1b5f6283ef0"} Apr 22 14:25:09.832950 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:25:09.832955 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-hnw8m" event={"ID":"ad315599-98f1-46ea-b529-63e21ca49370","Type":"ContainerStarted","Data":"c9e002cb3207e696c33491b6bef3428c461c2d175b606b1ac746708666b0f568"} Apr 22 14:25:09.850008 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:25:09.849889 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-serving-7fd5766db9-hnw8m" podStartSLOduration=1.552400478 podStartE2EDuration="1.849872026s" podCreationTimestamp="2026-04-22 14:25:08 +0000 UTC" firstStartedPulling="2026-04-22 14:25:09.436288385 +0000 UTC m=+579.762022781" lastFinishedPulling="2026-04-22 14:25:09.733759932 +0000 UTC m=+580.059494329" observedRunningTime="2026-04-22 14:25:09.849003082 +0000 UTC m=+580.174737509" watchObservedRunningTime="2026-04-22 14:25:09.849872026 +0000 UTC m=+580.175606442" Apr 22 14:25:27.373413 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:25:27.373378 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-tj8xx"] Apr 22 14:25:27.376898 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:25:27.376878 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-tj8xx" Apr 22 14:25:27.383525 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:25:27.383508 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-zp85x\"" Apr 22 14:25:27.389591 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:25:27.389569 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-tj8xx"] Apr 22 14:25:27.508650 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:25:27.508621 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cad75876-191d-4433-8f8c-1f65c63ada6e-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-7cf94896c9-tj8xx\" (UID: \"cad75876-191d-4433-8f8c-1f65c63ada6e\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-tj8xx" Apr 22 14:25:27.609865 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:25:27.609839 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cad75876-191d-4433-8f8c-1f65c63ada6e-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-7cf94896c9-tj8xx\" (UID: \"cad75876-191d-4433-8f8c-1f65c63ada6e\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-tj8xx" Apr 22 14:25:27.610161 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:25:27.610144 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cad75876-191d-4433-8f8c-1f65c63ada6e-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-7cf94896c9-tj8xx\" (UID: \"cad75876-191d-4433-8f8c-1f65c63ada6e\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-tj8xx" Apr 22 14:25:27.686800 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:25:27.686723 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-tj8xx" Apr 22 14:25:27.811087 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:25:27.811032 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-tj8xx"] Apr 22 14:25:27.813668 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:25:27.813640 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcad75876_191d_4433_8f8c_1f65c63ada6e.slice/crio-3b2d1feef419d1044e192b4b2f8734daa41092a13d06b7fca7ad2f73159b5f2e WatchSource:0}: Error finding container 3b2d1feef419d1044e192b4b2f8734daa41092a13d06b7fca7ad2f73159b5f2e: Status 404 returned error can't find the container with id 3b2d1feef419d1044e192b4b2f8734daa41092a13d06b7fca7ad2f73159b5f2e Apr 22 14:25:27.882510 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:25:27.882477 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-tj8xx" event={"ID":"cad75876-191d-4433-8f8c-1f65c63ada6e","Type":"ContainerStarted","Data":"3b2d1feef419d1044e192b4b2f8734daa41092a13d06b7fca7ad2f73159b5f2e"} Apr 22 14:25:30.913347 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:25:30.913318 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x7g6p_5880a8e9-777a-4921-b5f6-c6325c768bf2/ovn-acl-logging/0.log" Apr 22 14:25:30.913939 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:25:30.913717 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x7g6p_5880a8e9-777a-4921-b5f6-c6325c768bf2/ovn-acl-logging/0.log" Apr 22 14:25:31.896918 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:25:31.896883 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-tj8xx" event={"ID":"cad75876-191d-4433-8f8c-1f65c63ada6e","Type":"ContainerStarted","Data":"6f645a580c54036e3883516feed3777b13ed2fd75aa22d9a77f64d212dd8181f"} Apr 22 14:25:34.905613 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:25:34.905526 2576 generic.go:358] "Generic (PLEG): container finished" podID="cad75876-191d-4433-8f8c-1f65c63ada6e" containerID="6f645a580c54036e3883516feed3777b13ed2fd75aa22d9a77f64d212dd8181f" exitCode=0 Apr 22 14:25:34.905988 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:25:34.905604 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-tj8xx" event={"ID":"cad75876-191d-4433-8f8c-1f65c63ada6e","Type":"ContainerDied","Data":"6f645a580c54036e3883516feed3777b13ed2fd75aa22d9a77f64d212dd8181f"} Apr 22 14:25:47.949973 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:25:47.949895 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-tj8xx" event={"ID":"cad75876-191d-4433-8f8c-1f65c63ada6e","Type":"ContainerStarted","Data":"b694c22cf63b037ab07cdc3965116ad116dead7fd66f66805267d735b097f0e9"} Apr 22 14:25:50.959146 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:25:50.959111 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-tj8xx" event={"ID":"cad75876-191d-4433-8f8c-1f65c63ada6e","Type":"ContainerStarted","Data":"29fc68906bbb7355622f124559481791d1a95fa84f0fe7628fd47e3e951832c5"} Apr 22 14:25:50.959521 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:25:50.959335 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-tj8xx" Apr 22 14:25:50.960461 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:25:50.960435 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-tj8xx" podUID="cad75876-191d-4433-8f8c-1f65c63ada6e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.18:8080: connect: connection refused" Apr 22 14:25:50.977851 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:25:50.977776 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-tj8xx" podStartSLOduration=1.805665254 podStartE2EDuration="23.977766082s" podCreationTimestamp="2026-04-22 14:25:27 +0000 UTC" firstStartedPulling="2026-04-22 14:25:27.815473441 +0000 UTC m=+598.141207834" lastFinishedPulling="2026-04-22 14:25:49.987574268 +0000 UTC m=+620.313308662" observedRunningTime="2026-04-22 14:25:50.97761608 +0000 UTC m=+621.303350493" watchObservedRunningTime="2026-04-22 14:25:50.977766082 +0000 UTC m=+621.303500496" Apr 22 14:25:51.962046 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:25:51.962013 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-tj8xx" Apr 22 14:25:51.962523 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:25:51.962103 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-tj8xx" podUID="cad75876-191d-4433-8f8c-1f65c63ada6e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.18:8080: connect: connection refused" Apr 22 14:25:51.963045 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:25:51.963025 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-tj8xx" podUID="cad75876-191d-4433-8f8c-1f65c63ada6e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:25:52.965328 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:25:52.965287 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-tj8xx" podUID="cad75876-191d-4433-8f8c-1f65c63ada6e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.18:8080: connect: connection refused" Apr 22 14:25:52.965720 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:25:52.965689 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-tj8xx" podUID="cad75876-191d-4433-8f8c-1f65c63ada6e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:26:02.965335 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:26:02.965297 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-tj8xx" podUID="cad75876-191d-4433-8f8c-1f65c63ada6e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.18:8080: connect: connection refused" Apr 22 14:26:02.965791 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:26:02.965623 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-tj8xx" podUID="cad75876-191d-4433-8f8c-1f65c63ada6e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:26:12.966065 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:26:12.966022 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-tj8xx" podUID="cad75876-191d-4433-8f8c-1f65c63ada6e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.18:8080: connect: connection refused" Apr 22 14:26:13.031446 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:26:12.966468 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-tj8xx" podUID="cad75876-191d-4433-8f8c-1f65c63ada6e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:26:22.966284 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:26:22.966237 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-tj8xx" podUID="cad75876-191d-4433-8f8c-1f65c63ada6e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.18:8080: connect: connection refused" Apr 22 14:26:22.966838 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:26:22.966639 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-tj8xx" podUID="cad75876-191d-4433-8f8c-1f65c63ada6e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:26:32.965877 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:26:32.965835 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-tj8xx" podUID="cad75876-191d-4433-8f8c-1f65c63ada6e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.18:8080: connect: connection refused" Apr 22 14:26:32.966297 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:26:32.966241 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-tj8xx" podUID="cad75876-191d-4433-8f8c-1f65c63ada6e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:26:42.966299 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:26:42.966251 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-tj8xx" podUID="cad75876-191d-4433-8f8c-1f65c63ada6e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.18:8080: connect: connection refused" Apr 22 14:26:42.966765 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:26:42.966740 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-tj8xx" podUID="cad75876-191d-4433-8f8c-1f65c63ada6e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:26:52.966005 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:26:52.965972 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-tj8xx" Apr 22 14:26:52.966391 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:26:52.966183 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-tj8xx" Apr 22 14:27:02.423132 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:27:02.423102 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-tj8xx"] Apr 22 14:27:02.423563 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:27:02.423361 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-tj8xx" podUID="cad75876-191d-4433-8f8c-1f65c63ada6e" containerName="kserve-container" containerID="cri-o://b694c22cf63b037ab07cdc3965116ad116dead7fd66f66805267d735b097f0e9" gracePeriod=30 Apr 22 14:27:02.423563 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:27:02.423438 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-tj8xx" podUID="cad75876-191d-4433-8f8c-1f65c63ada6e" containerName="agent" containerID="cri-o://29fc68906bbb7355622f124559481791d1a95fa84f0fe7628fd47e3e951832c5" gracePeriod=30 Apr 22 14:27:02.507399 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:27:02.507376 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-sx2cf"] Apr 22 14:27:02.510663 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:27:02.510643 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-sx2cf" Apr 22 14:27:02.525260 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:27:02.525238 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-sx2cf"] Apr 22 14:27:02.667605 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:27:02.667579 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ff2ccb8a-409b-442c-98fc-8f2417d6463b-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-6467bf5b48-sx2cf\" (UID: \"ff2ccb8a-409b-442c-98fc-8f2417d6463b\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-sx2cf" Apr 22 14:27:02.767971 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:27:02.767947 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ff2ccb8a-409b-442c-98fc-8f2417d6463b-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-6467bf5b48-sx2cf\" (UID: \"ff2ccb8a-409b-442c-98fc-8f2417d6463b\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-sx2cf" Apr 22 14:27:02.768231 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:27:02.768216 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ff2ccb8a-409b-442c-98fc-8f2417d6463b-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-6467bf5b48-sx2cf\" (UID: \"ff2ccb8a-409b-442c-98fc-8f2417d6463b\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-sx2cf" Apr 22 14:27:02.825093 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:27:02.825071 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-sx2cf" Apr 22 14:27:02.939066 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:27:02.939039 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-sx2cf"] Apr 22 14:27:02.941057 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:27:02.941026 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff2ccb8a_409b_442c_98fc_8f2417d6463b.slice/crio-aaf008e5e322a526bf891cf34c8687df27e809e74c2044793588b241db33960d WatchSource:0}: Error finding container aaf008e5e322a526bf891cf34c8687df27e809e74c2044793588b241db33960d: Status 404 returned error can't find the container with id aaf008e5e322a526bf891cf34c8687df27e809e74c2044793588b241db33960d Apr 22 14:27:02.942889 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:27:02.942871 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 14:27:02.965473 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:27:02.965443 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-tj8xx" podUID="cad75876-191d-4433-8f8c-1f65c63ada6e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.18:8080: connect: connection refused" Apr 22 14:27:02.965722 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:27:02.965700 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-tj8xx" podUID="cad75876-191d-4433-8f8c-1f65c63ada6e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:27:03.165770 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:27:03.165686 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-sx2cf" event={"ID":"ff2ccb8a-409b-442c-98fc-8f2417d6463b","Type":"ContainerStarted","Data":"1c3359e30642ce853f9c95ca75af19ba713fc04c63bcfbc86d7caef113258cb9"} Apr 22 14:27:03.165770 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:27:03.165724 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-sx2cf" event={"ID":"ff2ccb8a-409b-442c-98fc-8f2417d6463b","Type":"ContainerStarted","Data":"aaf008e5e322a526bf891cf34c8687df27e809e74c2044793588b241db33960d"} Apr 22 14:27:06.176840 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:27:06.176765 2576 generic.go:358] "Generic (PLEG): container finished" podID="cad75876-191d-4433-8f8c-1f65c63ada6e" containerID="b694c22cf63b037ab07cdc3965116ad116dead7fd66f66805267d735b097f0e9" exitCode=0 Apr 22 14:27:06.177116 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:27:06.176839 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-tj8xx" event={"ID":"cad75876-191d-4433-8f8c-1f65c63ada6e","Type":"ContainerDied","Data":"b694c22cf63b037ab07cdc3965116ad116dead7fd66f66805267d735b097f0e9"} Apr 22 14:27:07.180747 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:27:07.180684 2576 generic.go:358] "Generic (PLEG): container finished" podID="ff2ccb8a-409b-442c-98fc-8f2417d6463b" containerID="1c3359e30642ce853f9c95ca75af19ba713fc04c63bcfbc86d7caef113258cb9" exitCode=0 Apr 22 14:27:07.181063 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:27:07.180755 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-sx2cf" event={"ID":"ff2ccb8a-409b-442c-98fc-8f2417d6463b","Type":"ContainerDied","Data":"1c3359e30642ce853f9c95ca75af19ba713fc04c63bcfbc86d7caef113258cb9"} Apr 22 14:27:08.185541 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:27:08.185510 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-sx2cf" event={"ID":"ff2ccb8a-409b-442c-98fc-8f2417d6463b","Type":"ContainerStarted","Data":"a29f5590f40c96ead24f1b5852ed7f08033fdbe66cc0e74c313006cf8c87a46d"} Apr 22 14:27:08.185942 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:27:08.185552 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-sx2cf" event={"ID":"ff2ccb8a-409b-442c-98fc-8f2417d6463b","Type":"ContainerStarted","Data":"b3362bdf83307c5d115d0b56ad19480a86eeffa817962c2cd6ca4c6dc23288d6"} Apr 22 14:27:08.185942 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:27:08.185836 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-sx2cf" Apr 22 14:27:08.187053 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:27:08.187030 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-sx2cf" podUID="ff2ccb8a-409b-442c-98fc-8f2417d6463b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.19:5000: connect: connection refused" Apr 22 14:27:08.208874 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:27:08.208837 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-sx2cf" podStartSLOduration=6.2088253 podStartE2EDuration="6.2088253s" podCreationTimestamp="2026-04-22 14:27:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:27:08.208080517 +0000 UTC m=+698.533814932" watchObservedRunningTime="2026-04-22 14:27:08.2088253 +0000 UTC m=+698.534559714" Apr 22 14:27:09.188795 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:27:09.188756 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-sx2cf" Apr 22 14:27:09.189201 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:27:09.188867 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-sx2cf" podUID="ff2ccb8a-409b-442c-98fc-8f2417d6463b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.19:5000: connect: connection refused" Apr 22 14:27:09.189838 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:27:09.189799 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-sx2cf" podUID="ff2ccb8a-409b-442c-98fc-8f2417d6463b" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:27:10.191252 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:27:10.191215 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-sx2cf" podUID="ff2ccb8a-409b-442c-98fc-8f2417d6463b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.19:5000: connect: connection refused" Apr 22 14:27:10.191619 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:27:10.191423 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-sx2cf" podUID="ff2ccb8a-409b-442c-98fc-8f2417d6463b" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:27:12.965940 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:27:12.965899 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-tj8xx" podUID="cad75876-191d-4433-8f8c-1f65c63ada6e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.18:8080: connect: connection refused" Apr 22 14:27:12.966351 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:27:12.966246 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-tj8xx" podUID="cad75876-191d-4433-8f8c-1f65c63ada6e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:27:20.191689 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:27:20.191648 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-sx2cf" podUID="ff2ccb8a-409b-442c-98fc-8f2417d6463b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.19:5000: connect: connection refused" Apr 22 14:27:20.192234 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:27:20.192211 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-sx2cf" podUID="ff2ccb8a-409b-442c-98fc-8f2417d6463b" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:27:22.965909 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:27:22.965870 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-tj8xx" podUID="cad75876-191d-4433-8f8c-1f65c63ada6e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.18:8080: connect: connection refused" Apr 22 14:27:22.966278 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:27:22.965995 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-tj8xx" Apr 22 14:27:22.966278 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:27:22.966139 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-tj8xx" podUID="cad75876-191d-4433-8f8c-1f65c63ada6e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:27:22.966278 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:27:22.966265 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-tj8xx" Apr 22 14:27:30.192141 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:27:30.192100 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-sx2cf" podUID="ff2ccb8a-409b-442c-98fc-8f2417d6463b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.19:5000: connect: connection refused" Apr 22 14:27:30.192562 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:27:30.192497 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-sx2cf" podUID="ff2ccb8a-409b-442c-98fc-8f2417d6463b" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:27:32.558533 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:27:32.558512 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-tj8xx" Apr 22 14:27:32.670675 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:27:32.670649 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cad75876-191d-4433-8f8c-1f65c63ada6e-kserve-provision-location\") pod \"cad75876-191d-4433-8f8c-1f65c63ada6e\" (UID: \"cad75876-191d-4433-8f8c-1f65c63ada6e\") " Apr 22 14:27:32.670862 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:27:32.670842 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cad75876-191d-4433-8f8c-1f65c63ada6e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "cad75876-191d-4433-8f8c-1f65c63ada6e" (UID: "cad75876-191d-4433-8f8c-1f65c63ada6e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:27:32.771362 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:27:32.771335 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cad75876-191d-4433-8f8c-1f65c63ada6e-kserve-provision-location\") on node \"ip-10-0-142-195.ec2.internal\" DevicePath \"\"" Apr 22 14:27:33.260787 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:27:33.260755 2576 generic.go:358] "Generic (PLEG): container finished" podID="cad75876-191d-4433-8f8c-1f65c63ada6e" containerID="29fc68906bbb7355622f124559481791d1a95fa84f0fe7628fd47e3e951832c5" exitCode=0 Apr 22 14:27:33.261019 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:27:33.260823 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-tj8xx" event={"ID":"cad75876-191d-4433-8f8c-1f65c63ada6e","Type":"ContainerDied","Data":"29fc68906bbb7355622f124559481791d1a95fa84f0fe7628fd47e3e951832c5"} Apr 22 14:27:33.261019 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:27:33.260857 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-tj8xx" Apr 22 14:27:33.261019 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:27:33.260870 2576 scope.go:117] "RemoveContainer" containerID="29fc68906bbb7355622f124559481791d1a95fa84f0fe7628fd47e3e951832c5" Apr 22 14:27:33.261019 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:27:33.260858 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-tj8xx" event={"ID":"cad75876-191d-4433-8f8c-1f65c63ada6e","Type":"ContainerDied","Data":"3b2d1feef419d1044e192b4b2f8734daa41092a13d06b7fca7ad2f73159b5f2e"} Apr 22 14:27:33.268350 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:27:33.268333 2576 scope.go:117] "RemoveContainer" containerID="b694c22cf63b037ab07cdc3965116ad116dead7fd66f66805267d735b097f0e9" Apr 22 14:27:33.274784 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:27:33.274767 2576 scope.go:117] "RemoveContainer" containerID="6f645a580c54036e3883516feed3777b13ed2fd75aa22d9a77f64d212dd8181f" Apr 22 14:27:33.281058 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:27:33.281036 2576 scope.go:117] "RemoveContainer" containerID="29fc68906bbb7355622f124559481791d1a95fa84f0fe7628fd47e3e951832c5" Apr 22 14:27:33.281416 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:27:33.281391 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29fc68906bbb7355622f124559481791d1a95fa84f0fe7628fd47e3e951832c5\": container with ID starting with 29fc68906bbb7355622f124559481791d1a95fa84f0fe7628fd47e3e951832c5 not found: ID does not exist" containerID="29fc68906bbb7355622f124559481791d1a95fa84f0fe7628fd47e3e951832c5" Apr 22 14:27:33.281496 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:27:33.281427 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29fc68906bbb7355622f124559481791d1a95fa84f0fe7628fd47e3e951832c5"} err="failed to get container status \"29fc68906bbb7355622f124559481791d1a95fa84f0fe7628fd47e3e951832c5\": rpc error: code = NotFound desc = could not find container \"29fc68906bbb7355622f124559481791d1a95fa84f0fe7628fd47e3e951832c5\": container with ID starting with 29fc68906bbb7355622f124559481791d1a95fa84f0fe7628fd47e3e951832c5 not found: ID does not exist" Apr 22 14:27:33.281496 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:27:33.281449 2576 scope.go:117] "RemoveContainer" containerID="b694c22cf63b037ab07cdc3965116ad116dead7fd66f66805267d735b097f0e9" Apr 22 14:27:33.281702 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:27:33.281681 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b694c22cf63b037ab07cdc3965116ad116dead7fd66f66805267d735b097f0e9\": container with ID starting with b694c22cf63b037ab07cdc3965116ad116dead7fd66f66805267d735b097f0e9 not found: ID does not exist" containerID="b694c22cf63b037ab07cdc3965116ad116dead7fd66f66805267d735b097f0e9" Apr 22 14:27:33.281753 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:27:33.281711 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b694c22cf63b037ab07cdc3965116ad116dead7fd66f66805267d735b097f0e9"} err="failed to get container status \"b694c22cf63b037ab07cdc3965116ad116dead7fd66f66805267d735b097f0e9\": rpc error: code = NotFound desc = could not find container \"b694c22cf63b037ab07cdc3965116ad116dead7fd66f66805267d735b097f0e9\": container with ID starting with b694c22cf63b037ab07cdc3965116ad116dead7fd66f66805267d735b097f0e9 not found: ID does not exist" Apr 22 14:27:33.281753 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:27:33.281731 2576 scope.go:117] "RemoveContainer" containerID="6f645a580c54036e3883516feed3777b13ed2fd75aa22d9a77f64d212dd8181f" Apr 22 14:27:33.282039 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:27:33.282022 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f645a580c54036e3883516feed3777b13ed2fd75aa22d9a77f64d212dd8181f\": container with ID starting with 6f645a580c54036e3883516feed3777b13ed2fd75aa22d9a77f64d212dd8181f not found: ID does not exist" containerID="6f645a580c54036e3883516feed3777b13ed2fd75aa22d9a77f64d212dd8181f" Apr 22 14:27:33.282094 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:27:33.282046 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f645a580c54036e3883516feed3777b13ed2fd75aa22d9a77f64d212dd8181f"} err="failed to get container status \"6f645a580c54036e3883516feed3777b13ed2fd75aa22d9a77f64d212dd8181f\": rpc error: code = NotFound desc = could not find container \"6f645a580c54036e3883516feed3777b13ed2fd75aa22d9a77f64d212dd8181f\": container with ID starting with 6f645a580c54036e3883516feed3777b13ed2fd75aa22d9a77f64d212dd8181f not found: ID does not exist" Apr 22 14:27:33.282765 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:27:33.282746 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-tj8xx"] Apr 22 14:27:33.286231 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:27:33.286212 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-tj8xx"] Apr 22 14:27:34.292002 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:27:34.291977 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cad75876-191d-4433-8f8c-1f65c63ada6e" path="/var/lib/kubelet/pods/cad75876-191d-4433-8f8c-1f65c63ada6e/volumes" Apr 22 14:27:40.191463 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:27:40.191418 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-sx2cf" podUID="ff2ccb8a-409b-442c-98fc-8f2417d6463b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.19:5000: connect: connection refused" Apr 22 14:27:40.191950 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:27:40.191919 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-sx2cf" podUID="ff2ccb8a-409b-442c-98fc-8f2417d6463b" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:27:50.191960 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:27:50.191904 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-sx2cf" podUID="ff2ccb8a-409b-442c-98fc-8f2417d6463b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.19:5000: connect: connection refused" Apr 22 14:27:50.192455 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:27:50.192431 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-sx2cf" podUID="ff2ccb8a-409b-442c-98fc-8f2417d6463b" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:28:00.191988 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:00.191937 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-sx2cf" podUID="ff2ccb8a-409b-442c-98fc-8f2417d6463b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.19:5000: connect: connection refused" Apr 22 14:28:00.192453 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:00.192426 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-sx2cf" podUID="ff2ccb8a-409b-442c-98fc-8f2417d6463b" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:28:10.192185 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:10.192154 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-sx2cf" Apr 22 14:28:10.192750 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:10.192345 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-sx2cf" Apr 22 14:28:17.596724 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:17.596687 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-sx2cf"] Apr 22 14:28:17.597124 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:17.597033 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-sx2cf" podUID="ff2ccb8a-409b-442c-98fc-8f2417d6463b" containerName="kserve-container" containerID="cri-o://b3362bdf83307c5d115d0b56ad19480a86eeffa817962c2cd6ca4c6dc23288d6" gracePeriod=30 Apr 22 14:28:17.597309 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:17.597096 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-sx2cf" podUID="ff2ccb8a-409b-442c-98fc-8f2417d6463b" containerName="agent" containerID="cri-o://a29f5590f40c96ead24f1b5852ed7f08033fdbe66cc0e74c313006cf8c87a46d" gracePeriod=30 Apr 22 14:28:20.192080 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:20.192035 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-sx2cf" podUID="ff2ccb8a-409b-442c-98fc-8f2417d6463b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.19:5000: connect: connection refused" Apr 22 14:28:20.192557 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:20.192501 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-sx2cf" podUID="ff2ccb8a-409b-442c-98fc-8f2417d6463b" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:28:21.400983 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:21.400913 2576 generic.go:358] "Generic (PLEG): container finished" podID="ff2ccb8a-409b-442c-98fc-8f2417d6463b" containerID="b3362bdf83307c5d115d0b56ad19480a86eeffa817962c2cd6ca4c6dc23288d6" exitCode=0 Apr 22 14:28:21.401295 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:21.400987 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-sx2cf" event={"ID":"ff2ccb8a-409b-442c-98fc-8f2417d6463b","Type":"ContainerDied","Data":"b3362bdf83307c5d115d0b56ad19480a86eeffa817962c2cd6ca4c6dc23288d6"} Apr 22 14:28:27.711385 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:27.711355 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-lpmh6"] Apr 22 14:28:27.711732 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:27.711605 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cad75876-191d-4433-8f8c-1f65c63ada6e" containerName="kserve-container" Apr 22 14:28:27.711732 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:27.711616 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="cad75876-191d-4433-8f8c-1f65c63ada6e" containerName="kserve-container" Apr 22 14:28:27.711732 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:27.711630 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cad75876-191d-4433-8f8c-1f65c63ada6e" containerName="agent" Apr 22 14:28:27.711732 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:27.711635 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="cad75876-191d-4433-8f8c-1f65c63ada6e" containerName="agent" Apr 22 14:28:27.711732 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:27.711658 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cad75876-191d-4433-8f8c-1f65c63ada6e" containerName="storage-initializer" Apr 22 14:28:27.711732 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:27.711667 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="cad75876-191d-4433-8f8c-1f65c63ada6e" containerName="storage-initializer" Apr 22 14:28:27.712010 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:27.711760 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="cad75876-191d-4433-8f8c-1f65c63ada6e" containerName="agent" Apr 22 14:28:27.712010 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:27.711771 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="cad75876-191d-4433-8f8c-1f65c63ada6e" containerName="kserve-container" Apr 22 14:28:27.714706 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:27.714691 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-lpmh6" Apr 22 14:28:27.725042 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:27.725021 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-lpmh6"] Apr 22 14:28:27.736926 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:27.736880 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4440562f-7c3f-4e72-8d2d-0e6a39b81c14-kserve-provision-location\") pod \"isvc-logger-predictor-7db75b5b6d-lpmh6\" (UID: \"4440562f-7c3f-4e72-8d2d-0e6a39b81c14\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-lpmh6" Apr 22 14:28:27.838079 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:27.838057 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4440562f-7c3f-4e72-8d2d-0e6a39b81c14-kserve-provision-location\") pod \"isvc-logger-predictor-7db75b5b6d-lpmh6\" (UID: \"4440562f-7c3f-4e72-8d2d-0e6a39b81c14\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-lpmh6" Apr 22 14:28:27.838356 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:27.838340 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4440562f-7c3f-4e72-8d2d-0e6a39b81c14-kserve-provision-location\") pod \"isvc-logger-predictor-7db75b5b6d-lpmh6\" (UID: \"4440562f-7c3f-4e72-8d2d-0e6a39b81c14\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-lpmh6" Apr 22 14:28:28.024110 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:28.024087 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-lpmh6" Apr 22 14:28:28.140061 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:28.140037 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-lpmh6"] Apr 22 14:28:28.142126 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:28:28.142101 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4440562f_7c3f_4e72_8d2d_0e6a39b81c14.slice/crio-b2c749a2b551da38109657d38a2bdf55c4097c919295ae1c18e5074bbe116b16 WatchSource:0}: Error finding container b2c749a2b551da38109657d38a2bdf55c4097c919295ae1c18e5074bbe116b16: Status 404 returned error can't find the container with id b2c749a2b551da38109657d38a2bdf55c4097c919295ae1c18e5074bbe116b16 Apr 22 14:28:28.421718 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:28.421635 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-lpmh6" event={"ID":"4440562f-7c3f-4e72-8d2d-0e6a39b81c14","Type":"ContainerStarted","Data":"51c75b5b7e123704cdab7238d6e8e611b01b7127647507358f8b9a380ea0a124"} Apr 22 14:28:28.421718 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:28.421672 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-lpmh6" event={"ID":"4440562f-7c3f-4e72-8d2d-0e6a39b81c14","Type":"ContainerStarted","Data":"b2c749a2b551da38109657d38a2bdf55c4097c919295ae1c18e5074bbe116b16"} Apr 22 14:28:30.191369 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:30.191334 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-sx2cf" podUID="ff2ccb8a-409b-442c-98fc-8f2417d6463b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.19:5000: connect: connection refused" Apr 22 14:28:30.191739 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:30.191670 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-sx2cf" podUID="ff2ccb8a-409b-442c-98fc-8f2417d6463b" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:28:32.433455 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:32.433412 2576 generic.go:358] "Generic (PLEG): container finished" podID="4440562f-7c3f-4e72-8d2d-0e6a39b81c14" containerID="51c75b5b7e123704cdab7238d6e8e611b01b7127647507358f8b9a380ea0a124" exitCode=0 Apr 22 14:28:32.433793 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:32.433486 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-lpmh6" event={"ID":"4440562f-7c3f-4e72-8d2d-0e6a39b81c14","Type":"ContainerDied","Data":"51c75b5b7e123704cdab7238d6e8e611b01b7127647507358f8b9a380ea0a124"} Apr 22 14:28:33.438328 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:33.438292 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-lpmh6" event={"ID":"4440562f-7c3f-4e72-8d2d-0e6a39b81c14","Type":"ContainerStarted","Data":"f271e4dc8e161bc86fcd8b67db67077e940cc8562e7bb6890bf5a87db9652b8b"} Apr 22 14:28:33.438328 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:33.438330 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-lpmh6" event={"ID":"4440562f-7c3f-4e72-8d2d-0e6a39b81c14","Type":"ContainerStarted","Data":"5f67cc47fababb4ee257940f9bade0f2ebe28f8f6423123bf504d3248bb66665"} Apr 22 14:28:33.438852 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:33.438605 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-lpmh6" Apr 22 14:28:33.438852 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:33.438629 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-lpmh6" Apr 22 14:28:33.439783 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:33.439745 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-lpmh6" podUID="4440562f-7c3f-4e72-8d2d-0e6a39b81c14" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 22 14:28:33.440489 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:33.440447 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-lpmh6" podUID="4440562f-7c3f-4e72-8d2d-0e6a39b81c14" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:28:33.458380 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:33.458238 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-lpmh6" podStartSLOduration=6.458224178 podStartE2EDuration="6.458224178s" podCreationTimestamp="2026-04-22 14:28:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:28:33.457768008 +0000 UTC m=+783.783502417" watchObservedRunningTime="2026-04-22 14:28:33.458224178 +0000 UTC m=+783.783958597" Apr 22 14:28:34.441848 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:34.441801 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-lpmh6" podUID="4440562f-7c3f-4e72-8d2d-0e6a39b81c14" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 22 14:28:34.442267 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:34.442153 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-lpmh6" podUID="4440562f-7c3f-4e72-8d2d-0e6a39b81c14" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:28:40.191924 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:40.191782 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-sx2cf" podUID="ff2ccb8a-409b-442c-98fc-8f2417d6463b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.19:5000: connect: connection refused" Apr 22 14:28:40.192438 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:40.191977 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-sx2cf" Apr 22 14:28:40.192438 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:40.192143 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-sx2cf" podUID="ff2ccb8a-409b-442c-98fc-8f2417d6463b" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:28:40.192438 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:40.192261 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-sx2cf" Apr 22 14:28:44.441835 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:44.441766 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-lpmh6" podUID="4440562f-7c3f-4e72-8d2d-0e6a39b81c14" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 22 14:28:44.442318 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:44.442287 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-lpmh6" podUID="4440562f-7c3f-4e72-8d2d-0e6a39b81c14" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:28:47.730839 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:47.730796 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-sx2cf" Apr 22 14:28:47.765455 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:47.765429 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ff2ccb8a-409b-442c-98fc-8f2417d6463b-kserve-provision-location\") pod \"ff2ccb8a-409b-442c-98fc-8f2417d6463b\" (UID: \"ff2ccb8a-409b-442c-98fc-8f2417d6463b\") " Apr 22 14:28:47.765699 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:47.765679 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff2ccb8a-409b-442c-98fc-8f2417d6463b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ff2ccb8a-409b-442c-98fc-8f2417d6463b" (UID: "ff2ccb8a-409b-442c-98fc-8f2417d6463b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:28:47.866773 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:47.866725 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ff2ccb8a-409b-442c-98fc-8f2417d6463b-kserve-provision-location\") on node \"ip-10-0-142-195.ec2.internal\" DevicePath \"\"" Apr 22 14:28:48.484273 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:48.484241 2576 generic.go:358] "Generic (PLEG): container finished" podID="ff2ccb8a-409b-442c-98fc-8f2417d6463b" containerID="a29f5590f40c96ead24f1b5852ed7f08033fdbe66cc0e74c313006cf8c87a46d" exitCode=0 Apr 22 14:28:48.484353 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:48.484280 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-sx2cf" event={"ID":"ff2ccb8a-409b-442c-98fc-8f2417d6463b","Type":"ContainerDied","Data":"a29f5590f40c96ead24f1b5852ed7f08033fdbe66cc0e74c313006cf8c87a46d"} Apr 22 14:28:48.484353 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:48.484309 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-sx2cf" event={"ID":"ff2ccb8a-409b-442c-98fc-8f2417d6463b","Type":"ContainerDied","Data":"aaf008e5e322a526bf891cf34c8687df27e809e74c2044793588b241db33960d"} Apr 22 14:28:48.484353 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:48.484329 2576 scope.go:117] "RemoveContainer" containerID="a29f5590f40c96ead24f1b5852ed7f08033fdbe66cc0e74c313006cf8c87a46d" Apr 22 14:28:48.484353 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:48.484339 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-sx2cf" Apr 22 14:28:48.491447 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:48.491429 2576 scope.go:117] "RemoveContainer" containerID="b3362bdf83307c5d115d0b56ad19480a86eeffa817962c2cd6ca4c6dc23288d6" Apr 22 14:28:48.497703 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:48.497677 2576 scope.go:117] "RemoveContainer" containerID="1c3359e30642ce853f9c95ca75af19ba713fc04c63bcfbc86d7caef113258cb9" Apr 22 14:28:48.502112 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:48.502091 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-sx2cf"] Apr 22 14:28:48.504917 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:48.504896 2576 scope.go:117] "RemoveContainer" containerID="a29f5590f40c96ead24f1b5852ed7f08033fdbe66cc0e74c313006cf8c87a46d" Apr 22 14:28:48.505025 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:48.505009 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-sx2cf"] Apr 22 14:28:48.505139 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:28:48.505124 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a29f5590f40c96ead24f1b5852ed7f08033fdbe66cc0e74c313006cf8c87a46d\": container with ID starting with a29f5590f40c96ead24f1b5852ed7f08033fdbe66cc0e74c313006cf8c87a46d not found: ID does not exist" containerID="a29f5590f40c96ead24f1b5852ed7f08033fdbe66cc0e74c313006cf8c87a46d" Apr 22 14:28:48.505201 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:48.505147 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a29f5590f40c96ead24f1b5852ed7f08033fdbe66cc0e74c313006cf8c87a46d"} err="failed to get container status \"a29f5590f40c96ead24f1b5852ed7f08033fdbe66cc0e74c313006cf8c87a46d\": rpc error: code = NotFound desc = could not find container \"a29f5590f40c96ead24f1b5852ed7f08033fdbe66cc0e74c313006cf8c87a46d\": container with ID starting with a29f5590f40c96ead24f1b5852ed7f08033fdbe66cc0e74c313006cf8c87a46d not found: ID does not exist" Apr 22 14:28:48.505201 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:48.505167 2576 scope.go:117] "RemoveContainer" containerID="b3362bdf83307c5d115d0b56ad19480a86eeffa817962c2cd6ca4c6dc23288d6" Apr 22 14:28:48.505372 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:28:48.505358 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3362bdf83307c5d115d0b56ad19480a86eeffa817962c2cd6ca4c6dc23288d6\": container with ID starting with b3362bdf83307c5d115d0b56ad19480a86eeffa817962c2cd6ca4c6dc23288d6 not found: ID does not exist" containerID="b3362bdf83307c5d115d0b56ad19480a86eeffa817962c2cd6ca4c6dc23288d6" Apr 22 14:28:48.505422 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:48.505387 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3362bdf83307c5d115d0b56ad19480a86eeffa817962c2cd6ca4c6dc23288d6"} err="failed to get container status \"b3362bdf83307c5d115d0b56ad19480a86eeffa817962c2cd6ca4c6dc23288d6\": rpc error: code = NotFound desc = could not find container \"b3362bdf83307c5d115d0b56ad19480a86eeffa817962c2cd6ca4c6dc23288d6\": container with ID starting with b3362bdf83307c5d115d0b56ad19480a86eeffa817962c2cd6ca4c6dc23288d6 not found: ID does not exist" Apr 22 14:28:48.505422 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:48.505401 2576 scope.go:117] "RemoveContainer" containerID="1c3359e30642ce853f9c95ca75af19ba713fc04c63bcfbc86d7caef113258cb9" Apr 22 14:28:48.505609 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:28:48.505591 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c3359e30642ce853f9c95ca75af19ba713fc04c63bcfbc86d7caef113258cb9\": container with ID starting with 1c3359e30642ce853f9c95ca75af19ba713fc04c63bcfbc86d7caef113258cb9 not found: ID does not exist" containerID="1c3359e30642ce853f9c95ca75af19ba713fc04c63bcfbc86d7caef113258cb9" Apr 22 14:28:48.505652 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:48.505615 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c3359e30642ce853f9c95ca75af19ba713fc04c63bcfbc86d7caef113258cb9"} err="failed to get container status \"1c3359e30642ce853f9c95ca75af19ba713fc04c63bcfbc86d7caef113258cb9\": rpc error: code = NotFound desc = could not find container \"1c3359e30642ce853f9c95ca75af19ba713fc04c63bcfbc86d7caef113258cb9\": container with ID starting with 1c3359e30642ce853f9c95ca75af19ba713fc04c63bcfbc86d7caef113258cb9 not found: ID does not exist" Apr 22 14:28:50.291545 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:50.291513 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff2ccb8a-409b-442c-98fc-8f2417d6463b" path="/var/lib/kubelet/pods/ff2ccb8a-409b-442c-98fc-8f2417d6463b/volumes" Apr 22 14:28:54.441831 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:54.441766 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-lpmh6" podUID="4440562f-7c3f-4e72-8d2d-0e6a39b81c14" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 22 14:28:54.442338 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:28:54.442305 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-lpmh6" podUID="4440562f-7c3f-4e72-8d2d-0e6a39b81c14" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:29:04.442267 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:29:04.442219 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-lpmh6" podUID="4440562f-7c3f-4e72-8d2d-0e6a39b81c14" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 22 14:29:04.442770 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:29:04.442744 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-lpmh6" podUID="4440562f-7c3f-4e72-8d2d-0e6a39b81c14" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:29:14.442094 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:29:14.442036 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-lpmh6" podUID="4440562f-7c3f-4e72-8d2d-0e6a39b81c14" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 22 14:29:14.442658 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:29:14.442491 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-lpmh6" podUID="4440562f-7c3f-4e72-8d2d-0e6a39b81c14" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:29:24.442302 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:29:24.442246 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-lpmh6" podUID="4440562f-7c3f-4e72-8d2d-0e6a39b81c14" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 22 14:29:24.442776 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:29:24.442647 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-lpmh6" podUID="4440562f-7c3f-4e72-8d2d-0e6a39b81c14" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:29:34.442685 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:29:34.442644 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-lpmh6" Apr 22 14:29:34.443221 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:29:34.442766 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-lpmh6" Apr 22 14:29:42.961784 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:29:42.961758 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-lpmh6"] Apr 22 14:29:42.962132 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:29:42.962043 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-lpmh6" podUID="4440562f-7c3f-4e72-8d2d-0e6a39b81c14" containerName="kserve-container" containerID="cri-o://5f67cc47fababb4ee257940f9bade0f2ebe28f8f6423123bf504d3248bb66665" gracePeriod=30 Apr 22 14:29:42.962171 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:29:42.962133 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-lpmh6" podUID="4440562f-7c3f-4e72-8d2d-0e6a39b81c14" containerName="agent" containerID="cri-o://f271e4dc8e161bc86fcd8b67db67077e940cc8562e7bb6890bf5a87db9652b8b" gracePeriod=30 Apr 22 14:29:43.000768 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:29:43.000748 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-4bshk"] Apr 22 14:29:43.001026 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:29:43.001014 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ff2ccb8a-409b-442c-98fc-8f2417d6463b" containerName="agent" Apr 22 14:29:43.001068 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:29:43.001028 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff2ccb8a-409b-442c-98fc-8f2417d6463b" containerName="agent" Apr 22 14:29:43.001068 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:29:43.001039 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ff2ccb8a-409b-442c-98fc-8f2417d6463b" containerName="kserve-container" Apr 22 14:29:43.001068 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:29:43.001044 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff2ccb8a-409b-442c-98fc-8f2417d6463b" containerName="kserve-container" Apr 22 14:29:43.001068 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:29:43.001054 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ff2ccb8a-409b-442c-98fc-8f2417d6463b" containerName="storage-initializer" Apr 22 14:29:43.001068 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:29:43.001059 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff2ccb8a-409b-442c-98fc-8f2417d6463b" containerName="storage-initializer" Apr 22 14:29:43.001208 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:29:43.001099 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ff2ccb8a-409b-442c-98fc-8f2417d6463b" containerName="kserve-container" Apr 22 14:29:43.001208 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:29:43.001110 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ff2ccb8a-409b-442c-98fc-8f2417d6463b" containerName="agent" Apr 22 14:29:43.003845 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:29:43.003803 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-4bshk" Apr 22 14:29:43.020129 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:29:43.020109 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-4bshk"] Apr 22 14:29:43.116484 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:29:43.116454 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/89c658ca-70a9-4cc2-b34d-71aca3e32f33-kserve-provision-location\") pod \"isvc-lightgbm-predictor-78c8d484d6-4bshk\" (UID: \"89c658ca-70a9-4cc2-b34d-71aca3e32f33\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-4bshk" Apr 22 14:29:43.217762 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:29:43.217713 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/89c658ca-70a9-4cc2-b34d-71aca3e32f33-kserve-provision-location\") pod \"isvc-lightgbm-predictor-78c8d484d6-4bshk\" (UID: \"89c658ca-70a9-4cc2-b34d-71aca3e32f33\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-4bshk" Apr 22 14:29:43.218030 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:29:43.218014 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/89c658ca-70a9-4cc2-b34d-71aca3e32f33-kserve-provision-location\") pod \"isvc-lightgbm-predictor-78c8d484d6-4bshk\" (UID: \"89c658ca-70a9-4cc2-b34d-71aca3e32f33\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-4bshk" Apr 22 14:29:43.313241 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:29:43.313219 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-4bshk" Apr 22 14:29:43.422049 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:29:43.422027 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-4bshk"] Apr 22 14:29:43.424695 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:29:43.424669 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89c658ca_70a9_4cc2_b34d_71aca3e32f33.slice/crio-dc485738d6dfc1cf16c77966f4efb73826fb81e3b2c0394dffd91cd8fbbc5b4e WatchSource:0}: Error finding container dc485738d6dfc1cf16c77966f4efb73826fb81e3b2c0394dffd91cd8fbbc5b4e: Status 404 returned error can't find the container with id dc485738d6dfc1cf16c77966f4efb73826fb81e3b2c0394dffd91cd8fbbc5b4e Apr 22 14:29:43.639449 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:29:43.639419 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-4bshk" event={"ID":"89c658ca-70a9-4cc2-b34d-71aca3e32f33","Type":"ContainerStarted","Data":"306048b2c9df9469880b0f66f6ae42d7557e927b1009c322e24574d0b03453c4"} Apr 22 14:29:43.639449 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:29:43.639450 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-4bshk" event={"ID":"89c658ca-70a9-4cc2-b34d-71aca3e32f33","Type":"ContainerStarted","Data":"dc485738d6dfc1cf16c77966f4efb73826fb81e3b2c0394dffd91cd8fbbc5b4e"} Apr 22 14:29:44.442186 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:29:44.442149 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-lpmh6" podUID="4440562f-7c3f-4e72-8d2d-0e6a39b81c14" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 22 14:29:44.442586 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:29:44.442473 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-lpmh6" podUID="4440562f-7c3f-4e72-8d2d-0e6a39b81c14" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:29:46.648626 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:29:46.648599 2576 generic.go:358] "Generic (PLEG): container finished" podID="4440562f-7c3f-4e72-8d2d-0e6a39b81c14" containerID="5f67cc47fababb4ee257940f9bade0f2ebe28f8f6423123bf504d3248bb66665" exitCode=0 Apr 22 14:29:46.648932 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:29:46.648665 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-lpmh6" event={"ID":"4440562f-7c3f-4e72-8d2d-0e6a39b81c14","Type":"ContainerDied","Data":"5f67cc47fababb4ee257940f9bade0f2ebe28f8f6423123bf504d3248bb66665"} Apr 22 14:29:47.652930 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:29:47.652897 2576 generic.go:358] "Generic (PLEG): container finished" podID="89c658ca-70a9-4cc2-b34d-71aca3e32f33" containerID="306048b2c9df9469880b0f66f6ae42d7557e927b1009c322e24574d0b03453c4" exitCode=0 Apr 22 14:29:47.653259 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:29:47.652973 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-4bshk" event={"ID":"89c658ca-70a9-4cc2-b34d-71aca3e32f33","Type":"ContainerDied","Data":"306048b2c9df9469880b0f66f6ae42d7557e927b1009c322e24574d0b03453c4"} Apr 22 14:29:54.441960 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:29:54.441915 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-lpmh6" podUID="4440562f-7c3f-4e72-8d2d-0e6a39b81c14" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 22 14:29:54.442497 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:29:54.442463 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-lpmh6" podUID="4440562f-7c3f-4e72-8d2d-0e6a39b81c14" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:29:56.678640 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:29:56.678609 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-4bshk" event={"ID":"89c658ca-70a9-4cc2-b34d-71aca3e32f33","Type":"ContainerStarted","Data":"65cbd1e376702a1ead05df161400a4359364383bc5c9281fb779f05888905b79"} Apr 22 14:29:56.679008 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:29:56.678878 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-4bshk" Apr 22 14:29:56.679986 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:29:56.679957 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-4bshk" podUID="89c658ca-70a9-4cc2-b34d-71aca3e32f33" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 22 14:29:56.697105 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:29:56.697064 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-4bshk" podStartSLOduration=6.523979034 podStartE2EDuration="14.697051369s" podCreationTimestamp="2026-04-22 14:29:42 +0000 UTC" firstStartedPulling="2026-04-22 14:29:47.654259232 +0000 UTC m=+857.979993625" lastFinishedPulling="2026-04-22 14:29:55.827331568 +0000 UTC m=+866.153065960" observedRunningTime="2026-04-22 14:29:56.695851522 +0000 UTC m=+867.021585935" watchObservedRunningTime="2026-04-22 14:29:56.697051369 +0000 UTC m=+867.022785782" Apr 22 14:29:57.682229 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:29:57.682192 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-4bshk" podUID="89c658ca-70a9-4cc2-b34d-71aca3e32f33" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 22 14:30:04.442699 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:30:04.442651 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-lpmh6" podUID="4440562f-7c3f-4e72-8d2d-0e6a39b81c14" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 22 14:30:04.443241 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:30:04.442825 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-lpmh6" Apr 22 14:30:04.443241 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:30:04.443045 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-lpmh6" podUID="4440562f-7c3f-4e72-8d2d-0e6a39b81c14" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:30:04.443241 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:30:04.443118 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-lpmh6" Apr 22 14:30:07.682930 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:30:07.682885 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-4bshk" podUID="89c658ca-70a9-4cc2-b34d-71aca3e32f33" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 22 14:30:13.108460 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:30:13.108441 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-lpmh6" Apr 22 14:30:13.226780 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:30:13.226754 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4440562f-7c3f-4e72-8d2d-0e6a39b81c14-kserve-provision-location\") pod \"4440562f-7c3f-4e72-8d2d-0e6a39b81c14\" (UID: \"4440562f-7c3f-4e72-8d2d-0e6a39b81c14\") " Apr 22 14:30:13.227050 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:30:13.227026 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4440562f-7c3f-4e72-8d2d-0e6a39b81c14-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4440562f-7c3f-4e72-8d2d-0e6a39b81c14" (UID: "4440562f-7c3f-4e72-8d2d-0e6a39b81c14"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:30:13.327184 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:30:13.327163 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4440562f-7c3f-4e72-8d2d-0e6a39b81c14-kserve-provision-location\") on node \"ip-10-0-142-195.ec2.internal\" DevicePath \"\"" Apr 22 14:30:13.731146 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:30:13.731110 2576 generic.go:358] "Generic (PLEG): container finished" podID="4440562f-7c3f-4e72-8d2d-0e6a39b81c14" containerID="f271e4dc8e161bc86fcd8b67db67077e940cc8562e7bb6890bf5a87db9652b8b" exitCode=137 Apr 22 14:30:13.731268 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:30:13.731183 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-lpmh6" Apr 22 14:30:13.731268 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:30:13.731185 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-lpmh6" event={"ID":"4440562f-7c3f-4e72-8d2d-0e6a39b81c14","Type":"ContainerDied","Data":"f271e4dc8e161bc86fcd8b67db67077e940cc8562e7bb6890bf5a87db9652b8b"} Apr 22 14:30:13.731268 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:30:13.731222 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-lpmh6" event={"ID":"4440562f-7c3f-4e72-8d2d-0e6a39b81c14","Type":"ContainerDied","Data":"b2c749a2b551da38109657d38a2bdf55c4097c919295ae1c18e5074bbe116b16"} Apr 22 14:30:13.731268 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:30:13.731239 2576 scope.go:117] "RemoveContainer" containerID="f271e4dc8e161bc86fcd8b67db67077e940cc8562e7bb6890bf5a87db9652b8b" Apr 22 14:30:13.739627 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:30:13.739603 2576 scope.go:117] "RemoveContainer" containerID="5f67cc47fababb4ee257940f9bade0f2ebe28f8f6423123bf504d3248bb66665" Apr 22 14:30:13.746280 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:30:13.746260 2576 scope.go:117] "RemoveContainer" containerID="51c75b5b7e123704cdab7238d6e8e611b01b7127647507358f8b9a380ea0a124" Apr 22 14:30:13.753161 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:30:13.753128 2576 scope.go:117] "RemoveContainer" containerID="f271e4dc8e161bc86fcd8b67db67077e940cc8562e7bb6890bf5a87db9652b8b" Apr 22 14:30:13.753472 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:30:13.753450 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f271e4dc8e161bc86fcd8b67db67077e940cc8562e7bb6890bf5a87db9652b8b\": container with ID starting with f271e4dc8e161bc86fcd8b67db67077e940cc8562e7bb6890bf5a87db9652b8b not found: ID does not exist" containerID="f271e4dc8e161bc86fcd8b67db67077e940cc8562e7bb6890bf5a87db9652b8b" Apr 22 14:30:13.753540 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:30:13.753479 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f271e4dc8e161bc86fcd8b67db67077e940cc8562e7bb6890bf5a87db9652b8b"} err="failed to get container status \"f271e4dc8e161bc86fcd8b67db67077e940cc8562e7bb6890bf5a87db9652b8b\": rpc error: code = NotFound desc = could not find container \"f271e4dc8e161bc86fcd8b67db67077e940cc8562e7bb6890bf5a87db9652b8b\": container with ID starting with f271e4dc8e161bc86fcd8b67db67077e940cc8562e7bb6890bf5a87db9652b8b not found: ID does not exist" Apr 22 14:30:13.753540 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:30:13.753505 2576 scope.go:117] "RemoveContainer" containerID="5f67cc47fababb4ee257940f9bade0f2ebe28f8f6423123bf504d3248bb66665" Apr 22 14:30:13.753745 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:30:13.753725 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f67cc47fababb4ee257940f9bade0f2ebe28f8f6423123bf504d3248bb66665\": container with ID starting with 5f67cc47fababb4ee257940f9bade0f2ebe28f8f6423123bf504d3248bb66665 not found: ID does not exist" containerID="5f67cc47fababb4ee257940f9bade0f2ebe28f8f6423123bf504d3248bb66665" Apr 22 14:30:13.753840 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:30:13.753755 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f67cc47fababb4ee257940f9bade0f2ebe28f8f6423123bf504d3248bb66665"} err="failed to get container status \"5f67cc47fababb4ee257940f9bade0f2ebe28f8f6423123bf504d3248bb66665\": rpc error: code = NotFound desc = could not find container \"5f67cc47fababb4ee257940f9bade0f2ebe28f8f6423123bf504d3248bb66665\": container with ID starting with 5f67cc47fababb4ee257940f9bade0f2ebe28f8f6423123bf504d3248bb66665 not found: ID does not exist" Apr 22 14:30:13.753840 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:30:13.753777 2576 scope.go:117] "RemoveContainer" containerID="51c75b5b7e123704cdab7238d6e8e611b01b7127647507358f8b9a380ea0a124" Apr 22 14:30:13.754068 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:30:13.754051 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51c75b5b7e123704cdab7238d6e8e611b01b7127647507358f8b9a380ea0a124\": container with ID starting with 51c75b5b7e123704cdab7238d6e8e611b01b7127647507358f8b9a380ea0a124 not found: ID does not exist" containerID="51c75b5b7e123704cdab7238d6e8e611b01b7127647507358f8b9a380ea0a124" Apr 22 14:30:13.754111 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:30:13.754075 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-lpmh6"] Apr 22 14:30:13.754111 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:30:13.754072 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51c75b5b7e123704cdab7238d6e8e611b01b7127647507358f8b9a380ea0a124"} err="failed to get container status \"51c75b5b7e123704cdab7238d6e8e611b01b7127647507358f8b9a380ea0a124\": rpc error: code = NotFound desc = could not find container \"51c75b5b7e123704cdab7238d6e8e611b01b7127647507358f8b9a380ea0a124\": container with ID starting with 51c75b5b7e123704cdab7238d6e8e611b01b7127647507358f8b9a380ea0a124 not found: ID does not exist" Apr 22 14:30:13.757641 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:30:13.757619 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-lpmh6"] Apr 22 14:30:14.291562 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:30:14.291534 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4440562f-7c3f-4e72-8d2d-0e6a39b81c14" path="/var/lib/kubelet/pods/4440562f-7c3f-4e72-8d2d-0e6a39b81c14/volumes" Apr 22 14:30:17.682651 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:30:17.682613 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-4bshk" podUID="89c658ca-70a9-4cc2-b34d-71aca3e32f33" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 22 14:30:27.682600 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:30:27.682563 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-4bshk" podUID="89c658ca-70a9-4cc2-b34d-71aca3e32f33" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 22 14:30:30.941522 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:30:30.941495 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x7g6p_5880a8e9-777a-4921-b5f6-c6325c768bf2/ovn-acl-logging/0.log" Apr 22 14:30:30.942144 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:30:30.942123 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x7g6p_5880a8e9-777a-4921-b5f6-c6325c768bf2/ovn-acl-logging/0.log" Apr 22 14:30:37.683059 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:30:37.683017 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-4bshk" podUID="89c658ca-70a9-4cc2-b34d-71aca3e32f33" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 22 14:30:47.682734 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:30:47.682692 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-4bshk" podUID="89c658ca-70a9-4cc2-b34d-71aca3e32f33" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 22 14:30:57.682616 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:30:57.682579 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-4bshk" podUID="89c658ca-70a9-4cc2-b34d-71aca3e32f33" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 22 14:30:58.288980 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:30:58.288945 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-4bshk" podUID="89c658ca-70a9-4cc2-b34d-71aca3e32f33" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 22 14:31:08.291800 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:31:08.291770 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-4bshk" Apr 22 14:31:13.155734 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:31:13.155708 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-4bshk"] Apr 22 14:31:13.156995 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:31:13.155970 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-4bshk" podUID="89c658ca-70a9-4cc2-b34d-71aca3e32f33" containerName="kserve-container" containerID="cri-o://65cbd1e376702a1ead05df161400a4359364383bc5c9281fb779f05888905b79" gracePeriod=30 Apr 22 14:31:13.239662 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:31:13.239634 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-hdbp8"] Apr 22 14:31:13.239982 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:31:13.239968 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4440562f-7c3f-4e72-8d2d-0e6a39b81c14" containerName="agent" Apr 22 14:31:13.240066 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:31:13.239985 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="4440562f-7c3f-4e72-8d2d-0e6a39b81c14" containerName="agent" Apr 22 14:31:13.240066 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:31:13.240003 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4440562f-7c3f-4e72-8d2d-0e6a39b81c14" containerName="storage-initializer" Apr 22 14:31:13.240066 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:31:13.240011 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="4440562f-7c3f-4e72-8d2d-0e6a39b81c14" containerName="storage-initializer" Apr 22 14:31:13.240066 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:31:13.240021 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4440562f-7c3f-4e72-8d2d-0e6a39b81c14" containerName="kserve-container" Apr 22 14:31:13.240066 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:31:13.240033 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="4440562f-7c3f-4e72-8d2d-0e6a39b81c14" containerName="kserve-container" Apr 22 14:31:13.240272 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:31:13.240144 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="4440562f-7c3f-4e72-8d2d-0e6a39b81c14" containerName="kserve-container" Apr 22 14:31:13.240272 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:31:13.240162 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="4440562f-7c3f-4e72-8d2d-0e6a39b81c14" containerName="agent" Apr 22 14:31:13.242074 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:31:13.242055 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-hdbp8" Apr 22 14:31:13.250316 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:31:13.250292 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-hdbp8"] Apr 22 14:31:13.320157 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:31:13.320131 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6bee471a-52ae-4181-93bb-bd953988823e-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-64984c7cb-hdbp8\" (UID: \"6bee471a-52ae-4181-93bb-bd953988823e\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-hdbp8" Apr 22 14:31:13.421054 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:31:13.420995 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6bee471a-52ae-4181-93bb-bd953988823e-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-64984c7cb-hdbp8\" (UID: \"6bee471a-52ae-4181-93bb-bd953988823e\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-hdbp8" Apr 22 14:31:13.421325 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:31:13.421308 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6bee471a-52ae-4181-93bb-bd953988823e-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-64984c7cb-hdbp8\" (UID: \"6bee471a-52ae-4181-93bb-bd953988823e\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-hdbp8" Apr 22 14:31:13.552096 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:31:13.552071 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-hdbp8" Apr 22 14:31:13.659461 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:31:13.659438 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-hdbp8"] Apr 22 14:31:13.661455 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:31:13.661429 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bee471a_52ae_4181_93bb_bd953988823e.slice/crio-e48b1d50848703c43e2920625e862348a4759f278cc3c165b20ad1694d614bed WatchSource:0}: Error finding container e48b1d50848703c43e2920625e862348a4759f278cc3c165b20ad1694d614bed: Status 404 returned error can't find the container with id e48b1d50848703c43e2920625e862348a4759f278cc3c165b20ad1694d614bed Apr 22 14:31:13.894756 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:31:13.894716 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-hdbp8" event={"ID":"6bee471a-52ae-4181-93bb-bd953988823e","Type":"ContainerStarted","Data":"c4d03f86d573c6232e83b3da97cf07752f0e26028d2913bf426344327e41e35b"} Apr 22 14:31:13.894756 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:31:13.894750 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-hdbp8" event={"ID":"6bee471a-52ae-4181-93bb-bd953988823e","Type":"ContainerStarted","Data":"e48b1d50848703c43e2920625e862348a4759f278cc3c165b20ad1694d614bed"} Apr 22 14:31:17.084841 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:31:17.084803 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-4bshk" Apr 22 14:31:17.149520 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:31:17.149496 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/89c658ca-70a9-4cc2-b34d-71aca3e32f33-kserve-provision-location\") pod \"89c658ca-70a9-4cc2-b34d-71aca3e32f33\" (UID: \"89c658ca-70a9-4cc2-b34d-71aca3e32f33\") " Apr 22 14:31:17.149779 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:31:17.149758 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89c658ca-70a9-4cc2-b34d-71aca3e32f33-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "89c658ca-70a9-4cc2-b34d-71aca3e32f33" (UID: "89c658ca-70a9-4cc2-b34d-71aca3e32f33"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:31:17.250337 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:31:17.250315 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/89c658ca-70a9-4cc2-b34d-71aca3e32f33-kserve-provision-location\") on node \"ip-10-0-142-195.ec2.internal\" DevicePath \"\"" Apr 22 14:31:17.908407 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:31:17.908372 2576 generic.go:358] "Generic (PLEG): container finished" podID="89c658ca-70a9-4cc2-b34d-71aca3e32f33" containerID="65cbd1e376702a1ead05df161400a4359364383bc5c9281fb779f05888905b79" exitCode=0 Apr 22 14:31:17.908604 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:31:17.908457 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-4bshk" Apr 22 14:31:17.908604 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:31:17.908456 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-4bshk" event={"ID":"89c658ca-70a9-4cc2-b34d-71aca3e32f33","Type":"ContainerDied","Data":"65cbd1e376702a1ead05df161400a4359364383bc5c9281fb779f05888905b79"} Apr 22 14:31:17.908604 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:31:17.908506 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-4bshk" event={"ID":"89c658ca-70a9-4cc2-b34d-71aca3e32f33","Type":"ContainerDied","Data":"dc485738d6dfc1cf16c77966f4efb73826fb81e3b2c0394dffd91cd8fbbc5b4e"} Apr 22 14:31:17.908604 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:31:17.908531 2576 scope.go:117] "RemoveContainer" containerID="65cbd1e376702a1ead05df161400a4359364383bc5c9281fb779f05888905b79" Apr 22 14:31:17.910078 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:31:17.910052 2576 generic.go:358] "Generic (PLEG): container finished" podID="6bee471a-52ae-4181-93bb-bd953988823e" containerID="c4d03f86d573c6232e83b3da97cf07752f0e26028d2913bf426344327e41e35b" exitCode=0 Apr 22 14:31:17.910191 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:31:17.910135 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-hdbp8" event={"ID":"6bee471a-52ae-4181-93bb-bd953988823e","Type":"ContainerDied","Data":"c4d03f86d573c6232e83b3da97cf07752f0e26028d2913bf426344327e41e35b"} Apr 22 14:31:17.917976 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:31:17.917709 2576 scope.go:117] "RemoveContainer" containerID="306048b2c9df9469880b0f66f6ae42d7557e927b1009c322e24574d0b03453c4" Apr 22 14:31:17.925115 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:31:17.925053 2576 scope.go:117] "RemoveContainer" containerID="65cbd1e376702a1ead05df161400a4359364383bc5c9281fb779f05888905b79" Apr 22 14:31:17.925347 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:31:17.925329 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65cbd1e376702a1ead05df161400a4359364383bc5c9281fb779f05888905b79\": container with ID starting with 65cbd1e376702a1ead05df161400a4359364383bc5c9281fb779f05888905b79 not found: ID does not exist" containerID="65cbd1e376702a1ead05df161400a4359364383bc5c9281fb779f05888905b79" Apr 22 14:31:17.925411 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:31:17.925355 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65cbd1e376702a1ead05df161400a4359364383bc5c9281fb779f05888905b79"} err="failed to get container status \"65cbd1e376702a1ead05df161400a4359364383bc5c9281fb779f05888905b79\": rpc error: code = NotFound desc = could not find container \"65cbd1e376702a1ead05df161400a4359364383bc5c9281fb779f05888905b79\": container with ID starting with 65cbd1e376702a1ead05df161400a4359364383bc5c9281fb779f05888905b79 not found: ID does not exist" Apr 22 14:31:17.925411 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:31:17.925390 2576 scope.go:117] "RemoveContainer" containerID="306048b2c9df9469880b0f66f6ae42d7557e927b1009c322e24574d0b03453c4" Apr 22 14:31:17.925657 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:31:17.925642 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"306048b2c9df9469880b0f66f6ae42d7557e927b1009c322e24574d0b03453c4\": container with ID starting with 306048b2c9df9469880b0f66f6ae42d7557e927b1009c322e24574d0b03453c4 not found: ID does not exist" containerID="306048b2c9df9469880b0f66f6ae42d7557e927b1009c322e24574d0b03453c4" Apr 22 14:31:17.925701 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:31:17.925660 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"306048b2c9df9469880b0f66f6ae42d7557e927b1009c322e24574d0b03453c4"} err="failed to get container status \"306048b2c9df9469880b0f66f6ae42d7557e927b1009c322e24574d0b03453c4\": rpc error: code = NotFound desc = could not find container \"306048b2c9df9469880b0f66f6ae42d7557e927b1009c322e24574d0b03453c4\": container with ID starting with 306048b2c9df9469880b0f66f6ae42d7557e927b1009c322e24574d0b03453c4 not found: ID does not exist" Apr 22 14:31:17.943509 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:31:17.943485 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-4bshk"] Apr 22 14:31:17.947318 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:31:17.947296 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-4bshk"] Apr 22 14:31:18.291586 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:31:18.291560 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89c658ca-70a9-4cc2-b34d-71aca3e32f33" path="/var/lib/kubelet/pods/89c658ca-70a9-4cc2-b34d-71aca3e32f33/volumes" Apr 22 14:31:18.915782 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:31:18.915753 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-hdbp8" event={"ID":"6bee471a-52ae-4181-93bb-bd953988823e","Type":"ContainerStarted","Data":"38fa34c6a96be3dad0bf08872c3b22cc60ec9968168a9b1935b8835b8ec2a717"} Apr 22 14:31:18.916088 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:31:18.916065 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-hdbp8" Apr 22 14:31:18.917331 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:31:18.917303 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-hdbp8" podUID="6bee471a-52ae-4181-93bb-bd953988823e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 22 14:31:18.932973 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:31:18.932919 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-hdbp8" podStartSLOduration=5.932901185 podStartE2EDuration="5.932901185s" podCreationTimestamp="2026-04-22 14:31:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:31:18.93142036 +0000 UTC m=+949.257154793" watchObservedRunningTime="2026-04-22 14:31:18.932901185 +0000 UTC m=+949.258635601" Apr 22 14:31:19.918751 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:31:19.918723 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-hdbp8" podUID="6bee471a-52ae-4181-93bb-bd953988823e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 22 14:31:29.919500 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:31:29.919454 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-hdbp8" podUID="6bee471a-52ae-4181-93bb-bd953988823e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 22 14:31:39.919885 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:31:39.919771 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-hdbp8" podUID="6bee471a-52ae-4181-93bb-bd953988823e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 22 14:31:49.919337 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:31:49.919290 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-hdbp8" podUID="6bee471a-52ae-4181-93bb-bd953988823e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 22 14:31:59.918940 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:31:59.918895 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-hdbp8" podUID="6bee471a-52ae-4181-93bb-bd953988823e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 22 14:32:09.919424 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:32:09.919376 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-hdbp8" podUID="6bee471a-52ae-4181-93bb-bd953988823e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 22 14:32:19.919178 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:32:19.919137 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-hdbp8" podUID="6bee471a-52ae-4181-93bb-bd953988823e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 22 14:32:26.293453 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:32:26.293423 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-hdbp8" Apr 22 14:32:33.464900 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:32:33.464867 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-hdbp8"] Apr 22 14:32:33.465254 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:32:33.465085 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-hdbp8" podUID="6bee471a-52ae-4181-93bb-bd953988823e" containerName="kserve-container" containerID="cri-o://38fa34c6a96be3dad0bf08872c3b22cc60ec9968168a9b1935b8835b8ec2a717" gracePeriod=30 Apr 22 14:32:33.535701 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:32:33.535675 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-tzrbb"] Apr 22 14:32:33.535955 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:32:33.535941 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="89c658ca-70a9-4cc2-b34d-71aca3e32f33" containerName="storage-initializer" Apr 22 14:32:33.536002 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:32:33.535958 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="89c658ca-70a9-4cc2-b34d-71aca3e32f33" containerName="storage-initializer" Apr 22 14:32:33.536002 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:32:33.535974 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="89c658ca-70a9-4cc2-b34d-71aca3e32f33" containerName="kserve-container" Apr 22 14:32:33.536002 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:32:33.535981 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="89c658ca-70a9-4cc2-b34d-71aca3e32f33" containerName="kserve-container" Apr 22 14:32:33.536092 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:32:33.536054 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="89c658ca-70a9-4cc2-b34d-71aca3e32f33" containerName="kserve-container" Apr 22 14:32:33.538741 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:32:33.538727 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-tzrbb" Apr 22 14:32:33.545426 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:32:33.545407 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-tzrbb"] Apr 22 14:32:33.657803 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:32:33.657776 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e023b377-12a1-45e8-af89-d6b4351ea563-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8d44c64dc-tzrbb\" (UID: \"e023b377-12a1-45e8-af89-d6b4351ea563\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-tzrbb" Apr 22 14:32:33.758665 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:32:33.758639 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e023b377-12a1-45e8-af89-d6b4351ea563-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8d44c64dc-tzrbb\" (UID: \"e023b377-12a1-45e8-af89-d6b4351ea563\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-tzrbb" Apr 22 14:32:33.758983 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:32:33.758968 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e023b377-12a1-45e8-af89-d6b4351ea563-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8d44c64dc-tzrbb\" (UID: \"e023b377-12a1-45e8-af89-d6b4351ea563\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-tzrbb" Apr 22 14:32:33.848056 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:32:33.848033 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-tzrbb" Apr 22 14:32:33.958437 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:32:33.958414 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-tzrbb"] Apr 22 14:32:33.959986 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:32:33.959959 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode023b377_12a1_45e8_af89_d6b4351ea563.slice/crio-c02bce5882ffb3a95bfaa3b96fab436b2d8a936bf54a4844baf98243f561fbd8 WatchSource:0}: Error finding container c02bce5882ffb3a95bfaa3b96fab436b2d8a936bf54a4844baf98243f561fbd8: Status 404 returned error can't find the container with id c02bce5882ffb3a95bfaa3b96fab436b2d8a936bf54a4844baf98243f561fbd8 Apr 22 14:32:33.961674 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:32:33.961655 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 14:32:34.126902 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:32:34.126831 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-tzrbb" event={"ID":"e023b377-12a1-45e8-af89-d6b4351ea563","Type":"ContainerStarted","Data":"9e3cd25ccca0b223e6208db7fbb2d14a930ede9584dbb193535469a0ce442f79"} Apr 22 14:32:34.126902 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:32:34.126865 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-tzrbb" event={"ID":"e023b377-12a1-45e8-af89-d6b4351ea563","Type":"ContainerStarted","Data":"c02bce5882ffb3a95bfaa3b96fab436b2d8a936bf54a4844baf98243f561fbd8"} Apr 22 14:32:36.289635 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:32:36.289587 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-hdbp8" podUID="6bee471a-52ae-4181-93bb-bd953988823e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 22 14:32:37.399675 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:32:37.399655 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-hdbp8" Apr 22 14:32:37.484997 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:32:37.484971 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6bee471a-52ae-4181-93bb-bd953988823e-kserve-provision-location\") pod \"6bee471a-52ae-4181-93bb-bd953988823e\" (UID: \"6bee471a-52ae-4181-93bb-bd953988823e\") " Apr 22 14:32:37.485280 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:32:37.485236 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bee471a-52ae-4181-93bb-bd953988823e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6bee471a-52ae-4181-93bb-bd953988823e" (UID: "6bee471a-52ae-4181-93bb-bd953988823e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:32:37.586155 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:32:37.586132 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6bee471a-52ae-4181-93bb-bd953988823e-kserve-provision-location\") on node \"ip-10-0-142-195.ec2.internal\" DevicePath \"\"" Apr 22 14:32:38.139082 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:32:38.139049 2576 generic.go:358] "Generic (PLEG): container finished" podID="e023b377-12a1-45e8-af89-d6b4351ea563" containerID="9e3cd25ccca0b223e6208db7fbb2d14a930ede9584dbb193535469a0ce442f79" exitCode=0 Apr 22 14:32:38.139225 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:32:38.139126 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-tzrbb" event={"ID":"e023b377-12a1-45e8-af89-d6b4351ea563","Type":"ContainerDied","Data":"9e3cd25ccca0b223e6208db7fbb2d14a930ede9584dbb193535469a0ce442f79"} Apr 22 14:32:38.140735 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:32:38.140710 2576 generic.go:358] "Generic (PLEG): container finished" podID="6bee471a-52ae-4181-93bb-bd953988823e" containerID="38fa34c6a96be3dad0bf08872c3b22cc60ec9968168a9b1935b8835b8ec2a717" exitCode=0 Apr 22 14:32:38.140856 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:32:38.140740 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-hdbp8" event={"ID":"6bee471a-52ae-4181-93bb-bd953988823e","Type":"ContainerDied","Data":"38fa34c6a96be3dad0bf08872c3b22cc60ec9968168a9b1935b8835b8ec2a717"} Apr 22 14:32:38.140856 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:32:38.140777 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-hdbp8" event={"ID":"6bee471a-52ae-4181-93bb-bd953988823e","Type":"ContainerDied","Data":"e48b1d50848703c43e2920625e862348a4759f278cc3c165b20ad1694d614bed"} Apr 22 14:32:38.140856 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:32:38.140827 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-hdbp8" Apr 22 14:32:38.140967 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:32:38.140827 2576 scope.go:117] "RemoveContainer" containerID="38fa34c6a96be3dad0bf08872c3b22cc60ec9968168a9b1935b8835b8ec2a717" Apr 22 14:32:38.149996 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:32:38.149976 2576 scope.go:117] "RemoveContainer" containerID="c4d03f86d573c6232e83b3da97cf07752f0e26028d2913bf426344327e41e35b" Apr 22 14:32:38.157514 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:32:38.157496 2576 scope.go:117] "RemoveContainer" containerID="38fa34c6a96be3dad0bf08872c3b22cc60ec9968168a9b1935b8835b8ec2a717" Apr 22 14:32:38.157802 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:32:38.157780 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38fa34c6a96be3dad0bf08872c3b22cc60ec9968168a9b1935b8835b8ec2a717\": container with ID starting with 38fa34c6a96be3dad0bf08872c3b22cc60ec9968168a9b1935b8835b8ec2a717 not found: ID does not exist" containerID="38fa34c6a96be3dad0bf08872c3b22cc60ec9968168a9b1935b8835b8ec2a717" Apr 22 14:32:38.157870 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:32:38.157826 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38fa34c6a96be3dad0bf08872c3b22cc60ec9968168a9b1935b8835b8ec2a717"} err="failed to get container status \"38fa34c6a96be3dad0bf08872c3b22cc60ec9968168a9b1935b8835b8ec2a717\": rpc error: code = NotFound desc = could not find container \"38fa34c6a96be3dad0bf08872c3b22cc60ec9968168a9b1935b8835b8ec2a717\": container with ID starting with 38fa34c6a96be3dad0bf08872c3b22cc60ec9968168a9b1935b8835b8ec2a717 not found: ID does not exist" Apr 22 14:32:38.157870 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:32:38.157847 2576 scope.go:117] "RemoveContainer" containerID="c4d03f86d573c6232e83b3da97cf07752f0e26028d2913bf426344327e41e35b" Apr 22 14:32:38.158096 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:32:38.158077 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4d03f86d573c6232e83b3da97cf07752f0e26028d2913bf426344327e41e35b\": container with ID starting with c4d03f86d573c6232e83b3da97cf07752f0e26028d2913bf426344327e41e35b not found: ID does not exist" containerID="c4d03f86d573c6232e83b3da97cf07752f0e26028d2913bf426344327e41e35b" Apr 22 14:32:38.158152 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:32:38.158105 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4d03f86d573c6232e83b3da97cf07752f0e26028d2913bf426344327e41e35b"} err="failed to get container status \"c4d03f86d573c6232e83b3da97cf07752f0e26028d2913bf426344327e41e35b\": rpc error: code = NotFound desc = could not find container \"c4d03f86d573c6232e83b3da97cf07752f0e26028d2913bf426344327e41e35b\": container with ID starting with c4d03f86d573c6232e83b3da97cf07752f0e26028d2913bf426344327e41e35b not found: ID does not exist" Apr 22 14:32:38.169044 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:32:38.169019 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-hdbp8"] Apr 22 14:32:38.174469 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:32:38.174449 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-hdbp8"] Apr 22 14:32:38.291764 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:32:38.291740 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bee471a-52ae-4181-93bb-bd953988823e" path="/var/lib/kubelet/pods/6bee471a-52ae-4181-93bb-bd953988823e/volumes" Apr 22 14:34:55.530146 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:34:55.530112 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-tzrbb" event={"ID":"e023b377-12a1-45e8-af89-d6b4351ea563","Type":"ContainerStarted","Data":"1dcae323b7b750dc240b8f0877a1908807b690d7a41320fde963f378eb5c4c7a"} Apr 22 14:34:55.530594 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:34:55.530168 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-tzrbb" Apr 22 14:34:55.557157 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:34:55.557115 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-tzrbb" podStartSLOduration=5.492124696 podStartE2EDuration="2m22.557098998s" podCreationTimestamp="2026-04-22 14:32:33 +0000 UTC" firstStartedPulling="2026-04-22 14:32:38.140493447 +0000 UTC m=+1028.466227841" lastFinishedPulling="2026-04-22 14:34:55.205467742 +0000 UTC m=+1165.531202143" observedRunningTime="2026-04-22 14:34:55.5567436 +0000 UTC m=+1165.882478019" watchObservedRunningTime="2026-04-22 14:34:55.557098998 +0000 UTC m=+1165.882833418" Apr 22 14:35:26.538734 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:26.538700 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-tzrbb" Apr 22 14:35:30.960636 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:30.960604 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x7g6p_5880a8e9-777a-4921-b5f6-c6325c768bf2/ovn-acl-logging/0.log" Apr 22 14:35:30.962182 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:30.962157 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x7g6p_5880a8e9-777a-4921-b5f6-c6325c768bf2/ovn-acl-logging/0.log" Apr 22 14:35:33.736831 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:33.736792 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-tzrbb"] Apr 22 14:35:33.737263 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:33.737056 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-tzrbb" podUID="e023b377-12a1-45e8-af89-d6b4351ea563" containerName="kserve-container" containerID="cri-o://1dcae323b7b750dc240b8f0877a1908807b690d7a41320fde963f378eb5c4c7a" gracePeriod=30 Apr 22 14:35:33.831446 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:33.831417 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-88x6r"] Apr 22 14:35:33.831691 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:33.831679 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6bee471a-52ae-4181-93bb-bd953988823e" containerName="storage-initializer" Apr 22 14:35:33.831732 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:33.831693 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bee471a-52ae-4181-93bb-bd953988823e" containerName="storage-initializer" Apr 22 14:35:33.831732 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:33.831700 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6bee471a-52ae-4181-93bb-bd953988823e" containerName="kserve-container" Apr 22 14:35:33.831732 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:33.831705 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bee471a-52ae-4181-93bb-bd953988823e" containerName="kserve-container" Apr 22 14:35:33.831827 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:33.831750 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="6bee471a-52ae-4181-93bb-bd953988823e" containerName="kserve-container" Apr 22 14:35:33.853647 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:33.853612 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-88x6r"] Apr 22 14:35:33.853789 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:33.853754 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-88x6r" Apr 22 14:35:33.922359 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:33.922325 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ee853580-d5f3-4b66-bb8d-30463980f664-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-88x6r\" (UID: \"ee853580-d5f3-4b66-bb8d-30463980f664\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-88x6r" Apr 22 14:35:34.023742 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:34.023669 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ee853580-d5f3-4b66-bb8d-30463980f664-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-88x6r\" (UID: \"ee853580-d5f3-4b66-bb8d-30463980f664\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-88x6r" Apr 22 14:35:34.024058 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:34.024039 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ee853580-d5f3-4b66-bb8d-30463980f664-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-88x6r\" (UID: \"ee853580-d5f3-4b66-bb8d-30463980f664\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-88x6r" Apr 22 14:35:34.162973 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:34.162947 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-88x6r" Apr 22 14:35:34.270031 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:34.270002 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-88x6r"] Apr 22 14:35:34.272788 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:35:34.272760 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee853580_d5f3_4b66_bb8d_30463980f664.slice/crio-308a1658fac97a9a07f17505a2410afefed59d6303880244bf3d4dfee0143967 WatchSource:0}: Error finding container 308a1658fac97a9a07f17505a2410afefed59d6303880244bf3d4dfee0143967: Status 404 returned error can't find the container with id 308a1658fac97a9a07f17505a2410afefed59d6303880244bf3d4dfee0143967 Apr 22 14:35:34.632466 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:34.632427 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-88x6r" event={"ID":"ee853580-d5f3-4b66-bb8d-30463980f664","Type":"ContainerStarted","Data":"e00ce51ecdad8ca24cb36bb4713f361466efdddaec78de8d11ce37fc2058cd16"} Apr 22 14:35:34.632616 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:34.632476 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-88x6r" event={"ID":"ee853580-d5f3-4b66-bb8d-30463980f664","Type":"ContainerStarted","Data":"308a1658fac97a9a07f17505a2410afefed59d6303880244bf3d4dfee0143967"} Apr 22 14:35:34.634237 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:34.634212 2576 generic.go:358] "Generic (PLEG): container finished" podID="e023b377-12a1-45e8-af89-d6b4351ea563" containerID="1dcae323b7b750dc240b8f0877a1908807b690d7a41320fde963f378eb5c4c7a" exitCode=0 Apr 22 14:35:34.634345 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:34.634285 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-tzrbb" event={"ID":"e023b377-12a1-45e8-af89-d6b4351ea563","Type":"ContainerDied","Data":"1dcae323b7b750dc240b8f0877a1908807b690d7a41320fde963f378eb5c4c7a"} Apr 22 14:35:34.688071 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:34.688050 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-tzrbb" Apr 22 14:35:34.729560 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:34.729538 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e023b377-12a1-45e8-af89-d6b4351ea563-kserve-provision-location\") pod \"e023b377-12a1-45e8-af89-d6b4351ea563\" (UID: \"e023b377-12a1-45e8-af89-d6b4351ea563\") " Apr 22 14:35:34.729800 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:34.729777 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e023b377-12a1-45e8-af89-d6b4351ea563-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e023b377-12a1-45e8-af89-d6b4351ea563" (UID: "e023b377-12a1-45e8-af89-d6b4351ea563"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:35:34.830588 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:34.830537 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e023b377-12a1-45e8-af89-d6b4351ea563-kserve-provision-location\") on node \"ip-10-0-142-195.ec2.internal\" DevicePath \"\"" Apr 22 14:35:35.638662 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:35.638630 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-tzrbb" event={"ID":"e023b377-12a1-45e8-af89-d6b4351ea563","Type":"ContainerDied","Data":"c02bce5882ffb3a95bfaa3b96fab436b2d8a936bf54a4844baf98243f561fbd8"} Apr 22 14:35:35.638828 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:35.638675 2576 scope.go:117] "RemoveContainer" containerID="1dcae323b7b750dc240b8f0877a1908807b690d7a41320fde963f378eb5c4c7a" Apr 22 14:35:35.638828 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:35.638645 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-tzrbb" Apr 22 14:35:35.646497 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:35.646475 2576 scope.go:117] "RemoveContainer" containerID="9e3cd25ccca0b223e6208db7fbb2d14a930ede9584dbb193535469a0ce442f79" Apr 22 14:35:35.659862 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:35.659838 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-tzrbb"] Apr 22 14:35:35.663014 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:35.662995 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-tzrbb"] Apr 22 14:35:36.291665 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:36.291636 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e023b377-12a1-45e8-af89-d6b4351ea563" path="/var/lib/kubelet/pods/e023b377-12a1-45e8-af89-d6b4351ea563/volumes" Apr 22 14:35:38.648546 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:38.648520 2576 generic.go:358] "Generic (PLEG): container finished" podID="ee853580-d5f3-4b66-bb8d-30463980f664" containerID="e00ce51ecdad8ca24cb36bb4713f361466efdddaec78de8d11ce37fc2058cd16" exitCode=0 Apr 22 14:35:38.648899 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:38.648599 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-88x6r" event={"ID":"ee853580-d5f3-4b66-bb8d-30463980f664","Type":"ContainerDied","Data":"e00ce51ecdad8ca24cb36bb4713f361466efdddaec78de8d11ce37fc2058cd16"} Apr 22 14:35:39.652625 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:39.652593 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-88x6r" event={"ID":"ee853580-d5f3-4b66-bb8d-30463980f664","Type":"ContainerStarted","Data":"cacfa4a0b82ec4e179208106967ea6882fde8a654d11ceda97838e526163ddd0"} Apr 22 14:35:39.653095 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:39.652922 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-88x6r" Apr 22 14:35:39.654341 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:39.654313 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-88x6r" podUID="ee853580-d5f3-4b66-bb8d-30463980f664" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 22 14:35:39.671694 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:39.671647 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-88x6r" podStartSLOduration=6.671630342 podStartE2EDuration="6.671630342s" podCreationTimestamp="2026-04-22 14:35:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:35:39.670083449 +0000 UTC m=+1209.995817874" watchObservedRunningTime="2026-04-22 14:35:39.671630342 +0000 UTC m=+1209.997364757" Apr 22 14:35:40.655602 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:40.655570 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-88x6r" podUID="ee853580-d5f3-4b66-bb8d-30463980f664" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 22 14:35:50.656727 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:50.656687 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-88x6r" Apr 22 14:35:53.888908 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:53.888871 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-88x6r"] Apr 22 14:35:53.889340 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:53.889106 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-88x6r" podUID="ee853580-d5f3-4b66-bb8d-30463980f664" containerName="kserve-container" containerID="cri-o://cacfa4a0b82ec4e179208106967ea6882fde8a654d11ceda97838e526163ddd0" gracePeriod=30 Apr 22 14:35:53.972328 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:53.972299 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-jkvwn"] Apr 22 14:35:53.972546 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:53.972535 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e023b377-12a1-45e8-af89-d6b4351ea563" containerName="storage-initializer" Apr 22 14:35:53.972592 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:53.972548 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e023b377-12a1-45e8-af89-d6b4351ea563" containerName="storage-initializer" Apr 22 14:35:53.972592 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:53.972561 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e023b377-12a1-45e8-af89-d6b4351ea563" containerName="kserve-container" Apr 22 14:35:53.972592 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:53.972566 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e023b377-12a1-45e8-af89-d6b4351ea563" containerName="kserve-container" Apr 22 14:35:53.972684 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:53.972614 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="e023b377-12a1-45e8-af89-d6b4351ea563" containerName="kserve-container" Apr 22 14:35:53.974586 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:53.974566 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-jkvwn" Apr 22 14:35:53.988939 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:53.988921 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-jkvwn"] Apr 22 14:35:54.061318 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:54.061295 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2da5352d-b38f-4cb3-9a44-1261dff54aa7-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5497fc5f68-jkvwn\" (UID: \"2da5352d-b38f-4cb3-9a44-1261dff54aa7\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-jkvwn" Apr 22 14:35:54.161914 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:54.161830 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2da5352d-b38f-4cb3-9a44-1261dff54aa7-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5497fc5f68-jkvwn\" (UID: \"2da5352d-b38f-4cb3-9a44-1261dff54aa7\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-jkvwn" Apr 22 14:35:54.162175 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:54.162158 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2da5352d-b38f-4cb3-9a44-1261dff54aa7-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5497fc5f68-jkvwn\" (UID: \"2da5352d-b38f-4cb3-9a44-1261dff54aa7\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-jkvwn" Apr 22 14:35:54.283677 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:54.283645 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-jkvwn" Apr 22 14:35:54.404687 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:54.404665 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-jkvwn"] Apr 22 14:35:54.406858 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:35:54.406828 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2da5352d_b38f_4cb3_9a44_1261dff54aa7.slice/crio-273bec5ba797dd3716bd948145e2df4744ce1dc8c938909511683a14802701c4 WatchSource:0}: Error finding container 273bec5ba797dd3716bd948145e2df4744ce1dc8c938909511683a14802701c4: Status 404 returned error can't find the container with id 273bec5ba797dd3716bd948145e2df4744ce1dc8c938909511683a14802701c4 Apr 22 14:35:54.517582 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:54.517565 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-88x6r" Apr 22 14:35:54.665707 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:54.665679 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ee853580-d5f3-4b66-bb8d-30463980f664-kserve-provision-location\") pod \"ee853580-d5f3-4b66-bb8d-30463980f664\" (UID: \"ee853580-d5f3-4b66-bb8d-30463980f664\") " Apr 22 14:35:54.665990 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:54.665970 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee853580-d5f3-4b66-bb8d-30463980f664-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ee853580-d5f3-4b66-bb8d-30463980f664" (UID: "ee853580-d5f3-4b66-bb8d-30463980f664"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:35:54.692903 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:54.692841 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-jkvwn" event={"ID":"2da5352d-b38f-4cb3-9a44-1261dff54aa7","Type":"ContainerStarted","Data":"8be3568ea6c67ec51efcdf4902a4b041c1e4ee4daf4fbec96007c54161cd3772"} Apr 22 14:35:54.692903 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:54.692882 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-jkvwn" event={"ID":"2da5352d-b38f-4cb3-9a44-1261dff54aa7","Type":"ContainerStarted","Data":"273bec5ba797dd3716bd948145e2df4744ce1dc8c938909511683a14802701c4"} Apr 22 14:35:54.694097 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:54.694078 2576 generic.go:358] "Generic (PLEG): container finished" podID="ee853580-d5f3-4b66-bb8d-30463980f664" containerID="cacfa4a0b82ec4e179208106967ea6882fde8a654d11ceda97838e526163ddd0" exitCode=0 Apr 22 14:35:54.694191 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:54.694130 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-88x6r" Apr 22 14:35:54.694191 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:54.694158 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-88x6r" event={"ID":"ee853580-d5f3-4b66-bb8d-30463980f664","Type":"ContainerDied","Data":"cacfa4a0b82ec4e179208106967ea6882fde8a654d11ceda97838e526163ddd0"} Apr 22 14:35:54.694256 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:54.694203 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-88x6r" event={"ID":"ee853580-d5f3-4b66-bb8d-30463980f664","Type":"ContainerDied","Data":"308a1658fac97a9a07f17505a2410afefed59d6303880244bf3d4dfee0143967"} Apr 22 14:35:54.694256 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:54.694223 2576 scope.go:117] "RemoveContainer" containerID="cacfa4a0b82ec4e179208106967ea6882fde8a654d11ceda97838e526163ddd0" Apr 22 14:35:54.701661 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:54.701645 2576 scope.go:117] "RemoveContainer" containerID="e00ce51ecdad8ca24cb36bb4713f361466efdddaec78de8d11ce37fc2058cd16" Apr 22 14:35:54.707868 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:54.707852 2576 scope.go:117] "RemoveContainer" containerID="cacfa4a0b82ec4e179208106967ea6882fde8a654d11ceda97838e526163ddd0" Apr 22 14:35:54.708134 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:35:54.708115 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cacfa4a0b82ec4e179208106967ea6882fde8a654d11ceda97838e526163ddd0\": container with ID starting with cacfa4a0b82ec4e179208106967ea6882fde8a654d11ceda97838e526163ddd0 not found: ID does not exist" containerID="cacfa4a0b82ec4e179208106967ea6882fde8a654d11ceda97838e526163ddd0" Apr 22 14:35:54.708204 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:54.708139 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cacfa4a0b82ec4e179208106967ea6882fde8a654d11ceda97838e526163ddd0"} err="failed to get container status \"cacfa4a0b82ec4e179208106967ea6882fde8a654d11ceda97838e526163ddd0\": rpc error: code = NotFound desc = could not find container \"cacfa4a0b82ec4e179208106967ea6882fde8a654d11ceda97838e526163ddd0\": container with ID starting with cacfa4a0b82ec4e179208106967ea6882fde8a654d11ceda97838e526163ddd0 not found: ID does not exist" Apr 22 14:35:54.708204 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:54.708155 2576 scope.go:117] "RemoveContainer" containerID="e00ce51ecdad8ca24cb36bb4713f361466efdddaec78de8d11ce37fc2058cd16" Apr 22 14:35:54.708380 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:35:54.708366 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e00ce51ecdad8ca24cb36bb4713f361466efdddaec78de8d11ce37fc2058cd16\": container with ID starting with e00ce51ecdad8ca24cb36bb4713f361466efdddaec78de8d11ce37fc2058cd16 not found: ID does not exist" containerID="e00ce51ecdad8ca24cb36bb4713f361466efdddaec78de8d11ce37fc2058cd16" Apr 22 14:35:54.708419 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:54.708384 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e00ce51ecdad8ca24cb36bb4713f361466efdddaec78de8d11ce37fc2058cd16"} err="failed to get container status \"e00ce51ecdad8ca24cb36bb4713f361466efdddaec78de8d11ce37fc2058cd16\": rpc error: code = NotFound desc = could not find container \"e00ce51ecdad8ca24cb36bb4713f361466efdddaec78de8d11ce37fc2058cd16\": container with ID starting with e00ce51ecdad8ca24cb36bb4713f361466efdddaec78de8d11ce37fc2058cd16 not found: ID does not exist" Apr 22 14:35:54.748308 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:54.748284 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-88x6r"] Apr 22 14:35:54.756915 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:54.756895 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-88x6r"] Apr 22 14:35:54.767031 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:54.767013 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ee853580-d5f3-4b66-bb8d-30463980f664-kserve-provision-location\") on node \"ip-10-0-142-195.ec2.internal\" DevicePath \"\"" Apr 22 14:35:56.292442 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:56.292400 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee853580-d5f3-4b66-bb8d-30463980f664" path="/var/lib/kubelet/pods/ee853580-d5f3-4b66-bb8d-30463980f664/volumes" Apr 22 14:35:58.709836 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:58.709752 2576 generic.go:358] "Generic (PLEG): container finished" podID="2da5352d-b38f-4cb3-9a44-1261dff54aa7" containerID="8be3568ea6c67ec51efcdf4902a4b041c1e4ee4daf4fbec96007c54161cd3772" exitCode=0 Apr 22 14:35:58.710237 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:58.709844 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-jkvwn" event={"ID":"2da5352d-b38f-4cb3-9a44-1261dff54aa7","Type":"ContainerDied","Data":"8be3568ea6c67ec51efcdf4902a4b041c1e4ee4daf4fbec96007c54161cd3772"} Apr 22 14:35:59.714498 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:59.714467 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-jkvwn" event={"ID":"2da5352d-b38f-4cb3-9a44-1261dff54aa7","Type":"ContainerStarted","Data":"0516ba5f51fa0bfddc32e26d9edac55a76ef2fb308ce0e00a2384e33537ae096"} Apr 22 14:35:59.714872 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:59.714677 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-jkvwn" Apr 22 14:35:59.733404 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:35:59.733366 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-jkvwn" podStartSLOduration=6.733354759 podStartE2EDuration="6.733354759s" podCreationTimestamp="2026-04-22 14:35:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:35:59.731438697 +0000 UTC m=+1230.057173112" watchObservedRunningTime="2026-04-22 14:35:59.733354759 +0000 UTC m=+1230.059089172" Apr 22 14:36:30.727916 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:36:30.727885 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-jkvwn" Apr 22 14:36:34.128441 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:36:34.128409 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-bd659dd86-8w4vt"] Apr 22 14:36:34.128765 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:36:34.128649 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ee853580-d5f3-4b66-bb8d-30463980f664" containerName="storage-initializer" Apr 22 14:36:34.128765 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:36:34.128659 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee853580-d5f3-4b66-bb8d-30463980f664" containerName="storage-initializer" Apr 22 14:36:34.128765 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:36:34.128674 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ee853580-d5f3-4b66-bb8d-30463980f664" containerName="kserve-container" Apr 22 14:36:34.128765 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:36:34.128680 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee853580-d5f3-4b66-bb8d-30463980f664" containerName="kserve-container" Apr 22 14:36:34.128765 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:36:34.128734 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ee853580-d5f3-4b66-bb8d-30463980f664" containerName="kserve-container" Apr 22 14:36:34.132837 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:36:34.132820 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-bd659dd86-8w4vt" Apr 22 14:36:34.141823 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:36:34.141788 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-bd659dd86-8w4vt"] Apr 22 14:36:34.168975 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:36:34.168954 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-jkvwn"] Apr 22 14:36:34.169180 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:36:34.169163 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-jkvwn" podUID="2da5352d-b38f-4cb3-9a44-1261dff54aa7" containerName="kserve-container" containerID="cri-o://0516ba5f51fa0bfddc32e26d9edac55a76ef2fb308ce0e00a2384e33537ae096" gracePeriod=30 Apr 22 14:36:34.227443 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:36:34.227402 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e93c4801-7f45-49b0-a797-f1533ffb53f9-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-bd659dd86-8w4vt\" (UID: \"e93c4801-7f45-49b0-a797-f1533ffb53f9\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-bd659dd86-8w4vt" Apr 22 14:36:34.328205 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:36:34.328169 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e93c4801-7f45-49b0-a797-f1533ffb53f9-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-bd659dd86-8w4vt\" (UID: \"e93c4801-7f45-49b0-a797-f1533ffb53f9\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-bd659dd86-8w4vt" Apr 22 14:36:34.328530 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:36:34.328513 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e93c4801-7f45-49b0-a797-f1533ffb53f9-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-bd659dd86-8w4vt\" (UID: \"e93c4801-7f45-49b0-a797-f1533ffb53f9\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-bd659dd86-8w4vt" Apr 22 14:36:34.442317 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:36:34.442227 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-bd659dd86-8w4vt" Apr 22 14:36:34.557482 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:36:34.557450 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-bd659dd86-8w4vt"] Apr 22 14:36:34.560032 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:36:34.560007 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode93c4801_7f45_49b0_a797_f1533ffb53f9.slice/crio-3b2cec548a1d2c0bd355e337e6d30260c7e0d7524fc65f06208a1c3e0728c73c WatchSource:0}: Error finding container 3b2cec548a1d2c0bd355e337e6d30260c7e0d7524fc65f06208a1c3e0728c73c: Status 404 returned error can't find the container with id 3b2cec548a1d2c0bd355e337e6d30260c7e0d7524fc65f06208a1c3e0728c73c Apr 22 14:36:34.813090 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:36:34.813049 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-bd659dd86-8w4vt" event={"ID":"e93c4801-7f45-49b0-a797-f1533ffb53f9","Type":"ContainerStarted","Data":"9532be5a409eacf28af288f91ac056acd51354ff2a47105620bc18a8009e9b90"} Apr 22 14:36:34.813090 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:36:34.813095 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-bd659dd86-8w4vt" event={"ID":"e93c4801-7f45-49b0-a797-f1533ffb53f9","Type":"ContainerStarted","Data":"3b2cec548a1d2c0bd355e337e6d30260c7e0d7524fc65f06208a1c3e0728c73c"} Apr 22 14:36:35.282412 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:36:35.282391 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-jkvwn" Apr 22 14:36:35.437775 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:36:35.437717 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2da5352d-b38f-4cb3-9a44-1261dff54aa7-kserve-provision-location\") pod \"2da5352d-b38f-4cb3-9a44-1261dff54aa7\" (UID: \"2da5352d-b38f-4cb3-9a44-1261dff54aa7\") " Apr 22 14:36:35.438048 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:36:35.438026 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2da5352d-b38f-4cb3-9a44-1261dff54aa7-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2da5352d-b38f-4cb3-9a44-1261dff54aa7" (UID: "2da5352d-b38f-4cb3-9a44-1261dff54aa7"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:36:35.538503 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:36:35.538479 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2da5352d-b38f-4cb3-9a44-1261dff54aa7-kserve-provision-location\") on node \"ip-10-0-142-195.ec2.internal\" DevicePath \"\"" Apr 22 14:36:35.817404 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:36:35.817373 2576 generic.go:358] "Generic (PLEG): container finished" podID="2da5352d-b38f-4cb3-9a44-1261dff54aa7" containerID="0516ba5f51fa0bfddc32e26d9edac55a76ef2fb308ce0e00a2384e33537ae096" exitCode=0 Apr 22 14:36:35.817543 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:36:35.817444 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-jkvwn" Apr 22 14:36:35.817543 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:36:35.817467 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-jkvwn" event={"ID":"2da5352d-b38f-4cb3-9a44-1261dff54aa7","Type":"ContainerDied","Data":"0516ba5f51fa0bfddc32e26d9edac55a76ef2fb308ce0e00a2384e33537ae096"} Apr 22 14:36:35.817543 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:36:35.817511 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-jkvwn" event={"ID":"2da5352d-b38f-4cb3-9a44-1261dff54aa7","Type":"ContainerDied","Data":"273bec5ba797dd3716bd948145e2df4744ce1dc8c938909511683a14802701c4"} Apr 22 14:36:35.817543 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:36:35.817527 2576 scope.go:117] "RemoveContainer" containerID="0516ba5f51fa0bfddc32e26d9edac55a76ef2fb308ce0e00a2384e33537ae096" Apr 22 14:36:35.826267 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:36:35.826156 2576 scope.go:117] "RemoveContainer" containerID="8be3568ea6c67ec51efcdf4902a4b041c1e4ee4daf4fbec96007c54161cd3772" Apr 22 14:36:35.833033 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:36:35.833010 2576 scope.go:117] "RemoveContainer" containerID="0516ba5f51fa0bfddc32e26d9edac55a76ef2fb308ce0e00a2384e33537ae096" Apr 22 14:36:35.833318 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:36:35.833299 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0516ba5f51fa0bfddc32e26d9edac55a76ef2fb308ce0e00a2384e33537ae096\": container with ID starting with 0516ba5f51fa0bfddc32e26d9edac55a76ef2fb308ce0e00a2384e33537ae096 not found: ID does not exist" containerID="0516ba5f51fa0bfddc32e26d9edac55a76ef2fb308ce0e00a2384e33537ae096" Apr 22 14:36:35.833383 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:36:35.833325 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0516ba5f51fa0bfddc32e26d9edac55a76ef2fb308ce0e00a2384e33537ae096"} err="failed to get container status \"0516ba5f51fa0bfddc32e26d9edac55a76ef2fb308ce0e00a2384e33537ae096\": rpc error: code = NotFound desc = could not find container \"0516ba5f51fa0bfddc32e26d9edac55a76ef2fb308ce0e00a2384e33537ae096\": container with ID starting with 0516ba5f51fa0bfddc32e26d9edac55a76ef2fb308ce0e00a2384e33537ae096 not found: ID does not exist" Apr 22 14:36:35.833383 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:36:35.833342 2576 scope.go:117] "RemoveContainer" containerID="8be3568ea6c67ec51efcdf4902a4b041c1e4ee4daf4fbec96007c54161cd3772" Apr 22 14:36:35.833586 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:36:35.833565 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8be3568ea6c67ec51efcdf4902a4b041c1e4ee4daf4fbec96007c54161cd3772\": container with ID starting with 8be3568ea6c67ec51efcdf4902a4b041c1e4ee4daf4fbec96007c54161cd3772 not found: ID does not exist" containerID="8be3568ea6c67ec51efcdf4902a4b041c1e4ee4daf4fbec96007c54161cd3772" Apr 22 14:36:35.833654 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:36:35.833594 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8be3568ea6c67ec51efcdf4902a4b041c1e4ee4daf4fbec96007c54161cd3772"} err="failed to get container status \"8be3568ea6c67ec51efcdf4902a4b041c1e4ee4daf4fbec96007c54161cd3772\": rpc error: code = NotFound desc = could not find container \"8be3568ea6c67ec51efcdf4902a4b041c1e4ee4daf4fbec96007c54161cd3772\": container with ID starting with 8be3568ea6c67ec51efcdf4902a4b041c1e4ee4daf4fbec96007c54161cd3772 not found: ID does not exist" Apr 22 14:36:35.839384 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:36:35.839359 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-jkvwn"] Apr 22 14:36:35.845618 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:36:35.845592 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-jkvwn"] Apr 22 14:36:36.292156 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:36:36.292130 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2da5352d-b38f-4cb3-9a44-1261dff54aa7" path="/var/lib/kubelet/pods/2da5352d-b38f-4cb3-9a44-1261dff54aa7/volumes" Apr 22 14:36:38.828526 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:36:38.828496 2576 generic.go:358] "Generic (PLEG): container finished" podID="e93c4801-7f45-49b0-a797-f1533ffb53f9" containerID="9532be5a409eacf28af288f91ac056acd51354ff2a47105620bc18a8009e9b90" exitCode=0 Apr 22 14:36:38.828867 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:36:38.828572 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-bd659dd86-8w4vt" event={"ID":"e93c4801-7f45-49b0-a797-f1533ffb53f9","Type":"ContainerDied","Data":"9532be5a409eacf28af288f91ac056acd51354ff2a47105620bc18a8009e9b90"} Apr 22 14:36:39.834222 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:36:39.834182 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-bd659dd86-8w4vt" event={"ID":"e93c4801-7f45-49b0-a797-f1533ffb53f9","Type":"ContainerStarted","Data":"13c71892dbf4673c93c2f4e22507b49577b41c2bb81648c6645f379bc69e54fb"} Apr 22 14:36:42.844338 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:36:42.844301 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-bd659dd86-8w4vt" event={"ID":"e93c4801-7f45-49b0-a797-f1533ffb53f9","Type":"ContainerStarted","Data":"95bc6821e88e46776718dcc88ab84c0ff5f437940f6eab5e2e510c05c17c91fd"} Apr 22 14:36:42.844667 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:36:42.844554 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-bd659dd86-8w4vt" Apr 22 14:36:42.844667 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:36:42.844594 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-bd659dd86-8w4vt" Apr 22 14:36:42.862629 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:36:42.862590 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-bd659dd86-8w4vt" podStartSLOduration=5.92448208 podStartE2EDuration="8.86257706s" podCreationTimestamp="2026-04-22 14:36:34 +0000 UTC" firstStartedPulling="2026-04-22 14:36:38.885694742 +0000 UTC m=+1269.211429141" lastFinishedPulling="2026-04-22 14:36:41.823789723 +0000 UTC m=+1272.149524121" observedRunningTime="2026-04-22 14:36:42.861530687 +0000 UTC m=+1273.187265100" watchObservedRunningTime="2026-04-22 14:36:42.86257706 +0000 UTC m=+1273.188311473" Apr 22 14:37:13.850862 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:37:13.850803 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-bd659dd86-8w4vt" Apr 22 14:37:43.851967 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:37:43.851869 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-bd659dd86-8w4vt" Apr 22 14:37:44.229844 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:37:44.229791 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-bd659dd86-8w4vt"] Apr 22 14:37:44.230101 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:37:44.230074 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-bd659dd86-8w4vt" podUID="e93c4801-7f45-49b0-a797-f1533ffb53f9" containerName="kserve-container" containerID="cri-o://13c71892dbf4673c93c2f4e22507b49577b41c2bb81648c6645f379bc69e54fb" gracePeriod=30 Apr 22 14:37:44.230199 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:37:44.230109 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-bd659dd86-8w4vt" podUID="e93c4801-7f45-49b0-a797-f1533ffb53f9" containerName="kserve-agent" containerID="cri-o://95bc6821e88e46776718dcc88ab84c0ff5f437940f6eab5e2e510c05c17c91fd" gracePeriod=30 Apr 22 14:37:44.293391 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:37:44.293367 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-lq2n7"] Apr 22 14:37:44.293667 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:37:44.293652 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2da5352d-b38f-4cb3-9a44-1261dff54aa7" containerName="storage-initializer" Apr 22 14:37:44.293734 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:37:44.293670 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2da5352d-b38f-4cb3-9a44-1261dff54aa7" containerName="storage-initializer" Apr 22 14:37:44.293734 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:37:44.293692 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2da5352d-b38f-4cb3-9a44-1261dff54aa7" containerName="kserve-container" Apr 22 14:37:44.293734 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:37:44.293703 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2da5352d-b38f-4cb3-9a44-1261dff54aa7" containerName="kserve-container" Apr 22 14:37:44.293902 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:37:44.293778 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="2da5352d-b38f-4cb3-9a44-1261dff54aa7" containerName="kserve-container" Apr 22 14:37:44.298064 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:37:44.298048 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-lq2n7" Apr 22 14:37:44.306009 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:37:44.305990 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-lq2n7"] Apr 22 14:37:44.386694 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:37:44.386671 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bc138f3c-910c-474e-bba4-11286b7fc3d9-kserve-provision-location\") pod \"isvc-paddle-predictor-7dddcb4bd4-lq2n7\" (UID: \"bc138f3c-910c-474e-bba4-11286b7fc3d9\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-lq2n7" Apr 22 14:37:44.487779 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:37:44.487716 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bc138f3c-910c-474e-bba4-11286b7fc3d9-kserve-provision-location\") pod \"isvc-paddle-predictor-7dddcb4bd4-lq2n7\" (UID: \"bc138f3c-910c-474e-bba4-11286b7fc3d9\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-lq2n7" Apr 22 14:37:44.488063 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:37:44.488048 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bc138f3c-910c-474e-bba4-11286b7fc3d9-kserve-provision-location\") pod \"isvc-paddle-predictor-7dddcb4bd4-lq2n7\" (UID: \"bc138f3c-910c-474e-bba4-11286b7fc3d9\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-lq2n7" Apr 22 14:37:44.607627 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:37:44.607603 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-lq2n7" Apr 22 14:37:44.714339 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:37:44.714312 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-lq2n7"] Apr 22 14:37:44.717737 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:37:44.717709 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc138f3c_910c_474e_bba4_11286b7fc3d9.slice/crio-b90bcd91bce75353f4c132e62135139e3d1d3db6e5ca73112fe03a49571df617 WatchSource:0}: Error finding container b90bcd91bce75353f4c132e62135139e3d1d3db6e5ca73112fe03a49571df617: Status 404 returned error can't find the container with id b90bcd91bce75353f4c132e62135139e3d1d3db6e5ca73112fe03a49571df617 Apr 22 14:37:44.719845 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:37:44.719831 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 14:37:45.013073 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:37:45.013003 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-lq2n7" event={"ID":"bc138f3c-910c-474e-bba4-11286b7fc3d9","Type":"ContainerStarted","Data":"a6d7ae6e77e36495f85155982c33eb7a99fdf9b092fedc1be67e32c07e56a64b"} Apr 22 14:37:45.013073 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:37:45.013039 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-lq2n7" event={"ID":"bc138f3c-910c-474e-bba4-11286b7fc3d9","Type":"ContainerStarted","Data":"b90bcd91bce75353f4c132e62135139e3d1d3db6e5ca73112fe03a49571df617"} Apr 22 14:37:46.017057 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:37:46.017032 2576 generic.go:358] "Generic (PLEG): container finished" podID="e93c4801-7f45-49b0-a797-f1533ffb53f9" containerID="13c71892dbf4673c93c2f4e22507b49577b41c2bb81648c6645f379bc69e54fb" exitCode=0 Apr 22 14:37:46.017339 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:37:46.017105 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-bd659dd86-8w4vt" event={"ID":"e93c4801-7f45-49b0-a797-f1533ffb53f9","Type":"ContainerDied","Data":"13c71892dbf4673c93c2f4e22507b49577b41c2bb81648c6645f379bc69e54fb"} Apr 22 14:37:50.035951 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:37:50.035914 2576 generic.go:358] "Generic (PLEG): container finished" podID="bc138f3c-910c-474e-bba4-11286b7fc3d9" containerID="a6d7ae6e77e36495f85155982c33eb7a99fdf9b092fedc1be67e32c07e56a64b" exitCode=0 Apr 22 14:37:50.036299 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:37:50.035989 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-lq2n7" event={"ID":"bc138f3c-910c-474e-bba4-11286b7fc3d9","Type":"ContainerDied","Data":"a6d7ae6e77e36495f85155982c33eb7a99fdf9b092fedc1be67e32c07e56a64b"} Apr 22 14:37:53.848378 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:37:53.848340 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-bd659dd86-8w4vt" podUID="e93c4801-7f45-49b0-a797-f1533ffb53f9" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.26:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.134.0.26:8080: connect: connection refused" Apr 22 14:38:03.075382 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:03.075350 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-lq2n7" event={"ID":"bc138f3c-910c-474e-bba4-11286b7fc3d9","Type":"ContainerStarted","Data":"0b55a46aabc11623943ad26f7075ae21f27ce6ce8b0c05307456488c927a1fc1"} Apr 22 14:38:03.075781 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:03.075674 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-lq2n7" Apr 22 14:38:03.076888 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:03.076861 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-lq2n7" podUID="bc138f3c-910c-474e-bba4-11286b7fc3d9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 22 14:38:03.092271 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:03.092224 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-lq2n7" podStartSLOduration=7.017263109 podStartE2EDuration="19.092212947s" podCreationTimestamp="2026-04-22 14:37:44 +0000 UTC" firstStartedPulling="2026-04-22 14:37:50.037251673 +0000 UTC m=+1340.362986065" lastFinishedPulling="2026-04-22 14:38:02.112201495 +0000 UTC m=+1352.437935903" observedRunningTime="2026-04-22 14:38:03.091646367 +0000 UTC m=+1353.417380782" watchObservedRunningTime="2026-04-22 14:38:03.092212947 +0000 UTC m=+1353.417947360" Apr 22 14:38:03.848531 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:03.848503 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-bd659dd86-8w4vt" podUID="e93c4801-7f45-49b0-a797-f1533ffb53f9" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.26:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.134.0.26:8080: connect: connection refused" Apr 22 14:38:04.078743 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:04.078716 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-lq2n7" podUID="bc138f3c-910c-474e-bba4-11286b7fc3d9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 22 14:38:13.848425 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:13.848379 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-bd659dd86-8w4vt" podUID="e93c4801-7f45-49b0-a797-f1533ffb53f9" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.26:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.134.0.26:8080: connect: connection refused" Apr 22 14:38:13.848829 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:13.848506 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-bd659dd86-8w4vt" Apr 22 14:38:14.079686 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:14.079644 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-lq2n7" podUID="bc138f3c-910c-474e-bba4-11286b7fc3d9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 22 14:38:14.403691 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:14.403671 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-bd659dd86-8w4vt" Apr 22 14:38:14.500851 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:14.500780 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e93c4801-7f45-49b0-a797-f1533ffb53f9-kserve-provision-location\") pod \"e93c4801-7f45-49b0-a797-f1533ffb53f9\" (UID: \"e93c4801-7f45-49b0-a797-f1533ffb53f9\") " Apr 22 14:38:14.501064 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:14.501044 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e93c4801-7f45-49b0-a797-f1533ffb53f9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e93c4801-7f45-49b0-a797-f1533ffb53f9" (UID: "e93c4801-7f45-49b0-a797-f1533ffb53f9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:38:14.601153 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:14.601133 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e93c4801-7f45-49b0-a797-f1533ffb53f9-kserve-provision-location\") on node \"ip-10-0-142-195.ec2.internal\" DevicePath \"\"" Apr 22 14:38:15.108522 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:15.108499 2576 generic.go:358] "Generic (PLEG): container finished" podID="e93c4801-7f45-49b0-a797-f1533ffb53f9" containerID="95bc6821e88e46776718dcc88ab84c0ff5f437940f6eab5e2e510c05c17c91fd" exitCode=0 Apr 22 14:38:15.108870 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:15.108575 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-bd659dd86-8w4vt" Apr 22 14:38:15.108870 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:15.108592 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-bd659dd86-8w4vt" event={"ID":"e93c4801-7f45-49b0-a797-f1533ffb53f9","Type":"ContainerDied","Data":"95bc6821e88e46776718dcc88ab84c0ff5f437940f6eab5e2e510c05c17c91fd"} Apr 22 14:38:15.108870 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:15.108632 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-bd659dd86-8w4vt" event={"ID":"e93c4801-7f45-49b0-a797-f1533ffb53f9","Type":"ContainerDied","Data":"3b2cec548a1d2c0bd355e337e6d30260c7e0d7524fc65f06208a1c3e0728c73c"} Apr 22 14:38:15.108870 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:15.108648 2576 scope.go:117] "RemoveContainer" containerID="95bc6821e88e46776718dcc88ab84c0ff5f437940f6eab5e2e510c05c17c91fd" Apr 22 14:38:15.116120 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:15.115894 2576 scope.go:117] "RemoveContainer" containerID="13c71892dbf4673c93c2f4e22507b49577b41c2bb81648c6645f379bc69e54fb" Apr 22 14:38:15.122324 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:15.122306 2576 scope.go:117] "RemoveContainer" containerID="9532be5a409eacf28af288f91ac056acd51354ff2a47105620bc18a8009e9b90" Apr 22 14:38:15.128718 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:15.128695 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-bd659dd86-8w4vt"] Apr 22 14:38:15.129266 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:15.129246 2576 scope.go:117] "RemoveContainer" containerID="95bc6821e88e46776718dcc88ab84c0ff5f437940f6eab5e2e510c05c17c91fd" Apr 22 14:38:15.129502 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:38:15.129483 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95bc6821e88e46776718dcc88ab84c0ff5f437940f6eab5e2e510c05c17c91fd\": container with ID starting with 95bc6821e88e46776718dcc88ab84c0ff5f437940f6eab5e2e510c05c17c91fd not found: ID does not exist" containerID="95bc6821e88e46776718dcc88ab84c0ff5f437940f6eab5e2e510c05c17c91fd" Apr 22 14:38:15.129565 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:15.129509 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95bc6821e88e46776718dcc88ab84c0ff5f437940f6eab5e2e510c05c17c91fd"} err="failed to get container status \"95bc6821e88e46776718dcc88ab84c0ff5f437940f6eab5e2e510c05c17c91fd\": rpc error: code = NotFound desc = could not find container \"95bc6821e88e46776718dcc88ab84c0ff5f437940f6eab5e2e510c05c17c91fd\": container with ID starting with 95bc6821e88e46776718dcc88ab84c0ff5f437940f6eab5e2e510c05c17c91fd not found: ID does not exist" Apr 22 14:38:15.129565 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:15.129526 2576 scope.go:117] "RemoveContainer" containerID="13c71892dbf4673c93c2f4e22507b49577b41c2bb81648c6645f379bc69e54fb" Apr 22 14:38:15.129742 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:38:15.129726 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13c71892dbf4673c93c2f4e22507b49577b41c2bb81648c6645f379bc69e54fb\": container with ID starting with 13c71892dbf4673c93c2f4e22507b49577b41c2bb81648c6645f379bc69e54fb not found: ID does not exist" containerID="13c71892dbf4673c93c2f4e22507b49577b41c2bb81648c6645f379bc69e54fb" Apr 22 14:38:15.129781 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:15.129746 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13c71892dbf4673c93c2f4e22507b49577b41c2bb81648c6645f379bc69e54fb"} err="failed to get container status \"13c71892dbf4673c93c2f4e22507b49577b41c2bb81648c6645f379bc69e54fb\": rpc error: code = NotFound desc = could not find container \"13c71892dbf4673c93c2f4e22507b49577b41c2bb81648c6645f379bc69e54fb\": container with ID starting with 13c71892dbf4673c93c2f4e22507b49577b41c2bb81648c6645f379bc69e54fb not found: ID does not exist" Apr 22 14:38:15.129781 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:15.129758 2576 scope.go:117] "RemoveContainer" containerID="9532be5a409eacf28af288f91ac056acd51354ff2a47105620bc18a8009e9b90" Apr 22 14:38:15.129970 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:38:15.129951 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9532be5a409eacf28af288f91ac056acd51354ff2a47105620bc18a8009e9b90\": container with ID starting with 9532be5a409eacf28af288f91ac056acd51354ff2a47105620bc18a8009e9b90 not found: ID does not exist" containerID="9532be5a409eacf28af288f91ac056acd51354ff2a47105620bc18a8009e9b90" Apr 22 14:38:15.130008 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:15.129975 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9532be5a409eacf28af288f91ac056acd51354ff2a47105620bc18a8009e9b90"} err="failed to get container status \"9532be5a409eacf28af288f91ac056acd51354ff2a47105620bc18a8009e9b90\": rpc error: code = NotFound desc = could not find container \"9532be5a409eacf28af288f91ac056acd51354ff2a47105620bc18a8009e9b90\": container with ID starting with 9532be5a409eacf28af288f91ac056acd51354ff2a47105620bc18a8009e9b90 not found: ID does not exist" Apr 22 14:38:15.132487 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:15.132463 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-bd659dd86-8w4vt"] Apr 22 14:38:16.291973 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:16.291941 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e93c4801-7f45-49b0-a797-f1533ffb53f9" path="/var/lib/kubelet/pods/e93c4801-7f45-49b0-a797-f1533ffb53f9/volumes" Apr 22 14:38:24.078873 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:24.078839 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-lq2n7" podUID="bc138f3c-910c-474e-bba4-11286b7fc3d9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 22 14:38:34.079309 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:34.079265 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-lq2n7" podUID="bc138f3c-910c-474e-bba4-11286b7fc3d9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 22 14:38:44.079786 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:44.079749 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-lq2n7" Apr 22 14:38:45.663291 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:45.663252 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-lq2n7"] Apr 22 14:38:45.663639 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:45.663506 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-lq2n7" podUID="bc138f3c-910c-474e-bba4-11286b7fc3d9" containerName="kserve-container" containerID="cri-o://0b55a46aabc11623943ad26f7075ae21f27ce6ce8b0c05307456488c927a1fc1" gracePeriod=30 Apr 22 14:38:45.758657 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:45.758623 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-fxq6h"] Apr 22 14:38:45.758920 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:45.758907 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e93c4801-7f45-49b0-a797-f1533ffb53f9" containerName="kserve-agent" Apr 22 14:38:45.758974 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:45.758921 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e93c4801-7f45-49b0-a797-f1533ffb53f9" containerName="kserve-agent" Apr 22 14:38:45.758974 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:45.758937 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e93c4801-7f45-49b0-a797-f1533ffb53f9" containerName="storage-initializer" Apr 22 14:38:45.758974 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:45.758945 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e93c4801-7f45-49b0-a797-f1533ffb53f9" containerName="storage-initializer" Apr 22 14:38:45.758974 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:45.758955 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e93c4801-7f45-49b0-a797-f1533ffb53f9" containerName="kserve-container" Apr 22 14:38:45.758974 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:45.758961 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e93c4801-7f45-49b0-a797-f1533ffb53f9" containerName="kserve-container" Apr 22 14:38:45.759113 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:45.759002 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="e93c4801-7f45-49b0-a797-f1533ffb53f9" containerName="kserve-container" Apr 22 14:38:45.759113 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:45.759010 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="e93c4801-7f45-49b0-a797-f1533ffb53f9" containerName="kserve-agent" Apr 22 14:38:45.761874 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:45.761859 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-fxq6h" Apr 22 14:38:45.772962 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:45.772940 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-fxq6h"] Apr 22 14:38:45.899156 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:45.899130 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0dac6118-1bb6-4152-af05-b056b8a86cf8-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-86b49c4466-fxq6h\" (UID: \"0dac6118-1bb6-4152-af05-b056b8a86cf8\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-fxq6h" Apr 22 14:38:45.999846 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:45.999755 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0dac6118-1bb6-4152-af05-b056b8a86cf8-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-86b49c4466-fxq6h\" (UID: \"0dac6118-1bb6-4152-af05-b056b8a86cf8\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-fxq6h" Apr 22 14:38:46.000081 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:46.000063 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0dac6118-1bb6-4152-af05-b056b8a86cf8-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-86b49c4466-fxq6h\" (UID: \"0dac6118-1bb6-4152-af05-b056b8a86cf8\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-fxq6h" Apr 22 14:38:46.070406 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:46.070387 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-fxq6h" Apr 22 14:38:46.180113 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:46.180091 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-fxq6h"] Apr 22 14:38:46.182553 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:38:46.182524 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0dac6118_1bb6_4152_af05_b056b8a86cf8.slice/crio-bd2d626619f9e6b698163d38f162f6e55d2735cd99ad14a2a7e816eda3b2e229 WatchSource:0}: Error finding container bd2d626619f9e6b698163d38f162f6e55d2735cd99ad14a2a7e816eda3b2e229: Status 404 returned error can't find the container with id bd2d626619f9e6b698163d38f162f6e55d2735cd99ad14a2a7e816eda3b2e229 Apr 22 14:38:46.191799 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:46.191775 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-fxq6h" event={"ID":"0dac6118-1bb6-4152-af05-b056b8a86cf8","Type":"ContainerStarted","Data":"bd2d626619f9e6b698163d38f162f6e55d2735cd99ad14a2a7e816eda3b2e229"} Apr 22 14:38:47.196199 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:47.196161 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-fxq6h" event={"ID":"0dac6118-1bb6-4152-af05-b056b8a86cf8","Type":"ContainerStarted","Data":"33b78b0cccd27ce8bb78662bb2b701ac55ac1478b1be0662562dbebfae818b6c"} Apr 22 14:38:47.994734 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:47.994714 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-lq2n7" Apr 22 14:38:48.119780 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:48.119725 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bc138f3c-910c-474e-bba4-11286b7fc3d9-kserve-provision-location\") pod \"bc138f3c-910c-474e-bba4-11286b7fc3d9\" (UID: \"bc138f3c-910c-474e-bba4-11286b7fc3d9\") " Apr 22 14:38:48.128902 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:48.128879 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc138f3c-910c-474e-bba4-11286b7fc3d9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "bc138f3c-910c-474e-bba4-11286b7fc3d9" (UID: "bc138f3c-910c-474e-bba4-11286b7fc3d9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:38:48.200312 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:48.200290 2576 generic.go:358] "Generic (PLEG): container finished" podID="bc138f3c-910c-474e-bba4-11286b7fc3d9" containerID="0b55a46aabc11623943ad26f7075ae21f27ce6ce8b0c05307456488c927a1fc1" exitCode=0 Apr 22 14:38:48.200586 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:48.200348 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-lq2n7" Apr 22 14:38:48.200586 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:48.200378 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-lq2n7" event={"ID":"bc138f3c-910c-474e-bba4-11286b7fc3d9","Type":"ContainerDied","Data":"0b55a46aabc11623943ad26f7075ae21f27ce6ce8b0c05307456488c927a1fc1"} Apr 22 14:38:48.200586 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:48.200420 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-lq2n7" event={"ID":"bc138f3c-910c-474e-bba4-11286b7fc3d9","Type":"ContainerDied","Data":"b90bcd91bce75353f4c132e62135139e3d1d3db6e5ca73112fe03a49571df617"} Apr 22 14:38:48.200586 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:48.200435 2576 scope.go:117] "RemoveContainer" containerID="0b55a46aabc11623943ad26f7075ae21f27ce6ce8b0c05307456488c927a1fc1" Apr 22 14:38:48.207925 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:48.207907 2576 scope.go:117] "RemoveContainer" containerID="a6d7ae6e77e36495f85155982c33eb7a99fdf9b092fedc1be67e32c07e56a64b" Apr 22 14:38:48.214410 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:48.214393 2576 scope.go:117] "RemoveContainer" containerID="0b55a46aabc11623943ad26f7075ae21f27ce6ce8b0c05307456488c927a1fc1" Apr 22 14:38:48.214652 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:38:48.214635 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b55a46aabc11623943ad26f7075ae21f27ce6ce8b0c05307456488c927a1fc1\": container with ID starting with 0b55a46aabc11623943ad26f7075ae21f27ce6ce8b0c05307456488c927a1fc1 not found: ID does not exist" containerID="0b55a46aabc11623943ad26f7075ae21f27ce6ce8b0c05307456488c927a1fc1" Apr 22 14:38:48.214695 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:48.214659 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b55a46aabc11623943ad26f7075ae21f27ce6ce8b0c05307456488c927a1fc1"} err="failed to get container status \"0b55a46aabc11623943ad26f7075ae21f27ce6ce8b0c05307456488c927a1fc1\": rpc error: code = NotFound desc = could not find container \"0b55a46aabc11623943ad26f7075ae21f27ce6ce8b0c05307456488c927a1fc1\": container with ID starting with 0b55a46aabc11623943ad26f7075ae21f27ce6ce8b0c05307456488c927a1fc1 not found: ID does not exist" Apr 22 14:38:48.214695 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:48.214677 2576 scope.go:117] "RemoveContainer" containerID="a6d7ae6e77e36495f85155982c33eb7a99fdf9b092fedc1be67e32c07e56a64b" Apr 22 14:38:48.214880 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:38:48.214862 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6d7ae6e77e36495f85155982c33eb7a99fdf9b092fedc1be67e32c07e56a64b\": container with ID starting with a6d7ae6e77e36495f85155982c33eb7a99fdf9b092fedc1be67e32c07e56a64b not found: ID does not exist" containerID="a6d7ae6e77e36495f85155982c33eb7a99fdf9b092fedc1be67e32c07e56a64b" Apr 22 14:38:48.214937 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:48.214889 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6d7ae6e77e36495f85155982c33eb7a99fdf9b092fedc1be67e32c07e56a64b"} err="failed to get container status \"a6d7ae6e77e36495f85155982c33eb7a99fdf9b092fedc1be67e32c07e56a64b\": rpc error: code = NotFound desc = could not find container \"a6d7ae6e77e36495f85155982c33eb7a99fdf9b092fedc1be67e32c07e56a64b\": container with ID starting with a6d7ae6e77e36495f85155982c33eb7a99fdf9b092fedc1be67e32c07e56a64b not found: ID does not exist" Apr 22 14:38:48.219202 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:48.219181 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-lq2n7"] Apr 22 14:38:48.221041 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:48.221025 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bc138f3c-910c-474e-bba4-11286b7fc3d9-kserve-provision-location\") on node \"ip-10-0-142-195.ec2.internal\" DevicePath \"\"" Apr 22 14:38:48.223231 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:48.223210 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-lq2n7"] Apr 22 14:38:48.291082 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:48.291063 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc138f3c-910c-474e-bba4-11286b7fc3d9" path="/var/lib/kubelet/pods/bc138f3c-910c-474e-bba4-11286b7fc3d9/volumes" Apr 22 14:38:51.210836 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:51.210746 2576 generic.go:358] "Generic (PLEG): container finished" podID="0dac6118-1bb6-4152-af05-b056b8a86cf8" containerID="33b78b0cccd27ce8bb78662bb2b701ac55ac1478b1be0662562dbebfae818b6c" exitCode=0 Apr 22 14:38:51.210836 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:51.210785 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-fxq6h" event={"ID":"0dac6118-1bb6-4152-af05-b056b8a86cf8","Type":"ContainerDied","Data":"33b78b0cccd27ce8bb78662bb2b701ac55ac1478b1be0662562dbebfae818b6c"} Apr 22 14:38:52.215278 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:52.215242 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-fxq6h" event={"ID":"0dac6118-1bb6-4152-af05-b056b8a86cf8","Type":"ContainerStarted","Data":"5214875860fa8bc02925c4dc6fe74a9f0e204406f1a6f01cf52c376ae69153a1"} Apr 22 14:38:52.215716 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:52.215610 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-fxq6h" Apr 22 14:38:52.216934 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:52.216905 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-fxq6h" podUID="0dac6118-1bb6-4152-af05-b056b8a86cf8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 22 14:38:52.231823 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:52.231759 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-fxq6h" podStartSLOduration=7.231744315 podStartE2EDuration="7.231744315s" podCreationTimestamp="2026-04-22 14:38:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:38:52.2307278 +0000 UTC m=+1402.556462213" watchObservedRunningTime="2026-04-22 14:38:52.231744315 +0000 UTC m=+1402.557478816" Apr 22 14:38:53.218011 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:38:53.217977 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-fxq6h" podUID="0dac6118-1bb6-4152-af05-b056b8a86cf8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 22 14:39:03.218066 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:39:03.218024 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-fxq6h" podUID="0dac6118-1bb6-4152-af05-b056b8a86cf8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 22 14:39:13.218603 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:39:13.218499 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-fxq6h" podUID="0dac6118-1bb6-4152-af05-b056b8a86cf8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 22 14:39:23.218468 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:39:23.218426 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-fxq6h" podUID="0dac6118-1bb6-4152-af05-b056b8a86cf8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 22 14:39:33.219006 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:39:33.218972 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-fxq6h" Apr 22 14:39:37.202456 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:39:37.202419 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-fxq6h"] Apr 22 14:39:37.203333 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:39:37.203279 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-fxq6h" podUID="0dac6118-1bb6-4152-af05-b056b8a86cf8" containerName="kserve-container" containerID="cri-o://5214875860fa8bc02925c4dc6fe74a9f0e204406f1a6f01cf52c376ae69153a1" gracePeriod=30 Apr 22 14:39:37.250183 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:39:37.250157 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-vmhkf"] Apr 22 14:39:37.250419 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:39:37.250408 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc138f3c-910c-474e-bba4-11286b7fc3d9" containerName="kserve-container" Apr 22 14:39:37.250462 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:39:37.250420 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc138f3c-910c-474e-bba4-11286b7fc3d9" containerName="kserve-container" Apr 22 14:39:37.250462 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:39:37.250427 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc138f3c-910c-474e-bba4-11286b7fc3d9" containerName="storage-initializer" Apr 22 14:39:37.250462 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:39:37.250433 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc138f3c-910c-474e-bba4-11286b7fc3d9" containerName="storage-initializer" Apr 22 14:39:37.250549 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:39:37.250483 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="bc138f3c-910c-474e-bba4-11286b7fc3d9" containerName="kserve-container" Apr 22 14:39:37.254635 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:39:37.254617 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-vmhkf" Apr 22 14:39:37.263594 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:39:37.263554 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-vmhkf"] Apr 22 14:39:37.444416 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:39:37.444386 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9187c1b5-55e0-48c8-88e7-d1abe4694251-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7c6bd76f7b-vmhkf\" (UID: \"9187c1b5-55e0-48c8-88e7-d1abe4694251\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-vmhkf" Apr 22 14:39:37.545585 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:39:37.545559 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9187c1b5-55e0-48c8-88e7-d1abe4694251-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7c6bd76f7b-vmhkf\" (UID: \"9187c1b5-55e0-48c8-88e7-d1abe4694251\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-vmhkf" Apr 22 14:39:37.545869 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:39:37.545853 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9187c1b5-55e0-48c8-88e7-d1abe4694251-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7c6bd76f7b-vmhkf\" (UID: \"9187c1b5-55e0-48c8-88e7-d1abe4694251\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-vmhkf" Apr 22 14:39:37.563855 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:39:37.563838 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-vmhkf" Apr 22 14:39:37.688701 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:39:37.688674 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-vmhkf"] Apr 22 14:39:37.691609 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:39:37.691583 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9187c1b5_55e0_48c8_88e7_d1abe4694251.slice/crio-76008399f5e1fd7e064f1df0af57309c6f0bb8a5f02d5b4bd46f3adbe665df2e WatchSource:0}: Error finding container 76008399f5e1fd7e064f1df0af57309c6f0bb8a5f02d5b4bd46f3adbe665df2e: Status 404 returned error can't find the container with id 76008399f5e1fd7e064f1df0af57309c6f0bb8a5f02d5b4bd46f3adbe665df2e Apr 22 14:39:38.339298 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:39:38.339266 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-vmhkf" event={"ID":"9187c1b5-55e0-48c8-88e7-d1abe4694251","Type":"ContainerStarted","Data":"575609123d4ea612fde71e6bcdac8bd9d51c3cb31dbb3a78a0a2aeff48f987e1"} Apr 22 14:39:38.339298 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:39:38.339299 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-vmhkf" event={"ID":"9187c1b5-55e0-48c8-88e7-d1abe4694251","Type":"ContainerStarted","Data":"76008399f5e1fd7e064f1df0af57309c6f0bb8a5f02d5b4bd46f3adbe665df2e"} Apr 22 14:39:39.533165 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:39:39.533147 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-fxq6h" Apr 22 14:39:39.661042 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:39:39.660978 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0dac6118-1bb6-4152-af05-b056b8a86cf8-kserve-provision-location\") pod \"0dac6118-1bb6-4152-af05-b056b8a86cf8\" (UID: \"0dac6118-1bb6-4152-af05-b056b8a86cf8\") " Apr 22 14:39:39.668596 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:39:39.668573 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0dac6118-1bb6-4152-af05-b056b8a86cf8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0dac6118-1bb6-4152-af05-b056b8a86cf8" (UID: "0dac6118-1bb6-4152-af05-b056b8a86cf8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:39:39.762390 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:39:39.762358 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0dac6118-1bb6-4152-af05-b056b8a86cf8-kserve-provision-location\") on node \"ip-10-0-142-195.ec2.internal\" DevicePath \"\"" Apr 22 14:39:40.345530 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:39:40.345505 2576 generic.go:358] "Generic (PLEG): container finished" podID="0dac6118-1bb6-4152-af05-b056b8a86cf8" containerID="5214875860fa8bc02925c4dc6fe74a9f0e204406f1a6f01cf52c376ae69153a1" exitCode=0 Apr 22 14:39:40.345620 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:39:40.345557 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-fxq6h" event={"ID":"0dac6118-1bb6-4152-af05-b056b8a86cf8","Type":"ContainerDied","Data":"5214875860fa8bc02925c4dc6fe74a9f0e204406f1a6f01cf52c376ae69153a1"} Apr 22 14:39:40.345620 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:39:40.345566 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-fxq6h" Apr 22 14:39:40.345620 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:39:40.345576 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-fxq6h" event={"ID":"0dac6118-1bb6-4152-af05-b056b8a86cf8","Type":"ContainerDied","Data":"bd2d626619f9e6b698163d38f162f6e55d2735cd99ad14a2a7e816eda3b2e229"} Apr 22 14:39:40.345620 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:39:40.345592 2576 scope.go:117] "RemoveContainer" containerID="5214875860fa8bc02925c4dc6fe74a9f0e204406f1a6f01cf52c376ae69153a1" Apr 22 14:39:40.353094 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:39:40.353072 2576 scope.go:117] "RemoveContainer" containerID="33b78b0cccd27ce8bb78662bb2b701ac55ac1478b1be0662562dbebfae818b6c" Apr 22 14:39:40.359738 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:39:40.359718 2576 scope.go:117] "RemoveContainer" containerID="5214875860fa8bc02925c4dc6fe74a9f0e204406f1a6f01cf52c376ae69153a1" Apr 22 14:39:40.360018 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:39:40.359999 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5214875860fa8bc02925c4dc6fe74a9f0e204406f1a6f01cf52c376ae69153a1\": container with ID starting with 5214875860fa8bc02925c4dc6fe74a9f0e204406f1a6f01cf52c376ae69153a1 not found: ID does not exist" containerID="5214875860fa8bc02925c4dc6fe74a9f0e204406f1a6f01cf52c376ae69153a1" Apr 22 14:39:40.360122 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:39:40.360035 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5214875860fa8bc02925c4dc6fe74a9f0e204406f1a6f01cf52c376ae69153a1"} err="failed to get container status \"5214875860fa8bc02925c4dc6fe74a9f0e204406f1a6f01cf52c376ae69153a1\": rpc error: code = NotFound desc = could not find container \"5214875860fa8bc02925c4dc6fe74a9f0e204406f1a6f01cf52c376ae69153a1\": container with ID starting with 5214875860fa8bc02925c4dc6fe74a9f0e204406f1a6f01cf52c376ae69153a1 not found: ID does not exist" Apr 22 14:39:40.360122 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:39:40.360053 2576 scope.go:117] "RemoveContainer" containerID="33b78b0cccd27ce8bb78662bb2b701ac55ac1478b1be0662562dbebfae818b6c" Apr 22 14:39:40.360355 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:39:40.360327 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33b78b0cccd27ce8bb78662bb2b701ac55ac1478b1be0662562dbebfae818b6c\": container with ID starting with 33b78b0cccd27ce8bb78662bb2b701ac55ac1478b1be0662562dbebfae818b6c not found: ID does not exist" containerID="33b78b0cccd27ce8bb78662bb2b701ac55ac1478b1be0662562dbebfae818b6c" Apr 22 14:39:40.360474 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:39:40.360362 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33b78b0cccd27ce8bb78662bb2b701ac55ac1478b1be0662562dbebfae818b6c"} err="failed to get container status \"33b78b0cccd27ce8bb78662bb2b701ac55ac1478b1be0662562dbebfae818b6c\": rpc error: code = NotFound desc = could not find container \"33b78b0cccd27ce8bb78662bb2b701ac55ac1478b1be0662562dbebfae818b6c\": container with ID starting with 33b78b0cccd27ce8bb78662bb2b701ac55ac1478b1be0662562dbebfae818b6c not found: ID does not exist" Apr 22 14:39:40.362000 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:39:40.361979 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-fxq6h"] Apr 22 14:39:40.365698 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:39:40.365677 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-fxq6h"] Apr 22 14:39:42.292288 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:39:42.292258 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dac6118-1bb6-4152-af05-b056b8a86cf8" path="/var/lib/kubelet/pods/0dac6118-1bb6-4152-af05-b056b8a86cf8/volumes" Apr 22 14:39:42.352801 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:39:42.352773 2576 generic.go:358] "Generic (PLEG): container finished" podID="9187c1b5-55e0-48c8-88e7-d1abe4694251" containerID="575609123d4ea612fde71e6bcdac8bd9d51c3cb31dbb3a78a0a2aeff48f987e1" exitCode=0 Apr 22 14:39:42.352916 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:39:42.352847 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-vmhkf" event={"ID":"9187c1b5-55e0-48c8-88e7-d1abe4694251","Type":"ContainerDied","Data":"575609123d4ea612fde71e6bcdac8bd9d51c3cb31dbb3a78a0a2aeff48f987e1"} Apr 22 14:39:43.357877 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:39:43.357842 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-vmhkf" event={"ID":"9187c1b5-55e0-48c8-88e7-d1abe4694251","Type":"ContainerStarted","Data":"634b1aa3e200c5ea541b0db55af8275b23f6d2b15aa0b14087d3fbf5809588fd"} Apr 22 14:39:43.358225 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:39:43.358177 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-vmhkf" Apr 22 14:39:43.359245 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:39:43.359214 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-vmhkf" podUID="9187c1b5-55e0-48c8-88e7-d1abe4694251" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 22 14:39:43.375587 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:39:43.375545 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-vmhkf" podStartSLOduration=6.375532112 podStartE2EDuration="6.375532112s" podCreationTimestamp="2026-04-22 14:39:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:39:43.37359563 +0000 UTC m=+1453.699330043" watchObservedRunningTime="2026-04-22 14:39:43.375532112 +0000 UTC m=+1453.701266597" Apr 22 14:39:44.360685 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:39:44.360645 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-vmhkf" podUID="9187c1b5-55e0-48c8-88e7-d1abe4694251" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 22 14:39:54.361504 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:39:54.361469 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-vmhkf" podUID="9187c1b5-55e0-48c8-88e7-d1abe4694251" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 22 14:40:04.361519 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:40:04.361481 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-vmhkf" podUID="9187c1b5-55e0-48c8-88e7-d1abe4694251" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 22 14:40:14.361395 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:40:14.361352 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-vmhkf" podUID="9187c1b5-55e0-48c8-88e7-d1abe4694251" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 22 14:40:24.361937 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:40:24.361907 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-vmhkf" Apr 22 14:40:28.775340 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:40:28.775308 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-vmhkf"] Apr 22 14:40:28.775787 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:40:28.775561 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-vmhkf" podUID="9187c1b5-55e0-48c8-88e7-d1abe4694251" containerName="kserve-container" containerID="cri-o://634b1aa3e200c5ea541b0db55af8275b23f6d2b15aa0b14087d3fbf5809588fd" gracePeriod=30 Apr 22 14:40:28.870537 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:40:28.870509 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-wqxc6"] Apr 22 14:40:28.870756 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:40:28.870745 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0dac6118-1bb6-4152-af05-b056b8a86cf8" containerName="kserve-container" Apr 22 14:40:28.870798 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:40:28.870757 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dac6118-1bb6-4152-af05-b056b8a86cf8" containerName="kserve-container" Apr 22 14:40:28.870798 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:40:28.870779 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0dac6118-1bb6-4152-af05-b056b8a86cf8" containerName="storage-initializer" Apr 22 14:40:28.870798 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:40:28.870784 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dac6118-1bb6-4152-af05-b056b8a86cf8" containerName="storage-initializer" Apr 22 14:40:28.870909 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:40:28.870862 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="0dac6118-1bb6-4152-af05-b056b8a86cf8" containerName="kserve-container" Apr 22 14:40:28.873949 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:40:28.873926 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-wqxc6" Apr 22 14:40:28.882971 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:40:28.882952 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-wqxc6"] Apr 22 14:40:28.965444 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:40:28.965415 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a246c193-b83b-475b-b075-f65dd1af8872-kserve-provision-location\") pod \"isvc-pmml-predictor-5584ffd8c9-wqxc6\" (UID: \"a246c193-b83b-475b-b075-f65dd1af8872\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-wqxc6" Apr 22 14:40:29.066370 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:40:29.066310 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a246c193-b83b-475b-b075-f65dd1af8872-kserve-provision-location\") pod \"isvc-pmml-predictor-5584ffd8c9-wqxc6\" (UID: \"a246c193-b83b-475b-b075-f65dd1af8872\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-wqxc6" Apr 22 14:40:29.066752 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:40:29.066734 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a246c193-b83b-475b-b075-f65dd1af8872-kserve-provision-location\") pod \"isvc-pmml-predictor-5584ffd8c9-wqxc6\" (UID: \"a246c193-b83b-475b-b075-f65dd1af8872\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-wqxc6" Apr 22 14:40:29.182850 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:40:29.182826 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-wqxc6" Apr 22 14:40:29.292498 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:40:29.292478 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-wqxc6"] Apr 22 14:40:29.294670 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:40:29.294643 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda246c193_b83b_475b_b075_f65dd1af8872.slice/crio-4640d6dccb1ec4154193aa693b6ee407acc9bc760145822df0294d184b77a17d WatchSource:0}: Error finding container 4640d6dccb1ec4154193aa693b6ee407acc9bc760145822df0294d184b77a17d: Status 404 returned error can't find the container with id 4640d6dccb1ec4154193aa693b6ee407acc9bc760145822df0294d184b77a17d Apr 22 14:40:29.471518 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:40:29.471488 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-wqxc6" event={"ID":"a246c193-b83b-475b-b075-f65dd1af8872","Type":"ContainerStarted","Data":"40ca6e4704309189bf504bc2e546a9c1caef2ca891e6dc564daf1c36f827e12d"} Apr 22 14:40:29.471518 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:40:29.471521 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-wqxc6" event={"ID":"a246c193-b83b-475b-b075-f65dd1af8872","Type":"ContainerStarted","Data":"4640d6dccb1ec4154193aa693b6ee407acc9bc760145822df0294d184b77a17d"} Apr 22 14:40:30.985624 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:40:30.985601 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x7g6p_5880a8e9-777a-4921-b5f6-c6325c768bf2/ovn-acl-logging/0.log" Apr 22 14:40:30.986016 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:40:30.985625 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x7g6p_5880a8e9-777a-4921-b5f6-c6325c768bf2/ovn-acl-logging/0.log" Apr 22 14:40:31.110344 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:40:31.110324 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-vmhkf" Apr 22 14:40:31.180646 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:40:31.180619 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9187c1b5-55e0-48c8-88e7-d1abe4694251-kserve-provision-location\") pod \"9187c1b5-55e0-48c8-88e7-d1abe4694251\" (UID: \"9187c1b5-55e0-48c8-88e7-d1abe4694251\") " Apr 22 14:40:31.189730 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:40:31.189703 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9187c1b5-55e0-48c8-88e7-d1abe4694251-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9187c1b5-55e0-48c8-88e7-d1abe4694251" (UID: "9187c1b5-55e0-48c8-88e7-d1abe4694251"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:40:31.281571 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:40:31.281544 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9187c1b5-55e0-48c8-88e7-d1abe4694251-kserve-provision-location\") on node \"ip-10-0-142-195.ec2.internal\" DevicePath \"\"" Apr 22 14:40:31.480036 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:40:31.480008 2576 generic.go:358] "Generic (PLEG): container finished" podID="9187c1b5-55e0-48c8-88e7-d1abe4694251" containerID="634b1aa3e200c5ea541b0db55af8275b23f6d2b15aa0b14087d3fbf5809588fd" exitCode=0 Apr 22 14:40:31.480162 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:40:31.480080 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-vmhkf" Apr 22 14:40:31.480162 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:40:31.480096 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-vmhkf" event={"ID":"9187c1b5-55e0-48c8-88e7-d1abe4694251","Type":"ContainerDied","Data":"634b1aa3e200c5ea541b0db55af8275b23f6d2b15aa0b14087d3fbf5809588fd"} Apr 22 14:40:31.480162 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:40:31.480134 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-vmhkf" event={"ID":"9187c1b5-55e0-48c8-88e7-d1abe4694251","Type":"ContainerDied","Data":"76008399f5e1fd7e064f1df0af57309c6f0bb8a5f02d5b4bd46f3adbe665df2e"} Apr 22 14:40:31.480162 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:40:31.480150 2576 scope.go:117] "RemoveContainer" containerID="634b1aa3e200c5ea541b0db55af8275b23f6d2b15aa0b14087d3fbf5809588fd" Apr 22 14:40:31.487433 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:40:31.487416 2576 scope.go:117] "RemoveContainer" containerID="575609123d4ea612fde71e6bcdac8bd9d51c3cb31dbb3a78a0a2aeff48f987e1" Apr 22 14:40:31.493894 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:40:31.493876 2576 scope.go:117] "RemoveContainer" containerID="634b1aa3e200c5ea541b0db55af8275b23f6d2b15aa0b14087d3fbf5809588fd" Apr 22 14:40:31.494159 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:40:31.494135 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"634b1aa3e200c5ea541b0db55af8275b23f6d2b15aa0b14087d3fbf5809588fd\": container with ID starting with 634b1aa3e200c5ea541b0db55af8275b23f6d2b15aa0b14087d3fbf5809588fd not found: ID does not exist" containerID="634b1aa3e200c5ea541b0db55af8275b23f6d2b15aa0b14087d3fbf5809588fd" Apr 22 14:40:31.494209 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:40:31.494168 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"634b1aa3e200c5ea541b0db55af8275b23f6d2b15aa0b14087d3fbf5809588fd"} err="failed to get container status \"634b1aa3e200c5ea541b0db55af8275b23f6d2b15aa0b14087d3fbf5809588fd\": rpc error: code = NotFound desc = could not find container \"634b1aa3e200c5ea541b0db55af8275b23f6d2b15aa0b14087d3fbf5809588fd\": container with ID starting with 634b1aa3e200c5ea541b0db55af8275b23f6d2b15aa0b14087d3fbf5809588fd not found: ID does not exist" Apr 22 14:40:31.494209 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:40:31.494184 2576 scope.go:117] "RemoveContainer" containerID="575609123d4ea612fde71e6bcdac8bd9d51c3cb31dbb3a78a0a2aeff48f987e1" Apr 22 14:40:31.494423 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:40:31.494407 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"575609123d4ea612fde71e6bcdac8bd9d51c3cb31dbb3a78a0a2aeff48f987e1\": container with ID starting with 575609123d4ea612fde71e6bcdac8bd9d51c3cb31dbb3a78a0a2aeff48f987e1 not found: ID does not exist" containerID="575609123d4ea612fde71e6bcdac8bd9d51c3cb31dbb3a78a0a2aeff48f987e1" Apr 22 14:40:31.494458 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:40:31.494431 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"575609123d4ea612fde71e6bcdac8bd9d51c3cb31dbb3a78a0a2aeff48f987e1"} err="failed to get container status \"575609123d4ea612fde71e6bcdac8bd9d51c3cb31dbb3a78a0a2aeff48f987e1\": rpc error: code = NotFound desc = could not find container \"575609123d4ea612fde71e6bcdac8bd9d51c3cb31dbb3a78a0a2aeff48f987e1\": container with ID starting with 575609123d4ea612fde71e6bcdac8bd9d51c3cb31dbb3a78a0a2aeff48f987e1 not found: ID does not exist" Apr 22 14:40:31.500243 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:40:31.500224 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-vmhkf"] Apr 22 14:40:31.503094 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:40:31.503074 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-vmhkf"] Apr 22 14:40:32.291500 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:40:32.291480 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9187c1b5-55e0-48c8-88e7-d1abe4694251" path="/var/lib/kubelet/pods/9187c1b5-55e0-48c8-88e7-d1abe4694251/volumes" Apr 22 14:40:33.486920 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:40:33.486890 2576 generic.go:358] "Generic (PLEG): container finished" podID="a246c193-b83b-475b-b075-f65dd1af8872" containerID="40ca6e4704309189bf504bc2e546a9c1caef2ca891e6dc564daf1c36f827e12d" exitCode=0 Apr 22 14:40:33.487308 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:40:33.486956 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-wqxc6" event={"ID":"a246c193-b83b-475b-b075-f65dd1af8872","Type":"ContainerDied","Data":"40ca6e4704309189bf504bc2e546a9c1caef2ca891e6dc564daf1c36f827e12d"} Apr 22 14:40:43.515818 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:40:43.515772 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-wqxc6" event={"ID":"a246c193-b83b-475b-b075-f65dd1af8872","Type":"ContainerStarted","Data":"337cfd4aa16e88d8f612c554fea564e92e0cee280bc73aa7ec7fb4300b254860"} Apr 22 14:40:43.534624 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:40:43.534583 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-wqxc6" podStartSLOduration=6.424618083 podStartE2EDuration="15.534571705s" podCreationTimestamp="2026-04-22 14:40:28 +0000 UTC" firstStartedPulling="2026-04-22 14:40:33.487965955 +0000 UTC m=+1503.813700349" lastFinishedPulling="2026-04-22 14:40:42.597919576 +0000 UTC m=+1512.923653971" observedRunningTime="2026-04-22 14:40:43.53316762 +0000 UTC m=+1513.858902033" watchObservedRunningTime="2026-04-22 14:40:43.534571705 +0000 UTC m=+1513.860306119" Apr 22 14:40:53.516119 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:40:53.516086 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-wqxc6" Apr 22 14:40:53.517530 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:40:53.517498 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-wqxc6" podUID="a246c193-b83b-475b-b075-f65dd1af8872" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 22 14:40:53.517753 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:40:53.517730 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-wqxc6" podUID="a246c193-b83b-475b-b075-f65dd1af8872" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 22 14:41:03.518826 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:41:03.518775 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-wqxc6" podUID="a246c193-b83b-475b-b075-f65dd1af8872" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 22 14:41:13.518749 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:41:13.518710 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-wqxc6" podUID="a246c193-b83b-475b-b075-f65dd1af8872" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 22 14:41:23.518193 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:41:23.518155 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-wqxc6" podUID="a246c193-b83b-475b-b075-f65dd1af8872" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 22 14:41:33.518482 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:41:33.518442 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-wqxc6" podUID="a246c193-b83b-475b-b075-f65dd1af8872" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 22 14:41:43.518866 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:41:43.518803 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-wqxc6" podUID="a246c193-b83b-475b-b075-f65dd1af8872" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 22 14:41:53.519003 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:41:53.518973 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-wqxc6" Apr 22 14:41:59.723540 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:41:59.723509 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-wqxc6"] Apr 22 14:41:59.723917 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:41:59.723824 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-wqxc6" podUID="a246c193-b83b-475b-b075-f65dd1af8872" containerName="kserve-container" containerID="cri-o://337cfd4aa16e88d8f612c554fea564e92e0cee280bc73aa7ec7fb4300b254860" gracePeriod=30 Apr 22 14:41:59.820803 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:41:59.820776 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dsqxr"] Apr 22 14:41:59.821037 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:41:59.821025 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9187c1b5-55e0-48c8-88e7-d1abe4694251" containerName="storage-initializer" Apr 22 14:41:59.821080 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:41:59.821040 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9187c1b5-55e0-48c8-88e7-d1abe4694251" containerName="storage-initializer" Apr 22 14:41:59.821080 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:41:59.821055 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9187c1b5-55e0-48c8-88e7-d1abe4694251" containerName="kserve-container" Apr 22 14:41:59.821080 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:41:59.821061 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9187c1b5-55e0-48c8-88e7-d1abe4694251" containerName="kserve-container" Apr 22 14:41:59.821165 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:41:59.821120 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="9187c1b5-55e0-48c8-88e7-d1abe4694251" containerName="kserve-container" Apr 22 14:41:59.824042 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:41:59.824022 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dsqxr" Apr 22 14:41:59.831148 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:41:59.831125 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dsqxr"] Apr 22 14:41:59.981445 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:41:59.981379 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d13153cd-73f4-49f4-8801-052d4ccce495-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-7576f6b69f-dsqxr\" (UID: \"d13153cd-73f4-49f4-8801-052d4ccce495\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dsqxr" Apr 22 14:42:00.082754 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:42:00.082725 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d13153cd-73f4-49f4-8801-052d4ccce495-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-7576f6b69f-dsqxr\" (UID: \"d13153cd-73f4-49f4-8801-052d4ccce495\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dsqxr" Apr 22 14:42:00.083053 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:42:00.083036 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d13153cd-73f4-49f4-8801-052d4ccce495-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-7576f6b69f-dsqxr\" (UID: \"d13153cd-73f4-49f4-8801-052d4ccce495\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dsqxr" Apr 22 14:42:00.134403 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:42:00.134380 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dsqxr" Apr 22 14:42:00.249593 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:42:00.249571 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dsqxr"] Apr 22 14:42:00.251769 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:42:00.251735 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd13153cd_73f4_49f4_8801_052d4ccce495.slice/crio-b2bcec674fd4d29f01d8ad57d994690aee8b5d96cc5da60eccdf7def51b3b393 WatchSource:0}: Error finding container b2bcec674fd4d29f01d8ad57d994690aee8b5d96cc5da60eccdf7def51b3b393: Status 404 returned error can't find the container with id b2bcec674fd4d29f01d8ad57d994690aee8b5d96cc5da60eccdf7def51b3b393 Apr 22 14:42:00.710111 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:42:00.710025 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dsqxr" event={"ID":"d13153cd-73f4-49f4-8801-052d4ccce495","Type":"ContainerStarted","Data":"314d2d6efedbe9273e418f5e8cfb0d9c688d33997a297ede1fbc3ee0f738e375"} Apr 22 14:42:00.710111 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:42:00.710064 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dsqxr" event={"ID":"d13153cd-73f4-49f4-8801-052d4ccce495","Type":"ContainerStarted","Data":"b2bcec674fd4d29f01d8ad57d994690aee8b5d96cc5da60eccdf7def51b3b393"} Apr 22 14:42:02.857948 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:42:02.857929 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-wqxc6" Apr 22 14:42:03.002356 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:42:03.002332 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a246c193-b83b-475b-b075-f65dd1af8872-kserve-provision-location\") pod \"a246c193-b83b-475b-b075-f65dd1af8872\" (UID: \"a246c193-b83b-475b-b075-f65dd1af8872\") " Apr 22 14:42:03.002622 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:42:03.002601 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a246c193-b83b-475b-b075-f65dd1af8872-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a246c193-b83b-475b-b075-f65dd1af8872" (UID: "a246c193-b83b-475b-b075-f65dd1af8872"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:42:03.103437 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:42:03.103416 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a246c193-b83b-475b-b075-f65dd1af8872-kserve-provision-location\") on node \"ip-10-0-142-195.ec2.internal\" DevicePath \"\"" Apr 22 14:42:03.719058 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:42:03.719031 2576 generic.go:358] "Generic (PLEG): container finished" podID="a246c193-b83b-475b-b075-f65dd1af8872" containerID="337cfd4aa16e88d8f612c554fea564e92e0cee280bc73aa7ec7fb4300b254860" exitCode=0 Apr 22 14:42:03.719161 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:42:03.719085 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-wqxc6" Apr 22 14:42:03.719161 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:42:03.719116 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-wqxc6" event={"ID":"a246c193-b83b-475b-b075-f65dd1af8872","Type":"ContainerDied","Data":"337cfd4aa16e88d8f612c554fea564e92e0cee280bc73aa7ec7fb4300b254860"} Apr 22 14:42:03.719161 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:42:03.719150 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-wqxc6" event={"ID":"a246c193-b83b-475b-b075-f65dd1af8872","Type":"ContainerDied","Data":"4640d6dccb1ec4154193aa693b6ee407acc9bc760145822df0294d184b77a17d"} Apr 22 14:42:03.719277 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:42:03.719164 2576 scope.go:117] "RemoveContainer" containerID="337cfd4aa16e88d8f612c554fea564e92e0cee280bc73aa7ec7fb4300b254860" Apr 22 14:42:03.726410 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:42:03.726389 2576 scope.go:117] "RemoveContainer" containerID="40ca6e4704309189bf504bc2e546a9c1caef2ca891e6dc564daf1c36f827e12d" Apr 22 14:42:03.734938 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:42:03.734922 2576 scope.go:117] "RemoveContainer" containerID="337cfd4aa16e88d8f612c554fea564e92e0cee280bc73aa7ec7fb4300b254860" Apr 22 14:42:03.735182 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:42:03.735164 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"337cfd4aa16e88d8f612c554fea564e92e0cee280bc73aa7ec7fb4300b254860\": container with ID starting with 337cfd4aa16e88d8f612c554fea564e92e0cee280bc73aa7ec7fb4300b254860 not found: ID does not exist" containerID="337cfd4aa16e88d8f612c554fea564e92e0cee280bc73aa7ec7fb4300b254860" Apr 22 14:42:03.735260 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:42:03.735189 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"337cfd4aa16e88d8f612c554fea564e92e0cee280bc73aa7ec7fb4300b254860"} err="failed to get container status \"337cfd4aa16e88d8f612c554fea564e92e0cee280bc73aa7ec7fb4300b254860\": rpc error: code = NotFound desc = could not find container \"337cfd4aa16e88d8f612c554fea564e92e0cee280bc73aa7ec7fb4300b254860\": container with ID starting with 337cfd4aa16e88d8f612c554fea564e92e0cee280bc73aa7ec7fb4300b254860 not found: ID does not exist" Apr 22 14:42:03.735260 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:42:03.735211 2576 scope.go:117] "RemoveContainer" containerID="40ca6e4704309189bf504bc2e546a9c1caef2ca891e6dc564daf1c36f827e12d" Apr 22 14:42:03.735445 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:42:03.735431 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40ca6e4704309189bf504bc2e546a9c1caef2ca891e6dc564daf1c36f827e12d\": container with ID starting with 40ca6e4704309189bf504bc2e546a9c1caef2ca891e6dc564daf1c36f827e12d not found: ID does not exist" containerID="40ca6e4704309189bf504bc2e546a9c1caef2ca891e6dc564daf1c36f827e12d" Apr 22 14:42:03.735489 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:42:03.735449 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40ca6e4704309189bf504bc2e546a9c1caef2ca891e6dc564daf1c36f827e12d"} err="failed to get container status \"40ca6e4704309189bf504bc2e546a9c1caef2ca891e6dc564daf1c36f827e12d\": rpc error: code = NotFound desc = could not find container \"40ca6e4704309189bf504bc2e546a9c1caef2ca891e6dc564daf1c36f827e12d\": container with ID starting with 40ca6e4704309189bf504bc2e546a9c1caef2ca891e6dc564daf1c36f827e12d not found: ID does not exist" Apr 22 14:42:03.739455 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:42:03.739434 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-wqxc6"] Apr 22 14:42:03.742794 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:42:03.742776 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-wqxc6"] Apr 22 14:42:04.291479 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:42:04.291453 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a246c193-b83b-475b-b075-f65dd1af8872" path="/var/lib/kubelet/pods/a246c193-b83b-475b-b075-f65dd1af8872/volumes" Apr 22 14:42:04.723167 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:42:04.723139 2576 generic.go:358] "Generic (PLEG): container finished" podID="d13153cd-73f4-49f4-8801-052d4ccce495" containerID="314d2d6efedbe9273e418f5e8cfb0d9c688d33997a297ede1fbc3ee0f738e375" exitCode=0 Apr 22 14:42:04.723286 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:42:04.723210 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dsqxr" event={"ID":"d13153cd-73f4-49f4-8801-052d4ccce495","Type":"ContainerDied","Data":"314d2d6efedbe9273e418f5e8cfb0d9c688d33997a297ede1fbc3ee0f738e375"} Apr 22 14:42:05.729286 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:42:05.729252 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dsqxr" event={"ID":"d13153cd-73f4-49f4-8801-052d4ccce495","Type":"ContainerStarted","Data":"a5965fa0cadb2dde735105886c40838f791ed828e91ee2e686aad040d58c288f"} Apr 22 14:42:05.729770 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:42:05.729626 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dsqxr" Apr 22 14:42:05.730942 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:42:05.730914 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dsqxr" podUID="d13153cd-73f4-49f4-8801-052d4ccce495" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 22 14:42:05.748536 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:42:05.748484 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dsqxr" podStartSLOduration=6.748467799 podStartE2EDuration="6.748467799s" podCreationTimestamp="2026-04-22 14:41:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:42:05.747237035 +0000 UTC m=+1596.072971444" watchObservedRunningTime="2026-04-22 14:42:05.748467799 +0000 UTC m=+1596.074202213" Apr 22 14:42:06.732117 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:42:06.732081 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dsqxr" podUID="d13153cd-73f4-49f4-8801-052d4ccce495" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 22 14:42:16.732270 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:42:16.732174 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dsqxr" podUID="d13153cd-73f4-49f4-8801-052d4ccce495" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 22 14:42:26.732553 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:42:26.732502 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dsqxr" podUID="d13153cd-73f4-49f4-8801-052d4ccce495" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 22 14:42:36.732647 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:42:36.732603 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dsqxr" podUID="d13153cd-73f4-49f4-8801-052d4ccce495" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 22 14:42:46.733100 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:42:46.733051 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dsqxr" podUID="d13153cd-73f4-49f4-8801-052d4ccce495" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 22 14:42:56.732704 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:42:56.732640 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dsqxr" podUID="d13153cd-73f4-49f4-8801-052d4ccce495" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 22 14:43:06.732404 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:43:06.732358 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dsqxr" podUID="d13153cd-73f4-49f4-8801-052d4ccce495" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 22 14:43:12.288532 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:43:12.288483 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dsqxr" podUID="d13153cd-73f4-49f4-8801-052d4ccce495" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 22 14:43:22.292345 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:43:22.292312 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dsqxr" Apr 22 14:43:30.906206 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:43:30.906176 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dsqxr"] Apr 22 14:43:30.906677 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:43:30.906479 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dsqxr" podUID="d13153cd-73f4-49f4-8801-052d4ccce495" containerName="kserve-container" containerID="cri-o://a5965fa0cadb2dde735105886c40838f791ed828e91ee2e686aad040d58c288f" gracePeriod=30 Apr 22 14:43:31.011356 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:43:31.011333 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-qgq2v"] Apr 22 14:43:31.011623 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:43:31.011611 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a246c193-b83b-475b-b075-f65dd1af8872" containerName="storage-initializer" Apr 22 14:43:31.011686 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:43:31.011625 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a246c193-b83b-475b-b075-f65dd1af8872" containerName="storage-initializer" Apr 22 14:43:31.011686 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:43:31.011637 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a246c193-b83b-475b-b075-f65dd1af8872" containerName="kserve-container" Apr 22 14:43:31.011686 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:43:31.011642 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a246c193-b83b-475b-b075-f65dd1af8872" containerName="kserve-container" Apr 22 14:43:31.011686 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:43:31.011685 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="a246c193-b83b-475b-b075-f65dd1af8872" containerName="kserve-container" Apr 22 14:43:31.014556 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:43:31.014540 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-qgq2v" Apr 22 14:43:31.024234 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:43:31.024213 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-qgq2v"] Apr 22 14:43:31.097091 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:43:31.097069 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6b0d36be-3c72-45d5-8e08-1c2070e60de3-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-75b87ff64c-qgq2v\" (UID: \"6b0d36be-3c72-45d5-8e08-1c2070e60de3\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-qgq2v" Apr 22 14:43:31.198127 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:43:31.198074 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6b0d36be-3c72-45d5-8e08-1c2070e60de3-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-75b87ff64c-qgq2v\" (UID: \"6b0d36be-3c72-45d5-8e08-1c2070e60de3\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-qgq2v" Apr 22 14:43:31.198403 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:43:31.198386 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6b0d36be-3c72-45d5-8e08-1c2070e60de3-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-75b87ff64c-qgq2v\" (UID: \"6b0d36be-3c72-45d5-8e08-1c2070e60de3\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-qgq2v" Apr 22 14:43:31.323544 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:43:31.323524 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-qgq2v" Apr 22 14:43:31.435966 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:43:31.435942 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-qgq2v"] Apr 22 14:43:31.439026 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:43:31.438998 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b0d36be_3c72_45d5_8e08_1c2070e60de3.slice/crio-9624b92aaebb576210088670336142ddf9f50ec0db1e4886b101582973c61ab0 WatchSource:0}: Error finding container 9624b92aaebb576210088670336142ddf9f50ec0db1e4886b101582973c61ab0: Status 404 returned error can't find the container with id 9624b92aaebb576210088670336142ddf9f50ec0db1e4886b101582973c61ab0 Apr 22 14:43:31.440835 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:43:31.440804 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 14:43:31.965864 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:43:31.965830 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-qgq2v" event={"ID":"6b0d36be-3c72-45d5-8e08-1c2070e60de3","Type":"ContainerStarted","Data":"c0b279a1938178da073388e2eb344c45267636dd48597d497ea9985a73f798d7"} Apr 22 14:43:31.965864 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:43:31.965866 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-qgq2v" event={"ID":"6b0d36be-3c72-45d5-8e08-1c2070e60de3","Type":"ContainerStarted","Data":"9624b92aaebb576210088670336142ddf9f50ec0db1e4886b101582973c61ab0"} Apr 22 14:43:32.288596 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:43:32.288555 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dsqxr" podUID="d13153cd-73f4-49f4-8801-052d4ccce495" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 22 14:43:33.972898 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:43:33.972871 2576 generic.go:358] "Generic (PLEG): container finished" podID="d13153cd-73f4-49f4-8801-052d4ccce495" containerID="a5965fa0cadb2dde735105886c40838f791ed828e91ee2e686aad040d58c288f" exitCode=0 Apr 22 14:43:33.973179 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:43:33.972941 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dsqxr" event={"ID":"d13153cd-73f4-49f4-8801-052d4ccce495","Type":"ContainerDied","Data":"a5965fa0cadb2dde735105886c40838f791ed828e91ee2e686aad040d58c288f"} Apr 22 14:43:34.039555 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:43:34.039533 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dsqxr" Apr 22 14:43:34.118615 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:43:34.118570 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d13153cd-73f4-49f4-8801-052d4ccce495-kserve-provision-location\") pod \"d13153cd-73f4-49f4-8801-052d4ccce495\" (UID: \"d13153cd-73f4-49f4-8801-052d4ccce495\") " Apr 22 14:43:34.118853 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:43:34.118832 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d13153cd-73f4-49f4-8801-052d4ccce495-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d13153cd-73f4-49f4-8801-052d4ccce495" (UID: "d13153cd-73f4-49f4-8801-052d4ccce495"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:43:34.219859 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:43:34.219836 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d13153cd-73f4-49f4-8801-052d4ccce495-kserve-provision-location\") on node \"ip-10-0-142-195.ec2.internal\" DevicePath \"\"" Apr 22 14:43:34.977220 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:43:34.977193 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dsqxr" Apr 22 14:43:34.977220 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:43:34.977200 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dsqxr" event={"ID":"d13153cd-73f4-49f4-8801-052d4ccce495","Type":"ContainerDied","Data":"b2bcec674fd4d29f01d8ad57d994690aee8b5d96cc5da60eccdf7def51b3b393"} Apr 22 14:43:34.977596 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:43:34.977251 2576 scope.go:117] "RemoveContainer" containerID="a5965fa0cadb2dde735105886c40838f791ed828e91ee2e686aad040d58c288f" Apr 22 14:43:34.985538 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:43:34.985509 2576 scope.go:117] "RemoveContainer" containerID="314d2d6efedbe9273e418f5e8cfb0d9c688d33997a297ede1fbc3ee0f738e375" Apr 22 14:43:34.993657 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:43:34.993635 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dsqxr"] Apr 22 14:43:34.997547 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:43:34.997521 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dsqxr"] Apr 22 14:43:35.982386 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:43:35.982351 2576 generic.go:358] "Generic (PLEG): container finished" podID="6b0d36be-3c72-45d5-8e08-1c2070e60de3" containerID="c0b279a1938178da073388e2eb344c45267636dd48597d497ea9985a73f798d7" exitCode=0 Apr 22 14:43:35.982711 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:43:35.982428 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-qgq2v" event={"ID":"6b0d36be-3c72-45d5-8e08-1c2070e60de3","Type":"ContainerDied","Data":"c0b279a1938178da073388e2eb344c45267636dd48597d497ea9985a73f798d7"} Apr 22 14:43:36.291815 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:43:36.291745 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d13153cd-73f4-49f4-8801-052d4ccce495" path="/var/lib/kubelet/pods/d13153cd-73f4-49f4-8801-052d4ccce495/volumes" Apr 22 14:43:36.986922 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:43:36.986893 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-qgq2v" event={"ID":"6b0d36be-3c72-45d5-8e08-1c2070e60de3","Type":"ContainerStarted","Data":"9d0e2fa48987fd7e4a6a058e98190a06945e0a6e47c5f3ae431c0cab2888a504"} Apr 22 14:43:36.987304 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:43:36.987194 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-qgq2v" Apr 22 14:43:36.988701 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:43:36.988673 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-qgq2v" podUID="6b0d36be-3c72-45d5-8e08-1c2070e60de3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 22 14:43:37.004934 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:43:37.004891 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-qgq2v" podStartSLOduration=7.004876225 podStartE2EDuration="7.004876225s" podCreationTimestamp="2026-04-22 14:43:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:43:37.00328273 +0000 UTC m=+1687.329017143" watchObservedRunningTime="2026-04-22 14:43:37.004876225 +0000 UTC m=+1687.330610635" Apr 22 14:43:37.989970 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:43:37.989931 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-qgq2v" podUID="6b0d36be-3c72-45d5-8e08-1c2070e60de3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 22 14:43:47.990519 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:43:47.990474 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-qgq2v" podUID="6b0d36be-3c72-45d5-8e08-1c2070e60de3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 22 14:43:57.990623 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:43:57.990577 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-qgq2v" podUID="6b0d36be-3c72-45d5-8e08-1c2070e60de3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 22 14:44:07.990244 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:44:07.990190 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-qgq2v" podUID="6b0d36be-3c72-45d5-8e08-1c2070e60de3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 22 14:44:17.990501 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:44:17.990458 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-qgq2v" podUID="6b0d36be-3c72-45d5-8e08-1c2070e60de3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 22 14:44:27.990336 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:44:27.990290 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-qgq2v" podUID="6b0d36be-3c72-45d5-8e08-1c2070e60de3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 22 14:44:37.990367 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:44:37.990317 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-qgq2v" podUID="6b0d36be-3c72-45d5-8e08-1c2070e60de3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 22 14:44:47.990302 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:44:47.990258 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-qgq2v" podUID="6b0d36be-3c72-45d5-8e08-1c2070e60de3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 22 14:44:52.291346 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:44:52.291323 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-qgq2v" Apr 22 14:45:01.989792 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:45:01.989758 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-qgq2v"] Apr 22 14:45:01.990358 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:45:01.990148 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-qgq2v" podUID="6b0d36be-3c72-45d5-8e08-1c2070e60de3" containerName="kserve-container" containerID="cri-o://9d0e2fa48987fd7e4a6a058e98190a06945e0a6e47c5f3ae431c0cab2888a504" gracePeriod=30 Apr 22 14:45:02.084590 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:45:02.084561 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-primary-4bd053-predictor-5f5dcb8686-2q4qs"] Apr 22 14:45:02.084859 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:45:02.084848 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d13153cd-73f4-49f4-8801-052d4ccce495" containerName="storage-initializer" Apr 22 14:45:02.084907 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:45:02.084860 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d13153cd-73f4-49f4-8801-052d4ccce495" containerName="storage-initializer" Apr 22 14:45:02.084907 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:45:02.084880 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d13153cd-73f4-49f4-8801-052d4ccce495" containerName="kserve-container" Apr 22 14:45:02.084907 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:45:02.084885 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d13153cd-73f4-49f4-8801-052d4ccce495" containerName="kserve-container" Apr 22 14:45:02.084994 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:45:02.084925 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="d13153cd-73f4-49f4-8801-052d4ccce495" containerName="kserve-container" Apr 22 14:45:02.087446 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:45:02.087431 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-4bd053-predictor-5f5dcb8686-2q4qs" Apr 22 14:45:02.094619 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:45:02.094594 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-4bd053-predictor-5f5dcb8686-2q4qs"] Apr 22 14:45:02.095435 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:45:02.095417 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e275e1ff-0f52-4bbf-86da-7e237eb1c675-kserve-provision-location\") pod \"isvc-primary-4bd053-predictor-5f5dcb8686-2q4qs\" (UID: \"e275e1ff-0f52-4bbf-86da-7e237eb1c675\") " pod="kserve-ci-e2e-test/isvc-primary-4bd053-predictor-5f5dcb8686-2q4qs" Apr 22 14:45:02.196307 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:45:02.196280 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e275e1ff-0f52-4bbf-86da-7e237eb1c675-kserve-provision-location\") pod \"isvc-primary-4bd053-predictor-5f5dcb8686-2q4qs\" (UID: \"e275e1ff-0f52-4bbf-86da-7e237eb1c675\") " pod="kserve-ci-e2e-test/isvc-primary-4bd053-predictor-5f5dcb8686-2q4qs" Apr 22 14:45:02.196607 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:45:02.196589 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e275e1ff-0f52-4bbf-86da-7e237eb1c675-kserve-provision-location\") pod \"isvc-primary-4bd053-predictor-5f5dcb8686-2q4qs\" (UID: \"e275e1ff-0f52-4bbf-86da-7e237eb1c675\") " pod="kserve-ci-e2e-test/isvc-primary-4bd053-predictor-5f5dcb8686-2q4qs" Apr 22 14:45:02.288909 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:45:02.288844 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-qgq2v" podUID="6b0d36be-3c72-45d5-8e08-1c2070e60de3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 22 14:45:02.397498 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:45:02.397480 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-4bd053-predictor-5f5dcb8686-2q4qs" Apr 22 14:45:02.511669 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:45:02.511645 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-4bd053-predictor-5f5dcb8686-2q4qs"] Apr 22 14:45:02.513735 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:45:02.513708 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode275e1ff_0f52_4bbf_86da_7e237eb1c675.slice/crio-79705e7eb7e8fdf3a01e7db86fa967bed224dafcbb45a341e5cb03c04a6f778e WatchSource:0}: Error finding container 79705e7eb7e8fdf3a01e7db86fa967bed224dafcbb45a341e5cb03c04a6f778e: Status 404 returned error can't find the container with id 79705e7eb7e8fdf3a01e7db86fa967bed224dafcbb45a341e5cb03c04a6f778e Apr 22 14:45:03.233588 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:45:03.233554 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-4bd053-predictor-5f5dcb8686-2q4qs" event={"ID":"e275e1ff-0f52-4bbf-86da-7e237eb1c675","Type":"ContainerStarted","Data":"e1ad64bb02660b88aada8c237607e5aa8deb0c4e6e44a7b01fd8b58ce166b56a"} Apr 22 14:45:03.233588 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:45:03.233589 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-4bd053-predictor-5f5dcb8686-2q4qs" event={"ID":"e275e1ff-0f52-4bbf-86da-7e237eb1c675","Type":"ContainerStarted","Data":"79705e7eb7e8fdf3a01e7db86fa967bed224dafcbb45a341e5cb03c04a6f778e"} Apr 22 14:45:04.922350 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:45:04.922332 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-qgq2v" Apr 22 14:45:05.012043 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:45:05.011987 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6b0d36be-3c72-45d5-8e08-1c2070e60de3-kserve-provision-location\") pod \"6b0d36be-3c72-45d5-8e08-1c2070e60de3\" (UID: \"6b0d36be-3c72-45d5-8e08-1c2070e60de3\") " Apr 22 14:45:05.012302 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:45:05.012281 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b0d36be-3c72-45d5-8e08-1c2070e60de3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6b0d36be-3c72-45d5-8e08-1c2070e60de3" (UID: "6b0d36be-3c72-45d5-8e08-1c2070e60de3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:45:05.112541 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:45:05.112520 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6b0d36be-3c72-45d5-8e08-1c2070e60de3-kserve-provision-location\") on node \"ip-10-0-142-195.ec2.internal\" DevicePath \"\"" Apr 22 14:45:05.240162 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:45:05.240140 2576 generic.go:358] "Generic (PLEG): container finished" podID="6b0d36be-3c72-45d5-8e08-1c2070e60de3" containerID="9d0e2fa48987fd7e4a6a058e98190a06945e0a6e47c5f3ae431c0cab2888a504" exitCode=0 Apr 22 14:45:05.240253 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:45:05.240196 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-qgq2v" Apr 22 14:45:05.240253 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:45:05.240217 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-qgq2v" event={"ID":"6b0d36be-3c72-45d5-8e08-1c2070e60de3","Type":"ContainerDied","Data":"9d0e2fa48987fd7e4a6a058e98190a06945e0a6e47c5f3ae431c0cab2888a504"} Apr 22 14:45:05.240253 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:45:05.240241 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-qgq2v" event={"ID":"6b0d36be-3c72-45d5-8e08-1c2070e60de3","Type":"ContainerDied","Data":"9624b92aaebb576210088670336142ddf9f50ec0db1e4886b101582973c61ab0"} Apr 22 14:45:05.240352 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:45:05.240257 2576 scope.go:117] "RemoveContainer" containerID="9d0e2fa48987fd7e4a6a058e98190a06945e0a6e47c5f3ae431c0cab2888a504" Apr 22 14:45:05.247515 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:45:05.247499 2576 scope.go:117] "RemoveContainer" containerID="c0b279a1938178da073388e2eb344c45267636dd48597d497ea9985a73f798d7" Apr 22 14:45:05.253735 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:45:05.253719 2576 scope.go:117] "RemoveContainer" containerID="9d0e2fa48987fd7e4a6a058e98190a06945e0a6e47c5f3ae431c0cab2888a504" Apr 22 14:45:05.253982 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:45:05.253964 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d0e2fa48987fd7e4a6a058e98190a06945e0a6e47c5f3ae431c0cab2888a504\": container with ID starting with 9d0e2fa48987fd7e4a6a058e98190a06945e0a6e47c5f3ae431c0cab2888a504 not found: ID does not exist" containerID="9d0e2fa48987fd7e4a6a058e98190a06945e0a6e47c5f3ae431c0cab2888a504" Apr 22 14:45:05.254046 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:45:05.253989 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d0e2fa48987fd7e4a6a058e98190a06945e0a6e47c5f3ae431c0cab2888a504"} err="failed to get container status \"9d0e2fa48987fd7e4a6a058e98190a06945e0a6e47c5f3ae431c0cab2888a504\": rpc error: code = NotFound desc = could not find container \"9d0e2fa48987fd7e4a6a058e98190a06945e0a6e47c5f3ae431c0cab2888a504\": container with ID starting with 9d0e2fa48987fd7e4a6a058e98190a06945e0a6e47c5f3ae431c0cab2888a504 not found: ID does not exist" Apr 22 14:45:05.254046 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:45:05.254005 2576 scope.go:117] "RemoveContainer" containerID="c0b279a1938178da073388e2eb344c45267636dd48597d497ea9985a73f798d7" Apr 22 14:45:05.254227 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:45:05.254208 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0b279a1938178da073388e2eb344c45267636dd48597d497ea9985a73f798d7\": container with ID starting with c0b279a1938178da073388e2eb344c45267636dd48597d497ea9985a73f798d7 not found: ID does not exist" containerID="c0b279a1938178da073388e2eb344c45267636dd48597d497ea9985a73f798d7" Apr 22 14:45:05.254269 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:45:05.254232 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0b279a1938178da073388e2eb344c45267636dd48597d497ea9985a73f798d7"} err="failed to get container status \"c0b279a1938178da073388e2eb344c45267636dd48597d497ea9985a73f798d7\": rpc error: code = NotFound desc = could not find container \"c0b279a1938178da073388e2eb344c45267636dd48597d497ea9985a73f798d7\": container with ID starting with c0b279a1938178da073388e2eb344c45267636dd48597d497ea9985a73f798d7 not found: ID does not exist" Apr 22 14:45:05.261248 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:45:05.261230 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-qgq2v"] Apr 22 14:45:05.266922 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:45:05.266870 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-qgq2v"] Apr 22 14:45:06.291878 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:45:06.291853 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b0d36be-3c72-45d5-8e08-1c2070e60de3" path="/var/lib/kubelet/pods/6b0d36be-3c72-45d5-8e08-1c2070e60de3/volumes" Apr 22 14:45:07.247221 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:45:07.247195 2576 generic.go:358] "Generic (PLEG): container finished" podID="e275e1ff-0f52-4bbf-86da-7e237eb1c675" containerID="e1ad64bb02660b88aada8c237607e5aa8deb0c4e6e44a7b01fd8b58ce166b56a" exitCode=0 Apr 22 14:45:07.247328 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:45:07.247267 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-4bd053-predictor-5f5dcb8686-2q4qs" event={"ID":"e275e1ff-0f52-4bbf-86da-7e237eb1c675","Type":"ContainerDied","Data":"e1ad64bb02660b88aada8c237607e5aa8deb0c4e6e44a7b01fd8b58ce166b56a"} Apr 22 14:45:08.251856 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:45:08.251803 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-4bd053-predictor-5f5dcb8686-2q4qs" event={"ID":"e275e1ff-0f52-4bbf-86da-7e237eb1c675","Type":"ContainerStarted","Data":"9633b97b8a67500312884e5d00704b4c34dddcec7f91d86d069f793f97cd296b"} Apr 22 14:45:08.252288 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:45:08.252152 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-4bd053-predictor-5f5dcb8686-2q4qs" Apr 22 14:45:08.253270 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:45:08.253241 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-4bd053-predictor-5f5dcb8686-2q4qs" podUID="e275e1ff-0f52-4bbf-86da-7e237eb1c675" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 22 14:45:08.267799 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:45:08.267757 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-primary-4bd053-predictor-5f5dcb8686-2q4qs" podStartSLOduration=6.267740387 podStartE2EDuration="6.267740387s" podCreationTimestamp="2026-04-22 14:45:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:45:08.267436588 +0000 UTC m=+1778.593171003" watchObservedRunningTime="2026-04-22 14:45:08.267740387 +0000 UTC m=+1778.593474797" Apr 22 14:45:09.255309 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:45:09.255273 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-4bd053-predictor-5f5dcb8686-2q4qs" podUID="e275e1ff-0f52-4bbf-86da-7e237eb1c675" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 22 14:45:19.255553 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:45:19.255517 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-4bd053-predictor-5f5dcb8686-2q4qs" podUID="e275e1ff-0f52-4bbf-86da-7e237eb1c675" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 22 14:45:29.256065 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:45:29.256023 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-4bd053-predictor-5f5dcb8686-2q4qs" podUID="e275e1ff-0f52-4bbf-86da-7e237eb1c675" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 22 14:45:31.004014 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:45:31.003984 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x7g6p_5880a8e9-777a-4921-b5f6-c6325c768bf2/ovn-acl-logging/0.log" Apr 22 14:45:31.004605 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:45:31.004580 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x7g6p_5880a8e9-777a-4921-b5f6-c6325c768bf2/ovn-acl-logging/0.log" Apr 22 14:45:39.256109 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:45:39.256067 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-4bd053-predictor-5f5dcb8686-2q4qs" podUID="e275e1ff-0f52-4bbf-86da-7e237eb1c675" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 22 14:45:49.255248 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:45:49.255210 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-4bd053-predictor-5f5dcb8686-2q4qs" podUID="e275e1ff-0f52-4bbf-86da-7e237eb1c675" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 22 14:45:59.255367 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:45:59.255329 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-4bd053-predictor-5f5dcb8686-2q4qs" podUID="e275e1ff-0f52-4bbf-86da-7e237eb1c675" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 22 14:46:09.257047 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:09.257019 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-4bd053-predictor-5f5dcb8686-2q4qs" Apr 22 14:46:12.224188 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:12.224150 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-4bd053-predictor-6645559687-vhthp"] Apr 22 14:46:12.224530 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:12.224405 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6b0d36be-3c72-45d5-8e08-1c2070e60de3" containerName="storage-initializer" Apr 22 14:46:12.224530 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:12.224416 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b0d36be-3c72-45d5-8e08-1c2070e60de3" containerName="storage-initializer" Apr 22 14:46:12.224530 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:12.224425 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6b0d36be-3c72-45d5-8e08-1c2070e60de3" containerName="kserve-container" Apr 22 14:46:12.224530 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:12.224431 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b0d36be-3c72-45d5-8e08-1c2070e60de3" containerName="kserve-container" Apr 22 14:46:12.224530 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:12.224495 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="6b0d36be-3c72-45d5-8e08-1c2070e60de3" containerName="kserve-container" Apr 22 14:46:12.227330 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:12.227315 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-4bd053-predictor-6645559687-vhthp" Apr 22 14:46:12.230573 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:12.230550 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-secret-4bd053\"" Apr 22 14:46:12.230703 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:12.230607 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-sa-4bd053-dockercfg-s75xn\"" Apr 22 14:46:12.231735 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:12.231721 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 22 14:46:12.235797 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:12.235776 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-4bd053-predictor-6645559687-vhthp"] Apr 22 14:46:12.340673 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:12.340652 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/74716d8c-c334-44dd-9a7a-45935d64b1c3-kserve-provision-location\") pod \"isvc-secondary-4bd053-predictor-6645559687-vhthp\" (UID: \"74716d8c-c334-44dd-9a7a-45935d64b1c3\") " pod="kserve-ci-e2e-test/isvc-secondary-4bd053-predictor-6645559687-vhthp" Apr 22 14:46:12.340759 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:12.340677 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/74716d8c-c334-44dd-9a7a-45935d64b1c3-cabundle-cert\") pod \"isvc-secondary-4bd053-predictor-6645559687-vhthp\" (UID: \"74716d8c-c334-44dd-9a7a-45935d64b1c3\") " pod="kserve-ci-e2e-test/isvc-secondary-4bd053-predictor-6645559687-vhthp" Apr 22 14:46:12.441264 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:12.441237 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/74716d8c-c334-44dd-9a7a-45935d64b1c3-kserve-provision-location\") pod \"isvc-secondary-4bd053-predictor-6645559687-vhthp\" (UID: \"74716d8c-c334-44dd-9a7a-45935d64b1c3\") " pod="kserve-ci-e2e-test/isvc-secondary-4bd053-predictor-6645559687-vhthp" Apr 22 14:46:12.441359 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:12.441272 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/74716d8c-c334-44dd-9a7a-45935d64b1c3-cabundle-cert\") pod \"isvc-secondary-4bd053-predictor-6645559687-vhthp\" (UID: \"74716d8c-c334-44dd-9a7a-45935d64b1c3\") " pod="kserve-ci-e2e-test/isvc-secondary-4bd053-predictor-6645559687-vhthp" Apr 22 14:46:12.441582 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:12.441563 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/74716d8c-c334-44dd-9a7a-45935d64b1c3-kserve-provision-location\") pod \"isvc-secondary-4bd053-predictor-6645559687-vhthp\" (UID: \"74716d8c-c334-44dd-9a7a-45935d64b1c3\") " pod="kserve-ci-e2e-test/isvc-secondary-4bd053-predictor-6645559687-vhthp" Apr 22 14:46:12.441869 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:12.441850 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/74716d8c-c334-44dd-9a7a-45935d64b1c3-cabundle-cert\") pod \"isvc-secondary-4bd053-predictor-6645559687-vhthp\" (UID: \"74716d8c-c334-44dd-9a7a-45935d64b1c3\") " pod="kserve-ci-e2e-test/isvc-secondary-4bd053-predictor-6645559687-vhthp" Apr 22 14:46:12.538381 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:12.538328 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-4bd053-predictor-6645559687-vhthp" Apr 22 14:46:12.652187 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:12.652164 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-4bd053-predictor-6645559687-vhthp"] Apr 22 14:46:12.654567 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:46:12.654536 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74716d8c_c334_44dd_9a7a_45935d64b1c3.slice/crio-0e7e73930fcd3e464fbb1405cb0ace9fac3e72fda54c190f8e90ee7ccaecd6d6 WatchSource:0}: Error finding container 0e7e73930fcd3e464fbb1405cb0ace9fac3e72fda54c190f8e90ee7ccaecd6d6: Status 404 returned error can't find the container with id 0e7e73930fcd3e464fbb1405cb0ace9fac3e72fda54c190f8e90ee7ccaecd6d6 Apr 22 14:46:13.423973 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:13.423937 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-4bd053-predictor-6645559687-vhthp" event={"ID":"74716d8c-c334-44dd-9a7a-45935d64b1c3","Type":"ContainerStarted","Data":"6ae3ddf817419af356e8d8d10d491ec05735e407673079e91bee945fc8c3b69c"} Apr 22 14:46:13.423973 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:13.423974 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-4bd053-predictor-6645559687-vhthp" event={"ID":"74716d8c-c334-44dd-9a7a-45935d64b1c3","Type":"ContainerStarted","Data":"0e7e73930fcd3e464fbb1405cb0ace9fac3e72fda54c190f8e90ee7ccaecd6d6"} Apr 22 14:46:16.431255 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:16.431232 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-4bd053-predictor-6645559687-vhthp_74716d8c-c334-44dd-9a7a-45935d64b1c3/storage-initializer/0.log" Apr 22 14:46:16.431573 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:16.431267 2576 generic.go:358] "Generic (PLEG): container finished" podID="74716d8c-c334-44dd-9a7a-45935d64b1c3" containerID="6ae3ddf817419af356e8d8d10d491ec05735e407673079e91bee945fc8c3b69c" exitCode=1 Apr 22 14:46:16.431573 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:16.431325 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-4bd053-predictor-6645559687-vhthp" event={"ID":"74716d8c-c334-44dd-9a7a-45935d64b1c3","Type":"ContainerDied","Data":"6ae3ddf817419af356e8d8d10d491ec05735e407673079e91bee945fc8c3b69c"} Apr 22 14:46:17.434919 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:17.434890 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-4bd053-predictor-6645559687-vhthp_74716d8c-c334-44dd-9a7a-45935d64b1c3/storage-initializer/0.log" Apr 22 14:46:17.435364 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:17.434963 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-4bd053-predictor-6645559687-vhthp" event={"ID":"74716d8c-c334-44dd-9a7a-45935d64b1c3","Type":"ContainerStarted","Data":"223bf9060493c1cf257b6acd502168744c20f5d539e24b96b1bda059a78b089b"} Apr 22 14:46:20.443035 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:20.443013 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-4bd053-predictor-6645559687-vhthp_74716d8c-c334-44dd-9a7a-45935d64b1c3/storage-initializer/1.log" Apr 22 14:46:20.443419 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:20.443388 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-4bd053-predictor-6645559687-vhthp_74716d8c-c334-44dd-9a7a-45935d64b1c3/storage-initializer/0.log" Apr 22 14:46:20.443482 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:20.443423 2576 generic.go:358] "Generic (PLEG): container finished" podID="74716d8c-c334-44dd-9a7a-45935d64b1c3" containerID="223bf9060493c1cf257b6acd502168744c20f5d539e24b96b1bda059a78b089b" exitCode=1 Apr 22 14:46:20.443482 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:20.443470 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-4bd053-predictor-6645559687-vhthp" event={"ID":"74716d8c-c334-44dd-9a7a-45935d64b1c3","Type":"ContainerDied","Data":"223bf9060493c1cf257b6acd502168744c20f5d539e24b96b1bda059a78b089b"} Apr 22 14:46:20.443591 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:20.443503 2576 scope.go:117] "RemoveContainer" containerID="6ae3ddf817419af356e8d8d10d491ec05735e407673079e91bee945fc8c3b69c" Apr 22 14:46:20.443788 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:20.443770 2576 scope.go:117] "RemoveContainer" containerID="6ae3ddf817419af356e8d8d10d491ec05735e407673079e91bee945fc8c3b69c" Apr 22 14:46:20.452883 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:46:20.452803 2576 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-4bd053-predictor-6645559687-vhthp_kserve-ci-e2e-test_74716d8c-c334-44dd-9a7a-45935d64b1c3_0 in pod sandbox 0e7e73930fcd3e464fbb1405cb0ace9fac3e72fda54c190f8e90ee7ccaecd6d6 from index: no such id: '6ae3ddf817419af356e8d8d10d491ec05735e407673079e91bee945fc8c3b69c'" containerID="6ae3ddf817419af356e8d8d10d491ec05735e407673079e91bee945fc8c3b69c" Apr 22 14:46:20.452961 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:46:20.452907 2576 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-4bd053-predictor-6645559687-vhthp_kserve-ci-e2e-test_74716d8c-c334-44dd-9a7a-45935d64b1c3_0 in pod sandbox 0e7e73930fcd3e464fbb1405cb0ace9fac3e72fda54c190f8e90ee7ccaecd6d6 from index: no such id: '6ae3ddf817419af356e8d8d10d491ec05735e407673079e91bee945fc8c3b69c'; Skipping pod \"isvc-secondary-4bd053-predictor-6645559687-vhthp_kserve-ci-e2e-test(74716d8c-c334-44dd-9a7a-45935d64b1c3)\"" logger="UnhandledError" Apr 22 14:46:20.454172 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:46:20.454151 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-secondary-4bd053-predictor-6645559687-vhthp_kserve-ci-e2e-test(74716d8c-c334-44dd-9a7a-45935d64b1c3)\"" pod="kserve-ci-e2e-test/isvc-secondary-4bd053-predictor-6645559687-vhthp" podUID="74716d8c-c334-44dd-9a7a-45935d64b1c3" Apr 22 14:46:21.447399 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:21.447381 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-4bd053-predictor-6645559687-vhthp_74716d8c-c334-44dd-9a7a-45935d64b1c3/storage-initializer/1.log" Apr 22 14:46:28.292423 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:28.292393 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-4bd053-predictor-6645559687-vhthp"] Apr 22 14:46:28.340190 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:28.340161 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-4bd053-predictor-5f5dcb8686-2q4qs"] Apr 22 14:46:28.340521 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:28.340481 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-4bd053-predictor-5f5dcb8686-2q4qs" podUID="e275e1ff-0f52-4bbf-86da-7e237eb1c675" containerName="kserve-container" containerID="cri-o://9633b97b8a67500312884e5d00704b4c34dddcec7f91d86d069f793f97cd296b" gracePeriod=30 Apr 22 14:46:28.403461 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:28.403438 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-0bbdb5-predictor-9f644d9f9-qlv52"] Apr 22 14:46:28.406677 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:28.406659 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-0bbdb5-predictor-9f644d9f9-qlv52" Apr 22 14:46:28.409242 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:28.409224 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-secret-0bbdb5\"" Apr 22 14:46:28.409319 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:28.409297 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-sa-0bbdb5-dockercfg-wghch\"" Apr 22 14:46:28.412720 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:28.412701 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-4bd053-predictor-6645559687-vhthp_74716d8c-c334-44dd-9a7a-45935d64b1c3/storage-initializer/1.log" Apr 22 14:46:28.412796 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:28.412769 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-4bd053-predictor-6645559687-vhthp" Apr 22 14:46:28.415431 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:28.415413 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-0bbdb5-predictor-9f644d9f9-qlv52"] Apr 22 14:46:28.466023 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:28.466001 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-4bd053-predictor-6645559687-vhthp_74716d8c-c334-44dd-9a7a-45935d64b1c3/storage-initializer/1.log" Apr 22 14:46:28.466130 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:28.466067 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-4bd053-predictor-6645559687-vhthp" event={"ID":"74716d8c-c334-44dd-9a7a-45935d64b1c3","Type":"ContainerDied","Data":"0e7e73930fcd3e464fbb1405cb0ace9fac3e72fda54c190f8e90ee7ccaecd6d6"} Apr 22 14:46:28.466130 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:28.466091 2576 scope.go:117] "RemoveContainer" containerID="223bf9060493c1cf257b6acd502168744c20f5d539e24b96b1bda059a78b089b" Apr 22 14:46:28.466130 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:28.466097 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-4bd053-predictor-6645559687-vhthp" Apr 22 14:46:28.544394 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:28.544340 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/74716d8c-c334-44dd-9a7a-45935d64b1c3-cabundle-cert\") pod \"74716d8c-c334-44dd-9a7a-45935d64b1c3\" (UID: \"74716d8c-c334-44dd-9a7a-45935d64b1c3\") " Apr 22 14:46:28.544394 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:28.544376 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/74716d8c-c334-44dd-9a7a-45935d64b1c3-kserve-provision-location\") pod \"74716d8c-c334-44dd-9a7a-45935d64b1c3\" (UID: \"74716d8c-c334-44dd-9a7a-45935d64b1c3\") " Apr 22 14:46:28.544539 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:28.544521 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/2b2a0322-8c62-4b12-bc7e-bad85de64c96-cabundle-cert\") pod \"isvc-init-fail-0bbdb5-predictor-9f644d9f9-qlv52\" (UID: \"2b2a0322-8c62-4b12-bc7e-bad85de64c96\") " pod="kserve-ci-e2e-test/isvc-init-fail-0bbdb5-predictor-9f644d9f9-qlv52" Apr 22 14:46:28.544592 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:28.544565 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2b2a0322-8c62-4b12-bc7e-bad85de64c96-kserve-provision-location\") pod \"isvc-init-fail-0bbdb5-predictor-9f644d9f9-qlv52\" (UID: \"2b2a0322-8c62-4b12-bc7e-bad85de64c96\") " pod="kserve-ci-e2e-test/isvc-init-fail-0bbdb5-predictor-9f644d9f9-qlv52" Apr 22 14:46:28.544651 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:28.544633 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74716d8c-c334-44dd-9a7a-45935d64b1c3-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "74716d8c-c334-44dd-9a7a-45935d64b1c3" (UID: "74716d8c-c334-44dd-9a7a-45935d64b1c3"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:46:28.544696 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:28.544659 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74716d8c-c334-44dd-9a7a-45935d64b1c3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "74716d8c-c334-44dd-9a7a-45935d64b1c3" (UID: "74716d8c-c334-44dd-9a7a-45935d64b1c3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:46:28.645255 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:28.645228 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/2b2a0322-8c62-4b12-bc7e-bad85de64c96-cabundle-cert\") pod \"isvc-init-fail-0bbdb5-predictor-9f644d9f9-qlv52\" (UID: \"2b2a0322-8c62-4b12-bc7e-bad85de64c96\") " pod="kserve-ci-e2e-test/isvc-init-fail-0bbdb5-predictor-9f644d9f9-qlv52" Apr 22 14:46:28.645373 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:28.645262 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2b2a0322-8c62-4b12-bc7e-bad85de64c96-kserve-provision-location\") pod \"isvc-init-fail-0bbdb5-predictor-9f644d9f9-qlv52\" (UID: \"2b2a0322-8c62-4b12-bc7e-bad85de64c96\") " pod="kserve-ci-e2e-test/isvc-init-fail-0bbdb5-predictor-9f644d9f9-qlv52" Apr 22 14:46:28.645373 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:28.645328 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/74716d8c-c334-44dd-9a7a-45935d64b1c3-kserve-provision-location\") on node \"ip-10-0-142-195.ec2.internal\" DevicePath \"\"" Apr 22 14:46:28.645373 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:28.645338 2576 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/74716d8c-c334-44dd-9a7a-45935d64b1c3-cabundle-cert\") on node \"ip-10-0-142-195.ec2.internal\" DevicePath \"\"" Apr 22 14:46:28.645652 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:28.645637 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2b2a0322-8c62-4b12-bc7e-bad85de64c96-kserve-provision-location\") pod \"isvc-init-fail-0bbdb5-predictor-9f644d9f9-qlv52\" (UID: \"2b2a0322-8c62-4b12-bc7e-bad85de64c96\") " pod="kserve-ci-e2e-test/isvc-init-fail-0bbdb5-predictor-9f644d9f9-qlv52" Apr 22 14:46:28.645765 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:28.645748 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/2b2a0322-8c62-4b12-bc7e-bad85de64c96-cabundle-cert\") pod \"isvc-init-fail-0bbdb5-predictor-9f644d9f9-qlv52\" (UID: \"2b2a0322-8c62-4b12-bc7e-bad85de64c96\") " pod="kserve-ci-e2e-test/isvc-init-fail-0bbdb5-predictor-9f644d9f9-qlv52" Apr 22 14:46:28.722261 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:28.722241 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-0bbdb5-predictor-9f644d9f9-qlv52" Apr 22 14:46:28.811675 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:28.811590 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-4bd053-predictor-6645559687-vhthp"] Apr 22 14:46:28.814165 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:28.814143 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-4bd053-predictor-6645559687-vhthp"] Apr 22 14:46:28.835180 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:28.835160 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-0bbdb5-predictor-9f644d9f9-qlv52"] Apr 22 14:46:28.837904 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:46:28.837873 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b2a0322_8c62_4b12_bc7e_bad85de64c96.slice/crio-57395513c00a63a4b2b3c615e2eada1de14e00338c640cec7f26a2b6068f8a8e WatchSource:0}: Error finding container 57395513c00a63a4b2b3c615e2eada1de14e00338c640cec7f26a2b6068f8a8e: Status 404 returned error can't find the container with id 57395513c00a63a4b2b3c615e2eada1de14e00338c640cec7f26a2b6068f8a8e Apr 22 14:46:29.256062 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:29.256019 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-4bd053-predictor-5f5dcb8686-2q4qs" podUID="e275e1ff-0f52-4bbf-86da-7e237eb1c675" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 22 14:46:29.471439 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:29.471407 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-0bbdb5-predictor-9f644d9f9-qlv52" event={"ID":"2b2a0322-8c62-4b12-bc7e-bad85de64c96","Type":"ContainerStarted","Data":"abb0c745f2af6cae5a624a18833ebc44c304ca8c7b696ddaa81e35c533677b29"} Apr 22 14:46:29.471439 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:29.471441 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-0bbdb5-predictor-9f644d9f9-qlv52" event={"ID":"2b2a0322-8c62-4b12-bc7e-bad85de64c96","Type":"ContainerStarted","Data":"57395513c00a63a4b2b3c615e2eada1de14e00338c640cec7f26a2b6068f8a8e"} Apr 22 14:46:30.293014 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:30.292916 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74716d8c-c334-44dd-9a7a-45935d64b1c3" path="/var/lib/kubelet/pods/74716d8c-c334-44dd-9a7a-45935d64b1c3/volumes" Apr 22 14:46:31.770387 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:31.770367 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-4bd053-predictor-5f5dcb8686-2q4qs" Apr 22 14:46:31.866757 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:31.866735 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e275e1ff-0f52-4bbf-86da-7e237eb1c675-kserve-provision-location\") pod \"e275e1ff-0f52-4bbf-86da-7e237eb1c675\" (UID: \"e275e1ff-0f52-4bbf-86da-7e237eb1c675\") " Apr 22 14:46:31.866982 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:31.866962 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e275e1ff-0f52-4bbf-86da-7e237eb1c675-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e275e1ff-0f52-4bbf-86da-7e237eb1c675" (UID: "e275e1ff-0f52-4bbf-86da-7e237eb1c675"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:46:31.967929 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:31.967861 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e275e1ff-0f52-4bbf-86da-7e237eb1c675-kserve-provision-location\") on node \"ip-10-0-142-195.ec2.internal\" DevicePath \"\"" Apr 22 14:46:32.479468 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:32.479445 2576 generic.go:358] "Generic (PLEG): container finished" podID="e275e1ff-0f52-4bbf-86da-7e237eb1c675" containerID="9633b97b8a67500312884e5d00704b4c34dddcec7f91d86d069f793f97cd296b" exitCode=0 Apr 22 14:46:32.479554 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:32.479483 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-4bd053-predictor-5f5dcb8686-2q4qs" event={"ID":"e275e1ff-0f52-4bbf-86da-7e237eb1c675","Type":"ContainerDied","Data":"9633b97b8a67500312884e5d00704b4c34dddcec7f91d86d069f793f97cd296b"} Apr 22 14:46:32.479554 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:32.479506 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-4bd053-predictor-5f5dcb8686-2q4qs" Apr 22 14:46:32.479554 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:32.479515 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-4bd053-predictor-5f5dcb8686-2q4qs" event={"ID":"e275e1ff-0f52-4bbf-86da-7e237eb1c675","Type":"ContainerDied","Data":"79705e7eb7e8fdf3a01e7db86fa967bed224dafcbb45a341e5cb03c04a6f778e"} Apr 22 14:46:32.479554 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:32.479533 2576 scope.go:117] "RemoveContainer" containerID="9633b97b8a67500312884e5d00704b4c34dddcec7f91d86d069f793f97cd296b" Apr 22 14:46:32.486637 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:32.486623 2576 scope.go:117] "RemoveContainer" containerID="e1ad64bb02660b88aada8c237607e5aa8deb0c4e6e44a7b01fd8b58ce166b56a" Apr 22 14:46:32.492923 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:32.492907 2576 scope.go:117] "RemoveContainer" containerID="9633b97b8a67500312884e5d00704b4c34dddcec7f91d86d069f793f97cd296b" Apr 22 14:46:32.493145 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:46:32.493128 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9633b97b8a67500312884e5d00704b4c34dddcec7f91d86d069f793f97cd296b\": container with ID starting with 9633b97b8a67500312884e5d00704b4c34dddcec7f91d86d069f793f97cd296b not found: ID does not exist" containerID="9633b97b8a67500312884e5d00704b4c34dddcec7f91d86d069f793f97cd296b" Apr 22 14:46:32.493188 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:32.493154 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9633b97b8a67500312884e5d00704b4c34dddcec7f91d86d069f793f97cd296b"} err="failed to get container status \"9633b97b8a67500312884e5d00704b4c34dddcec7f91d86d069f793f97cd296b\": rpc error: code = NotFound desc = could not find container \"9633b97b8a67500312884e5d00704b4c34dddcec7f91d86d069f793f97cd296b\": container with ID starting with 9633b97b8a67500312884e5d00704b4c34dddcec7f91d86d069f793f97cd296b not found: ID does not exist" Apr 22 14:46:32.493188 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:32.493170 2576 scope.go:117] "RemoveContainer" containerID="e1ad64bb02660b88aada8c237607e5aa8deb0c4e6e44a7b01fd8b58ce166b56a" Apr 22 14:46:32.493401 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:46:32.493386 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1ad64bb02660b88aada8c237607e5aa8deb0c4e6e44a7b01fd8b58ce166b56a\": container with ID starting with e1ad64bb02660b88aada8c237607e5aa8deb0c4e6e44a7b01fd8b58ce166b56a not found: ID does not exist" containerID="e1ad64bb02660b88aada8c237607e5aa8deb0c4e6e44a7b01fd8b58ce166b56a" Apr 22 14:46:32.493450 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:32.493404 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1ad64bb02660b88aada8c237607e5aa8deb0c4e6e44a7b01fd8b58ce166b56a"} err="failed to get container status \"e1ad64bb02660b88aada8c237607e5aa8deb0c4e6e44a7b01fd8b58ce166b56a\": rpc error: code = NotFound desc = could not find container \"e1ad64bb02660b88aada8c237607e5aa8deb0c4e6e44a7b01fd8b58ce166b56a\": container with ID starting with e1ad64bb02660b88aada8c237607e5aa8deb0c4e6e44a7b01fd8b58ce166b56a not found: ID does not exist" Apr 22 14:46:32.496675 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:32.496653 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-4bd053-predictor-5f5dcb8686-2q4qs"] Apr 22 14:46:32.500371 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:32.500352 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-4bd053-predictor-5f5dcb8686-2q4qs"] Apr 22 14:46:34.292177 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:34.292148 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e275e1ff-0f52-4bbf-86da-7e237eb1c675" path="/var/lib/kubelet/pods/e275e1ff-0f52-4bbf-86da-7e237eb1c675/volumes" Apr 22 14:46:34.486270 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:34.486243 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-0bbdb5-predictor-9f644d9f9-qlv52_2b2a0322-8c62-4b12-bc7e-bad85de64c96/storage-initializer/0.log" Apr 22 14:46:34.486384 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:34.486281 2576 generic.go:358] "Generic (PLEG): container finished" podID="2b2a0322-8c62-4b12-bc7e-bad85de64c96" containerID="abb0c745f2af6cae5a624a18833ebc44c304ca8c7b696ddaa81e35c533677b29" exitCode=1 Apr 22 14:46:34.486384 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:34.486350 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-0bbdb5-predictor-9f644d9f9-qlv52" event={"ID":"2b2a0322-8c62-4b12-bc7e-bad85de64c96","Type":"ContainerDied","Data":"abb0c745f2af6cae5a624a18833ebc44c304ca8c7b696ddaa81e35c533677b29"} Apr 22 14:46:35.489996 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:35.489970 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-0bbdb5-predictor-9f644d9f9-qlv52_2b2a0322-8c62-4b12-bc7e-bad85de64c96/storage-initializer/0.log" Apr 22 14:46:35.490338 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:35.490058 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-0bbdb5-predictor-9f644d9f9-qlv52" event={"ID":"2b2a0322-8c62-4b12-bc7e-bad85de64c96","Type":"ContainerStarted","Data":"0488898353fdea7a64ff661d1ec15d15f6a0c7f1c83e8d96ad9a01b9e3c62491"} Apr 22 14:46:38.439619 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:38.439541 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-0bbdb5-predictor-9f644d9f9-qlv52"] Apr 22 14:46:38.440046 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:38.439835 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-init-fail-0bbdb5-predictor-9f644d9f9-qlv52" podUID="2b2a0322-8c62-4b12-bc7e-bad85de64c96" containerName="storage-initializer" containerID="cri-o://0488898353fdea7a64ff661d1ec15d15f6a0c7f1c83e8d96ad9a01b9e3c62491" gracePeriod=30 Apr 22 14:46:38.544907 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:38.544881 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-c2cpz"] Apr 22 14:46:38.545193 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:38.545177 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="74716d8c-c334-44dd-9a7a-45935d64b1c3" containerName="storage-initializer" Apr 22 14:46:38.545268 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:38.545196 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="74716d8c-c334-44dd-9a7a-45935d64b1c3" containerName="storage-initializer" Apr 22 14:46:38.545268 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:38.545207 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e275e1ff-0f52-4bbf-86da-7e237eb1c675" containerName="kserve-container" Apr 22 14:46:38.545268 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:38.545215 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e275e1ff-0f52-4bbf-86da-7e237eb1c675" containerName="kserve-container" Apr 22 14:46:38.545268 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:38.545224 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="74716d8c-c334-44dd-9a7a-45935d64b1c3" containerName="storage-initializer" Apr 22 14:46:38.545268 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:38.545233 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="74716d8c-c334-44dd-9a7a-45935d64b1c3" containerName="storage-initializer" Apr 22 14:46:38.545498 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:38.545271 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e275e1ff-0f52-4bbf-86da-7e237eb1c675" containerName="storage-initializer" Apr 22 14:46:38.545498 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:38.545279 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e275e1ff-0f52-4bbf-86da-7e237eb1c675" containerName="storage-initializer" Apr 22 14:46:38.545498 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:38.545343 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="e275e1ff-0f52-4bbf-86da-7e237eb1c675" containerName="kserve-container" Apr 22 14:46:38.545498 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:38.545355 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="74716d8c-c334-44dd-9a7a-45935d64b1c3" containerName="storage-initializer" Apr 22 14:46:38.545498 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:38.545364 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="74716d8c-c334-44dd-9a7a-45935d64b1c3" containerName="storage-initializer" Apr 22 14:46:38.548776 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:38.548758 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-c2cpz" Apr 22 14:46:38.551212 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:38.551194 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-zp85x\"" Apr 22 14:46:38.563734 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:38.563708 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-c2cpz"] Apr 22 14:46:38.613485 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:38.613454 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e2f1d707-e2d2-4b5f-a313-38c5700e6374-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-85bccb8945-c2cpz\" (UID: \"e2f1d707-e2d2-4b5f-a313-38c5700e6374\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-c2cpz" Apr 22 14:46:38.714095 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:38.714045 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e2f1d707-e2d2-4b5f-a313-38c5700e6374-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-85bccb8945-c2cpz\" (UID: \"e2f1d707-e2d2-4b5f-a313-38c5700e6374\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-c2cpz" Apr 22 14:46:38.714322 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:38.714308 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e2f1d707-e2d2-4b5f-a313-38c5700e6374-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-85bccb8945-c2cpz\" (UID: \"e2f1d707-e2d2-4b5f-a313-38c5700e6374\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-c2cpz" Apr 22 14:46:38.859171 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:38.859148 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-c2cpz" Apr 22 14:46:38.968659 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:38.968638 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-c2cpz"] Apr 22 14:46:38.970790 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:46:38.970762 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2f1d707_e2d2_4b5f_a313_38c5700e6374.slice/crio-c41dfee6b2fef33867747c3c72da78a77019b9cad0b227d17e214974b47c6b6e WatchSource:0}: Error finding container c41dfee6b2fef33867747c3c72da78a77019b9cad0b227d17e214974b47c6b6e: Status 404 returned error can't find the container with id c41dfee6b2fef33867747c3c72da78a77019b9cad0b227d17e214974b47c6b6e Apr 22 14:46:39.503671 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:39.503644 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-c2cpz" event={"ID":"e2f1d707-e2d2-4b5f-a313-38c5700e6374","Type":"ContainerStarted","Data":"981c3bebcd0c729d153a66de70a74812f74bb8119a9e4f7f563dda74e8776884"} Apr 22 14:46:39.503989 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:39.503679 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-c2cpz" event={"ID":"e2f1d707-e2d2-4b5f-a313-38c5700e6374","Type":"ContainerStarted","Data":"c41dfee6b2fef33867747c3c72da78a77019b9cad0b227d17e214974b47c6b6e"} Apr 22 14:46:39.665432 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:39.665413 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-0bbdb5-predictor-9f644d9f9-qlv52_2b2a0322-8c62-4b12-bc7e-bad85de64c96/storage-initializer/1.log" Apr 22 14:46:39.665755 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:39.665737 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-0bbdb5-predictor-9f644d9f9-qlv52_2b2a0322-8c62-4b12-bc7e-bad85de64c96/storage-initializer/0.log" Apr 22 14:46:39.665861 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:39.665797 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-0bbdb5-predictor-9f644d9f9-qlv52" Apr 22 14:46:39.821478 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:39.821449 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2b2a0322-8c62-4b12-bc7e-bad85de64c96-kserve-provision-location\") pod \"2b2a0322-8c62-4b12-bc7e-bad85de64c96\" (UID: \"2b2a0322-8c62-4b12-bc7e-bad85de64c96\") " Apr 22 14:46:39.821568 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:39.821493 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/2b2a0322-8c62-4b12-bc7e-bad85de64c96-cabundle-cert\") pod \"2b2a0322-8c62-4b12-bc7e-bad85de64c96\" (UID: \"2b2a0322-8c62-4b12-bc7e-bad85de64c96\") " Apr 22 14:46:39.821734 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:39.821717 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b2a0322-8c62-4b12-bc7e-bad85de64c96-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2b2a0322-8c62-4b12-bc7e-bad85de64c96" (UID: "2b2a0322-8c62-4b12-bc7e-bad85de64c96"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:46:39.821864 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:39.821844 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b2a0322-8c62-4b12-bc7e-bad85de64c96-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "2b2a0322-8c62-4b12-bc7e-bad85de64c96" (UID: "2b2a0322-8c62-4b12-bc7e-bad85de64c96"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:46:39.922823 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:39.922784 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2b2a0322-8c62-4b12-bc7e-bad85de64c96-kserve-provision-location\") on node \"ip-10-0-142-195.ec2.internal\" DevicePath \"\"" Apr 22 14:46:39.922823 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:39.922803 2576 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/2b2a0322-8c62-4b12-bc7e-bad85de64c96-cabundle-cert\") on node \"ip-10-0-142-195.ec2.internal\" DevicePath \"\"" Apr 22 14:46:40.507217 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:40.507197 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-0bbdb5-predictor-9f644d9f9-qlv52_2b2a0322-8c62-4b12-bc7e-bad85de64c96/storage-initializer/1.log" Apr 22 14:46:40.507561 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:40.507546 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-0bbdb5-predictor-9f644d9f9-qlv52_2b2a0322-8c62-4b12-bc7e-bad85de64c96/storage-initializer/0.log" Apr 22 14:46:40.507628 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:40.507585 2576 generic.go:358] "Generic (PLEG): container finished" podID="2b2a0322-8c62-4b12-bc7e-bad85de64c96" containerID="0488898353fdea7a64ff661d1ec15d15f6a0c7f1c83e8d96ad9a01b9e3c62491" exitCode=1 Apr 22 14:46:40.507678 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:40.507644 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-0bbdb5-predictor-9f644d9f9-qlv52" Apr 22 14:46:40.507725 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:40.507669 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-0bbdb5-predictor-9f644d9f9-qlv52" event={"ID":"2b2a0322-8c62-4b12-bc7e-bad85de64c96","Type":"ContainerDied","Data":"0488898353fdea7a64ff661d1ec15d15f6a0c7f1c83e8d96ad9a01b9e3c62491"} Apr 22 14:46:40.507725 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:40.507698 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-0bbdb5-predictor-9f644d9f9-qlv52" event={"ID":"2b2a0322-8c62-4b12-bc7e-bad85de64c96","Type":"ContainerDied","Data":"57395513c00a63a4b2b3c615e2eada1de14e00338c640cec7f26a2b6068f8a8e"} Apr 22 14:46:40.507725 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:40.507715 2576 scope.go:117] "RemoveContainer" containerID="0488898353fdea7a64ff661d1ec15d15f6a0c7f1c83e8d96ad9a01b9e3c62491" Apr 22 14:46:40.515241 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:40.515217 2576 scope.go:117] "RemoveContainer" containerID="abb0c745f2af6cae5a624a18833ebc44c304ca8c7b696ddaa81e35c533677b29" Apr 22 14:46:40.522589 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:40.522573 2576 scope.go:117] "RemoveContainer" containerID="0488898353fdea7a64ff661d1ec15d15f6a0c7f1c83e8d96ad9a01b9e3c62491" Apr 22 14:46:40.522849 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:46:40.522831 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0488898353fdea7a64ff661d1ec15d15f6a0c7f1c83e8d96ad9a01b9e3c62491\": container with ID starting with 0488898353fdea7a64ff661d1ec15d15f6a0c7f1c83e8d96ad9a01b9e3c62491 not found: ID does not exist" containerID="0488898353fdea7a64ff661d1ec15d15f6a0c7f1c83e8d96ad9a01b9e3c62491" Apr 22 14:46:40.522935 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:40.522853 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0488898353fdea7a64ff661d1ec15d15f6a0c7f1c83e8d96ad9a01b9e3c62491"} err="failed to get container status \"0488898353fdea7a64ff661d1ec15d15f6a0c7f1c83e8d96ad9a01b9e3c62491\": rpc error: code = NotFound desc = could not find container \"0488898353fdea7a64ff661d1ec15d15f6a0c7f1c83e8d96ad9a01b9e3c62491\": container with ID starting with 0488898353fdea7a64ff661d1ec15d15f6a0c7f1c83e8d96ad9a01b9e3c62491 not found: ID does not exist" Apr 22 14:46:40.522935 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:40.522868 2576 scope.go:117] "RemoveContainer" containerID="abb0c745f2af6cae5a624a18833ebc44c304ca8c7b696ddaa81e35c533677b29" Apr 22 14:46:40.523096 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:46:40.523078 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abb0c745f2af6cae5a624a18833ebc44c304ca8c7b696ddaa81e35c533677b29\": container with ID starting with abb0c745f2af6cae5a624a18833ebc44c304ca8c7b696ddaa81e35c533677b29 not found: ID does not exist" containerID="abb0c745f2af6cae5a624a18833ebc44c304ca8c7b696ddaa81e35c533677b29" Apr 22 14:46:40.523153 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:40.523103 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abb0c745f2af6cae5a624a18833ebc44c304ca8c7b696ddaa81e35c533677b29"} err="failed to get container status \"abb0c745f2af6cae5a624a18833ebc44c304ca8c7b696ddaa81e35c533677b29\": rpc error: code = NotFound desc = could not find container \"abb0c745f2af6cae5a624a18833ebc44c304ca8c7b696ddaa81e35c533677b29\": container with ID starting with abb0c745f2af6cae5a624a18833ebc44c304ca8c7b696ddaa81e35c533677b29 not found: ID does not exist" Apr 22 14:46:40.537902 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:40.537880 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-0bbdb5-predictor-9f644d9f9-qlv52"] Apr 22 14:46:40.540166 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:40.540146 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-0bbdb5-predictor-9f644d9f9-qlv52"] Apr 22 14:46:42.291630 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:42.291596 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b2a0322-8c62-4b12-bc7e-bad85de64c96" path="/var/lib/kubelet/pods/2b2a0322-8c62-4b12-bc7e-bad85de64c96/volumes" Apr 22 14:46:43.517289 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:43.517261 2576 generic.go:358] "Generic (PLEG): container finished" podID="e2f1d707-e2d2-4b5f-a313-38c5700e6374" containerID="981c3bebcd0c729d153a66de70a74812f74bb8119a9e4f7f563dda74e8776884" exitCode=0 Apr 22 14:46:43.517627 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:46:43.517321 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-c2cpz" event={"ID":"e2f1d707-e2d2-4b5f-a313-38c5700e6374","Type":"ContainerDied","Data":"981c3bebcd0c729d153a66de70a74812f74bb8119a9e4f7f563dda74e8776884"} Apr 22 14:47:03.581238 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:47:03.581180 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-c2cpz" event={"ID":"e2f1d707-e2d2-4b5f-a313-38c5700e6374","Type":"ContainerStarted","Data":"39bd7053dca5f1ee9f3f68567802786d2c0de4ca06461b53a7dfb6cb667e0af6"} Apr 22 14:47:03.581556 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:47:03.581448 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-c2cpz" Apr 22 14:47:03.582582 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:47:03.582556 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-c2cpz" podUID="e2f1d707-e2d2-4b5f-a313-38c5700e6374" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 22 14:47:03.597568 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:47:03.597525 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-c2cpz" podStartSLOduration=5.789333961 podStartE2EDuration="25.597513918s" podCreationTimestamp="2026-04-22 14:46:38 +0000 UTC" firstStartedPulling="2026-04-22 14:46:43.518383719 +0000 UTC m=+1873.844118113" lastFinishedPulling="2026-04-22 14:47:03.326563676 +0000 UTC m=+1893.652298070" observedRunningTime="2026-04-22 14:47:03.596561886 +0000 UTC m=+1893.922296301" watchObservedRunningTime="2026-04-22 14:47:03.597513918 +0000 UTC m=+1893.923248331" Apr 22 14:47:04.587321 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:47:04.587287 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-c2cpz" podUID="e2f1d707-e2d2-4b5f-a313-38c5700e6374" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 22 14:47:14.588098 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:47:14.588053 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-c2cpz" podUID="e2f1d707-e2d2-4b5f-a313-38c5700e6374" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 22 14:47:24.587442 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:47:24.587385 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-c2cpz" podUID="e2f1d707-e2d2-4b5f-a313-38c5700e6374" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 22 14:47:34.587603 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:47:34.587562 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-c2cpz" podUID="e2f1d707-e2d2-4b5f-a313-38c5700e6374" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 22 14:47:44.587902 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:47:44.587860 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-c2cpz" podUID="e2f1d707-e2d2-4b5f-a313-38c5700e6374" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 22 14:47:54.587657 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:47:54.587614 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-c2cpz" podUID="e2f1d707-e2d2-4b5f-a313-38c5700e6374" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 22 14:48:04.587224 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:48:04.587185 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-c2cpz" podUID="e2f1d707-e2d2-4b5f-a313-38c5700e6374" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 22 14:48:14.588014 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:48:14.587945 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-c2cpz" Apr 22 14:48:18.742114 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:48:18.742078 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-c2cpz"] Apr 22 14:48:18.742544 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:48:18.742392 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-c2cpz" podUID="e2f1d707-e2d2-4b5f-a313-38c5700e6374" containerName="kserve-container" containerID="cri-o://39bd7053dca5f1ee9f3f68567802786d2c0de4ca06461b53a7dfb6cb667e0af6" gracePeriod=30 Apr 22 14:48:18.817498 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:48:18.817473 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-wk94w"] Apr 22 14:48:18.817732 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:48:18.817721 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2b2a0322-8c62-4b12-bc7e-bad85de64c96" containerName="storage-initializer" Apr 22 14:48:18.817775 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:48:18.817734 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b2a0322-8c62-4b12-bc7e-bad85de64c96" containerName="storage-initializer" Apr 22 14:48:18.817831 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:48:18.817789 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="2b2a0322-8c62-4b12-bc7e-bad85de64c96" containerName="storage-initializer" Apr 22 14:48:18.817831 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:48:18.817797 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="2b2a0322-8c62-4b12-bc7e-bad85de64c96" containerName="storage-initializer" Apr 22 14:48:18.817894 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:48:18.817871 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2b2a0322-8c62-4b12-bc7e-bad85de64c96" containerName="storage-initializer" Apr 22 14:48:18.817894 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:48:18.817878 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b2a0322-8c62-4b12-bc7e-bad85de64c96" containerName="storage-initializer" Apr 22 14:48:18.819594 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:48:18.819579 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-wk94w" Apr 22 14:48:18.827989 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:48:18.827964 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-wk94w"] Apr 22 14:48:18.906451 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:48:18.906426 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2859e7c7-115a-47b0-89f2-5c5447400284-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-794b54b9b4-wk94w\" (UID: \"2859e7c7-115a-47b0-89f2-5c5447400284\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-wk94w" Apr 22 14:48:19.007577 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:48:19.007527 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2859e7c7-115a-47b0-89f2-5c5447400284-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-794b54b9b4-wk94w\" (UID: \"2859e7c7-115a-47b0-89f2-5c5447400284\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-wk94w" Apr 22 14:48:19.007834 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:48:19.007803 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2859e7c7-115a-47b0-89f2-5c5447400284-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-794b54b9b4-wk94w\" (UID: \"2859e7c7-115a-47b0-89f2-5c5447400284\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-wk94w" Apr 22 14:48:19.129201 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:48:19.129179 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-wk94w" Apr 22 14:48:19.243142 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:48:19.243083 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-wk94w"] Apr 22 14:48:19.245344 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:48:19.245307 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2859e7c7_115a_47b0_89f2_5c5447400284.slice/crio-e9777da97149b202d8bbb42c8b034efb08933b939f244bf111e5b42127123dc8 WatchSource:0}: Error finding container e9777da97149b202d8bbb42c8b034efb08933b939f244bf111e5b42127123dc8: Status 404 returned error can't find the container with id e9777da97149b202d8bbb42c8b034efb08933b939f244bf111e5b42127123dc8 Apr 22 14:48:19.787934 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:48:19.787894 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-wk94w" event={"ID":"2859e7c7-115a-47b0-89f2-5c5447400284","Type":"ContainerStarted","Data":"6baab5c53e02780981459ba0bf5fdd84664ceca49ba2e50835ef194b2b20bf6d"} Apr 22 14:48:19.787934 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:48:19.787935 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-wk94w" event={"ID":"2859e7c7-115a-47b0-89f2-5c5447400284","Type":"ContainerStarted","Data":"e9777da97149b202d8bbb42c8b034efb08933b939f244bf111e5b42127123dc8"} Apr 22 14:48:22.474881 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:48:22.474861 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-c2cpz" Apr 22 14:48:22.531464 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:48:22.531435 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e2f1d707-e2d2-4b5f-a313-38c5700e6374-kserve-provision-location\") pod \"e2f1d707-e2d2-4b5f-a313-38c5700e6374\" (UID: \"e2f1d707-e2d2-4b5f-a313-38c5700e6374\") " Apr 22 14:48:22.531734 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:48:22.531710 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2f1d707-e2d2-4b5f-a313-38c5700e6374-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e2f1d707-e2d2-4b5f-a313-38c5700e6374" (UID: "e2f1d707-e2d2-4b5f-a313-38c5700e6374"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:48:22.632103 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:48:22.632053 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e2f1d707-e2d2-4b5f-a313-38c5700e6374-kserve-provision-location\") on node \"ip-10-0-142-195.ec2.internal\" DevicePath \"\"" Apr 22 14:48:22.796419 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:48:22.796397 2576 generic.go:358] "Generic (PLEG): container finished" podID="e2f1d707-e2d2-4b5f-a313-38c5700e6374" containerID="39bd7053dca5f1ee9f3f68567802786d2c0de4ca06461b53a7dfb6cb667e0af6" exitCode=0 Apr 22 14:48:22.796505 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:48:22.796456 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-c2cpz" Apr 22 14:48:22.796505 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:48:22.796474 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-c2cpz" event={"ID":"e2f1d707-e2d2-4b5f-a313-38c5700e6374","Type":"ContainerDied","Data":"39bd7053dca5f1ee9f3f68567802786d2c0de4ca06461b53a7dfb6cb667e0af6"} Apr 22 14:48:22.796579 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:48:22.796510 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-c2cpz" event={"ID":"e2f1d707-e2d2-4b5f-a313-38c5700e6374","Type":"ContainerDied","Data":"c41dfee6b2fef33867747c3c72da78a77019b9cad0b227d17e214974b47c6b6e"} Apr 22 14:48:22.796579 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:48:22.796528 2576 scope.go:117] "RemoveContainer" containerID="39bd7053dca5f1ee9f3f68567802786d2c0de4ca06461b53a7dfb6cb667e0af6" Apr 22 14:48:22.803854 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:48:22.803839 2576 scope.go:117] "RemoveContainer" containerID="981c3bebcd0c729d153a66de70a74812f74bb8119a9e4f7f563dda74e8776884" Apr 22 14:48:22.810178 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:48:22.810160 2576 scope.go:117] "RemoveContainer" containerID="39bd7053dca5f1ee9f3f68567802786d2c0de4ca06461b53a7dfb6cb667e0af6" Apr 22 14:48:22.810408 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:48:22.810391 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39bd7053dca5f1ee9f3f68567802786d2c0de4ca06461b53a7dfb6cb667e0af6\": container with ID starting with 39bd7053dca5f1ee9f3f68567802786d2c0de4ca06461b53a7dfb6cb667e0af6 not found: ID does not exist" containerID="39bd7053dca5f1ee9f3f68567802786d2c0de4ca06461b53a7dfb6cb667e0af6" Apr 22 14:48:22.810477 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:48:22.810421 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39bd7053dca5f1ee9f3f68567802786d2c0de4ca06461b53a7dfb6cb667e0af6"} err="failed to get container status \"39bd7053dca5f1ee9f3f68567802786d2c0de4ca06461b53a7dfb6cb667e0af6\": rpc error: code = NotFound desc = could not find container \"39bd7053dca5f1ee9f3f68567802786d2c0de4ca06461b53a7dfb6cb667e0af6\": container with ID starting with 39bd7053dca5f1ee9f3f68567802786d2c0de4ca06461b53a7dfb6cb667e0af6 not found: ID does not exist" Apr 22 14:48:22.810477 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:48:22.810445 2576 scope.go:117] "RemoveContainer" containerID="981c3bebcd0c729d153a66de70a74812f74bb8119a9e4f7f563dda74e8776884" Apr 22 14:48:22.810677 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:48:22.810661 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"981c3bebcd0c729d153a66de70a74812f74bb8119a9e4f7f563dda74e8776884\": container with ID starting with 981c3bebcd0c729d153a66de70a74812f74bb8119a9e4f7f563dda74e8776884 not found: ID does not exist" containerID="981c3bebcd0c729d153a66de70a74812f74bb8119a9e4f7f563dda74e8776884" Apr 22 14:48:22.810714 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:48:22.810683 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"981c3bebcd0c729d153a66de70a74812f74bb8119a9e4f7f563dda74e8776884"} err="failed to get container status \"981c3bebcd0c729d153a66de70a74812f74bb8119a9e4f7f563dda74e8776884\": rpc error: code = NotFound desc = could not find container \"981c3bebcd0c729d153a66de70a74812f74bb8119a9e4f7f563dda74e8776884\": container with ID starting with 981c3bebcd0c729d153a66de70a74812f74bb8119a9e4f7f563dda74e8776884 not found: ID does not exist" Apr 22 14:48:22.816696 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:48:22.816675 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-c2cpz"] Apr 22 14:48:22.820434 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:48:22.820416 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-c2cpz"] Apr 22 14:48:23.801417 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:48:23.801395 2576 generic.go:358] "Generic (PLEG): container finished" podID="2859e7c7-115a-47b0-89f2-5c5447400284" containerID="6baab5c53e02780981459ba0bf5fdd84664ceca49ba2e50835ef194b2b20bf6d" exitCode=0 Apr 22 14:48:23.801710 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:48:23.801458 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-wk94w" event={"ID":"2859e7c7-115a-47b0-89f2-5c5447400284","Type":"ContainerDied","Data":"6baab5c53e02780981459ba0bf5fdd84664ceca49ba2e50835ef194b2b20bf6d"} Apr 22 14:48:24.291995 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:48:24.291968 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2f1d707-e2d2-4b5f-a313-38c5700e6374" path="/var/lib/kubelet/pods/e2f1d707-e2d2-4b5f-a313-38c5700e6374/volumes" Apr 22 14:48:24.804886 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:48:24.804862 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-wk94w" event={"ID":"2859e7c7-115a-47b0-89f2-5c5447400284","Type":"ContainerStarted","Data":"2b4ecea9509ff43fe5ff3e4b82ea15a04d7b7baf5c149aa2e440d4a2f6ad1acc"} Apr 22 14:48:24.805221 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:48:24.805208 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-wk94w" Apr 22 14:48:24.806207 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:48:24.806183 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-wk94w" podUID="2859e7c7-115a-47b0-89f2-5c5447400284" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 22 14:48:24.833048 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:48:24.833005 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-wk94w" podStartSLOduration=6.83299219 podStartE2EDuration="6.83299219s" podCreationTimestamp="2026-04-22 14:48:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:48:24.832872888 +0000 UTC m=+1975.158607303" watchObservedRunningTime="2026-04-22 14:48:24.83299219 +0000 UTC m=+1975.158726605" Apr 22 14:48:25.807235 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:48:25.807197 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-wk94w" podUID="2859e7c7-115a-47b0-89f2-5c5447400284" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 22 14:48:35.808084 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:48:35.808046 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-wk94w" podUID="2859e7c7-115a-47b0-89f2-5c5447400284" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 22 14:48:45.808177 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:48:45.808138 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-wk94w" podUID="2859e7c7-115a-47b0-89f2-5c5447400284" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 22 14:48:55.807257 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:48:55.807219 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-wk94w" podUID="2859e7c7-115a-47b0-89f2-5c5447400284" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 22 14:49:05.807935 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:49:05.807895 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-wk94w" podUID="2859e7c7-115a-47b0-89f2-5c5447400284" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 22 14:49:15.807694 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:49:15.807650 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-wk94w" podUID="2859e7c7-115a-47b0-89f2-5c5447400284" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 22 14:49:25.807870 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:49:25.807801 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-wk94w" podUID="2859e7c7-115a-47b0-89f2-5c5447400284" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 22 14:49:35.807997 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:49:35.807971 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-wk94w" Apr 22 14:49:38.989832 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:49:38.989731 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-wk94w"] Apr 22 14:49:38.990238 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:49:38.990039 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-wk94w" podUID="2859e7c7-115a-47b0-89f2-5c5447400284" containerName="kserve-container" containerID="cri-o://2b4ecea9509ff43fe5ff3e4b82ea15a04d7b7baf5c149aa2e440d4a2f6ad1acc" gracePeriod=30 Apr 22 14:49:39.029642 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:49:39.029614 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-l5n8s"] Apr 22 14:49:39.029946 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:49:39.029932 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e2f1d707-e2d2-4b5f-a313-38c5700e6374" containerName="storage-initializer" Apr 22 14:49:39.030031 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:49:39.029948 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2f1d707-e2d2-4b5f-a313-38c5700e6374" containerName="storage-initializer" Apr 22 14:49:39.030031 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:49:39.029966 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e2f1d707-e2d2-4b5f-a313-38c5700e6374" containerName="kserve-container" Apr 22 14:49:39.030031 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:49:39.029975 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2f1d707-e2d2-4b5f-a313-38c5700e6374" containerName="kserve-container" Apr 22 14:49:39.030180 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:49:39.030048 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="e2f1d707-e2d2-4b5f-a313-38c5700e6374" containerName="kserve-container" Apr 22 14:49:39.032962 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:49:39.032944 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-l5n8s" Apr 22 14:49:39.041905 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:49:39.041882 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-l5n8s"] Apr 22 14:49:39.136063 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:49:39.136040 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/57d681d3-64ab-4840-bb8c-da74e6b0fa67-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-669896799c-l5n8s\" (UID: \"57d681d3-64ab-4840-bb8c-da74e6b0fa67\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-l5n8s" Apr 22 14:49:39.237116 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:49:39.237088 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/57d681d3-64ab-4840-bb8c-da74e6b0fa67-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-669896799c-l5n8s\" (UID: \"57d681d3-64ab-4840-bb8c-da74e6b0fa67\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-l5n8s" Apr 22 14:49:39.237424 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:49:39.237406 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/57d681d3-64ab-4840-bb8c-da74e6b0fa67-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-669896799c-l5n8s\" (UID: \"57d681d3-64ab-4840-bb8c-da74e6b0fa67\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-l5n8s" Apr 22 14:49:39.342824 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:49:39.342741 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-l5n8s" Apr 22 14:49:39.455991 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:49:39.455948 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-l5n8s"] Apr 22 14:49:39.458091 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:49:39.458063 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57d681d3_64ab_4840_bb8c_da74e6b0fa67.slice/crio-f6da7d9068a488cf9311b5a07bec7b91ee3092731644be395b4d045ba31a2f58 WatchSource:0}: Error finding container f6da7d9068a488cf9311b5a07bec7b91ee3092731644be395b4d045ba31a2f58: Status 404 returned error can't find the container with id f6da7d9068a488cf9311b5a07bec7b91ee3092731644be395b4d045ba31a2f58 Apr 22 14:49:39.459988 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:49:39.459969 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 14:49:40.001817 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:49:40.001772 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-l5n8s" event={"ID":"57d681d3-64ab-4840-bb8c-da74e6b0fa67","Type":"ContainerStarted","Data":"f9953716ada808c640e856a14f46992392df595783fcdcc87657a46ce93d0266"} Apr 22 14:49:40.001817 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:49:40.001824 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-l5n8s" event={"ID":"57d681d3-64ab-4840-bb8c-da74e6b0fa67","Type":"ContainerStarted","Data":"f6da7d9068a488cf9311b5a07bec7b91ee3092731644be395b4d045ba31a2f58"} Apr 22 14:49:42.927319 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:49:42.927298 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-wk94w" Apr 22 14:49:42.959774 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:49:42.959742 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2859e7c7-115a-47b0-89f2-5c5447400284-kserve-provision-location\") pod \"2859e7c7-115a-47b0-89f2-5c5447400284\" (UID: \"2859e7c7-115a-47b0-89f2-5c5447400284\") " Apr 22 14:49:42.960051 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:49:42.960028 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2859e7c7-115a-47b0-89f2-5c5447400284-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2859e7c7-115a-47b0-89f2-5c5447400284" (UID: "2859e7c7-115a-47b0-89f2-5c5447400284"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:49:43.011242 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:49:43.011169 2576 generic.go:358] "Generic (PLEG): container finished" podID="2859e7c7-115a-47b0-89f2-5c5447400284" containerID="2b4ecea9509ff43fe5ff3e4b82ea15a04d7b7baf5c149aa2e440d4a2f6ad1acc" exitCode=0 Apr 22 14:49:43.011242 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:49:43.011219 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-wk94w" event={"ID":"2859e7c7-115a-47b0-89f2-5c5447400284","Type":"ContainerDied","Data":"2b4ecea9509ff43fe5ff3e4b82ea15a04d7b7baf5c149aa2e440d4a2f6ad1acc"} Apr 22 14:49:43.011383 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:49:43.011251 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-wk94w" event={"ID":"2859e7c7-115a-47b0-89f2-5c5447400284","Type":"ContainerDied","Data":"e9777da97149b202d8bbb42c8b034efb08933b939f244bf111e5b42127123dc8"} Apr 22 14:49:43.011383 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:49:43.011251 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-wk94w" Apr 22 14:49:43.011383 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:49:43.011267 2576 scope.go:117] "RemoveContainer" containerID="2b4ecea9509ff43fe5ff3e4b82ea15a04d7b7baf5c149aa2e440d4a2f6ad1acc" Apr 22 14:49:43.018695 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:49:43.018460 2576 scope.go:117] "RemoveContainer" containerID="6baab5c53e02780981459ba0bf5fdd84664ceca49ba2e50835ef194b2b20bf6d" Apr 22 14:49:43.025045 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:49:43.025027 2576 scope.go:117] "RemoveContainer" containerID="2b4ecea9509ff43fe5ff3e4b82ea15a04d7b7baf5c149aa2e440d4a2f6ad1acc" Apr 22 14:49:43.025294 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:49:43.025276 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b4ecea9509ff43fe5ff3e4b82ea15a04d7b7baf5c149aa2e440d4a2f6ad1acc\": container with ID starting with 2b4ecea9509ff43fe5ff3e4b82ea15a04d7b7baf5c149aa2e440d4a2f6ad1acc not found: ID does not exist" containerID="2b4ecea9509ff43fe5ff3e4b82ea15a04d7b7baf5c149aa2e440d4a2f6ad1acc" Apr 22 14:49:43.025341 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:49:43.025302 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b4ecea9509ff43fe5ff3e4b82ea15a04d7b7baf5c149aa2e440d4a2f6ad1acc"} err="failed to get container status \"2b4ecea9509ff43fe5ff3e4b82ea15a04d7b7baf5c149aa2e440d4a2f6ad1acc\": rpc error: code = NotFound desc = could not find container \"2b4ecea9509ff43fe5ff3e4b82ea15a04d7b7baf5c149aa2e440d4a2f6ad1acc\": container with ID starting with 2b4ecea9509ff43fe5ff3e4b82ea15a04d7b7baf5c149aa2e440d4a2f6ad1acc not found: ID does not exist" Apr 22 14:49:43.025341 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:49:43.025318 2576 scope.go:117] "RemoveContainer" containerID="6baab5c53e02780981459ba0bf5fdd84664ceca49ba2e50835ef194b2b20bf6d" Apr 22 14:49:43.025511 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:49:43.025495 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6baab5c53e02780981459ba0bf5fdd84664ceca49ba2e50835ef194b2b20bf6d\": container with ID starting with 6baab5c53e02780981459ba0bf5fdd84664ceca49ba2e50835ef194b2b20bf6d not found: ID does not exist" containerID="6baab5c53e02780981459ba0bf5fdd84664ceca49ba2e50835ef194b2b20bf6d" Apr 22 14:49:43.025562 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:49:43.025521 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6baab5c53e02780981459ba0bf5fdd84664ceca49ba2e50835ef194b2b20bf6d"} err="failed to get container status \"6baab5c53e02780981459ba0bf5fdd84664ceca49ba2e50835ef194b2b20bf6d\": rpc error: code = NotFound desc = could not find container \"6baab5c53e02780981459ba0bf5fdd84664ceca49ba2e50835ef194b2b20bf6d\": container with ID starting with 6baab5c53e02780981459ba0bf5fdd84664ceca49ba2e50835ef194b2b20bf6d not found: ID does not exist" Apr 22 14:49:43.031966 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:49:43.031945 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-wk94w"] Apr 22 14:49:43.035403 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:49:43.035385 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-wk94w"] Apr 22 14:49:43.060992 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:49:43.060971 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2859e7c7-115a-47b0-89f2-5c5447400284-kserve-provision-location\") on node \"ip-10-0-142-195.ec2.internal\" DevicePath \"\"" Apr 22 14:49:44.014671 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:49:44.014640 2576 generic.go:358] "Generic (PLEG): container finished" podID="57d681d3-64ab-4840-bb8c-da74e6b0fa67" containerID="f9953716ada808c640e856a14f46992392df595783fcdcc87657a46ce93d0266" exitCode=0 Apr 22 14:49:44.015105 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:49:44.014713 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-l5n8s" event={"ID":"57d681d3-64ab-4840-bb8c-da74e6b0fa67","Type":"ContainerDied","Data":"f9953716ada808c640e856a14f46992392df595783fcdcc87657a46ce93d0266"} Apr 22 14:49:44.291996 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:49:44.291926 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2859e7c7-115a-47b0-89f2-5c5447400284" path="/var/lib/kubelet/pods/2859e7c7-115a-47b0-89f2-5c5447400284/volumes" Apr 22 14:49:45.019700 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:49:45.019671 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-l5n8s" event={"ID":"57d681d3-64ab-4840-bb8c-da74e6b0fa67","Type":"ContainerStarted","Data":"a734c5cf67822aea1c6980d3f4a2e81f8f21f6f0016f71e818933e79c51dc60f"} Apr 22 14:49:45.020081 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:49:45.019954 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-l5n8s" Apr 22 14:49:45.020932 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:49:45.020908 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-l5n8s" podUID="57d681d3-64ab-4840-bb8c-da74e6b0fa67" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 22 14:49:45.036124 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:49:45.036071 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-l5n8s" podStartSLOduration=6.036055001 podStartE2EDuration="6.036055001s" podCreationTimestamp="2026-04-22 14:49:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:49:45.035145806 +0000 UTC m=+2055.360880221" watchObservedRunningTime="2026-04-22 14:49:45.036055001 +0000 UTC m=+2055.361789412" Apr 22 14:49:46.022932 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:49:46.022897 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-l5n8s" podUID="57d681d3-64ab-4840-bb8c-da74e6b0fa67" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 22 14:49:56.023428 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:49:56.023390 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-l5n8s" podUID="57d681d3-64ab-4840-bb8c-da74e6b0fa67" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 22 14:50:06.023661 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:50:06.023615 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-l5n8s" podUID="57d681d3-64ab-4840-bb8c-da74e6b0fa67" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 22 14:50:16.023079 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:50:16.023032 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-l5n8s" podUID="57d681d3-64ab-4840-bb8c-da74e6b0fa67" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 22 14:50:26.023335 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:50:26.023293 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-l5n8s" podUID="57d681d3-64ab-4840-bb8c-da74e6b0fa67" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 22 14:50:31.024921 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:50:31.024887 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x7g6p_5880a8e9-777a-4921-b5f6-c6325c768bf2/ovn-acl-logging/0.log" Apr 22 14:50:31.026436 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:50:31.026413 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x7g6p_5880a8e9-777a-4921-b5f6-c6325c768bf2/ovn-acl-logging/0.log" Apr 22 14:50:36.023309 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:50:36.023266 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-l5n8s" podUID="57d681d3-64ab-4840-bb8c-da74e6b0fa67" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 22 14:50:46.023928 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:50:46.023881 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-l5n8s" podUID="57d681d3-64ab-4840-bb8c-da74e6b0fa67" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 22 14:50:56.024032 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:50:56.023993 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-l5n8s" Apr 22 14:50:59.170464 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:50:59.170431 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-l5n8s"] Apr 22 14:50:59.171272 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:50:59.171218 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-l5n8s" podUID="57d681d3-64ab-4840-bb8c-da74e6b0fa67" containerName="kserve-container" containerID="cri-o://a734c5cf67822aea1c6980d3f4a2e81f8f21f6f0016f71e818933e79c51dc60f" gracePeriod=30 Apr 22 14:50:59.245461 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:50:59.245433 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-j4mdb"] Apr 22 14:50:59.245733 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:50:59.245718 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2859e7c7-115a-47b0-89f2-5c5447400284" containerName="storage-initializer" Apr 22 14:50:59.245803 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:50:59.245735 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2859e7c7-115a-47b0-89f2-5c5447400284" containerName="storage-initializer" Apr 22 14:50:59.245803 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:50:59.245757 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2859e7c7-115a-47b0-89f2-5c5447400284" containerName="kserve-container" Apr 22 14:50:59.245803 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:50:59.245765 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2859e7c7-115a-47b0-89f2-5c5447400284" containerName="kserve-container" Apr 22 14:50:59.245974 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:50:59.245849 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="2859e7c7-115a-47b0-89f2-5c5447400284" containerName="kserve-container" Apr 22 14:50:59.248648 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:50:59.248630 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-j4mdb" Apr 22 14:50:59.258143 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:50:59.258124 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-j4mdb"] Apr 22 14:50:59.387412 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:50:59.387388 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/36474843-5617-4689-a815-9232394e3a21-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-j4mdb\" (UID: \"36474843-5617-4689-a815-9232394e3a21\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-j4mdb" Apr 22 14:50:59.487998 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:50:59.487972 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/36474843-5617-4689-a815-9232394e3a21-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-j4mdb\" (UID: \"36474843-5617-4689-a815-9232394e3a21\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-j4mdb" Apr 22 14:50:59.488282 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:50:59.488264 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/36474843-5617-4689-a815-9232394e3a21-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-j4mdb\" (UID: \"36474843-5617-4689-a815-9232394e3a21\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-j4mdb" Apr 22 14:50:59.558540 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:50:59.558521 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-j4mdb" Apr 22 14:50:59.670570 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:50:59.670547 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-j4mdb"] Apr 22 14:50:59.672148 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:50:59.672122 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36474843_5617_4689_a815_9232394e3a21.slice/crio-d3af10a834dfde8968ecc4a1a5488cc3f23d54af4531723410c364b80f42b005 WatchSource:0}: Error finding container d3af10a834dfde8968ecc4a1a5488cc3f23d54af4531723410c364b80f42b005: Status 404 returned error can't find the container with id d3af10a834dfde8968ecc4a1a5488cc3f23d54af4531723410c364b80f42b005 Apr 22 14:51:00.234004 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:51:00.233974 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-j4mdb" event={"ID":"36474843-5617-4689-a815-9232394e3a21","Type":"ContainerStarted","Data":"d3fa797f3a64ea0b5965e39c3558334d58c0a764571c625c1f09aa9697e75726"} Apr 22 14:51:00.234004 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:51:00.234008 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-j4mdb" event={"ID":"36474843-5617-4689-a815-9232394e3a21","Type":"ContainerStarted","Data":"d3af10a834dfde8968ecc4a1a5488cc3f23d54af4531723410c364b80f42b005"} Apr 22 14:51:03.244828 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:51:03.244789 2576 generic.go:358] "Generic (PLEG): container finished" podID="57d681d3-64ab-4840-bb8c-da74e6b0fa67" containerID="a734c5cf67822aea1c6980d3f4a2e81f8f21f6f0016f71e818933e79c51dc60f" exitCode=0 Apr 22 14:51:03.245213 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:51:03.244842 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-l5n8s" event={"ID":"57d681d3-64ab-4840-bb8c-da74e6b0fa67","Type":"ContainerDied","Data":"a734c5cf67822aea1c6980d3f4a2e81f8f21f6f0016f71e818933e79c51dc60f"} Apr 22 14:51:03.299731 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:51:03.299716 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-l5n8s" Apr 22 14:51:03.413504 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:51:03.413440 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/57d681d3-64ab-4840-bb8c-da74e6b0fa67-kserve-provision-location\") pod \"57d681d3-64ab-4840-bb8c-da74e6b0fa67\" (UID: \"57d681d3-64ab-4840-bb8c-da74e6b0fa67\") " Apr 22 14:51:03.413734 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:51:03.413714 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57d681d3-64ab-4840-bb8c-da74e6b0fa67-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "57d681d3-64ab-4840-bb8c-da74e6b0fa67" (UID: "57d681d3-64ab-4840-bb8c-da74e6b0fa67"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:51:03.514847 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:51:03.514827 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/57d681d3-64ab-4840-bb8c-da74e6b0fa67-kserve-provision-location\") on node \"ip-10-0-142-195.ec2.internal\" DevicePath \"\"" Apr 22 14:51:04.251661 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:51:04.251627 2576 generic.go:358] "Generic (PLEG): container finished" podID="36474843-5617-4689-a815-9232394e3a21" containerID="d3fa797f3a64ea0b5965e39c3558334d58c0a764571c625c1f09aa9697e75726" exitCode=0 Apr 22 14:51:04.252057 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:51:04.251702 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-j4mdb" event={"ID":"36474843-5617-4689-a815-9232394e3a21","Type":"ContainerDied","Data":"d3fa797f3a64ea0b5965e39c3558334d58c0a764571c625c1f09aa9697e75726"} Apr 22 14:51:04.253395 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:51:04.253364 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-l5n8s" event={"ID":"57d681d3-64ab-4840-bb8c-da74e6b0fa67","Type":"ContainerDied","Data":"f6da7d9068a488cf9311b5a07bec7b91ee3092731644be395b4d045ba31a2f58"} Apr 22 14:51:04.253490 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:51:04.253448 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-l5n8s" Apr 22 14:51:04.253543 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:51:04.253454 2576 scope.go:117] "RemoveContainer" containerID="a734c5cf67822aea1c6980d3f4a2e81f8f21f6f0016f71e818933e79c51dc60f" Apr 22 14:51:04.261997 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:51:04.261970 2576 scope.go:117] "RemoveContainer" containerID="f9953716ada808c640e856a14f46992392df595783fcdcc87657a46ce93d0266" Apr 22 14:51:04.281105 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:51:04.281080 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-l5n8s"] Apr 22 14:51:04.283391 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:51:04.283368 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-l5n8s"] Apr 22 14:51:04.291331 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:51:04.291310 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57d681d3-64ab-4840-bb8c-da74e6b0fa67" path="/var/lib/kubelet/pods/57d681d3-64ab-4840-bb8c-da74e6b0fa67/volumes" Apr 22 14:51:05.258484 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:51:05.258452 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-j4mdb" event={"ID":"36474843-5617-4689-a815-9232394e3a21","Type":"ContainerStarted","Data":"a480a2a29bb5b7cdeeabfb2eefcceb8ed95d176fbc5f6f540db604ffeb844d67"} Apr 22 14:51:05.258897 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:51:05.258677 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-j4mdb" Apr 22 14:51:05.277206 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:51:05.277148 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-j4mdb" podStartSLOduration=6.277135628 podStartE2EDuration="6.277135628s" podCreationTimestamp="2026-04-22 14:50:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:51:05.275024729 +0000 UTC m=+2135.600759142" watchObservedRunningTime="2026-04-22 14:51:05.277135628 +0000 UTC m=+2135.602870043" Apr 22 14:51:36.262900 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:51:36.262861 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-j4mdb" podUID="36474843-5617-4689-a815-9232394e3a21" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.39:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.134.0.39:8080: connect: connection refused" Apr 22 14:51:46.261525 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:51:46.261488 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-j4mdb" podUID="36474843-5617-4689-a815-9232394e3a21" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.39:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.134.0.39:8080: connect: connection refused" Apr 22 14:51:56.261593 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:51:56.261556 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-j4mdb" podUID="36474843-5617-4689-a815-9232394e3a21" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.39:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.134.0.39:8080: connect: connection refused" Apr 22 14:52:06.262003 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:52:06.261964 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-j4mdb" podUID="36474843-5617-4689-a815-9232394e3a21" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.39:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.134.0.39:8080: connect: connection refused" Apr 22 14:52:16.265204 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:52:16.265178 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-j4mdb" Apr 22 14:52:19.358593 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:52:19.358564 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-j4mdb"] Apr 22 14:52:19.358959 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:52:19.358765 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-j4mdb" podUID="36474843-5617-4689-a815-9232394e3a21" containerName="kserve-container" containerID="cri-o://a480a2a29bb5b7cdeeabfb2eefcceb8ed95d176fbc5f6f540db604ffeb844d67" gracePeriod=30 Apr 22 14:52:19.448103 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:52:19.448071 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-p5dq9"] Apr 22 14:52:19.448326 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:52:19.448314 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="57d681d3-64ab-4840-bb8c-da74e6b0fa67" containerName="storage-initializer" Apr 22 14:52:19.448372 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:52:19.448327 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="57d681d3-64ab-4840-bb8c-da74e6b0fa67" containerName="storage-initializer" Apr 22 14:52:19.448372 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:52:19.448344 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="57d681d3-64ab-4840-bb8c-da74e6b0fa67" containerName="kserve-container" Apr 22 14:52:19.448372 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:52:19.448349 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="57d681d3-64ab-4840-bb8c-da74e6b0fa67" containerName="kserve-container" Apr 22 14:52:19.448459 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:52:19.448392 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="57d681d3-64ab-4840-bb8c-da74e6b0fa67" containerName="kserve-container" Apr 22 14:52:19.454322 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:52:19.454295 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-p5dq9" Apr 22 14:52:19.460486 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:52:19.460460 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-p5dq9"] Apr 22 14:52:19.496863 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:52:19.496838 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/974a4886-82cd-4950-b2be-75310d61fee4-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-67fd65d6cb-p5dq9\" (UID: \"974a4886-82cd-4950-b2be-75310d61fee4\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-p5dq9" Apr 22 14:52:19.597334 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:52:19.597307 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/974a4886-82cd-4950-b2be-75310d61fee4-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-67fd65d6cb-p5dq9\" (UID: \"974a4886-82cd-4950-b2be-75310d61fee4\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-p5dq9" Apr 22 14:52:19.597628 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:52:19.597612 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/974a4886-82cd-4950-b2be-75310d61fee4-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-67fd65d6cb-p5dq9\" (UID: \"974a4886-82cd-4950-b2be-75310d61fee4\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-p5dq9" Apr 22 14:52:19.764930 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:52:19.764908 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-p5dq9" Apr 22 14:52:19.880281 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:52:19.880260 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-p5dq9"] Apr 22 14:52:19.882038 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:52:19.882012 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod974a4886_82cd_4950_b2be_75310d61fee4.slice/crio-b857c580487f235540c869997feca9bf0db2a46fa094505c6eacec3d495e2e48 WatchSource:0}: Error finding container b857c580487f235540c869997feca9bf0db2a46fa094505c6eacec3d495e2e48: Status 404 returned error can't find the container with id b857c580487f235540c869997feca9bf0db2a46fa094505c6eacec3d495e2e48 Apr 22 14:52:20.445648 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:52:20.445617 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-p5dq9" event={"ID":"974a4886-82cd-4950-b2be-75310d61fee4","Type":"ContainerStarted","Data":"c60b1a4c34c445cbbc8bccb814b6439284334b974e53e91e5d57e92ba19a151b"} Apr 22 14:52:20.445648 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:52:20.445646 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-p5dq9" event={"ID":"974a4886-82cd-4950-b2be-75310d61fee4","Type":"ContainerStarted","Data":"b857c580487f235540c869997feca9bf0db2a46fa094505c6eacec3d495e2e48"} Apr 22 14:52:23.087151 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:52:23.087131 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-j4mdb" Apr 22 14:52:23.121160 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:52:23.121140 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/36474843-5617-4689-a815-9232394e3a21-kserve-provision-location\") pod \"36474843-5617-4689-a815-9232394e3a21\" (UID: \"36474843-5617-4689-a815-9232394e3a21\") " Apr 22 14:52:23.121435 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:52:23.121417 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36474843-5617-4689-a815-9232394e3a21-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "36474843-5617-4689-a815-9232394e3a21" (UID: "36474843-5617-4689-a815-9232394e3a21"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:52:23.221908 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:52:23.221890 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/36474843-5617-4689-a815-9232394e3a21-kserve-provision-location\") on node \"ip-10-0-142-195.ec2.internal\" DevicePath \"\"" Apr 22 14:52:23.455003 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:52:23.454978 2576 generic.go:358] "Generic (PLEG): container finished" podID="36474843-5617-4689-a815-9232394e3a21" containerID="a480a2a29bb5b7cdeeabfb2eefcceb8ed95d176fbc5f6f540db604ffeb844d67" exitCode=0 Apr 22 14:52:23.455083 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:52:23.455036 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-j4mdb" Apr 22 14:52:23.455083 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:52:23.455053 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-j4mdb" event={"ID":"36474843-5617-4689-a815-9232394e3a21","Type":"ContainerDied","Data":"a480a2a29bb5b7cdeeabfb2eefcceb8ed95d176fbc5f6f540db604ffeb844d67"} Apr 22 14:52:23.455083 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:52:23.455075 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-j4mdb" event={"ID":"36474843-5617-4689-a815-9232394e3a21","Type":"ContainerDied","Data":"d3af10a834dfde8968ecc4a1a5488cc3f23d54af4531723410c364b80f42b005"} Apr 22 14:52:23.455197 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:52:23.455089 2576 scope.go:117] "RemoveContainer" containerID="a480a2a29bb5b7cdeeabfb2eefcceb8ed95d176fbc5f6f540db604ffeb844d67" Apr 22 14:52:23.462248 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:52:23.462219 2576 scope.go:117] "RemoveContainer" containerID="d3fa797f3a64ea0b5965e39c3558334d58c0a764571c625c1f09aa9697e75726" Apr 22 14:52:23.468512 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:52:23.468497 2576 scope.go:117] "RemoveContainer" containerID="a480a2a29bb5b7cdeeabfb2eefcceb8ed95d176fbc5f6f540db604ffeb844d67" Apr 22 14:52:23.468760 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:52:23.468743 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a480a2a29bb5b7cdeeabfb2eefcceb8ed95d176fbc5f6f540db604ffeb844d67\": container with ID starting with a480a2a29bb5b7cdeeabfb2eefcceb8ed95d176fbc5f6f540db604ffeb844d67 not found: ID does not exist" containerID="a480a2a29bb5b7cdeeabfb2eefcceb8ed95d176fbc5f6f540db604ffeb844d67" Apr 22 14:52:23.468832 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:52:23.468767 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a480a2a29bb5b7cdeeabfb2eefcceb8ed95d176fbc5f6f540db604ffeb844d67"} err="failed to get container status \"a480a2a29bb5b7cdeeabfb2eefcceb8ed95d176fbc5f6f540db604ffeb844d67\": rpc error: code = NotFound desc = could not find container \"a480a2a29bb5b7cdeeabfb2eefcceb8ed95d176fbc5f6f540db604ffeb844d67\": container with ID starting with a480a2a29bb5b7cdeeabfb2eefcceb8ed95d176fbc5f6f540db604ffeb844d67 not found: ID does not exist" Apr 22 14:52:23.468832 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:52:23.468784 2576 scope.go:117] "RemoveContainer" containerID="d3fa797f3a64ea0b5965e39c3558334d58c0a764571c625c1f09aa9697e75726" Apr 22 14:52:23.469096 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:52:23.469079 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3fa797f3a64ea0b5965e39c3558334d58c0a764571c625c1f09aa9697e75726\": container with ID starting with d3fa797f3a64ea0b5965e39c3558334d58c0a764571c625c1f09aa9697e75726 not found: ID does not exist" containerID="d3fa797f3a64ea0b5965e39c3558334d58c0a764571c625c1f09aa9697e75726" Apr 22 14:52:23.469146 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:52:23.469102 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3fa797f3a64ea0b5965e39c3558334d58c0a764571c625c1f09aa9697e75726"} err="failed to get container status \"d3fa797f3a64ea0b5965e39c3558334d58c0a764571c625c1f09aa9697e75726\": rpc error: code = NotFound desc = could not find container \"d3fa797f3a64ea0b5965e39c3558334d58c0a764571c625c1f09aa9697e75726\": container with ID starting with d3fa797f3a64ea0b5965e39c3558334d58c0a764571c625c1f09aa9697e75726 not found: ID does not exist" Apr 22 14:52:23.475105 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:52:23.475062 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-j4mdb"] Apr 22 14:52:23.480939 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:52:23.480918 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-j4mdb"] Apr 22 14:52:24.292165 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:52:24.292131 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36474843-5617-4689-a815-9232394e3a21" path="/var/lib/kubelet/pods/36474843-5617-4689-a815-9232394e3a21/volumes" Apr 22 14:52:24.458938 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:52:24.458915 2576 generic.go:358] "Generic (PLEG): container finished" podID="974a4886-82cd-4950-b2be-75310d61fee4" containerID="c60b1a4c34c445cbbc8bccb814b6439284334b974e53e91e5d57e92ba19a151b" exitCode=0 Apr 22 14:52:24.459049 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:52:24.458995 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-p5dq9" event={"ID":"974a4886-82cd-4950-b2be-75310d61fee4","Type":"ContainerDied","Data":"c60b1a4c34c445cbbc8bccb814b6439284334b974e53e91e5d57e92ba19a151b"} Apr 22 14:52:25.463187 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:52:25.463157 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-p5dq9" event={"ID":"974a4886-82cd-4950-b2be-75310d61fee4","Type":"ContainerStarted","Data":"42dd0bc499b378d3681e5d0d475262b572ebf20884a72baa6cacbf017fa6be17"} Apr 22 14:52:25.463540 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:52:25.463365 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-p5dq9" Apr 22 14:52:25.480928 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:52:25.480888 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-p5dq9" podStartSLOduration=6.480878205 podStartE2EDuration="6.480878205s" podCreationTimestamp="2026-04-22 14:52:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:52:25.479932406 +0000 UTC m=+2215.805666823" watchObservedRunningTime="2026-04-22 14:52:25.480878205 +0000 UTC m=+2215.806612618" Apr 22 14:52:56.467260 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:52:56.467183 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-p5dq9" podUID="974a4886-82cd-4950-b2be-75310d61fee4" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.40:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.134.0.40:8080: connect: connection refused" Apr 22 14:53:06.466191 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:53:06.466153 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-p5dq9" podUID="974a4886-82cd-4950-b2be-75310d61fee4" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.40:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.134.0.40:8080: connect: connection refused" Apr 22 14:53:16.466304 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:53:16.466269 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-p5dq9" podUID="974a4886-82cd-4950-b2be-75310d61fee4" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.40:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.134.0.40:8080: connect: connection refused" Apr 22 14:53:26.466217 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:53:26.466175 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-p5dq9" podUID="974a4886-82cd-4950-b2be-75310d61fee4" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.40:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.134.0.40:8080: connect: connection refused" Apr 22 14:53:32.292434 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:53:32.292400 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-p5dq9" Apr 22 14:53:39.617222 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:53:39.617189 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-fx74k"] Apr 22 14:53:39.617676 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:53:39.617429 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="36474843-5617-4689-a815-9232394e3a21" containerName="storage-initializer" Apr 22 14:53:39.617676 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:53:39.617439 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="36474843-5617-4689-a815-9232394e3a21" containerName="storage-initializer" Apr 22 14:53:39.617676 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:53:39.617454 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="36474843-5617-4689-a815-9232394e3a21" containerName="kserve-container" Apr 22 14:53:39.617676 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:53:39.617459 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="36474843-5617-4689-a815-9232394e3a21" containerName="kserve-container" Apr 22 14:53:39.617676 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:53:39.617506 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="36474843-5617-4689-a815-9232394e3a21" containerName="kserve-container" Apr 22 14:53:39.620339 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:53:39.620322 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-fx74k" Apr 22 14:53:39.630850 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:53:39.630824 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-fx74k"] Apr 22 14:53:39.709175 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:53:39.709139 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-p5dq9"] Apr 22 14:53:39.709402 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:53:39.709369 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-p5dq9" podUID="974a4886-82cd-4950-b2be-75310d61fee4" containerName="kserve-container" containerID="cri-o://42dd0bc499b378d3681e5d0d475262b572ebf20884a72baa6cacbf017fa6be17" gracePeriod=30 Apr 22 14:53:39.715707 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:53:39.715689 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b033790f-c58b-4df7-9ced-85366cd3d105-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-65df45cf79-fx74k\" (UID: \"b033790f-c58b-4df7-9ced-85366cd3d105\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-fx74k" Apr 22 14:53:39.816484 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:53:39.816460 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b033790f-c58b-4df7-9ced-85366cd3d105-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-65df45cf79-fx74k\" (UID: \"b033790f-c58b-4df7-9ced-85366cd3d105\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-fx74k" Apr 22 14:53:39.816763 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:53:39.816748 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b033790f-c58b-4df7-9ced-85366cd3d105-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-65df45cf79-fx74k\" (UID: \"b033790f-c58b-4df7-9ced-85366cd3d105\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-fx74k" Apr 22 14:53:39.929427 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:53:39.929372 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-fx74k" Apr 22 14:53:40.038598 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:53:40.038576 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-fx74k"] Apr 22 14:53:40.041145 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:53:40.041117 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb033790f_c58b_4df7_9ced_85366cd3d105.slice/crio-d374da4f4fc41567554a20e85a69bdbeb563670515109ae5547b28baeb751cd6 WatchSource:0}: Error finding container d374da4f4fc41567554a20e85a69bdbeb563670515109ae5547b28baeb751cd6: Status 404 returned error can't find the container with id d374da4f4fc41567554a20e85a69bdbeb563670515109ae5547b28baeb751cd6 Apr 22 14:53:40.653317 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:53:40.653282 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-fx74k" event={"ID":"b033790f-c58b-4df7-9ced-85366cd3d105","Type":"ContainerStarted","Data":"8a260aa4af9bf18f2f4e25d9b195add855b6ce4434be40cce4e86b582343bab9"} Apr 22 14:53:40.653317 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:53:40.653313 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-fx74k" event={"ID":"b033790f-c58b-4df7-9ced-85366cd3d105","Type":"ContainerStarted","Data":"d374da4f4fc41567554a20e85a69bdbeb563670515109ae5547b28baeb751cd6"} Apr 22 14:53:42.289260 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:53:42.289215 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-p5dq9" podUID="974a4886-82cd-4950-b2be-75310d61fee4" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.40:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.134.0.40:8080: connect: connection refused" Apr 22 14:53:43.640939 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:53:43.640920 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-p5dq9" Apr 22 14:53:43.661609 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:53:43.661585 2576 generic.go:358] "Generic (PLEG): container finished" podID="974a4886-82cd-4950-b2be-75310d61fee4" containerID="42dd0bc499b378d3681e5d0d475262b572ebf20884a72baa6cacbf017fa6be17" exitCode=0 Apr 22 14:53:43.661725 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:53:43.661647 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-p5dq9" Apr 22 14:53:43.661725 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:53:43.661660 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-p5dq9" event={"ID":"974a4886-82cd-4950-b2be-75310d61fee4","Type":"ContainerDied","Data":"42dd0bc499b378d3681e5d0d475262b572ebf20884a72baa6cacbf017fa6be17"} Apr 22 14:53:43.661725 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:53:43.661695 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-p5dq9" event={"ID":"974a4886-82cd-4950-b2be-75310d61fee4","Type":"ContainerDied","Data":"b857c580487f235540c869997feca9bf0db2a46fa094505c6eacec3d495e2e48"} Apr 22 14:53:43.661725 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:53:43.661723 2576 scope.go:117] "RemoveContainer" containerID="42dd0bc499b378d3681e5d0d475262b572ebf20884a72baa6cacbf017fa6be17" Apr 22 14:53:43.669475 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:53:43.669460 2576 scope.go:117] "RemoveContainer" containerID="c60b1a4c34c445cbbc8bccb814b6439284334b974e53e91e5d57e92ba19a151b" Apr 22 14:53:43.675625 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:53:43.675606 2576 scope.go:117] "RemoveContainer" containerID="42dd0bc499b378d3681e5d0d475262b572ebf20884a72baa6cacbf017fa6be17" Apr 22 14:53:43.675863 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:53:43.675846 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42dd0bc499b378d3681e5d0d475262b572ebf20884a72baa6cacbf017fa6be17\": container with ID starting with 42dd0bc499b378d3681e5d0d475262b572ebf20884a72baa6cacbf017fa6be17 not found: ID does not exist" containerID="42dd0bc499b378d3681e5d0d475262b572ebf20884a72baa6cacbf017fa6be17" Apr 22 14:53:43.675931 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:53:43.675872 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42dd0bc499b378d3681e5d0d475262b572ebf20884a72baa6cacbf017fa6be17"} err="failed to get container status \"42dd0bc499b378d3681e5d0d475262b572ebf20884a72baa6cacbf017fa6be17\": rpc error: code = NotFound desc = could not find container \"42dd0bc499b378d3681e5d0d475262b572ebf20884a72baa6cacbf017fa6be17\": container with ID starting with 42dd0bc499b378d3681e5d0d475262b572ebf20884a72baa6cacbf017fa6be17 not found: ID does not exist" Apr 22 14:53:43.675931 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:53:43.675894 2576 scope.go:117] "RemoveContainer" containerID="c60b1a4c34c445cbbc8bccb814b6439284334b974e53e91e5d57e92ba19a151b" Apr 22 14:53:43.676113 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:53:43.676097 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c60b1a4c34c445cbbc8bccb814b6439284334b974e53e91e5d57e92ba19a151b\": container with ID starting with c60b1a4c34c445cbbc8bccb814b6439284334b974e53e91e5d57e92ba19a151b not found: ID does not exist" containerID="c60b1a4c34c445cbbc8bccb814b6439284334b974e53e91e5d57e92ba19a151b" Apr 22 14:53:43.676171 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:53:43.676134 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c60b1a4c34c445cbbc8bccb814b6439284334b974e53e91e5d57e92ba19a151b"} err="failed to get container status \"c60b1a4c34c445cbbc8bccb814b6439284334b974e53e91e5d57e92ba19a151b\": rpc error: code = NotFound desc = could not find container \"c60b1a4c34c445cbbc8bccb814b6439284334b974e53e91e5d57e92ba19a151b\": container with ID starting with c60b1a4c34c445cbbc8bccb814b6439284334b974e53e91e5d57e92ba19a151b not found: ID does not exist" Apr 22 14:53:43.744120 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:53:43.744101 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/974a4886-82cd-4950-b2be-75310d61fee4-kserve-provision-location\") pod \"974a4886-82cd-4950-b2be-75310d61fee4\" (UID: \"974a4886-82cd-4950-b2be-75310d61fee4\") " Apr 22 14:53:43.744370 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:53:43.744352 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/974a4886-82cd-4950-b2be-75310d61fee4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "974a4886-82cd-4950-b2be-75310d61fee4" (UID: "974a4886-82cd-4950-b2be-75310d61fee4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:53:43.845132 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:53:43.845101 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/974a4886-82cd-4950-b2be-75310d61fee4-kserve-provision-location\") on node \"ip-10-0-142-195.ec2.internal\" DevicePath \"\"" Apr 22 14:53:43.989758 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:53:43.989733 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-p5dq9"] Apr 22 14:53:43.995586 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:53:43.992362 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-p5dq9"] Apr 22 14:53:44.293281 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:53:44.293217 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="974a4886-82cd-4950-b2be-75310d61fee4" path="/var/lib/kubelet/pods/974a4886-82cd-4950-b2be-75310d61fee4/volumes" Apr 22 14:53:44.667418 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:53:44.667356 2576 generic.go:358] "Generic (PLEG): container finished" podID="b033790f-c58b-4df7-9ced-85366cd3d105" containerID="8a260aa4af9bf18f2f4e25d9b195add855b6ce4434be40cce4e86b582343bab9" exitCode=0 Apr 22 14:53:44.667708 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:53:44.667431 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-fx74k" event={"ID":"b033790f-c58b-4df7-9ced-85366cd3d105","Type":"ContainerDied","Data":"8a260aa4af9bf18f2f4e25d9b195add855b6ce4434be40cce4e86b582343bab9"} Apr 22 14:53:45.671745 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:53:45.671710 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-fx74k" event={"ID":"b033790f-c58b-4df7-9ced-85366cd3d105","Type":"ContainerStarted","Data":"df4964f381ab51308fbf7a6199f16b5b03b66f9da685bcf210fa859634f54e4f"} Apr 22 14:53:45.672206 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:53:45.671926 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-fx74k" Apr 22 14:53:45.687928 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:53:45.687858 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-fx74k" podStartSLOduration=6.687844259 podStartE2EDuration="6.687844259s" podCreationTimestamp="2026-04-22 14:53:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:53:45.686961306 +0000 UTC m=+2296.012695723" watchObservedRunningTime="2026-04-22 14:53:45.687844259 +0000 UTC m=+2296.013578699" Apr 22 14:54:16.676246 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:54:16.676142 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-fx74k" podUID="b033790f-c58b-4df7-9ced-85366cd3d105" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.41:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.134.0.41:8080: connect: connection refused" Apr 22 14:54:26.675576 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:54:26.675532 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-fx74k" podUID="b033790f-c58b-4df7-9ced-85366cd3d105" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.41:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.134.0.41:8080: connect: connection refused" Apr 22 14:54:36.675061 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:54:36.675013 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-fx74k" podUID="b033790f-c58b-4df7-9ced-85366cd3d105" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.41:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.134.0.41:8080: connect: connection refused" Apr 22 14:54:46.675531 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:54:46.675483 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-fx74k" podUID="b033790f-c58b-4df7-9ced-85366cd3d105" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.41:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.134.0.41:8080: connect: connection refused" Apr 22 14:54:56.679382 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:54:56.679341 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-fx74k" Apr 22 14:54:59.850729 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:54:59.850699 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-fx74k"] Apr 22 14:54:59.851114 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:54:59.850957 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-fx74k" podUID="b033790f-c58b-4df7-9ced-85366cd3d105" containerName="kserve-container" containerID="cri-o://df4964f381ab51308fbf7a6199f16b5b03b66f9da685bcf210fa859634f54e4f" gracePeriod=30 Apr 22 14:55:02.005183 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:55:02.005146 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-5b8ffc6f57-zlg2z"] Apr 22 14:55:02.005618 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:55:02.005528 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="974a4886-82cd-4950-b2be-75310d61fee4" containerName="storage-initializer" Apr 22 14:55:02.005618 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:55:02.005544 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="974a4886-82cd-4950-b2be-75310d61fee4" containerName="storage-initializer" Apr 22 14:55:02.005618 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:55:02.005558 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="974a4886-82cd-4950-b2be-75310d61fee4" containerName="kserve-container" Apr 22 14:55:02.005618 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:55:02.005565 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="974a4886-82cd-4950-b2be-75310d61fee4" containerName="kserve-container" Apr 22 14:55:02.005836 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:55:02.005635 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="974a4886-82cd-4950-b2be-75310d61fee4" containerName="kserve-container" Apr 22 14:55:02.008886 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:55:02.008867 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-5b8ffc6f57-zlg2z" Apr 22 14:55:02.023275 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:55:02.023255 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-5b8ffc6f57-zlg2z"] Apr 22 14:55:02.078868 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:55:02.078844 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c34a7c7d-2c75-4a0a-82d2-6fbbde6b51e4-kserve-provision-location\") pod \"isvc-sklearn-predictor-5b8ffc6f57-zlg2z\" (UID: \"c34a7c7d-2c75-4a0a-82d2-6fbbde6b51e4\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-5b8ffc6f57-zlg2z" Apr 22 14:55:02.179397 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:55:02.179372 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c34a7c7d-2c75-4a0a-82d2-6fbbde6b51e4-kserve-provision-location\") pod \"isvc-sklearn-predictor-5b8ffc6f57-zlg2z\" (UID: \"c34a7c7d-2c75-4a0a-82d2-6fbbde6b51e4\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-5b8ffc6f57-zlg2z" Apr 22 14:55:02.179666 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:55:02.179643 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c34a7c7d-2c75-4a0a-82d2-6fbbde6b51e4-kserve-provision-location\") pod \"isvc-sklearn-predictor-5b8ffc6f57-zlg2z\" (UID: \"c34a7c7d-2c75-4a0a-82d2-6fbbde6b51e4\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-5b8ffc6f57-zlg2z" Apr 22 14:55:02.318367 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:55:02.318308 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-5b8ffc6f57-zlg2z" Apr 22 14:55:02.438436 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:55:02.438413 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-5b8ffc6f57-zlg2z"] Apr 22 14:55:02.441225 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:55:02.441195 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc34a7c7d_2c75_4a0a_82d2_6fbbde6b51e4.slice/crio-9e015703f5859a6fc346abfb1fe3b95b13190c3ea02107ea5b8518de7d2ff417 WatchSource:0}: Error finding container 9e015703f5859a6fc346abfb1fe3b95b13190c3ea02107ea5b8518de7d2ff417: Status 404 returned error can't find the container with id 9e015703f5859a6fc346abfb1fe3b95b13190c3ea02107ea5b8518de7d2ff417 Apr 22 14:55:02.443072 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:55:02.443055 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 14:55:02.878318 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:55:02.878280 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-5b8ffc6f57-zlg2z" event={"ID":"c34a7c7d-2c75-4a0a-82d2-6fbbde6b51e4","Type":"ContainerStarted","Data":"ff3bb9ac13aea2a426c9d724245dc4397aeb7ecc07ce556f899bb9008ccd77d5"} Apr 22 14:55:02.878501 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:55:02.878327 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-5b8ffc6f57-zlg2z" event={"ID":"c34a7c7d-2c75-4a0a-82d2-6fbbde6b51e4","Type":"ContainerStarted","Data":"9e015703f5859a6fc346abfb1fe3b95b13190c3ea02107ea5b8518de7d2ff417"} Apr 22 14:55:03.882925 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:55:03.882892 2576 generic.go:358] "Generic (PLEG): container finished" podID="b033790f-c58b-4df7-9ced-85366cd3d105" containerID="df4964f381ab51308fbf7a6199f16b5b03b66f9da685bcf210fa859634f54e4f" exitCode=0 Apr 22 14:55:03.883268 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:55:03.882964 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-fx74k" event={"ID":"b033790f-c58b-4df7-9ced-85366cd3d105","Type":"ContainerDied","Data":"df4964f381ab51308fbf7a6199f16b5b03b66f9da685bcf210fa859634f54e4f"} Apr 22 14:55:04.091309 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:55:04.091284 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-fx74k" Apr 22 14:55:04.193754 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:55:04.193694 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b033790f-c58b-4df7-9ced-85366cd3d105-kserve-provision-location\") pod \"b033790f-c58b-4df7-9ced-85366cd3d105\" (UID: \"b033790f-c58b-4df7-9ced-85366cd3d105\") " Apr 22 14:55:04.194044 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:55:04.194024 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b033790f-c58b-4df7-9ced-85366cd3d105-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b033790f-c58b-4df7-9ced-85366cd3d105" (UID: "b033790f-c58b-4df7-9ced-85366cd3d105"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:55:04.294690 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:55:04.294656 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b033790f-c58b-4df7-9ced-85366cd3d105-kserve-provision-location\") on node \"ip-10-0-142-195.ec2.internal\" DevicePath \"\"" Apr 22 14:55:04.888279 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:55:04.888255 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-fx74k" Apr 22 14:55:04.888631 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:55:04.888248 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-fx74k" event={"ID":"b033790f-c58b-4df7-9ced-85366cd3d105","Type":"ContainerDied","Data":"d374da4f4fc41567554a20e85a69bdbeb563670515109ae5547b28baeb751cd6"} Apr 22 14:55:04.888631 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:55:04.888372 2576 scope.go:117] "RemoveContainer" containerID="df4964f381ab51308fbf7a6199f16b5b03b66f9da685bcf210fa859634f54e4f" Apr 22 14:55:04.896416 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:55:04.896389 2576 scope.go:117] "RemoveContainer" containerID="8a260aa4af9bf18f2f4e25d9b195add855b6ce4434be40cce4e86b582343bab9" Apr 22 14:55:04.913655 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:55:04.913630 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-fx74k"] Apr 22 14:55:04.918133 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:55:04.918108 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-fx74k"] Apr 22 14:55:06.292434 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:55:06.292401 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b033790f-c58b-4df7-9ced-85366cd3d105" path="/var/lib/kubelet/pods/b033790f-c58b-4df7-9ced-85366cd3d105/volumes" Apr 22 14:55:06.896295 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:55:06.896262 2576 generic.go:358] "Generic (PLEG): container finished" podID="c34a7c7d-2c75-4a0a-82d2-6fbbde6b51e4" containerID="ff3bb9ac13aea2a426c9d724245dc4397aeb7ecc07ce556f899bb9008ccd77d5" exitCode=0 Apr 22 14:55:06.896410 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:55:06.896343 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-5b8ffc6f57-zlg2z" event={"ID":"c34a7c7d-2c75-4a0a-82d2-6fbbde6b51e4","Type":"ContainerDied","Data":"ff3bb9ac13aea2a426c9d724245dc4397aeb7ecc07ce556f899bb9008ccd77d5"} Apr 22 14:55:07.901281 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:55:07.901244 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-5b8ffc6f57-zlg2z" event={"ID":"c34a7c7d-2c75-4a0a-82d2-6fbbde6b51e4","Type":"ContainerStarted","Data":"ab9a2c865a72f8ba2cf9af431a87abc32bbbc82ba03cb10184a3dd078bcc4e3c"} Apr 22 14:55:07.901722 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:55:07.901582 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-5b8ffc6f57-zlg2z" Apr 22 14:55:07.903050 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:55:07.903020 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-5b8ffc6f57-zlg2z" podUID="c34a7c7d-2c75-4a0a-82d2-6fbbde6b51e4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 22 14:55:08.904916 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:55:08.904879 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-5b8ffc6f57-zlg2z" podUID="c34a7c7d-2c75-4a0a-82d2-6fbbde6b51e4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 22 14:55:18.905745 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:55:18.905699 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-5b8ffc6f57-zlg2z" podUID="c34a7c7d-2c75-4a0a-82d2-6fbbde6b51e4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 22 14:55:28.905519 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:55:28.905471 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-5b8ffc6f57-zlg2z" podUID="c34a7c7d-2c75-4a0a-82d2-6fbbde6b51e4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 22 14:55:31.044640 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:55:31.044612 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x7g6p_5880a8e9-777a-4921-b5f6-c6325c768bf2/ovn-acl-logging/0.log" Apr 22 14:55:31.048439 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:55:31.048418 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x7g6p_5880a8e9-777a-4921-b5f6-c6325c768bf2/ovn-acl-logging/0.log" Apr 22 14:55:38.905342 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:55:38.905250 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-5b8ffc6f57-zlg2z" podUID="c34a7c7d-2c75-4a0a-82d2-6fbbde6b51e4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 22 14:55:48.905149 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:55:48.905102 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-5b8ffc6f57-zlg2z" podUID="c34a7c7d-2c75-4a0a-82d2-6fbbde6b51e4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 22 14:55:58.905126 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:55:58.905079 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-5b8ffc6f57-zlg2z" podUID="c34a7c7d-2c75-4a0a-82d2-6fbbde6b51e4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 22 14:56:08.906005 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:56:08.905963 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-5b8ffc6f57-zlg2z" Apr 22 14:56:08.925439 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:56:08.925383 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-5b8ffc6f57-zlg2z" podStartSLOduration=67.925368018 podStartE2EDuration="1m7.925368018s" podCreationTimestamp="2026-04-22 14:55:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:55:07.928851247 +0000 UTC m=+2378.254585658" watchObservedRunningTime="2026-04-22 14:56:08.925368018 +0000 UTC m=+2439.251102431" Apr 22 14:56:12.082071 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:56:12.079100 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-5b8ffc6f57-zlg2z"] Apr 22 14:56:12.082071 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:56:12.079486 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-5b8ffc6f57-zlg2z" podUID="c34a7c7d-2c75-4a0a-82d2-6fbbde6b51e4" containerName="kserve-container" containerID="cri-o://ab9a2c865a72f8ba2cf9af431a87abc32bbbc82ba03cb10184a3dd078bcc4e3c" gracePeriod=30 Apr 22 14:56:12.135095 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:56:12.135066 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-k55nm"] Apr 22 14:56:12.135349 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:56:12.135338 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b033790f-c58b-4df7-9ced-85366cd3d105" containerName="storage-initializer" Apr 22 14:56:12.135392 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:56:12.135351 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b033790f-c58b-4df7-9ced-85366cd3d105" containerName="storage-initializer" Apr 22 14:56:12.135392 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:56:12.135359 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b033790f-c58b-4df7-9ced-85366cd3d105" containerName="kserve-container" Apr 22 14:56:12.135392 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:56:12.135365 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b033790f-c58b-4df7-9ced-85366cd3d105" containerName="kserve-container" Apr 22 14:56:12.135482 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:56:12.135413 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="b033790f-c58b-4df7-9ced-85366cd3d105" containerName="kserve-container" Apr 22 14:56:12.138217 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:56:12.138200 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-k55nm" Apr 22 14:56:12.147846 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:56:12.147789 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-k55nm"] Apr 22 14:56:12.245379 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:56:12.245336 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e243a0b5-3a95-4107-91ca-b32c82d0ba35-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-695d5f5568-k55nm\" (UID: \"e243a0b5-3a95-4107-91ca-b32c82d0ba35\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-k55nm" Apr 22 14:56:12.345913 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:56:12.345837 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e243a0b5-3a95-4107-91ca-b32c82d0ba35-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-695d5f5568-k55nm\" (UID: \"e243a0b5-3a95-4107-91ca-b32c82d0ba35\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-k55nm" Apr 22 14:56:12.346165 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:56:12.346148 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e243a0b5-3a95-4107-91ca-b32c82d0ba35-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-695d5f5568-k55nm\" (UID: \"e243a0b5-3a95-4107-91ca-b32c82d0ba35\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-k55nm" Apr 22 14:56:12.447648 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:56:12.447626 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-k55nm" Apr 22 14:56:12.564097 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:56:12.564076 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-k55nm"] Apr 22 14:56:12.565936 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:56:12.565909 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode243a0b5_3a95_4107_91ca_b32c82d0ba35.slice/crio-e1556f1a0f81bd9e56bef0b6bef2c56eb46aee10f8460b2329739c5ff8eaf0d5 WatchSource:0}: Error finding container e1556f1a0f81bd9e56bef0b6bef2c56eb46aee10f8460b2329739c5ff8eaf0d5: Status 404 returned error can't find the container with id e1556f1a0f81bd9e56bef0b6bef2c56eb46aee10f8460b2329739c5ff8eaf0d5 Apr 22 14:56:13.090738 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:56:13.090702 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-k55nm" event={"ID":"e243a0b5-3a95-4107-91ca-b32c82d0ba35","Type":"ContainerStarted","Data":"2538f95c4b05c187761e0ec79cea7bfc84f2d2b8c7e1bc9aded8ad9fe55de8b4"} Apr 22 14:56:13.091109 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:56:13.090745 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-k55nm" event={"ID":"e243a0b5-3a95-4107-91ca-b32c82d0ba35","Type":"ContainerStarted","Data":"e1556f1a0f81bd9e56bef0b6bef2c56eb46aee10f8460b2329739c5ff8eaf0d5"} Apr 22 14:56:16.007099 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:56:16.007080 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-5b8ffc6f57-zlg2z" Apr 22 14:56:16.101575 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:56:16.101512 2576 generic.go:358] "Generic (PLEG): container finished" podID="c34a7c7d-2c75-4a0a-82d2-6fbbde6b51e4" containerID="ab9a2c865a72f8ba2cf9af431a87abc32bbbc82ba03cb10184a3dd078bcc4e3c" exitCode=0 Apr 22 14:56:16.101575 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:56:16.101552 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-5b8ffc6f57-zlg2z" event={"ID":"c34a7c7d-2c75-4a0a-82d2-6fbbde6b51e4","Type":"ContainerDied","Data":"ab9a2c865a72f8ba2cf9af431a87abc32bbbc82ba03cb10184a3dd078bcc4e3c"} Apr 22 14:56:16.101575 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:56:16.101572 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-5b8ffc6f57-zlg2z" event={"ID":"c34a7c7d-2c75-4a0a-82d2-6fbbde6b51e4","Type":"ContainerDied","Data":"9e015703f5859a6fc346abfb1fe3b95b13190c3ea02107ea5b8518de7d2ff417"} Apr 22 14:56:16.101776 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:56:16.101586 2576 scope.go:117] "RemoveContainer" containerID="ab9a2c865a72f8ba2cf9af431a87abc32bbbc82ba03cb10184a3dd078bcc4e3c" Apr 22 14:56:16.101776 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:56:16.101584 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-5b8ffc6f57-zlg2z" Apr 22 14:56:16.108483 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:56:16.108470 2576 scope.go:117] "RemoveContainer" containerID="ff3bb9ac13aea2a426c9d724245dc4397aeb7ecc07ce556f899bb9008ccd77d5" Apr 22 14:56:16.114490 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:56:16.114474 2576 scope.go:117] "RemoveContainer" containerID="ab9a2c865a72f8ba2cf9af431a87abc32bbbc82ba03cb10184a3dd078bcc4e3c" Apr 22 14:56:16.114728 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:56:16.114712 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab9a2c865a72f8ba2cf9af431a87abc32bbbc82ba03cb10184a3dd078bcc4e3c\": container with ID starting with ab9a2c865a72f8ba2cf9af431a87abc32bbbc82ba03cb10184a3dd078bcc4e3c not found: ID does not exist" containerID="ab9a2c865a72f8ba2cf9af431a87abc32bbbc82ba03cb10184a3dd078bcc4e3c" Apr 22 14:56:16.114770 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:56:16.114744 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab9a2c865a72f8ba2cf9af431a87abc32bbbc82ba03cb10184a3dd078bcc4e3c"} err="failed to get container status \"ab9a2c865a72f8ba2cf9af431a87abc32bbbc82ba03cb10184a3dd078bcc4e3c\": rpc error: code = NotFound desc = could not find container \"ab9a2c865a72f8ba2cf9af431a87abc32bbbc82ba03cb10184a3dd078bcc4e3c\": container with ID starting with ab9a2c865a72f8ba2cf9af431a87abc32bbbc82ba03cb10184a3dd078bcc4e3c not found: ID does not exist" Apr 22 14:56:16.114770 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:56:16.114760 2576 scope.go:117] "RemoveContainer" containerID="ff3bb9ac13aea2a426c9d724245dc4397aeb7ecc07ce556f899bb9008ccd77d5" Apr 22 14:56:16.114987 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:56:16.114973 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff3bb9ac13aea2a426c9d724245dc4397aeb7ecc07ce556f899bb9008ccd77d5\": container with ID starting with ff3bb9ac13aea2a426c9d724245dc4397aeb7ecc07ce556f899bb9008ccd77d5 not found: ID does not exist" containerID="ff3bb9ac13aea2a426c9d724245dc4397aeb7ecc07ce556f899bb9008ccd77d5" Apr 22 14:56:16.115025 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:56:16.114991 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff3bb9ac13aea2a426c9d724245dc4397aeb7ecc07ce556f899bb9008ccd77d5"} err="failed to get container status \"ff3bb9ac13aea2a426c9d724245dc4397aeb7ecc07ce556f899bb9008ccd77d5\": rpc error: code = NotFound desc = could not find container \"ff3bb9ac13aea2a426c9d724245dc4397aeb7ecc07ce556f899bb9008ccd77d5\": container with ID starting with ff3bb9ac13aea2a426c9d724245dc4397aeb7ecc07ce556f899bb9008ccd77d5 not found: ID does not exist" Apr 22 14:56:16.170263 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:56:16.170245 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c34a7c7d-2c75-4a0a-82d2-6fbbde6b51e4-kserve-provision-location\") pod \"c34a7c7d-2c75-4a0a-82d2-6fbbde6b51e4\" (UID: \"c34a7c7d-2c75-4a0a-82d2-6fbbde6b51e4\") " Apr 22 14:56:16.170519 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:56:16.170499 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c34a7c7d-2c75-4a0a-82d2-6fbbde6b51e4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c34a7c7d-2c75-4a0a-82d2-6fbbde6b51e4" (UID: "c34a7c7d-2c75-4a0a-82d2-6fbbde6b51e4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:56:16.271026 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:56:16.271007 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c34a7c7d-2c75-4a0a-82d2-6fbbde6b51e4-kserve-provision-location\") on node \"ip-10-0-142-195.ec2.internal\" DevicePath \"\"" Apr 22 14:56:16.421873 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:56:16.419686 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-5b8ffc6f57-zlg2z"] Apr 22 14:56:16.423708 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:56:16.423686 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-5b8ffc6f57-zlg2z"] Apr 22 14:56:17.105744 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:56:17.105716 2576 generic.go:358] "Generic (PLEG): container finished" podID="e243a0b5-3a95-4107-91ca-b32c82d0ba35" containerID="2538f95c4b05c187761e0ec79cea7bfc84f2d2b8c7e1bc9aded8ad9fe55de8b4" exitCode=0 Apr 22 14:56:17.106135 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:56:17.105793 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-k55nm" event={"ID":"e243a0b5-3a95-4107-91ca-b32c82d0ba35","Type":"ContainerDied","Data":"2538f95c4b05c187761e0ec79cea7bfc84f2d2b8c7e1bc9aded8ad9fe55de8b4"} Apr 22 14:56:18.111336 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:56:18.111297 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-k55nm" event={"ID":"e243a0b5-3a95-4107-91ca-b32c82d0ba35","Type":"ContainerStarted","Data":"a734e26bd25352cd684676e213268d367e8c1fc6acf0461a9e7c6fa446e8e119"} Apr 22 14:56:18.111721 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:56:18.111534 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-k55nm" Apr 22 14:56:18.130643 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:56:18.130597 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-k55nm" podStartSLOduration=6.130583013 podStartE2EDuration="6.130583013s" podCreationTimestamp="2026-04-22 14:56:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:56:18.129575189 +0000 UTC m=+2448.455309603" watchObservedRunningTime="2026-04-22 14:56:18.130583013 +0000 UTC m=+2448.456317426" Apr 22 14:56:18.292648 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:56:18.292621 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c34a7c7d-2c75-4a0a-82d2-6fbbde6b51e4" path="/var/lib/kubelet/pods/c34a7c7d-2c75-4a0a-82d2-6fbbde6b51e4/volumes" Apr 22 14:56:49.133570 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:56:49.133517 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-k55nm" podUID="e243a0b5-3a95-4107-91ca-b32c82d0ba35" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 22 14:56:59.116582 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:56:59.116544 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-k55nm" Apr 22 14:57:02.245354 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:02.245319 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-k55nm"] Apr 22 14:57:02.245777 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:02.245626 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-k55nm" podUID="e243a0b5-3a95-4107-91ca-b32c82d0ba35" containerName="kserve-container" containerID="cri-o://a734e26bd25352cd684676e213268d367e8c1fc6acf0461a9e7c6fa446e8e119" gracePeriod=30 Apr 22 14:57:02.316440 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:02.316412 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-59c87754d7-885gw"] Apr 22 14:57:02.316655 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:02.316645 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c34a7c7d-2c75-4a0a-82d2-6fbbde6b51e4" containerName="storage-initializer" Apr 22 14:57:02.316705 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:02.316657 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c34a7c7d-2c75-4a0a-82d2-6fbbde6b51e4" containerName="storage-initializer" Apr 22 14:57:02.316705 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:02.316670 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c34a7c7d-2c75-4a0a-82d2-6fbbde6b51e4" containerName="kserve-container" Apr 22 14:57:02.316705 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:02.316675 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c34a7c7d-2c75-4a0a-82d2-6fbbde6b51e4" containerName="kserve-container" Apr 22 14:57:02.316829 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:02.316723 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="c34a7c7d-2c75-4a0a-82d2-6fbbde6b51e4" containerName="kserve-container" Apr 22 14:57:02.319593 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:02.319579 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-59c87754d7-885gw" Apr 22 14:57:02.327047 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:02.327025 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-59c87754d7-885gw"] Apr 22 14:57:02.382086 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:02.382060 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c6251bfc-5404-40e7-94c8-13a53db22ad5-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-59c87754d7-885gw\" (UID: \"c6251bfc-5404-40e7-94c8-13a53db22ad5\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-59c87754d7-885gw" Apr 22 14:57:02.483258 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:02.483223 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c6251bfc-5404-40e7-94c8-13a53db22ad5-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-59c87754d7-885gw\" (UID: \"c6251bfc-5404-40e7-94c8-13a53db22ad5\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-59c87754d7-885gw" Apr 22 14:57:02.483542 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:02.483525 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c6251bfc-5404-40e7-94c8-13a53db22ad5-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-59c87754d7-885gw\" (UID: \"c6251bfc-5404-40e7-94c8-13a53db22ad5\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-59c87754d7-885gw" Apr 22 14:57:02.629739 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:02.629682 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-59c87754d7-885gw" Apr 22 14:57:02.740686 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:02.740662 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-59c87754d7-885gw"] Apr 22 14:57:02.742617 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:57:02.742589 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6251bfc_5404_40e7_94c8_13a53db22ad5.slice/crio-5f3fa9bd55fac03af27bca47d2d421ca4f9d990651c80c9c2247ede8c4d8f3e3 WatchSource:0}: Error finding container 5f3fa9bd55fac03af27bca47d2d421ca4f9d990651c80c9c2247ede8c4d8f3e3: Status 404 returned error can't find the container with id 5f3fa9bd55fac03af27bca47d2d421ca4f9d990651c80c9c2247ede8c4d8f3e3 Apr 22 14:57:03.234671 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:03.234639 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-59c87754d7-885gw" event={"ID":"c6251bfc-5404-40e7-94c8-13a53db22ad5","Type":"ContainerStarted","Data":"4245aecfb883f62ba80dbfa7f7c33a8ffc01d4dcc044e1defa28d2a909414776"} Apr 22 14:57:03.234671 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:03.234678 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-59c87754d7-885gw" event={"ID":"c6251bfc-5404-40e7-94c8-13a53db22ad5","Type":"ContainerStarted","Data":"5f3fa9bd55fac03af27bca47d2d421ca4f9d990651c80c9c2247ede8c4d8f3e3"} Apr 22 14:57:08.251082 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:08.251053 2576 generic.go:358] "Generic (PLEG): container finished" podID="e243a0b5-3a95-4107-91ca-b32c82d0ba35" containerID="a734e26bd25352cd684676e213268d367e8c1fc6acf0461a9e7c6fa446e8e119" exitCode=0 Apr 22 14:57:08.251467 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:08.251125 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-k55nm" event={"ID":"e243a0b5-3a95-4107-91ca-b32c82d0ba35","Type":"ContainerDied","Data":"a734e26bd25352cd684676e213268d367e8c1fc6acf0461a9e7c6fa446e8e119"} Apr 22 14:57:08.252378 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:08.252358 2576 generic.go:358] "Generic (PLEG): container finished" podID="c6251bfc-5404-40e7-94c8-13a53db22ad5" containerID="4245aecfb883f62ba80dbfa7f7c33a8ffc01d4dcc044e1defa28d2a909414776" exitCode=0 Apr 22 14:57:08.252481 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:08.252387 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-59c87754d7-885gw" event={"ID":"c6251bfc-5404-40e7-94c8-13a53db22ad5","Type":"ContainerDied","Data":"4245aecfb883f62ba80dbfa7f7c33a8ffc01d4dcc044e1defa28d2a909414776"} Apr 22 14:57:08.295096 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:08.295080 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-k55nm" Apr 22 14:57:08.327770 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:08.327740 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e243a0b5-3a95-4107-91ca-b32c82d0ba35-kserve-provision-location\") pod \"e243a0b5-3a95-4107-91ca-b32c82d0ba35\" (UID: \"e243a0b5-3a95-4107-91ca-b32c82d0ba35\") " Apr 22 14:57:08.328048 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:08.328013 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e243a0b5-3a95-4107-91ca-b32c82d0ba35-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e243a0b5-3a95-4107-91ca-b32c82d0ba35" (UID: "e243a0b5-3a95-4107-91ca-b32c82d0ba35"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:57:08.429329 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:08.429239 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e243a0b5-3a95-4107-91ca-b32c82d0ba35-kserve-provision-location\") on node \"ip-10-0-142-195.ec2.internal\" DevicePath \"\"" Apr 22 14:57:09.256980 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:09.256946 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-k55nm" event={"ID":"e243a0b5-3a95-4107-91ca-b32c82d0ba35","Type":"ContainerDied","Data":"e1556f1a0f81bd9e56bef0b6bef2c56eb46aee10f8460b2329739c5ff8eaf0d5"} Apr 22 14:57:09.256980 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:09.256986 2576 scope.go:117] "RemoveContainer" containerID="a734e26bd25352cd684676e213268d367e8c1fc6acf0461a9e7c6fa446e8e119" Apr 22 14:57:09.257428 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:09.256992 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-k55nm" Apr 22 14:57:09.258566 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:09.258545 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-59c87754d7-885gw" event={"ID":"c6251bfc-5404-40e7-94c8-13a53db22ad5","Type":"ContainerStarted","Data":"9d74ddd28bd0a49123a8cb32cc2e98e085c679b1967c1a82e9f3afbe82de547a"} Apr 22 14:57:09.258841 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:09.258821 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-59c87754d7-885gw" Apr 22 14:57:09.260022 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:09.259996 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-59c87754d7-885gw" podUID="c6251bfc-5404-40e7-94c8-13a53db22ad5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 22 14:57:09.264167 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:09.264147 2576 scope.go:117] "RemoveContainer" containerID="2538f95c4b05c187761e0ec79cea7bfc84f2d2b8c7e1bc9aded8ad9fe55de8b4" Apr 22 14:57:09.302761 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:09.302722 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-59c87754d7-885gw" podStartSLOduration=7.302711098 podStartE2EDuration="7.302711098s" podCreationTimestamp="2026-04-22 14:57:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:57:09.285340976 +0000 UTC m=+2499.611075401" watchObservedRunningTime="2026-04-22 14:57:09.302711098 +0000 UTC m=+2499.628445512" Apr 22 14:57:09.303130 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:09.303115 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-k55nm"] Apr 22 14:57:09.308423 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:09.308402 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-k55nm"] Apr 22 14:57:10.262802 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:10.262772 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-59c87754d7-885gw" podUID="c6251bfc-5404-40e7-94c8-13a53db22ad5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 22 14:57:10.291898 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:10.291853 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e243a0b5-3a95-4107-91ca-b32c82d0ba35" path="/var/lib/kubelet/pods/e243a0b5-3a95-4107-91ca-b32c82d0ba35/volumes" Apr 22 14:57:20.263279 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:20.263230 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-59c87754d7-885gw" podUID="c6251bfc-5404-40e7-94c8-13a53db22ad5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 22 14:57:30.264010 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:30.263971 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-59c87754d7-885gw" Apr 22 14:57:39.313102 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:39.313065 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-runtime-predictor-59c87754d7-885gw_c6251bfc-5404-40e7-94c8-13a53db22ad5/kserve-container/0.log" Apr 22 14:57:39.477837 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:39.477796 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-59c87754d7-885gw"] Apr 22 14:57:39.478092 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:39.478058 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-59c87754d7-885gw" podUID="c6251bfc-5404-40e7-94c8-13a53db22ad5" containerName="kserve-container" containerID="cri-o://9d74ddd28bd0a49123a8cb32cc2e98e085c679b1967c1a82e9f3afbe82de547a" gracePeriod=30 Apr 22 14:57:39.547890 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:39.547864 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-ksd5w"] Apr 22 14:57:39.548134 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:39.548122 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e243a0b5-3a95-4107-91ca-b32c82d0ba35" containerName="kserve-container" Apr 22 14:57:39.548178 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:39.548136 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e243a0b5-3a95-4107-91ca-b32c82d0ba35" containerName="kserve-container" Apr 22 14:57:39.548178 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:39.548145 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e243a0b5-3a95-4107-91ca-b32c82d0ba35" containerName="storage-initializer" Apr 22 14:57:39.548178 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:39.548150 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e243a0b5-3a95-4107-91ca-b32c82d0ba35" containerName="storage-initializer" Apr 22 14:57:39.548274 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:39.548197 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="e243a0b5-3a95-4107-91ca-b32c82d0ba35" containerName="kserve-container" Apr 22 14:57:39.550776 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:39.550751 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-ksd5w" Apr 22 14:57:39.561863 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:39.561841 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-ksd5w"] Apr 22 14:57:39.650785 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:39.650727 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8fc1e8f9-1582-4f5e-a100-5109cc6e93ad-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-7dd9b85c64-ksd5w\" (UID: \"8fc1e8f9-1582-4f5e-a100-5109cc6e93ad\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-ksd5w" Apr 22 14:57:39.751998 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:39.751973 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8fc1e8f9-1582-4f5e-a100-5109cc6e93ad-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-7dd9b85c64-ksd5w\" (UID: \"8fc1e8f9-1582-4f5e-a100-5109cc6e93ad\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-ksd5w" Apr 22 14:57:39.752296 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:39.752278 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8fc1e8f9-1582-4f5e-a100-5109cc6e93ad-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-7dd9b85c64-ksd5w\" (UID: \"8fc1e8f9-1582-4f5e-a100-5109cc6e93ad\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-ksd5w" Apr 22 14:57:39.860754 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:39.860731 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-ksd5w" Apr 22 14:57:39.974327 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:39.974289 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-ksd5w"] Apr 22 14:57:39.976984 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:57:39.976957 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8fc1e8f9_1582_4f5e_a100_5109cc6e93ad.slice/crio-9fc0b985cb2ce172b0371bdb8dffadae1cbfc7f8f7fe02bb2685a881d2dc50b0 WatchSource:0}: Error finding container 9fc0b985cb2ce172b0371bdb8dffadae1cbfc7f8f7fe02bb2685a881d2dc50b0: Status 404 returned error can't find the container with id 9fc0b985cb2ce172b0371bdb8dffadae1cbfc7f8f7fe02bb2685a881d2dc50b0 Apr 22 14:57:40.263625 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:40.263579 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-59c87754d7-885gw" podUID="c6251bfc-5404-40e7-94c8-13a53db22ad5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 22 14:57:40.352987 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:40.352955 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-ksd5w" event={"ID":"8fc1e8f9-1582-4f5e-a100-5109cc6e93ad","Type":"ContainerStarted","Data":"18cb498cd33eeebd444708324a0442181c63a2b2b55a657a2396c9321839386a"} Apr 22 14:57:40.352987 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:40.352989 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-ksd5w" event={"ID":"8fc1e8f9-1582-4f5e-a100-5109cc6e93ad","Type":"ContainerStarted","Data":"9fc0b985cb2ce172b0371bdb8dffadae1cbfc7f8f7fe02bb2685a881d2dc50b0"} Apr 22 14:57:40.503475 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:40.503444 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-59c87754d7-885gw" Apr 22 14:57:40.660029 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:40.659967 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c6251bfc-5404-40e7-94c8-13a53db22ad5-kserve-provision-location\") pod \"c6251bfc-5404-40e7-94c8-13a53db22ad5\" (UID: \"c6251bfc-5404-40e7-94c8-13a53db22ad5\") " Apr 22 14:57:40.682525 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:40.682501 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6251bfc-5404-40e7-94c8-13a53db22ad5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c6251bfc-5404-40e7-94c8-13a53db22ad5" (UID: "c6251bfc-5404-40e7-94c8-13a53db22ad5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:57:40.761131 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:40.761109 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c6251bfc-5404-40e7-94c8-13a53db22ad5-kserve-provision-location\") on node \"ip-10-0-142-195.ec2.internal\" DevicePath \"\"" Apr 22 14:57:41.356800 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:41.356770 2576 generic.go:358] "Generic (PLEG): container finished" podID="c6251bfc-5404-40e7-94c8-13a53db22ad5" containerID="9d74ddd28bd0a49123a8cb32cc2e98e085c679b1967c1a82e9f3afbe82de547a" exitCode=0 Apr 22 14:57:41.357128 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:41.356867 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-59c87754d7-885gw" Apr 22 14:57:41.357128 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:41.356862 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-59c87754d7-885gw" event={"ID":"c6251bfc-5404-40e7-94c8-13a53db22ad5","Type":"ContainerDied","Data":"9d74ddd28bd0a49123a8cb32cc2e98e085c679b1967c1a82e9f3afbe82de547a"} Apr 22 14:57:41.357128 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:41.356966 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-59c87754d7-885gw" event={"ID":"c6251bfc-5404-40e7-94c8-13a53db22ad5","Type":"ContainerDied","Data":"5f3fa9bd55fac03af27bca47d2d421ca4f9d990651c80c9c2247ede8c4d8f3e3"} Apr 22 14:57:41.357128 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:41.356988 2576 scope.go:117] "RemoveContainer" containerID="9d74ddd28bd0a49123a8cb32cc2e98e085c679b1967c1a82e9f3afbe82de547a" Apr 22 14:57:41.365278 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:41.365252 2576 scope.go:117] "RemoveContainer" containerID="4245aecfb883f62ba80dbfa7f7c33a8ffc01d4dcc044e1defa28d2a909414776" Apr 22 14:57:41.372572 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:41.372554 2576 scope.go:117] "RemoveContainer" containerID="9d74ddd28bd0a49123a8cb32cc2e98e085c679b1967c1a82e9f3afbe82de547a" Apr 22 14:57:41.372878 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:57:41.372856 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d74ddd28bd0a49123a8cb32cc2e98e085c679b1967c1a82e9f3afbe82de547a\": container with ID starting with 9d74ddd28bd0a49123a8cb32cc2e98e085c679b1967c1a82e9f3afbe82de547a not found: ID does not exist" containerID="9d74ddd28bd0a49123a8cb32cc2e98e085c679b1967c1a82e9f3afbe82de547a" Apr 22 14:57:41.372939 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:41.372886 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d74ddd28bd0a49123a8cb32cc2e98e085c679b1967c1a82e9f3afbe82de547a"} err="failed to get container status \"9d74ddd28bd0a49123a8cb32cc2e98e085c679b1967c1a82e9f3afbe82de547a\": rpc error: code = NotFound desc = could not find container \"9d74ddd28bd0a49123a8cb32cc2e98e085c679b1967c1a82e9f3afbe82de547a\": container with ID starting with 9d74ddd28bd0a49123a8cb32cc2e98e085c679b1967c1a82e9f3afbe82de547a not found: ID does not exist" Apr 22 14:57:41.372939 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:41.372905 2576 scope.go:117] "RemoveContainer" containerID="4245aecfb883f62ba80dbfa7f7c33a8ffc01d4dcc044e1defa28d2a909414776" Apr 22 14:57:41.373163 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:57:41.373144 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4245aecfb883f62ba80dbfa7f7c33a8ffc01d4dcc044e1defa28d2a909414776\": container with ID starting with 4245aecfb883f62ba80dbfa7f7c33a8ffc01d4dcc044e1defa28d2a909414776 not found: ID does not exist" containerID="4245aecfb883f62ba80dbfa7f7c33a8ffc01d4dcc044e1defa28d2a909414776" Apr 22 14:57:41.373200 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:41.373171 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4245aecfb883f62ba80dbfa7f7c33a8ffc01d4dcc044e1defa28d2a909414776"} err="failed to get container status \"4245aecfb883f62ba80dbfa7f7c33a8ffc01d4dcc044e1defa28d2a909414776\": rpc error: code = NotFound desc = could not find container \"4245aecfb883f62ba80dbfa7f7c33a8ffc01d4dcc044e1defa28d2a909414776\": container with ID starting with 4245aecfb883f62ba80dbfa7f7c33a8ffc01d4dcc044e1defa28d2a909414776 not found: ID does not exist" Apr 22 14:57:41.379706 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:41.379685 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-59c87754d7-885gw"] Apr 22 14:57:41.383445 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:41.383420 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-59c87754d7-885gw"] Apr 22 14:57:42.292044 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:42.292013 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6251bfc-5404-40e7-94c8-13a53db22ad5" path="/var/lib/kubelet/pods/c6251bfc-5404-40e7-94c8-13a53db22ad5/volumes" Apr 22 14:57:44.366980 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:44.366949 2576 generic.go:358] "Generic (PLEG): container finished" podID="8fc1e8f9-1582-4f5e-a100-5109cc6e93ad" containerID="18cb498cd33eeebd444708324a0442181c63a2b2b55a657a2396c9321839386a" exitCode=0 Apr 22 14:57:44.367298 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:44.367007 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-ksd5w" event={"ID":"8fc1e8f9-1582-4f5e-a100-5109cc6e93ad","Type":"ContainerDied","Data":"18cb498cd33eeebd444708324a0442181c63a2b2b55a657a2396c9321839386a"} Apr 22 14:57:45.371717 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:45.371684 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-ksd5w" event={"ID":"8fc1e8f9-1582-4f5e-a100-5109cc6e93ad","Type":"ContainerStarted","Data":"2740089c3ee1bdf211bbd35063d6aa70f0886e564c7117efb004d10098fe9368"} Apr 22 14:57:45.372140 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:45.371922 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-ksd5w" Apr 22 14:57:45.392745 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:57:45.392696 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-ksd5w" podStartSLOduration=6.392682342 podStartE2EDuration="6.392682342s" podCreationTimestamp="2026-04-22 14:57:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:57:45.39083921 +0000 UTC m=+2535.716573623" watchObservedRunningTime="2026-04-22 14:57:45.392682342 +0000 UTC m=+2535.718416756" Apr 22 14:58:16.432608 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:58:16.432554 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-ksd5w" podUID="8fc1e8f9-1582-4f5e-a100-5109cc6e93ad" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 22 14:58:26.376339 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:58:26.376309 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-ksd5w" Apr 22 14:58:29.680055 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:58:29.680031 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-ksd5w"] Apr 22 14:58:29.680364 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:58:29.680230 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-ksd5w" podUID="8fc1e8f9-1582-4f5e-a100-5109cc6e93ad" containerName="kserve-container" containerID="cri-o://2740089c3ee1bdf211bbd35063d6aa70f0886e564c7117efb004d10098fe9368" gracePeriod=30 Apr 22 14:58:29.788636 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:58:29.788611 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5bc96b6857-trp5d"] Apr 22 14:58:29.788936 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:58:29.788921 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c6251bfc-5404-40e7-94c8-13a53db22ad5" containerName="storage-initializer" Apr 22 14:58:29.789080 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:58:29.788938 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6251bfc-5404-40e7-94c8-13a53db22ad5" containerName="storage-initializer" Apr 22 14:58:29.789080 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:58:29.788967 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c6251bfc-5404-40e7-94c8-13a53db22ad5" containerName="kserve-container" Apr 22 14:58:29.789080 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:58:29.788975 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6251bfc-5404-40e7-94c8-13a53db22ad5" containerName="kserve-container" Apr 22 14:58:29.789080 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:58:29.789046 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="c6251bfc-5404-40e7-94c8-13a53db22ad5" containerName="kserve-container" Apr 22 14:58:29.791046 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:58:29.791029 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5bc96b6857-trp5d" Apr 22 14:58:29.801981 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:58:29.801955 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5bc96b6857-trp5d"] Apr 22 14:58:29.876522 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:58:29.876472 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b548c27a-a4a1-4ae2-9a98-2bf2cff1f18e-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-5bc96b6857-trp5d\" (UID: \"b548c27a-a4a1-4ae2-9a98-2bf2cff1f18e\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5bc96b6857-trp5d" Apr 22 14:58:29.977789 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:58:29.977769 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b548c27a-a4a1-4ae2-9a98-2bf2cff1f18e-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-5bc96b6857-trp5d\" (UID: \"b548c27a-a4a1-4ae2-9a98-2bf2cff1f18e\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5bc96b6857-trp5d" Apr 22 14:58:29.978123 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:58:29.978102 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b548c27a-a4a1-4ae2-9a98-2bf2cff1f18e-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-5bc96b6857-trp5d\" (UID: \"b548c27a-a4a1-4ae2-9a98-2bf2cff1f18e\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5bc96b6857-trp5d" Apr 22 14:58:30.100365 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:58:30.100341 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5bc96b6857-trp5d" Apr 22 14:58:30.210759 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:58:30.210737 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5bc96b6857-trp5d"] Apr 22 14:58:30.494906 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:58:30.494876 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5bc96b6857-trp5d" event={"ID":"b548c27a-a4a1-4ae2-9a98-2bf2cff1f18e","Type":"ContainerStarted","Data":"dd494bdbe532b5943aa1301bd938989a568fa11659ecc3f86f40e59d833b3621"} Apr 22 14:58:30.494906 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:58:30.494915 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5bc96b6857-trp5d" event={"ID":"b548c27a-a4a1-4ae2-9a98-2bf2cff1f18e","Type":"ContainerStarted","Data":"a2c5b24b52820073278b4c3447bc77f83c457ee66c1051ec63984bca223e0387"} Apr 22 14:58:34.507967 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:58:34.507931 2576 generic.go:358] "Generic (PLEG): container finished" podID="b548c27a-a4a1-4ae2-9a98-2bf2cff1f18e" containerID="dd494bdbe532b5943aa1301bd938989a568fa11659ecc3f86f40e59d833b3621" exitCode=0 Apr 22 14:58:34.508308 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:58:34.508001 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5bc96b6857-trp5d" event={"ID":"b548c27a-a4a1-4ae2-9a98-2bf2cff1f18e","Type":"ContainerDied","Data":"dd494bdbe532b5943aa1301bd938989a568fa11659ecc3f86f40e59d833b3621"} Apr 22 14:58:35.512089 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:58:35.512057 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5bc96b6857-trp5d" event={"ID":"b548c27a-a4a1-4ae2-9a98-2bf2cff1f18e","Type":"ContainerStarted","Data":"8fe6a406319639ff64ee0873ae56a7ea69545a0bfc6d37b79a10e504a1a26414"} Apr 22 14:58:35.512491 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:58:35.512411 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5bc96b6857-trp5d" Apr 22 14:58:35.513719 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:58:35.513696 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5bc96b6857-trp5d" podUID="b548c27a-a4a1-4ae2-9a98-2bf2cff1f18e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 22 14:58:35.533677 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:58:35.533632 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5bc96b6857-trp5d" podStartSLOduration=6.533616537 podStartE2EDuration="6.533616537s" podCreationTimestamp="2026-04-22 14:58:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:58:35.532445936 +0000 UTC m=+2585.858180349" watchObservedRunningTime="2026-04-22 14:58:35.533616537 +0000 UTC m=+2585.859350949" Apr 22 14:58:35.799304 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:58:35.799287 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-ksd5w" Apr 22 14:58:35.919118 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:58:35.919098 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8fc1e8f9-1582-4f5e-a100-5109cc6e93ad-kserve-provision-location\") pod \"8fc1e8f9-1582-4f5e-a100-5109cc6e93ad\" (UID: \"8fc1e8f9-1582-4f5e-a100-5109cc6e93ad\") " Apr 22 14:58:35.919380 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:58:35.919360 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fc1e8f9-1582-4f5e-a100-5109cc6e93ad-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8fc1e8f9-1582-4f5e-a100-5109cc6e93ad" (UID: "8fc1e8f9-1582-4f5e-a100-5109cc6e93ad"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:58:36.020472 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:58:36.020434 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8fc1e8f9-1582-4f5e-a100-5109cc6e93ad-kserve-provision-location\") on node \"ip-10-0-142-195.ec2.internal\" DevicePath \"\"" Apr 22 14:58:36.515355 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:58:36.515327 2576 generic.go:358] "Generic (PLEG): container finished" podID="8fc1e8f9-1582-4f5e-a100-5109cc6e93ad" containerID="2740089c3ee1bdf211bbd35063d6aa70f0886e564c7117efb004d10098fe9368" exitCode=0 Apr 22 14:58:36.515715 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:58:36.515406 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-ksd5w" Apr 22 14:58:36.515715 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:58:36.515423 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-ksd5w" event={"ID":"8fc1e8f9-1582-4f5e-a100-5109cc6e93ad","Type":"ContainerDied","Data":"2740089c3ee1bdf211bbd35063d6aa70f0886e564c7117efb004d10098fe9368"} Apr 22 14:58:36.515715 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:58:36.515468 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-ksd5w" event={"ID":"8fc1e8f9-1582-4f5e-a100-5109cc6e93ad","Type":"ContainerDied","Data":"9fc0b985cb2ce172b0371bdb8dffadae1cbfc7f8f7fe02bb2685a881d2dc50b0"} Apr 22 14:58:36.515715 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:58:36.515489 2576 scope.go:117] "RemoveContainer" containerID="2740089c3ee1bdf211bbd35063d6aa70f0886e564c7117efb004d10098fe9368" Apr 22 14:58:36.516047 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:58:36.516021 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5bc96b6857-trp5d" podUID="b548c27a-a4a1-4ae2-9a98-2bf2cff1f18e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 22 14:58:36.522820 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:58:36.522789 2576 scope.go:117] "RemoveContainer" containerID="18cb498cd33eeebd444708324a0442181c63a2b2b55a657a2396c9321839386a" Apr 22 14:58:36.529385 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:58:36.529368 2576 scope.go:117] "RemoveContainer" containerID="2740089c3ee1bdf211bbd35063d6aa70f0886e564c7117efb004d10098fe9368" Apr 22 14:58:36.529620 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:58:36.529600 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2740089c3ee1bdf211bbd35063d6aa70f0886e564c7117efb004d10098fe9368\": container with ID starting with 2740089c3ee1bdf211bbd35063d6aa70f0886e564c7117efb004d10098fe9368 not found: ID does not exist" containerID="2740089c3ee1bdf211bbd35063d6aa70f0886e564c7117efb004d10098fe9368" Apr 22 14:58:36.529675 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:58:36.529627 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2740089c3ee1bdf211bbd35063d6aa70f0886e564c7117efb004d10098fe9368"} err="failed to get container status \"2740089c3ee1bdf211bbd35063d6aa70f0886e564c7117efb004d10098fe9368\": rpc error: code = NotFound desc = could not find container \"2740089c3ee1bdf211bbd35063d6aa70f0886e564c7117efb004d10098fe9368\": container with ID starting with 2740089c3ee1bdf211bbd35063d6aa70f0886e564c7117efb004d10098fe9368 not found: ID does not exist" Apr 22 14:58:36.529675 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:58:36.529644 2576 scope.go:117] "RemoveContainer" containerID="18cb498cd33eeebd444708324a0442181c63a2b2b55a657a2396c9321839386a" Apr 22 14:58:36.529886 ip-10-0-142-195 kubenswrapper[2576]: E0422 14:58:36.529869 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18cb498cd33eeebd444708324a0442181c63a2b2b55a657a2396c9321839386a\": container with ID starting with 18cb498cd33eeebd444708324a0442181c63a2b2b55a657a2396c9321839386a not found: ID does not exist" containerID="18cb498cd33eeebd444708324a0442181c63a2b2b55a657a2396c9321839386a" Apr 22 14:58:36.529938 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:58:36.529891 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18cb498cd33eeebd444708324a0442181c63a2b2b55a657a2396c9321839386a"} err="failed to get container status \"18cb498cd33eeebd444708324a0442181c63a2b2b55a657a2396c9321839386a\": rpc error: code = NotFound desc = could not find container \"18cb498cd33eeebd444708324a0442181c63a2b2b55a657a2396c9321839386a\": container with ID starting with 18cb498cd33eeebd444708324a0442181c63a2b2b55a657a2396c9321839386a not found: ID does not exist" Apr 22 14:58:36.534196 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:58:36.534175 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-ksd5w"] Apr 22 14:58:36.538546 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:58:36.538526 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-ksd5w"] Apr 22 14:58:38.291395 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:58:38.291362 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fc1e8f9-1582-4f5e-a100-5109cc6e93ad" path="/var/lib/kubelet/pods/8fc1e8f9-1582-4f5e-a100-5109cc6e93ad/volumes" Apr 22 14:58:46.516259 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:58:46.516221 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5bc96b6857-trp5d" podUID="b548c27a-a4a1-4ae2-9a98-2bf2cff1f18e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 22 14:58:56.516803 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:58:56.516768 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5bc96b6857-trp5d" podUID="b548c27a-a4a1-4ae2-9a98-2bf2cff1f18e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 22 14:59:06.516791 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:59:06.516751 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5bc96b6857-trp5d" podUID="b548c27a-a4a1-4ae2-9a98-2bf2cff1f18e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 22 14:59:16.516687 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:59:16.516645 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5bc96b6857-trp5d" podUID="b548c27a-a4a1-4ae2-9a98-2bf2cff1f18e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 22 14:59:26.516436 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:59:26.516396 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5bc96b6857-trp5d" podUID="b548c27a-a4a1-4ae2-9a98-2bf2cff1f18e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 22 14:59:36.516992 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:59:36.516953 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5bc96b6857-trp5d" Apr 22 14:59:39.910956 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:59:39.910927 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5bc96b6857-trp5d"] Apr 22 14:59:39.911460 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:59:39.911218 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5bc96b6857-trp5d" podUID="b548c27a-a4a1-4ae2-9a98-2bf2cff1f18e" containerName="kserve-container" containerID="cri-o://8fe6a406319639ff64ee0873ae56a7ea69545a0bfc6d37b79a10e504a1a26414" gracePeriod=30 Apr 22 14:59:39.966069 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:59:39.966047 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-595976f8b-bzsp2"] Apr 22 14:59:39.966305 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:59:39.966294 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8fc1e8f9-1582-4f5e-a100-5109cc6e93ad" containerName="kserve-container" Apr 22 14:59:39.966346 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:59:39.966307 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fc1e8f9-1582-4f5e-a100-5109cc6e93ad" containerName="kserve-container" Apr 22 14:59:39.966346 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:59:39.966325 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8fc1e8f9-1582-4f5e-a100-5109cc6e93ad" containerName="storage-initializer" Apr 22 14:59:39.966346 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:59:39.966332 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fc1e8f9-1582-4f5e-a100-5109cc6e93ad" containerName="storage-initializer" Apr 22 14:59:39.966472 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:59:39.966372 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="8fc1e8f9-1582-4f5e-a100-5109cc6e93ad" containerName="kserve-container" Apr 22 14:59:39.969390 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:59:39.969365 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-595976f8b-bzsp2" Apr 22 14:59:39.978327 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:59:39.978306 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-595976f8b-bzsp2"] Apr 22 14:59:40.048339 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:59:40.048307 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5e843008-7dc2-415b-b3ce-56dcaf9f8cba-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-595976f8b-bzsp2\" (UID: \"5e843008-7dc2-415b-b3ce-56dcaf9f8cba\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-595976f8b-bzsp2" Apr 22 14:59:40.149538 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:59:40.149509 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5e843008-7dc2-415b-b3ce-56dcaf9f8cba-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-595976f8b-bzsp2\" (UID: \"5e843008-7dc2-415b-b3ce-56dcaf9f8cba\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-595976f8b-bzsp2" Apr 22 14:59:40.149827 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:59:40.149794 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5e843008-7dc2-415b-b3ce-56dcaf9f8cba-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-595976f8b-bzsp2\" (UID: \"5e843008-7dc2-415b-b3ce-56dcaf9f8cba\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-595976f8b-bzsp2" Apr 22 14:59:40.279410 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:59:40.279393 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-595976f8b-bzsp2" Apr 22 14:59:40.390347 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:59:40.390326 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-595976f8b-bzsp2"] Apr 22 14:59:40.392603 ip-10-0-142-195 kubenswrapper[2576]: W0422 14:59:40.392578 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e843008_7dc2_415b_b3ce_56dcaf9f8cba.slice/crio-426277f18ba17a91765245f3e9d56ff42b26b6d3a400e092773f761eaf157bd5 WatchSource:0}: Error finding container 426277f18ba17a91765245f3e9d56ff42b26b6d3a400e092773f761eaf157bd5: Status 404 returned error can't find the container with id 426277f18ba17a91765245f3e9d56ff42b26b6d3a400e092773f761eaf157bd5 Apr 22 14:59:40.695132 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:59:40.695053 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-595976f8b-bzsp2" event={"ID":"5e843008-7dc2-415b-b3ce-56dcaf9f8cba","Type":"ContainerStarted","Data":"b9e084eee794b1074b8ea1e58c9c81a217fb12a139835af62182a9a450ffa117"} Apr 22 14:59:40.695132 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:59:40.695085 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-595976f8b-bzsp2" event={"ID":"5e843008-7dc2-415b-b3ce-56dcaf9f8cba","Type":"ContainerStarted","Data":"426277f18ba17a91765245f3e9d56ff42b26b6d3a400e092773f761eaf157bd5"} Apr 22 14:59:43.705042 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:59:43.705015 2576 generic.go:358] "Generic (PLEG): container finished" podID="b548c27a-a4a1-4ae2-9a98-2bf2cff1f18e" containerID="8fe6a406319639ff64ee0873ae56a7ea69545a0bfc6d37b79a10e504a1a26414" exitCode=0 Apr 22 14:59:43.705365 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:59:43.705090 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5bc96b6857-trp5d" event={"ID":"b548c27a-a4a1-4ae2-9a98-2bf2cff1f18e","Type":"ContainerDied","Data":"8fe6a406319639ff64ee0873ae56a7ea69545a0bfc6d37b79a10e504a1a26414"} Apr 22 14:59:43.748518 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:59:43.748502 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5bc96b6857-trp5d" Apr 22 14:59:43.872745 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:59:43.872687 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b548c27a-a4a1-4ae2-9a98-2bf2cff1f18e-kserve-provision-location\") pod \"b548c27a-a4a1-4ae2-9a98-2bf2cff1f18e\" (UID: \"b548c27a-a4a1-4ae2-9a98-2bf2cff1f18e\") " Apr 22 14:59:43.873057 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:59:43.873032 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b548c27a-a4a1-4ae2-9a98-2bf2cff1f18e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b548c27a-a4a1-4ae2-9a98-2bf2cff1f18e" (UID: "b548c27a-a4a1-4ae2-9a98-2bf2cff1f18e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:59:43.973471 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:59:43.973448 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b548c27a-a4a1-4ae2-9a98-2bf2cff1f18e-kserve-provision-location\") on node \"ip-10-0-142-195.ec2.internal\" DevicePath \"\"" Apr 22 14:59:44.708994 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:59:44.708926 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5bc96b6857-trp5d" event={"ID":"b548c27a-a4a1-4ae2-9a98-2bf2cff1f18e","Type":"ContainerDied","Data":"a2c5b24b52820073278b4c3447bc77f83c457ee66c1051ec63984bca223e0387"} Apr 22 14:59:44.708994 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:59:44.708950 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5bc96b6857-trp5d" Apr 22 14:59:44.708994 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:59:44.708976 2576 scope.go:117] "RemoveContainer" containerID="8fe6a406319639ff64ee0873ae56a7ea69545a0bfc6d37b79a10e504a1a26414" Apr 22 14:59:44.710632 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:59:44.710606 2576 generic.go:358] "Generic (PLEG): container finished" podID="5e843008-7dc2-415b-b3ce-56dcaf9f8cba" containerID="b9e084eee794b1074b8ea1e58c9c81a217fb12a139835af62182a9a450ffa117" exitCode=0 Apr 22 14:59:44.710752 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:59:44.710662 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-595976f8b-bzsp2" event={"ID":"5e843008-7dc2-415b-b3ce-56dcaf9f8cba","Type":"ContainerDied","Data":"b9e084eee794b1074b8ea1e58c9c81a217fb12a139835af62182a9a450ffa117"} Apr 22 14:59:44.717363 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:59:44.717326 2576 scope.go:117] "RemoveContainer" containerID="dd494bdbe532b5943aa1301bd938989a568fa11659ecc3f86f40e59d833b3621" Apr 22 14:59:44.747012 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:59:44.746987 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5bc96b6857-trp5d"] Apr 22 14:59:44.752732 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:59:44.752711 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5bc96b6857-trp5d"] Apr 22 14:59:45.718650 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:59:45.718608 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-595976f8b-bzsp2" event={"ID":"5e843008-7dc2-415b-b3ce-56dcaf9f8cba","Type":"ContainerStarted","Data":"ed98f2ee35118315a5d23d269015cb7c875e4f279394b8e9b0a668ca01658c55"} Apr 22 14:59:45.719059 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:59:45.718927 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-595976f8b-bzsp2" Apr 22 14:59:45.720321 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:59:45.720286 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-595976f8b-bzsp2" podUID="5e843008-7dc2-415b-b3ce-56dcaf9f8cba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 22 14:59:45.737210 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:59:45.737168 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-595976f8b-bzsp2" podStartSLOduration=6.73715596 podStartE2EDuration="6.73715596s" podCreationTimestamp="2026-04-22 14:59:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:59:45.73583474 +0000 UTC m=+2656.061569150" watchObservedRunningTime="2026-04-22 14:59:45.73715596 +0000 UTC m=+2656.062890373" Apr 22 14:59:46.292493 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:59:46.292468 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b548c27a-a4a1-4ae2-9a98-2bf2cff1f18e" path="/var/lib/kubelet/pods/b548c27a-a4a1-4ae2-9a98-2bf2cff1f18e/volumes" Apr 22 14:59:46.722380 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:59:46.722345 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-595976f8b-bzsp2" podUID="5e843008-7dc2-415b-b3ce-56dcaf9f8cba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 22 14:59:56.722269 ip-10-0-142-195 kubenswrapper[2576]: I0422 14:59:56.722229 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-595976f8b-bzsp2" podUID="5e843008-7dc2-415b-b3ce-56dcaf9f8cba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 22 15:00:06.723126 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:00:06.723076 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-595976f8b-bzsp2" podUID="5e843008-7dc2-415b-b3ce-56dcaf9f8cba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 22 15:00:16.722584 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:00:16.722498 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-595976f8b-bzsp2" podUID="5e843008-7dc2-415b-b3ce-56dcaf9f8cba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 22 15:00:26.723229 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:00:26.723179 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-595976f8b-bzsp2" podUID="5e843008-7dc2-415b-b3ce-56dcaf9f8cba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 22 15:00:31.062562 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:00:31.062528 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x7g6p_5880a8e9-777a-4921-b5f6-c6325c768bf2/ovn-acl-logging/0.log" Apr 22 15:00:31.067064 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:00:31.067045 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x7g6p_5880a8e9-777a-4921-b5f6-c6325c768bf2/ovn-acl-logging/0.log" Apr 22 15:00:36.722325 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:00:36.722273 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-595976f8b-bzsp2" podUID="5e843008-7dc2-415b-b3ce-56dcaf9f8cba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 22 15:00:46.724031 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:00:46.723990 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-595976f8b-bzsp2" Apr 22 15:00:50.249297 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:00:50.249266 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-7vqfr"] Apr 22 15:00:50.249917 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:00:50.249636 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b548c27a-a4a1-4ae2-9a98-2bf2cff1f18e" containerName="kserve-container" Apr 22 15:00:50.249917 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:00:50.249653 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b548c27a-a4a1-4ae2-9a98-2bf2cff1f18e" containerName="kserve-container" Apr 22 15:00:50.249917 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:00:50.249670 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b548c27a-a4a1-4ae2-9a98-2bf2cff1f18e" containerName="storage-initializer" Apr 22 15:00:50.249917 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:00:50.249679 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b548c27a-a4a1-4ae2-9a98-2bf2cff1f18e" containerName="storage-initializer" Apr 22 15:00:50.249917 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:00:50.249757 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="b548c27a-a4a1-4ae2-9a98-2bf2cff1f18e" containerName="kserve-container" Apr 22 15:00:50.253961 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:00:50.253944 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-7vqfr" Apr 22 15:00:50.266309 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:00:50.266288 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-7vqfr"] Apr 22 15:00:50.311416 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:00:50.311394 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-595976f8b-bzsp2"] Apr 22 15:00:50.311640 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:00:50.311620 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-595976f8b-bzsp2" podUID="5e843008-7dc2-415b-b3ce-56dcaf9f8cba" containerName="kserve-container" containerID="cri-o://ed98f2ee35118315a5d23d269015cb7c875e4f279394b8e9b0a668ca01658c55" gracePeriod=30 Apr 22 15:00:50.411566 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:00:50.411542 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/51b7f220-1d0d-41c4-bd3c-4184101fe659-kserve-provision-location\") pod \"isvc-tensorflow-predictor-88f6f6cb7-7vqfr\" (UID: \"51b7f220-1d0d-41c4-bd3c-4184101fe659\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-7vqfr" Apr 22 15:00:50.512534 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:00:50.512477 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/51b7f220-1d0d-41c4-bd3c-4184101fe659-kserve-provision-location\") pod \"isvc-tensorflow-predictor-88f6f6cb7-7vqfr\" (UID: \"51b7f220-1d0d-41c4-bd3c-4184101fe659\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-7vqfr" Apr 22 15:00:50.512787 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:00:50.512772 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/51b7f220-1d0d-41c4-bd3c-4184101fe659-kserve-provision-location\") pod \"isvc-tensorflow-predictor-88f6f6cb7-7vqfr\" (UID: \"51b7f220-1d0d-41c4-bd3c-4184101fe659\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-7vqfr" Apr 22 15:00:50.562753 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:00:50.562730 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-7vqfr" Apr 22 15:00:50.674136 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:00:50.674076 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-7vqfr"] Apr 22 15:00:50.676227 ip-10-0-142-195 kubenswrapper[2576]: W0422 15:00:50.676198 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51b7f220_1d0d_41c4_bd3c_4184101fe659.slice/crio-7a4be2c664c494cb9f589e16d02ce7778a26e1ff21af8ec5287a6b044891be03 WatchSource:0}: Error finding container 7a4be2c664c494cb9f589e16d02ce7778a26e1ff21af8ec5287a6b044891be03: Status 404 returned error can't find the container with id 7a4be2c664c494cb9f589e16d02ce7778a26e1ff21af8ec5287a6b044891be03 Apr 22 15:00:50.678065 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:00:50.678051 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 15:00:50.889899 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:00:50.889797 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-7vqfr" event={"ID":"51b7f220-1d0d-41c4-bd3c-4184101fe659","Type":"ContainerStarted","Data":"a2c4558bbb228334721aa63c82d8788741e2e168f0822fb9f0a578bbfefbafb7"} Apr 22 15:00:50.889899 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:00:50.889853 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-7vqfr" event={"ID":"51b7f220-1d0d-41c4-bd3c-4184101fe659","Type":"ContainerStarted","Data":"7a4be2c664c494cb9f589e16d02ce7778a26e1ff21af8ec5287a6b044891be03"} Apr 22 15:00:54.052013 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:00:54.051992 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-595976f8b-bzsp2" Apr 22 15:00:54.140001 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:00:54.139927 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5e843008-7dc2-415b-b3ce-56dcaf9f8cba-kserve-provision-location\") pod \"5e843008-7dc2-415b-b3ce-56dcaf9f8cba\" (UID: \"5e843008-7dc2-415b-b3ce-56dcaf9f8cba\") " Apr 22 15:00:54.140253 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:00:54.140226 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e843008-7dc2-415b-b3ce-56dcaf9f8cba-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5e843008-7dc2-415b-b3ce-56dcaf9f8cba" (UID: "5e843008-7dc2-415b-b3ce-56dcaf9f8cba"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 15:00:54.240939 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:00:54.240914 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5e843008-7dc2-415b-b3ce-56dcaf9f8cba-kserve-provision-location\") on node \"ip-10-0-142-195.ec2.internal\" DevicePath \"\"" Apr 22 15:00:54.902281 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:00:54.902250 2576 generic.go:358] "Generic (PLEG): container finished" podID="5e843008-7dc2-415b-b3ce-56dcaf9f8cba" containerID="ed98f2ee35118315a5d23d269015cb7c875e4f279394b8e9b0a668ca01658c55" exitCode=0 Apr 22 15:00:54.902410 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:00:54.902298 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-595976f8b-bzsp2" event={"ID":"5e843008-7dc2-415b-b3ce-56dcaf9f8cba","Type":"ContainerDied","Data":"ed98f2ee35118315a5d23d269015cb7c875e4f279394b8e9b0a668ca01658c55"} Apr 22 15:00:54.902410 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:00:54.902330 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-595976f8b-bzsp2" Apr 22 15:00:54.902410 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:00:54.902341 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-595976f8b-bzsp2" event={"ID":"5e843008-7dc2-415b-b3ce-56dcaf9f8cba","Type":"ContainerDied","Data":"426277f18ba17a91765245f3e9d56ff42b26b6d3a400e092773f761eaf157bd5"} Apr 22 15:00:54.902410 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:00:54.902362 2576 scope.go:117] "RemoveContainer" containerID="ed98f2ee35118315a5d23d269015cb7c875e4f279394b8e9b0a668ca01658c55" Apr 22 15:00:54.909622 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:00:54.909597 2576 scope.go:117] "RemoveContainer" containerID="b9e084eee794b1074b8ea1e58c9c81a217fb12a139835af62182a9a450ffa117" Apr 22 15:00:54.915572 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:00:54.915558 2576 scope.go:117] "RemoveContainer" containerID="ed98f2ee35118315a5d23d269015cb7c875e4f279394b8e9b0a668ca01658c55" Apr 22 15:00:54.915792 ip-10-0-142-195 kubenswrapper[2576]: E0422 15:00:54.915775 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed98f2ee35118315a5d23d269015cb7c875e4f279394b8e9b0a668ca01658c55\": container with ID starting with ed98f2ee35118315a5d23d269015cb7c875e4f279394b8e9b0a668ca01658c55 not found: ID does not exist" containerID="ed98f2ee35118315a5d23d269015cb7c875e4f279394b8e9b0a668ca01658c55" Apr 22 15:00:54.915885 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:00:54.915802 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed98f2ee35118315a5d23d269015cb7c875e4f279394b8e9b0a668ca01658c55"} err="failed to get container status \"ed98f2ee35118315a5d23d269015cb7c875e4f279394b8e9b0a668ca01658c55\": rpc error: code = NotFound desc = could not find container \"ed98f2ee35118315a5d23d269015cb7c875e4f279394b8e9b0a668ca01658c55\": container with ID starting with ed98f2ee35118315a5d23d269015cb7c875e4f279394b8e9b0a668ca01658c55 not found: ID does not exist" Apr 22 15:00:54.915885 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:00:54.915837 2576 scope.go:117] "RemoveContainer" containerID="b9e084eee794b1074b8ea1e58c9c81a217fb12a139835af62182a9a450ffa117" Apr 22 15:00:54.916093 ip-10-0-142-195 kubenswrapper[2576]: E0422 15:00:54.916077 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9e084eee794b1074b8ea1e58c9c81a217fb12a139835af62182a9a450ffa117\": container with ID starting with b9e084eee794b1074b8ea1e58c9c81a217fb12a139835af62182a9a450ffa117 not found: ID does not exist" containerID="b9e084eee794b1074b8ea1e58c9c81a217fb12a139835af62182a9a450ffa117" Apr 22 15:00:54.916137 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:00:54.916100 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9e084eee794b1074b8ea1e58c9c81a217fb12a139835af62182a9a450ffa117"} err="failed to get container status \"b9e084eee794b1074b8ea1e58c9c81a217fb12a139835af62182a9a450ffa117\": rpc error: code = NotFound desc = could not find container \"b9e084eee794b1074b8ea1e58c9c81a217fb12a139835af62182a9a450ffa117\": container with ID starting with b9e084eee794b1074b8ea1e58c9c81a217fb12a139835af62182a9a450ffa117 not found: ID does not exist" Apr 22 15:00:54.919266 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:00:54.919249 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-595976f8b-bzsp2"] Apr 22 15:00:54.922989 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:00:54.922972 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-595976f8b-bzsp2"] Apr 22 15:00:56.291247 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:00:56.291206 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e843008-7dc2-415b-b3ce-56dcaf9f8cba" path="/var/lib/kubelet/pods/5e843008-7dc2-415b-b3ce-56dcaf9f8cba/volumes" Apr 22 15:00:59.917175 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:00:59.917142 2576 generic.go:358] "Generic (PLEG): container finished" podID="51b7f220-1d0d-41c4-bd3c-4184101fe659" containerID="a2c4558bbb228334721aa63c82d8788741e2e168f0822fb9f0a578bbfefbafb7" exitCode=0 Apr 22 15:00:59.917587 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:00:59.917229 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-7vqfr" event={"ID":"51b7f220-1d0d-41c4-bd3c-4184101fe659","Type":"ContainerDied","Data":"a2c4558bbb228334721aa63c82d8788741e2e168f0822fb9f0a578bbfefbafb7"} Apr 22 15:01:03.931662 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:01:03.931634 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-7vqfr" event={"ID":"51b7f220-1d0d-41c4-bd3c-4184101fe659","Type":"ContainerStarted","Data":"c5edcc5602506764c476d5aeb0145ffccbec4747b0facc0ee9d062a97cfea77c"} Apr 22 15:01:03.932007 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:01:03.931911 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-7vqfr" Apr 22 15:01:03.933385 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:01:03.933361 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-7vqfr" podUID="51b7f220-1d0d-41c4-bd3c-4184101fe659" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Apr 22 15:01:03.949865 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:01:03.949794 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-7vqfr" podStartSLOduration=10.031593921 podStartE2EDuration="13.949781974s" podCreationTimestamp="2026-04-22 15:00:50 +0000 UTC" firstStartedPulling="2026-04-22 15:00:59.918611532 +0000 UTC m=+2730.244345928" lastFinishedPulling="2026-04-22 15:01:03.836799584 +0000 UTC m=+2734.162533981" observedRunningTime="2026-04-22 15:01:03.948683184 +0000 UTC m=+2734.274417598" watchObservedRunningTime="2026-04-22 15:01:03.949781974 +0000 UTC m=+2734.275516405" Apr 22 15:01:04.935025 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:01:04.934979 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-7vqfr" podUID="51b7f220-1d0d-41c4-bd3c-4184101fe659" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Apr 22 15:01:14.936460 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:01:14.936426 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-7vqfr" Apr 22 15:01:29.470250 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:01:29.470220 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-7vqfr"] Apr 22 15:01:29.470712 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:01:29.470499 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-7vqfr" podUID="51b7f220-1d0d-41c4-bd3c-4184101fe659" containerName="kserve-container" containerID="cri-o://c5edcc5602506764c476d5aeb0145ffccbec4747b0facc0ee9d062a97cfea77c" gracePeriod=30 Apr 22 15:01:29.555693 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:01:29.555664 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-k5tsj"] Apr 22 15:01:29.556029 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:01:29.556012 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5e843008-7dc2-415b-b3ce-56dcaf9f8cba" containerName="kserve-container" Apr 22 15:01:29.556120 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:01:29.556031 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e843008-7dc2-415b-b3ce-56dcaf9f8cba" containerName="kserve-container" Apr 22 15:01:29.556120 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:01:29.556063 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5e843008-7dc2-415b-b3ce-56dcaf9f8cba" containerName="storage-initializer" Apr 22 15:01:29.556120 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:01:29.556072 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e843008-7dc2-415b-b3ce-56dcaf9f8cba" containerName="storage-initializer" Apr 22 15:01:29.556277 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:01:29.556138 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="5e843008-7dc2-415b-b3ce-56dcaf9f8cba" containerName="kserve-container" Apr 22 15:01:29.558168 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:01:29.558149 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-k5tsj" Apr 22 15:01:29.568187 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:01:29.568167 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-k5tsj"] Apr 22 15:01:29.584819 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:01:29.584783 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2ac2bc20-ecde-4c0d-a4ae-b0f2bbd6109d-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-854bdff69c-k5tsj\" (UID: \"2ac2bc20-ecde-4c0d-a4ae-b0f2bbd6109d\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-k5tsj" Apr 22 15:01:29.685436 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:01:29.685411 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2ac2bc20-ecde-4c0d-a4ae-b0f2bbd6109d-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-854bdff69c-k5tsj\" (UID: \"2ac2bc20-ecde-4c0d-a4ae-b0f2bbd6109d\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-k5tsj" Apr 22 15:01:29.685689 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:01:29.685672 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2ac2bc20-ecde-4c0d-a4ae-b0f2bbd6109d-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-854bdff69c-k5tsj\" (UID: \"2ac2bc20-ecde-4c0d-a4ae-b0f2bbd6109d\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-k5tsj" Apr 22 15:01:29.867489 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:01:29.867454 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-k5tsj" Apr 22 15:01:29.979202 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:01:29.979181 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-k5tsj"] Apr 22 15:01:29.981397 ip-10-0-142-195 kubenswrapper[2576]: W0422 15:01:29.981373 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ac2bc20_ecde_4c0d_a4ae_b0f2bbd6109d.slice/crio-1c8ff54e371316c4731bc9d397d8a2e5c37dc29b308715e709be384dc17fe010 WatchSource:0}: Error finding container 1c8ff54e371316c4731bc9d397d8a2e5c37dc29b308715e709be384dc17fe010: Status 404 returned error can't find the container with id 1c8ff54e371316c4731bc9d397d8a2e5c37dc29b308715e709be384dc17fe010 Apr 22 15:01:30.005418 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:01:30.005396 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-k5tsj" event={"ID":"2ac2bc20-ecde-4c0d-a4ae-b0f2bbd6109d","Type":"ContainerStarted","Data":"1c8ff54e371316c4731bc9d397d8a2e5c37dc29b308715e709be384dc17fe010"} Apr 22 15:01:31.009213 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:01:31.009176 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-k5tsj" event={"ID":"2ac2bc20-ecde-4c0d-a4ae-b0f2bbd6109d","Type":"ContainerStarted","Data":"7f9c069effdcb39ac12a239447d44987085ba6384a1d4d18a346e8c59f80d3cb"} Apr 22 15:01:35.023398 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:01:35.023364 2576 generic.go:358] "Generic (PLEG): container finished" podID="2ac2bc20-ecde-4c0d-a4ae-b0f2bbd6109d" containerID="7f9c069effdcb39ac12a239447d44987085ba6384a1d4d18a346e8c59f80d3cb" exitCode=0 Apr 22 15:01:35.023796 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:01:35.023420 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-k5tsj" event={"ID":"2ac2bc20-ecde-4c0d-a4ae-b0f2bbd6109d","Type":"ContainerDied","Data":"7f9c069effdcb39ac12a239447d44987085ba6384a1d4d18a346e8c59f80d3cb"} Apr 22 15:01:36.027864 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:01:36.027827 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-k5tsj" event={"ID":"2ac2bc20-ecde-4c0d-a4ae-b0f2bbd6109d","Type":"ContainerStarted","Data":"c0fc4a976f8aac49a66bae73f4376acfe874b118efd4cc2202d368045f84ebc3"} Apr 22 15:01:36.028339 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:01:36.028158 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-k5tsj" Apr 22 15:01:36.029504 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:01:36.029470 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-k5tsj" podUID="2ac2bc20-ecde-4c0d-a4ae-b0f2bbd6109d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 22 15:01:36.045675 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:01:36.045626 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-k5tsj" podStartSLOduration=7.045607145 podStartE2EDuration="7.045607145s" podCreationTimestamp="2026-04-22 15:01:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:01:36.043602676 +0000 UTC m=+2766.369337089" watchObservedRunningTime="2026-04-22 15:01:36.045607145 +0000 UTC m=+2766.371341559" Apr 22 15:01:37.031401 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:01:37.031363 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-k5tsj" podUID="2ac2bc20-ecde-4c0d-a4ae-b0f2bbd6109d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 22 15:01:47.032847 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:01:47.032737 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-k5tsj" Apr 22 15:01:59.093133 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:01:59.093100 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-k5tsj"] Apr 22 15:01:59.093501 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:01:59.093348 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-k5tsj" podUID="2ac2bc20-ecde-4c0d-a4ae-b0f2bbd6109d" containerName="kserve-container" containerID="cri-o://c0fc4a976f8aac49a66bae73f4376acfe874b118efd4cc2202d368045f84ebc3" gracePeriod=30 Apr 22 15:01:59.489394 ip-10-0-142-195 kubenswrapper[2576]: E0422 15:01:59.489358 2576 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ac2bc20_ecde_4c0d_a4ae_b0f2bbd6109d.slice/crio-7f9c069effdcb39ac12a239447d44987085ba6384a1d4d18a346e8c59f80d3cb.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ac2bc20_ecde_4c0d_a4ae_b0f2bbd6109d.slice/crio-conmon-7f9c069effdcb39ac12a239447d44987085ba6384a1d4d18a346e8c59f80d3cb.scope\": RecentStats: unable to find data in memory cache]" Apr 22 15:01:59.489530 ip-10-0-142-195 kubenswrapper[2576]: E0422 15:01:59.489403 2576 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ac2bc20_ecde_4c0d_a4ae_b0f2bbd6109d.slice/crio-7f9c069effdcb39ac12a239447d44987085ba6384a1d4d18a346e8c59f80d3cb.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ac2bc20_ecde_4c0d_a4ae_b0f2bbd6109d.slice/crio-conmon-7f9c069effdcb39ac12a239447d44987085ba6384a1d4d18a346e8c59f80d3cb.scope\": RecentStats: unable to find data in memory cache]" Apr 22 15:01:59.489530 ip-10-0-142-195 kubenswrapper[2576]: E0422 15:01:59.489421 2576 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ac2bc20_ecde_4c0d_a4ae_b0f2bbd6109d.slice/crio-conmon-7f9c069effdcb39ac12a239447d44987085ba6384a1d4d18a346e8c59f80d3cb.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ac2bc20_ecde_4c0d_a4ae_b0f2bbd6109d.slice/crio-7f9c069effdcb39ac12a239447d44987085ba6384a1d4d18a346e8c59f80d3cb.scope\": RecentStats: unable to find data in memory cache]" Apr 22 15:01:59.489530 ip-10-0-142-195 kubenswrapper[2576]: E0422 15:01:59.489461 2576 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ac2bc20_ecde_4c0d_a4ae_b0f2bbd6109d.slice/crio-7f9c069effdcb39ac12a239447d44987085ba6384a1d4d18a346e8c59f80d3cb.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ac2bc20_ecde_4c0d_a4ae_b0f2bbd6109d.slice/crio-conmon-7f9c069effdcb39ac12a239447d44987085ba6384a1d4d18a346e8c59f80d3cb.scope\": RecentStats: unable to find data in memory cache]" Apr 22 15:02:00.096822 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:02:00.096780 2576 generic.go:358] "Generic (PLEG): container finished" podID="51b7f220-1d0d-41c4-bd3c-4184101fe659" containerID="c5edcc5602506764c476d5aeb0145ffccbec4747b0facc0ee9d062a97cfea77c" exitCode=137 Apr 22 15:02:00.097123 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:02:00.096840 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-7vqfr" event={"ID":"51b7f220-1d0d-41c4-bd3c-4184101fe659","Type":"ContainerDied","Data":"c5edcc5602506764c476d5aeb0145ffccbec4747b0facc0ee9d062a97cfea77c"} Apr 22 15:02:00.097123 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:02:00.096882 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-7vqfr" event={"ID":"51b7f220-1d0d-41c4-bd3c-4184101fe659","Type":"ContainerDied","Data":"7a4be2c664c494cb9f589e16d02ce7778a26e1ff21af8ec5287a6b044891be03"} Apr 22 15:02:00.097123 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:02:00.096897 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a4be2c664c494cb9f589e16d02ce7778a26e1ff21af8ec5287a6b044891be03" Apr 22 15:02:00.103366 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:02:00.103352 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-7vqfr" Apr 22 15:02:00.180559 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:02:00.180539 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/51b7f220-1d0d-41c4-bd3c-4184101fe659-kserve-provision-location\") pod \"51b7f220-1d0d-41c4-bd3c-4184101fe659\" (UID: \"51b7f220-1d0d-41c4-bd3c-4184101fe659\") " Apr 22 15:02:00.191131 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:02:00.191107 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51b7f220-1d0d-41c4-bd3c-4184101fe659-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "51b7f220-1d0d-41c4-bd3c-4184101fe659" (UID: "51b7f220-1d0d-41c4-bd3c-4184101fe659"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 15:02:00.281830 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:02:00.281767 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/51b7f220-1d0d-41c4-bd3c-4184101fe659-kserve-provision-location\") on node \"ip-10-0-142-195.ec2.internal\" DevicePath \"\"" Apr 22 15:02:01.099873 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:02:01.099846 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-7vqfr" Apr 22 15:02:01.118940 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:02:01.118908 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-7vqfr"] Apr 22 15:02:01.122668 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:02:01.122639 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-7vqfr"] Apr 22 15:02:02.291571 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:02:02.291541 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51b7f220-1d0d-41c4-bd3c-4184101fe659" path="/var/lib/kubelet/pods/51b7f220-1d0d-41c4-bd3c-4184101fe659/volumes" Apr 22 15:02:29.114267 ip-10-0-142-195 kubenswrapper[2576]: E0422 15:02:29.114236 2576 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51b7f220_1d0d_41c4_bd3c_4184101fe659.slice\": RecentStats: unable to find data in memory cache]" Apr 22 15:02:29.114579 ip-10-0-142-195 kubenswrapper[2576]: E0422 15:02:29.114455 2576 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51b7f220_1d0d_41c4_bd3c_4184101fe659.slice/crio-conmon-c5edcc5602506764c476d5aeb0145ffccbec4747b0facc0ee9d062a97cfea77c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51b7f220_1d0d_41c4_bd3c_4184101fe659.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51b7f220_1d0d_41c4_bd3c_4184101fe659.slice/crio-c5edcc5602506764c476d5aeb0145ffccbec4747b0facc0ee9d062a97cfea77c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51b7f220_1d0d_41c4_bd3c_4184101fe659.slice/crio-7a4be2c664c494cb9f589e16d02ce7778a26e1ff21af8ec5287a6b044891be03\": RecentStats: unable to find data in memory cache]" Apr 22 15:02:29.735643 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:02:29.735623 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-k5tsj" Apr 22 15:02:29.884490 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:02:29.884434 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2ac2bc20-ecde-4c0d-a4ae-b0f2bbd6109d-kserve-provision-location\") pod \"2ac2bc20-ecde-4c0d-a4ae-b0f2bbd6109d\" (UID: \"2ac2bc20-ecde-4c0d-a4ae-b0f2bbd6109d\") " Apr 22 15:02:29.892516 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:02:29.892488 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ac2bc20-ecde-4c0d-a4ae-b0f2bbd6109d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2ac2bc20-ecde-4c0d-a4ae-b0f2bbd6109d" (UID: "2ac2bc20-ecde-4c0d-a4ae-b0f2bbd6109d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 15:02:29.985824 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:02:29.985764 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2ac2bc20-ecde-4c0d-a4ae-b0f2bbd6109d-kserve-provision-location\") on node \"ip-10-0-142-195.ec2.internal\" DevicePath \"\"" Apr 22 15:02:30.180048 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:02:30.179989 2576 generic.go:358] "Generic (PLEG): container finished" podID="2ac2bc20-ecde-4c0d-a4ae-b0f2bbd6109d" containerID="c0fc4a976f8aac49a66bae73f4376acfe874b118efd4cc2202d368045f84ebc3" exitCode=137 Apr 22 15:02:30.180416 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:02:30.180060 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-k5tsj" Apr 22 15:02:30.180416 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:02:30.180080 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-k5tsj" event={"ID":"2ac2bc20-ecde-4c0d-a4ae-b0f2bbd6109d","Type":"ContainerDied","Data":"c0fc4a976f8aac49a66bae73f4376acfe874b118efd4cc2202d368045f84ebc3"} Apr 22 15:02:30.180416 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:02:30.180130 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-k5tsj" event={"ID":"2ac2bc20-ecde-4c0d-a4ae-b0f2bbd6109d","Type":"ContainerDied","Data":"1c8ff54e371316c4731bc9d397d8a2e5c37dc29b308715e709be384dc17fe010"} Apr 22 15:02:30.180416 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:02:30.180158 2576 scope.go:117] "RemoveContainer" containerID="c0fc4a976f8aac49a66bae73f4376acfe874b118efd4cc2202d368045f84ebc3" Apr 22 15:02:30.188560 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:02:30.188545 2576 scope.go:117] "RemoveContainer" containerID="7f9c069effdcb39ac12a239447d44987085ba6384a1d4d18a346e8c59f80d3cb" Apr 22 15:02:30.195584 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:02:30.195567 2576 scope.go:117] "RemoveContainer" containerID="c0fc4a976f8aac49a66bae73f4376acfe874b118efd4cc2202d368045f84ebc3" Apr 22 15:02:30.195860 ip-10-0-142-195 kubenswrapper[2576]: E0422 15:02:30.195840 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0fc4a976f8aac49a66bae73f4376acfe874b118efd4cc2202d368045f84ebc3\": container with ID starting with c0fc4a976f8aac49a66bae73f4376acfe874b118efd4cc2202d368045f84ebc3 not found: ID does not exist" containerID="c0fc4a976f8aac49a66bae73f4376acfe874b118efd4cc2202d368045f84ebc3" Apr 22 15:02:30.195961 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:02:30.195868 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0fc4a976f8aac49a66bae73f4376acfe874b118efd4cc2202d368045f84ebc3"} err="failed to get container status \"c0fc4a976f8aac49a66bae73f4376acfe874b118efd4cc2202d368045f84ebc3\": rpc error: code = NotFound desc = could not find container \"c0fc4a976f8aac49a66bae73f4376acfe874b118efd4cc2202d368045f84ebc3\": container with ID starting with c0fc4a976f8aac49a66bae73f4376acfe874b118efd4cc2202d368045f84ebc3 not found: ID does not exist" Apr 22 15:02:30.195961 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:02:30.195888 2576 scope.go:117] "RemoveContainer" containerID="7f9c069effdcb39ac12a239447d44987085ba6384a1d4d18a346e8c59f80d3cb" Apr 22 15:02:30.196106 ip-10-0-142-195 kubenswrapper[2576]: E0422 15:02:30.196089 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f9c069effdcb39ac12a239447d44987085ba6384a1d4d18a346e8c59f80d3cb\": container with ID starting with 7f9c069effdcb39ac12a239447d44987085ba6384a1d4d18a346e8c59f80d3cb not found: ID does not exist" containerID="7f9c069effdcb39ac12a239447d44987085ba6384a1d4d18a346e8c59f80d3cb" Apr 22 15:02:30.196142 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:02:30.196116 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f9c069effdcb39ac12a239447d44987085ba6384a1d4d18a346e8c59f80d3cb"} err="failed to get container status \"7f9c069effdcb39ac12a239447d44987085ba6384a1d4d18a346e8c59f80d3cb\": rpc error: code = NotFound desc = could not find container \"7f9c069effdcb39ac12a239447d44987085ba6384a1d4d18a346e8c59f80d3cb\": container with ID starting with 7f9c069effdcb39ac12a239447d44987085ba6384a1d4d18a346e8c59f80d3cb not found: ID does not exist" Apr 22 15:02:30.204315 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:02:30.204294 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-k5tsj"] Apr 22 15:02:30.211676 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:02:30.211656 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-k5tsj"] Apr 22 15:02:30.291210 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:02:30.291180 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ac2bc20-ecde-4c0d-a4ae-b0f2bbd6109d" path="/var/lib/kubelet/pods/2ac2bc20-ecde-4c0d-a4ae-b0f2bbd6109d/volumes" Apr 22 15:05:31.083359 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:05:31.083330 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x7g6p_5880a8e9-777a-4921-b5f6-c6325c768bf2/ovn-acl-logging/0.log" Apr 22 15:05:31.089357 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:05:31.089336 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x7g6p_5880a8e9-777a-4921-b5f6-c6325c768bf2/ovn-acl-logging/0.log" Apr 22 15:05:50.736003 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:05:50.735974 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-rzm4h"] Apr 22 15:05:50.736348 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:05:50.736208 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="51b7f220-1d0d-41c4-bd3c-4184101fe659" containerName="kserve-container" Apr 22 15:05:50.736348 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:05:50.736219 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="51b7f220-1d0d-41c4-bd3c-4184101fe659" containerName="kserve-container" Apr 22 15:05:50.736348 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:05:50.736232 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="51b7f220-1d0d-41c4-bd3c-4184101fe659" containerName="storage-initializer" Apr 22 15:05:50.736348 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:05:50.736238 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="51b7f220-1d0d-41c4-bd3c-4184101fe659" containerName="storage-initializer" Apr 22 15:05:50.736348 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:05:50.736243 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2ac2bc20-ecde-4c0d-a4ae-b0f2bbd6109d" containerName="kserve-container" Apr 22 15:05:50.736348 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:05:50.736248 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ac2bc20-ecde-4c0d-a4ae-b0f2bbd6109d" containerName="kserve-container" Apr 22 15:05:50.736348 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:05:50.736255 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2ac2bc20-ecde-4c0d-a4ae-b0f2bbd6109d" containerName="storage-initializer" Apr 22 15:05:50.736348 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:05:50.736260 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ac2bc20-ecde-4c0d-a4ae-b0f2bbd6109d" containerName="storage-initializer" Apr 22 15:05:50.736348 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:05:50.736311 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="2ac2bc20-ecde-4c0d-a4ae-b0f2bbd6109d" containerName="kserve-container" Apr 22 15:05:50.736348 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:05:50.736320 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="51b7f220-1d0d-41c4-bd3c-4184101fe659" containerName="kserve-container" Apr 22 15:05:50.738363 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:05:50.738346 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-rzm4h" Apr 22 15:05:50.741135 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:05:50.741113 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-zp85x\"" Apr 22 15:05:50.750551 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:05:50.750532 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-rzm4h"] Apr 22 15:05:50.907825 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:05:50.907787 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/81712e5b-ea67-4bc8-bf5c-1d40a7ea76b8-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-58b7db6668-rzm4h\" (UID: \"81712e5b-ea67-4bc8-bf5c-1d40a7ea76b8\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-rzm4h" Apr 22 15:05:51.008936 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:05:51.008856 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/81712e5b-ea67-4bc8-bf5c-1d40a7ea76b8-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-58b7db6668-rzm4h\" (UID: \"81712e5b-ea67-4bc8-bf5c-1d40a7ea76b8\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-rzm4h" Apr 22 15:05:51.009223 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:05:51.009199 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/81712e5b-ea67-4bc8-bf5c-1d40a7ea76b8-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-58b7db6668-rzm4h\" (UID: \"81712e5b-ea67-4bc8-bf5c-1d40a7ea76b8\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-rzm4h" Apr 22 15:05:51.047442 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:05:51.047413 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-rzm4h" Apr 22 15:05:51.176050 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:05:51.176027 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-rzm4h"] Apr 22 15:05:51.177306 ip-10-0-142-195 kubenswrapper[2576]: W0422 15:05:51.177281 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81712e5b_ea67_4bc8_bf5c_1d40a7ea76b8.slice/crio-54f735b5abb8ce6f9e50886df092d2efb22383e84481b72249dc297b952ef024 WatchSource:0}: Error finding container 54f735b5abb8ce6f9e50886df092d2efb22383e84481b72249dc297b952ef024: Status 404 returned error can't find the container with id 54f735b5abb8ce6f9e50886df092d2efb22383e84481b72249dc297b952ef024 Apr 22 15:05:51.179068 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:05:51.179053 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 15:05:51.711999 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:05:51.711965 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-rzm4h" event={"ID":"81712e5b-ea67-4bc8-bf5c-1d40a7ea76b8","Type":"ContainerStarted","Data":"9f3559907a6045292236594e32f6df2809e8de0e329dc7ddfbf8e7ad55f6ce67"} Apr 22 15:05:51.711999 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:05:51.712000 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-rzm4h" event={"ID":"81712e5b-ea67-4bc8-bf5c-1d40a7ea76b8","Type":"ContainerStarted","Data":"54f735b5abb8ce6f9e50886df092d2efb22383e84481b72249dc297b952ef024"} Apr 22 15:05:55.724175 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:05:55.724145 2576 generic.go:358] "Generic (PLEG): container finished" podID="81712e5b-ea67-4bc8-bf5c-1d40a7ea76b8" containerID="9f3559907a6045292236594e32f6df2809e8de0e329dc7ddfbf8e7ad55f6ce67" exitCode=0 Apr 22 15:05:55.724593 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:05:55.724219 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-rzm4h" event={"ID":"81712e5b-ea67-4bc8-bf5c-1d40a7ea76b8","Type":"ContainerDied","Data":"9f3559907a6045292236594e32f6df2809e8de0e329dc7ddfbf8e7ad55f6ce67"} Apr 22 15:05:56.728104 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:05:56.728067 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-rzm4h" event={"ID":"81712e5b-ea67-4bc8-bf5c-1d40a7ea76b8","Type":"ContainerStarted","Data":"6d9fa06f304ce96d1cdbdef715e838db3479af31e85169dd3cb47b752bea746a"} Apr 22 15:05:56.728517 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:05:56.728275 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-rzm4h" Apr 22 15:05:56.745783 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:05:56.745738 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-rzm4h" podStartSLOduration=6.745726835 podStartE2EDuration="6.745726835s" podCreationTimestamp="2026-04-22 15:05:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:05:56.744977224 +0000 UTC m=+3027.070711638" watchObservedRunningTime="2026-04-22 15:05:56.745726835 +0000 UTC m=+3027.071461249" Apr 22 15:06:27.736294 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:06:27.736264 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-rzm4h" Apr 22 15:06:30.884366 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:06:30.884338 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-rzm4h"] Apr 22 15:06:30.886043 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:06:30.884571 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-rzm4h" podUID="81712e5b-ea67-4bc8-bf5c-1d40a7ea76b8" containerName="kserve-container" containerID="cri-o://6d9fa06f304ce96d1cdbdef715e838db3479af31e85169dd3cb47b752bea746a" gracePeriod=30 Apr 22 15:06:30.939405 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:06:30.939377 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-rqqj7"] Apr 22 15:06:30.941406 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:06:30.941392 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-rqqj7" Apr 22 15:06:30.955989 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:06:30.955968 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-rqqj7"] Apr 22 15:06:30.965479 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:06:30.965458 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bd2cc1ef-507d-4554-b975-ea9ae7c2f82a-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-c58d48f-rqqj7\" (UID: \"bd2cc1ef-507d-4554-b975-ea9ae7c2f82a\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-rqqj7" Apr 22 15:06:31.066334 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:06:31.066307 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bd2cc1ef-507d-4554-b975-ea9ae7c2f82a-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-c58d48f-rqqj7\" (UID: \"bd2cc1ef-507d-4554-b975-ea9ae7c2f82a\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-rqqj7" Apr 22 15:06:31.066591 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:06:31.066576 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bd2cc1ef-507d-4554-b975-ea9ae7c2f82a-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-c58d48f-rqqj7\" (UID: \"bd2cc1ef-507d-4554-b975-ea9ae7c2f82a\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-rqqj7" Apr 22 15:06:31.250250 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:06:31.250228 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-rqqj7" Apr 22 15:06:31.376424 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:06:31.376318 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-rqqj7"] Apr 22 15:06:31.378769 ip-10-0-142-195 kubenswrapper[2576]: W0422 15:06:31.378745 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd2cc1ef_507d_4554_b975_ea9ae7c2f82a.slice/crio-ee2cb94e14deb8daeacfa87e32fe40e5a19bdb8bb21b06d876725f7604cafaf2 WatchSource:0}: Error finding container ee2cb94e14deb8daeacfa87e32fe40e5a19bdb8bb21b06d876725f7604cafaf2: Status 404 returned error can't find the container with id ee2cb94e14deb8daeacfa87e32fe40e5a19bdb8bb21b06d876725f7604cafaf2 Apr 22 15:06:31.822310 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:06:31.822260 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-rqqj7" event={"ID":"bd2cc1ef-507d-4554-b975-ea9ae7c2f82a","Type":"ContainerStarted","Data":"6025688d9a95987e2be0f9a3859b9f9ab60ffd9ab843b8cfcae4937658be35b5"} Apr 22 15:06:31.822310 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:06:31.822302 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-rqqj7" event={"ID":"bd2cc1ef-507d-4554-b975-ea9ae7c2f82a","Type":"ContainerStarted","Data":"ee2cb94e14deb8daeacfa87e32fe40e5a19bdb8bb21b06d876725f7604cafaf2"} Apr 22 15:06:35.836090 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:06:35.836050 2576 generic.go:358] "Generic (PLEG): container finished" podID="bd2cc1ef-507d-4554-b975-ea9ae7c2f82a" containerID="6025688d9a95987e2be0f9a3859b9f9ab60ffd9ab843b8cfcae4937658be35b5" exitCode=0 Apr 22 15:06:35.836619 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:06:35.836113 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-rqqj7" event={"ID":"bd2cc1ef-507d-4554-b975-ea9ae7c2f82a","Type":"ContainerDied","Data":"6025688d9a95987e2be0f9a3859b9f9ab60ffd9ab843b8cfcae4937658be35b5"} Apr 22 15:06:36.111906 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:06:36.111882 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-rzm4h" Apr 22 15:06:36.202865 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:06:36.202840 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/81712e5b-ea67-4bc8-bf5c-1d40a7ea76b8-kserve-provision-location\") pod \"81712e5b-ea67-4bc8-bf5c-1d40a7ea76b8\" (UID: \"81712e5b-ea67-4bc8-bf5c-1d40a7ea76b8\") " Apr 22 15:06:36.203143 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:06:36.203117 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81712e5b-ea67-4bc8-bf5c-1d40a7ea76b8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "81712e5b-ea67-4bc8-bf5c-1d40a7ea76b8" (UID: "81712e5b-ea67-4bc8-bf5c-1d40a7ea76b8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 15:06:36.303659 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:06:36.303640 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/81712e5b-ea67-4bc8-bf5c-1d40a7ea76b8-kserve-provision-location\") on node \"ip-10-0-142-195.ec2.internal\" DevicePath \"\"" Apr 22 15:06:36.840968 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:06:36.840933 2576 generic.go:358] "Generic (PLEG): container finished" podID="81712e5b-ea67-4bc8-bf5c-1d40a7ea76b8" containerID="6d9fa06f304ce96d1cdbdef715e838db3479af31e85169dd3cb47b752bea746a" exitCode=0 Apr 22 15:06:36.841350 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:06:36.840981 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-rzm4h" event={"ID":"81712e5b-ea67-4bc8-bf5c-1d40a7ea76b8","Type":"ContainerDied","Data":"6d9fa06f304ce96d1cdbdef715e838db3479af31e85169dd3cb47b752bea746a"} Apr 22 15:06:36.841350 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:06:36.841004 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-rzm4h" Apr 22 15:06:36.841350 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:06:36.841025 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-rzm4h" event={"ID":"81712e5b-ea67-4bc8-bf5c-1d40a7ea76b8","Type":"ContainerDied","Data":"54f735b5abb8ce6f9e50886df092d2efb22383e84481b72249dc297b952ef024"} Apr 22 15:06:36.841350 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:06:36.841049 2576 scope.go:117] "RemoveContainer" containerID="6d9fa06f304ce96d1cdbdef715e838db3479af31e85169dd3cb47b752bea746a" Apr 22 15:06:36.843418 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:06:36.843389 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-rqqj7" event={"ID":"bd2cc1ef-507d-4554-b975-ea9ae7c2f82a","Type":"ContainerStarted","Data":"decbbeeec1d33b14506f4dea6d24e0a4216475951fcd8a18349adcca2cb944a6"} Apr 22 15:06:36.849408 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:06:36.849377 2576 scope.go:117] "RemoveContainer" containerID="9f3559907a6045292236594e32f6df2809e8de0e329dc7ddfbf8e7ad55f6ce67" Apr 22 15:06:36.856261 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:06:36.856239 2576 scope.go:117] "RemoveContainer" containerID="6d9fa06f304ce96d1cdbdef715e838db3479af31e85169dd3cb47b752bea746a" Apr 22 15:06:36.856526 ip-10-0-142-195 kubenswrapper[2576]: E0422 15:06:36.856503 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d9fa06f304ce96d1cdbdef715e838db3479af31e85169dd3cb47b752bea746a\": container with ID starting with 6d9fa06f304ce96d1cdbdef715e838db3479af31e85169dd3cb47b752bea746a not found: ID does not exist" containerID="6d9fa06f304ce96d1cdbdef715e838db3479af31e85169dd3cb47b752bea746a" Apr 22 15:06:36.856642 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:06:36.856535 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d9fa06f304ce96d1cdbdef715e838db3479af31e85169dd3cb47b752bea746a"} err="failed to get container status \"6d9fa06f304ce96d1cdbdef715e838db3479af31e85169dd3cb47b752bea746a\": rpc error: code = NotFound desc = could not find container \"6d9fa06f304ce96d1cdbdef715e838db3479af31e85169dd3cb47b752bea746a\": container with ID starting with 6d9fa06f304ce96d1cdbdef715e838db3479af31e85169dd3cb47b752bea746a not found: ID does not exist" Apr 22 15:06:36.856642 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:06:36.856556 2576 scope.go:117] "RemoveContainer" containerID="9f3559907a6045292236594e32f6df2809e8de0e329dc7ddfbf8e7ad55f6ce67" Apr 22 15:06:36.856803 ip-10-0-142-195 kubenswrapper[2576]: E0422 15:06:36.856781 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f3559907a6045292236594e32f6df2809e8de0e329dc7ddfbf8e7ad55f6ce67\": container with ID starting with 9f3559907a6045292236594e32f6df2809e8de0e329dc7ddfbf8e7ad55f6ce67 not found: ID does not exist" containerID="9f3559907a6045292236594e32f6df2809e8de0e329dc7ddfbf8e7ad55f6ce67" Apr 22 15:06:36.856863 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:06:36.856825 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f3559907a6045292236594e32f6df2809e8de0e329dc7ddfbf8e7ad55f6ce67"} err="failed to get container status \"9f3559907a6045292236594e32f6df2809e8de0e329dc7ddfbf8e7ad55f6ce67\": rpc error: code = NotFound desc = could not find container \"9f3559907a6045292236594e32f6df2809e8de0e329dc7ddfbf8e7ad55f6ce67\": container with ID starting with 9f3559907a6045292236594e32f6df2809e8de0e329dc7ddfbf8e7ad55f6ce67 not found: ID does not exist" Apr 22 15:06:36.859252 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:06:36.859227 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-rzm4h"] Apr 22 15:06:36.864899 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:06:36.864875 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-rzm4h"] Apr 22 15:06:36.882118 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:06:36.882079 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-rqqj7" podStartSLOduration=6.882066314 podStartE2EDuration="6.882066314s" podCreationTimestamp="2026-04-22 15:06:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:06:36.881376929 +0000 UTC m=+3067.207111377" watchObservedRunningTime="2026-04-22 15:06:36.882066314 +0000 UTC m=+3067.207800728" Apr 22 15:06:38.291292 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:06:38.291257 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81712e5b-ea67-4bc8-bf5c-1d40a7ea76b8" path="/var/lib/kubelet/pods/81712e5b-ea67-4bc8-bf5c-1d40a7ea76b8/volumes" Apr 22 15:06:46.844254 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:06:46.844214 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-rqqj7" Apr 22 15:07:06.934075 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:07:06.934042 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-rqqj7" Apr 22 15:07:11.000101 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:07:11.000072 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-rqqj7"] Apr 22 15:07:11.000576 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:07:11.000301 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-rqqj7" podUID="bd2cc1ef-507d-4554-b975-ea9ae7c2f82a" containerName="kserve-container" containerID="cri-o://decbbeeec1d33b14506f4dea6d24e0a4216475951fcd8a18349adcca2cb944a6" gracePeriod=30 Apr 22 15:07:16.137046 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:07:16.137017 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-rqqj7" Apr 22 15:07:16.274559 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:07:16.274529 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bd2cc1ef-507d-4554-b975-ea9ae7c2f82a-kserve-provision-location\") pod \"bd2cc1ef-507d-4554-b975-ea9ae7c2f82a\" (UID: \"bd2cc1ef-507d-4554-b975-ea9ae7c2f82a\") " Apr 22 15:07:16.274821 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:07:16.274788 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd2cc1ef-507d-4554-b975-ea9ae7c2f82a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "bd2cc1ef-507d-4554-b975-ea9ae7c2f82a" (UID: "bd2cc1ef-507d-4554-b975-ea9ae7c2f82a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 15:07:16.375420 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:07:16.375396 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bd2cc1ef-507d-4554-b975-ea9ae7c2f82a-kserve-provision-location\") on node \"ip-10-0-142-195.ec2.internal\" DevicePath \"\"" Apr 22 15:07:16.963706 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:07:16.963676 2576 generic.go:358] "Generic (PLEG): container finished" podID="bd2cc1ef-507d-4554-b975-ea9ae7c2f82a" containerID="decbbeeec1d33b14506f4dea6d24e0a4216475951fcd8a18349adcca2cb944a6" exitCode=0 Apr 22 15:07:16.963857 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:07:16.963730 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-rqqj7" event={"ID":"bd2cc1ef-507d-4554-b975-ea9ae7c2f82a","Type":"ContainerDied","Data":"decbbeeec1d33b14506f4dea6d24e0a4216475951fcd8a18349adcca2cb944a6"} Apr 22 15:07:16.963857 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:07:16.963748 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-rqqj7" Apr 22 15:07:16.963857 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:07:16.963759 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-rqqj7" event={"ID":"bd2cc1ef-507d-4554-b975-ea9ae7c2f82a","Type":"ContainerDied","Data":"ee2cb94e14deb8daeacfa87e32fe40e5a19bdb8bb21b06d876725f7604cafaf2"} Apr 22 15:07:16.963857 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:07:16.963776 2576 scope.go:117] "RemoveContainer" containerID="decbbeeec1d33b14506f4dea6d24e0a4216475951fcd8a18349adcca2cb944a6" Apr 22 15:07:16.972470 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:07:16.972451 2576 scope.go:117] "RemoveContainer" containerID="6025688d9a95987e2be0f9a3859b9f9ab60ffd9ab843b8cfcae4937658be35b5" Apr 22 15:07:16.979404 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:07:16.979378 2576 scope.go:117] "RemoveContainer" containerID="decbbeeec1d33b14506f4dea6d24e0a4216475951fcd8a18349adcca2cb944a6" Apr 22 15:07:16.979662 ip-10-0-142-195 kubenswrapper[2576]: E0422 15:07:16.979643 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"decbbeeec1d33b14506f4dea6d24e0a4216475951fcd8a18349adcca2cb944a6\": container with ID starting with decbbeeec1d33b14506f4dea6d24e0a4216475951fcd8a18349adcca2cb944a6 not found: ID does not exist" containerID="decbbeeec1d33b14506f4dea6d24e0a4216475951fcd8a18349adcca2cb944a6" Apr 22 15:07:16.979715 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:07:16.979669 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"decbbeeec1d33b14506f4dea6d24e0a4216475951fcd8a18349adcca2cb944a6"} err="failed to get container status \"decbbeeec1d33b14506f4dea6d24e0a4216475951fcd8a18349adcca2cb944a6\": rpc error: code = NotFound desc = could not find container \"decbbeeec1d33b14506f4dea6d24e0a4216475951fcd8a18349adcca2cb944a6\": container with ID starting with decbbeeec1d33b14506f4dea6d24e0a4216475951fcd8a18349adcca2cb944a6 not found: ID does not exist" Apr 22 15:07:16.979715 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:07:16.979688 2576 scope.go:117] "RemoveContainer" containerID="6025688d9a95987e2be0f9a3859b9f9ab60ffd9ab843b8cfcae4937658be35b5" Apr 22 15:07:16.979991 ip-10-0-142-195 kubenswrapper[2576]: E0422 15:07:16.979970 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6025688d9a95987e2be0f9a3859b9f9ab60ffd9ab843b8cfcae4937658be35b5\": container with ID starting with 6025688d9a95987e2be0f9a3859b9f9ab60ffd9ab843b8cfcae4937658be35b5 not found: ID does not exist" containerID="6025688d9a95987e2be0f9a3859b9f9ab60ffd9ab843b8cfcae4937658be35b5" Apr 22 15:07:16.980072 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:07:16.980000 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6025688d9a95987e2be0f9a3859b9f9ab60ffd9ab843b8cfcae4937658be35b5"} err="failed to get container status \"6025688d9a95987e2be0f9a3859b9f9ab60ffd9ab843b8cfcae4937658be35b5\": rpc error: code = NotFound desc = could not find container \"6025688d9a95987e2be0f9a3859b9f9ab60ffd9ab843b8cfcae4937658be35b5\": container with ID starting with 6025688d9a95987e2be0f9a3859b9f9ab60ffd9ab843b8cfcae4937658be35b5 not found: ID does not exist" Apr 22 15:07:16.982541 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:07:16.982516 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-rqqj7"] Apr 22 15:07:16.986079 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:07:16.986056 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-rqqj7"] Apr 22 15:07:18.292219 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:07:18.292182 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd2cc1ef-507d-4554-b975-ea9ae7c2f82a" path="/var/lib/kubelet/pods/bd2cc1ef-507d-4554-b975-ea9ae7c2f82a/volumes" Apr 22 15:07:30.469269 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:07:30.469235 2576 scope.go:117] "RemoveContainer" containerID="c5edcc5602506764c476d5aeb0145ffccbec4747b0facc0ee9d062a97cfea77c" Apr 22 15:07:30.476696 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:07:30.476671 2576 scope.go:117] "RemoveContainer" containerID="a2c4558bbb228334721aa63c82d8788741e2e168f0822fb9f0a578bbfefbafb7" Apr 22 15:08:21.322666 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:08:21.322632 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-qfhjk"] Apr 22 15:08:21.323242 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:08:21.323015 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="81712e5b-ea67-4bc8-bf5c-1d40a7ea76b8" containerName="kserve-container" Apr 22 15:08:21.323242 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:08:21.323032 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="81712e5b-ea67-4bc8-bf5c-1d40a7ea76b8" containerName="kserve-container" Apr 22 15:08:21.323242 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:08:21.323054 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="81712e5b-ea67-4bc8-bf5c-1d40a7ea76b8" containerName="storage-initializer" Apr 22 15:08:21.323242 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:08:21.323062 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="81712e5b-ea67-4bc8-bf5c-1d40a7ea76b8" containerName="storage-initializer" Apr 22 15:08:21.323242 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:08:21.323073 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bd2cc1ef-507d-4554-b975-ea9ae7c2f82a" containerName="kserve-container" Apr 22 15:08:21.323242 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:08:21.323082 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd2cc1ef-507d-4554-b975-ea9ae7c2f82a" containerName="kserve-container" Apr 22 15:08:21.323242 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:08:21.323098 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bd2cc1ef-507d-4554-b975-ea9ae7c2f82a" containerName="storage-initializer" Apr 22 15:08:21.323242 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:08:21.323106 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd2cc1ef-507d-4554-b975-ea9ae7c2f82a" containerName="storage-initializer" Apr 22 15:08:21.323242 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:08:21.323170 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="81712e5b-ea67-4bc8-bf5c-1d40a7ea76b8" containerName="kserve-container" Apr 22 15:08:21.323242 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:08:21.323181 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="bd2cc1ef-507d-4554-b975-ea9ae7c2f82a" containerName="kserve-container" Apr 22 15:08:21.326212 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:08:21.326194 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-qfhjk" Apr 22 15:08:21.329621 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:08:21.329603 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-zp85x\"" Apr 22 15:08:21.348125 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:08:21.348104 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-qfhjk"] Apr 22 15:08:21.511481 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:08:21.511452 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/460889b2-b7bd-4ece-93d4-ea177341c2a9-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-b5d6966c7-qfhjk\" (UID: \"460889b2-b7bd-4ece-93d4-ea177341c2a9\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-qfhjk" Apr 22 15:08:21.612115 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:08:21.612051 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/460889b2-b7bd-4ece-93d4-ea177341c2a9-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-b5d6966c7-qfhjk\" (UID: \"460889b2-b7bd-4ece-93d4-ea177341c2a9\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-qfhjk" Apr 22 15:08:21.612374 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:08:21.612355 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/460889b2-b7bd-4ece-93d4-ea177341c2a9-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-b5d6966c7-qfhjk\" (UID: \"460889b2-b7bd-4ece-93d4-ea177341c2a9\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-qfhjk" Apr 22 15:08:21.635365 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:08:21.635340 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-qfhjk" Apr 22 15:08:21.757559 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:08:21.756603 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-qfhjk"] Apr 22 15:08:21.760169 ip-10-0-142-195 kubenswrapper[2576]: W0422 15:08:21.760143 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod460889b2_b7bd_4ece_93d4_ea177341c2a9.slice/crio-41f4c820cec3aa224dadd25f70aead0dae229ed50351ed6b33f5fcdc0260596d WatchSource:0}: Error finding container 41f4c820cec3aa224dadd25f70aead0dae229ed50351ed6b33f5fcdc0260596d: Status 404 returned error can't find the container with id 41f4c820cec3aa224dadd25f70aead0dae229ed50351ed6b33f5fcdc0260596d Apr 22 15:08:22.138188 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:08:22.138160 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-qfhjk" event={"ID":"460889b2-b7bd-4ece-93d4-ea177341c2a9","Type":"ContainerStarted","Data":"306ef0dab72f72e654efe3768952c91c3c4b7b31ef4bf5cfbb80719a41aa91bf"} Apr 22 15:08:22.138188 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:08:22.138193 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-qfhjk" event={"ID":"460889b2-b7bd-4ece-93d4-ea177341c2a9","Type":"ContainerStarted","Data":"41f4c820cec3aa224dadd25f70aead0dae229ed50351ed6b33f5fcdc0260596d"} Apr 22 15:08:26.149232 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:08:26.149198 2576 generic.go:358] "Generic (PLEG): container finished" podID="460889b2-b7bd-4ece-93d4-ea177341c2a9" containerID="306ef0dab72f72e654efe3768952c91c3c4b7b31ef4bf5cfbb80719a41aa91bf" exitCode=0 Apr 22 15:08:26.149609 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:08:26.149271 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-qfhjk" event={"ID":"460889b2-b7bd-4ece-93d4-ea177341c2a9","Type":"ContainerDied","Data":"306ef0dab72f72e654efe3768952c91c3c4b7b31ef4bf5cfbb80719a41aa91bf"} Apr 22 15:08:27.153328 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:08:27.153297 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-qfhjk" event={"ID":"460889b2-b7bd-4ece-93d4-ea177341c2a9","Type":"ContainerStarted","Data":"374fe1d9bda0cb2cd13fb9d8a52c37935dec87fef8d8dd61f1fd8fc05b730fe1"} Apr 22 15:08:27.153721 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:08:27.153508 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-qfhjk" Apr 22 15:08:27.170841 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:08:27.170772 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-qfhjk" podStartSLOduration=6.170761129 podStartE2EDuration="6.170761129s" podCreationTimestamp="2026-04-22 15:08:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:08:27.169920105 +0000 UTC m=+3177.495654521" watchObservedRunningTime="2026-04-22 15:08:27.170761129 +0000 UTC m=+3177.496495543" Apr 22 15:08:58.233717 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:08:58.233667 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-qfhjk" podUID="460889b2-b7bd-4ece-93d4-ea177341c2a9" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 22 15:09:08.159207 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:09:08.159179 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-qfhjk" Apr 22 15:09:11.402846 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:09:11.402794 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-qfhjk"] Apr 22 15:09:11.403261 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:09:11.403057 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-qfhjk" podUID="460889b2-b7bd-4ece-93d4-ea177341c2a9" containerName="kserve-container" containerID="cri-o://374fe1d9bda0cb2cd13fb9d8a52c37935dec87fef8d8dd61f1fd8fc05b730fe1" gracePeriod=30 Apr 22 15:09:18.157259 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:09:18.157218 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-qfhjk" podUID="460889b2-b7bd-4ece-93d4-ea177341c2a9" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.52:8080/v2/models/isvc-xgboost-v2-runtime/ready\": dial tcp 10.134.0.52:8080: connect: connection refused" Apr 22 15:09:18.937724 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:09:18.937703 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-qfhjk" Apr 22 15:09:19.019989 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:09:19.019900 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/460889b2-b7bd-4ece-93d4-ea177341c2a9-kserve-provision-location\") pod \"460889b2-b7bd-4ece-93d4-ea177341c2a9\" (UID: \"460889b2-b7bd-4ece-93d4-ea177341c2a9\") " Apr 22 15:09:19.020220 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:09:19.020195 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/460889b2-b7bd-4ece-93d4-ea177341c2a9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "460889b2-b7bd-4ece-93d4-ea177341c2a9" (UID: "460889b2-b7bd-4ece-93d4-ea177341c2a9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 15:09:19.120685 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:09:19.120651 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/460889b2-b7bd-4ece-93d4-ea177341c2a9-kserve-provision-location\") on node \"ip-10-0-142-195.ec2.internal\" DevicePath \"\"" Apr 22 15:09:19.296941 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:09:19.296848 2576 generic.go:358] "Generic (PLEG): container finished" podID="460889b2-b7bd-4ece-93d4-ea177341c2a9" containerID="374fe1d9bda0cb2cd13fb9d8a52c37935dec87fef8d8dd61f1fd8fc05b730fe1" exitCode=0 Apr 22 15:09:19.296941 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:09:19.296900 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-qfhjk" event={"ID":"460889b2-b7bd-4ece-93d4-ea177341c2a9","Type":"ContainerDied","Data":"374fe1d9bda0cb2cd13fb9d8a52c37935dec87fef8d8dd61f1fd8fc05b730fe1"} Apr 22 15:09:19.296941 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:09:19.296922 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-qfhjk" event={"ID":"460889b2-b7bd-4ece-93d4-ea177341c2a9","Type":"ContainerDied","Data":"41f4c820cec3aa224dadd25f70aead0dae229ed50351ed6b33f5fcdc0260596d"} Apr 22 15:09:19.296941 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:09:19.296927 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-qfhjk" Apr 22 15:09:19.296941 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:09:19.296941 2576 scope.go:117] "RemoveContainer" containerID="374fe1d9bda0cb2cd13fb9d8a52c37935dec87fef8d8dd61f1fd8fc05b730fe1" Apr 22 15:09:19.304323 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:09:19.304230 2576 scope.go:117] "RemoveContainer" containerID="306ef0dab72f72e654efe3768952c91c3c4b7b31ef4bf5cfbb80719a41aa91bf" Apr 22 15:09:19.311013 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:09:19.310995 2576 scope.go:117] "RemoveContainer" containerID="374fe1d9bda0cb2cd13fb9d8a52c37935dec87fef8d8dd61f1fd8fc05b730fe1" Apr 22 15:09:19.311239 ip-10-0-142-195 kubenswrapper[2576]: E0422 15:09:19.311220 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"374fe1d9bda0cb2cd13fb9d8a52c37935dec87fef8d8dd61f1fd8fc05b730fe1\": container with ID starting with 374fe1d9bda0cb2cd13fb9d8a52c37935dec87fef8d8dd61f1fd8fc05b730fe1 not found: ID does not exist" containerID="374fe1d9bda0cb2cd13fb9d8a52c37935dec87fef8d8dd61f1fd8fc05b730fe1" Apr 22 15:09:19.311298 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:09:19.311247 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"374fe1d9bda0cb2cd13fb9d8a52c37935dec87fef8d8dd61f1fd8fc05b730fe1"} err="failed to get container status \"374fe1d9bda0cb2cd13fb9d8a52c37935dec87fef8d8dd61f1fd8fc05b730fe1\": rpc error: code = NotFound desc = could not find container \"374fe1d9bda0cb2cd13fb9d8a52c37935dec87fef8d8dd61f1fd8fc05b730fe1\": container with ID starting with 374fe1d9bda0cb2cd13fb9d8a52c37935dec87fef8d8dd61f1fd8fc05b730fe1 not found: ID does not exist" Apr 22 15:09:19.311298 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:09:19.311265 2576 scope.go:117] "RemoveContainer" containerID="306ef0dab72f72e654efe3768952c91c3c4b7b31ef4bf5cfbb80719a41aa91bf" Apr 22 15:09:19.311499 ip-10-0-142-195 kubenswrapper[2576]: E0422 15:09:19.311482 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"306ef0dab72f72e654efe3768952c91c3c4b7b31ef4bf5cfbb80719a41aa91bf\": container with ID starting with 306ef0dab72f72e654efe3768952c91c3c4b7b31ef4bf5cfbb80719a41aa91bf not found: ID does not exist" containerID="306ef0dab72f72e654efe3768952c91c3c4b7b31ef4bf5cfbb80719a41aa91bf" Apr 22 15:09:19.311537 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:09:19.311505 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"306ef0dab72f72e654efe3768952c91c3c4b7b31ef4bf5cfbb80719a41aa91bf"} err="failed to get container status \"306ef0dab72f72e654efe3768952c91c3c4b7b31ef4bf5cfbb80719a41aa91bf\": rpc error: code = NotFound desc = could not find container \"306ef0dab72f72e654efe3768952c91c3c4b7b31ef4bf5cfbb80719a41aa91bf\": container with ID starting with 306ef0dab72f72e654efe3768952c91c3c4b7b31ef4bf5cfbb80719a41aa91bf not found: ID does not exist" Apr 22 15:09:19.321915 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:09:19.321891 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-qfhjk"] Apr 22 15:09:19.327605 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:09:19.327583 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-qfhjk"] Apr 22 15:09:20.292387 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:09:20.292354 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="460889b2-b7bd-4ece-93d4-ea177341c2a9" path="/var/lib/kubelet/pods/460889b2-b7bd-4ece-93d4-ea177341c2a9/volumes" Apr 22 15:10:21.638950 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:10:21.638915 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-988f7fbdb-55sgl"] Apr 22 15:10:21.639368 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:10:21.639204 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="460889b2-b7bd-4ece-93d4-ea177341c2a9" containerName="kserve-container" Apr 22 15:10:21.639368 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:10:21.639218 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="460889b2-b7bd-4ece-93d4-ea177341c2a9" containerName="kserve-container" Apr 22 15:10:21.639368 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:10:21.639235 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="460889b2-b7bd-4ece-93d4-ea177341c2a9" containerName="storage-initializer" Apr 22 15:10:21.639368 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:10:21.639241 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="460889b2-b7bd-4ece-93d4-ea177341c2a9" containerName="storage-initializer" Apr 22 15:10:21.639368 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:10:21.639290 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="460889b2-b7bd-4ece-93d4-ea177341c2a9" containerName="kserve-container" Apr 22 15:10:21.642231 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:10:21.642213 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-988f7fbdb-55sgl" Apr 22 15:10:21.644945 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:10:21.644923 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"storage-config\"" Apr 22 15:10:21.645085 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:10:21.644993 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-zp85x\"" Apr 22 15:10:21.649659 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:10:21.649635 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-988f7fbdb-55sgl"] Apr 22 15:10:21.694953 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:10:21.694921 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e2a1d7ef-7de9-4fba-ac24-de0edbbb38d3-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-988f7fbdb-55sgl\" (UID: \"e2a1d7ef-7de9-4fba-ac24-de0edbbb38d3\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-988f7fbdb-55sgl" Apr 22 15:10:21.795995 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:10:21.795955 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e2a1d7ef-7de9-4fba-ac24-de0edbbb38d3-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-988f7fbdb-55sgl\" (UID: \"e2a1d7ef-7de9-4fba-ac24-de0edbbb38d3\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-988f7fbdb-55sgl" Apr 22 15:10:21.796358 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:10:21.796338 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e2a1d7ef-7de9-4fba-ac24-de0edbbb38d3-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-988f7fbdb-55sgl\" (UID: \"e2a1d7ef-7de9-4fba-ac24-de0edbbb38d3\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-988f7fbdb-55sgl" Apr 22 15:10:21.953829 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:10:21.953707 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-988f7fbdb-55sgl" Apr 22 15:10:22.065257 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:10:22.065235 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-988f7fbdb-55sgl"] Apr 22 15:10:22.067116 ip-10-0-142-195 kubenswrapper[2576]: W0422 15:10:22.067092 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2a1d7ef_7de9_4fba_ac24_de0edbbb38d3.slice/crio-4e6ce721515987246b26fd4b783b03d8499cef0dcdc1ce3d18c8b40425ca8d79 WatchSource:0}: Error finding container 4e6ce721515987246b26fd4b783b03d8499cef0dcdc1ce3d18c8b40425ca8d79: Status 404 returned error can't find the container with id 4e6ce721515987246b26fd4b783b03d8499cef0dcdc1ce3d18c8b40425ca8d79 Apr 22 15:10:22.460149 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:10:22.460117 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-988f7fbdb-55sgl" event={"ID":"e2a1d7ef-7de9-4fba-ac24-de0edbbb38d3","Type":"ContainerStarted","Data":"3639f8e35534d172d2443a138c41ec88929971bc628d1c0d9ab7414c5fb90038"} Apr 22 15:10:22.460149 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:10:22.460154 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-988f7fbdb-55sgl" event={"ID":"e2a1d7ef-7de9-4fba-ac24-de0edbbb38d3","Type":"ContainerStarted","Data":"4e6ce721515987246b26fd4b783b03d8499cef0dcdc1ce3d18c8b40425ca8d79"} Apr 22 15:10:23.464365 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:10:23.464276 2576 generic.go:358] "Generic (PLEG): container finished" podID="e2a1d7ef-7de9-4fba-ac24-de0edbbb38d3" containerID="3639f8e35534d172d2443a138c41ec88929971bc628d1c0d9ab7414c5fb90038" exitCode=0 Apr 22 15:10:23.464365 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:10:23.464342 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-988f7fbdb-55sgl" event={"ID":"e2a1d7ef-7de9-4fba-ac24-de0edbbb38d3","Type":"ContainerDied","Data":"3639f8e35534d172d2443a138c41ec88929971bc628d1c0d9ab7414c5fb90038"} Apr 22 15:10:24.469024 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:10:24.468989 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-988f7fbdb-55sgl" event={"ID":"e2a1d7ef-7de9-4fba-ac24-de0edbbb38d3","Type":"ContainerStarted","Data":"31f0a0800ac387969f8afdc9c59f63cf1c4a6e6d9b7f75fe3aedf08b17b5692a"} Apr 22 15:10:24.469430 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:10:24.469233 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-988f7fbdb-55sgl" Apr 22 15:10:24.470511 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:10:24.470488 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-988f7fbdb-55sgl" podUID="e2a1d7ef-7de9-4fba-ac24-de0edbbb38d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.53:8080: connect: connection refused" Apr 22 15:10:24.488300 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:10:24.488223 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-988f7fbdb-55sgl" podStartSLOduration=3.488210479 podStartE2EDuration="3.488210479s" podCreationTimestamp="2026-04-22 15:10:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:10:24.486923791 +0000 UTC m=+3294.812658205" watchObservedRunningTime="2026-04-22 15:10:24.488210479 +0000 UTC m=+3294.813944893" Apr 22 15:10:25.472395 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:10:25.472353 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-988f7fbdb-55sgl" podUID="e2a1d7ef-7de9-4fba-ac24-de0edbbb38d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.53:8080: connect: connection refused" Apr 22 15:10:31.101140 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:10:31.101115 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x7g6p_5880a8e9-777a-4921-b5f6-c6325c768bf2/ovn-acl-logging/0.log" Apr 22 15:10:31.107642 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:10:31.107624 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x7g6p_5880a8e9-777a-4921-b5f6-c6325c768bf2/ovn-acl-logging/0.log" Apr 22 15:10:35.472803 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:10:35.472764 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-988f7fbdb-55sgl" podUID="e2a1d7ef-7de9-4fba-ac24-de0edbbb38d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.53:8080: connect: connection refused" Apr 22 15:10:45.473376 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:10:45.473296 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-988f7fbdb-55sgl" podUID="e2a1d7ef-7de9-4fba-ac24-de0edbbb38d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.53:8080: connect: connection refused" Apr 22 15:10:55.472826 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:10:55.472774 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-988f7fbdb-55sgl" podUID="e2a1d7ef-7de9-4fba-ac24-de0edbbb38d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.53:8080: connect: connection refused" Apr 22 15:11:05.473367 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:11:05.473331 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-988f7fbdb-55sgl" podUID="e2a1d7ef-7de9-4fba-ac24-de0edbbb38d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.53:8080: connect: connection refused" Apr 22 15:11:15.473083 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:11:15.473044 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-988f7fbdb-55sgl" podUID="e2a1d7ef-7de9-4fba-ac24-de0edbbb38d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.53:8080: connect: connection refused" Apr 22 15:11:25.472434 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:11:25.472397 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-988f7fbdb-55sgl" podUID="e2a1d7ef-7de9-4fba-ac24-de0edbbb38d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.53:8080: connect: connection refused" Apr 22 15:11:35.473428 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:11:35.473399 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-988f7fbdb-55sgl" Apr 22 15:11:41.766848 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:11:41.766786 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-988f7fbdb-55sgl"] Apr 22 15:11:41.767337 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:11:41.767078 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-988f7fbdb-55sgl" podUID="e2a1d7ef-7de9-4fba-ac24-de0edbbb38d3" containerName="kserve-container" containerID="cri-o://31f0a0800ac387969f8afdc9c59f63cf1c4a6e6d9b7f75fe3aedf08b17b5692a" gracePeriod=30 Apr 22 15:11:41.888575 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:11:41.888541 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-v726n"] Apr 22 15:11:41.891776 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:11:41.891759 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-v726n" Apr 22 15:11:41.894474 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:11:41.894452 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 22 15:11:41.899994 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:11:41.899971 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-v726n"] Apr 22 15:11:42.018349 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:11:42.018268 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/244c69ff-29dd-4743-8c6a-dd03534a1c15-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-v726n\" (UID: \"244c69ff-29dd-4743-8c6a-dd03534a1c15\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-v726n" Apr 22 15:11:42.018349 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:11:42.018310 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/244c69ff-29dd-4743-8c6a-dd03534a1c15-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-v726n\" (UID: \"244c69ff-29dd-4743-8c6a-dd03534a1c15\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-v726n" Apr 22 15:11:42.118886 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:11:42.118831 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/244c69ff-29dd-4743-8c6a-dd03534a1c15-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-v726n\" (UID: \"244c69ff-29dd-4743-8c6a-dd03534a1c15\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-v726n" Apr 22 15:11:42.118886 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:11:42.118897 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/244c69ff-29dd-4743-8c6a-dd03534a1c15-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-v726n\" (UID: \"244c69ff-29dd-4743-8c6a-dd03534a1c15\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-v726n" Apr 22 15:11:42.119255 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:11:42.119238 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/244c69ff-29dd-4743-8c6a-dd03534a1c15-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-v726n\" (UID: \"244c69ff-29dd-4743-8c6a-dd03534a1c15\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-v726n" Apr 22 15:11:42.119417 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:11:42.119397 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/244c69ff-29dd-4743-8c6a-dd03534a1c15-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-v726n\" (UID: \"244c69ff-29dd-4743-8c6a-dd03534a1c15\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-v726n" Apr 22 15:11:42.202790 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:11:42.202760 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-v726n" Apr 22 15:11:42.320133 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:11:42.320110 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-v726n"] Apr 22 15:11:42.322471 ip-10-0-142-195 kubenswrapper[2576]: W0422 15:11:42.322443 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod244c69ff_29dd_4743_8c6a_dd03534a1c15.slice/crio-bcda681faf80b3aa757aaac71425f5491064922c1aacc17c5d52b4e1180a5fd6 WatchSource:0}: Error finding container bcda681faf80b3aa757aaac71425f5491064922c1aacc17c5d52b4e1180a5fd6: Status 404 returned error can't find the container with id bcda681faf80b3aa757aaac71425f5491064922c1aacc17c5d52b4e1180a5fd6 Apr 22 15:11:42.324240 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:11:42.324223 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 15:11:42.675298 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:11:42.675203 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-v726n" event={"ID":"244c69ff-29dd-4743-8c6a-dd03534a1c15","Type":"ContainerStarted","Data":"ce6ba844be29fc9a97fde2e9977265e22b8cdc0bb8075bf744086e6d982e3e19"} Apr 22 15:11:42.675298 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:11:42.675240 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-v726n" event={"ID":"244c69ff-29dd-4743-8c6a-dd03534a1c15","Type":"ContainerStarted","Data":"bcda681faf80b3aa757aaac71425f5491064922c1aacc17c5d52b4e1180a5fd6"} Apr 22 15:11:43.678904 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:11:43.678795 2576 generic.go:358] "Generic (PLEG): container finished" podID="244c69ff-29dd-4743-8c6a-dd03534a1c15" containerID="ce6ba844be29fc9a97fde2e9977265e22b8cdc0bb8075bf744086e6d982e3e19" exitCode=0 Apr 22 15:11:43.678904 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:11:43.678875 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-v726n" event={"ID":"244c69ff-29dd-4743-8c6a-dd03534a1c15","Type":"ContainerDied","Data":"ce6ba844be29fc9a97fde2e9977265e22b8cdc0bb8075bf744086e6d982e3e19"} Apr 22 15:11:44.682588 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:11:44.682555 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-v726n" event={"ID":"244c69ff-29dd-4743-8c6a-dd03534a1c15","Type":"ContainerStarted","Data":"caa834cee3ff8e7d93aad3f49a341e36d5bb78dcddd5f713d0a90c66cfa7dc43"} Apr 22 15:11:44.683037 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:11:44.682781 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-v726n" Apr 22 15:11:44.684043 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:11:44.684017 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-v726n" podUID="244c69ff-29dd-4743-8c6a-dd03534a1c15" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 22 15:11:44.700354 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:11:44.700313 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-v726n" podStartSLOduration=3.700299783 podStartE2EDuration="3.700299783s" podCreationTimestamp="2026-04-22 15:11:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:11:44.699754539 +0000 UTC m=+3375.025488977" watchObservedRunningTime="2026-04-22 15:11:44.700299783 +0000 UTC m=+3375.026034197" Apr 22 15:11:45.472960 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:11:45.472926 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-988f7fbdb-55sgl" podUID="e2a1d7ef-7de9-4fba-ac24-de0edbbb38d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.53:8080: connect: connection refused" Apr 22 15:11:45.685511 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:11:45.685468 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-v726n" podUID="244c69ff-29dd-4743-8c6a-dd03534a1c15" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 22 15:11:46.101525 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:11:46.101501 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-988f7fbdb-55sgl" Apr 22 15:11:46.151825 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:11:46.151777 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e2a1d7ef-7de9-4fba-ac24-de0edbbb38d3-kserve-provision-location\") pod \"e2a1d7ef-7de9-4fba-ac24-de0edbbb38d3\" (UID: \"e2a1d7ef-7de9-4fba-ac24-de0edbbb38d3\") " Apr 22 15:11:46.152127 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:11:46.152106 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2a1d7ef-7de9-4fba-ac24-de0edbbb38d3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e2a1d7ef-7de9-4fba-ac24-de0edbbb38d3" (UID: "e2a1d7ef-7de9-4fba-ac24-de0edbbb38d3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 15:11:46.253168 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:11:46.253139 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e2a1d7ef-7de9-4fba-ac24-de0edbbb38d3-kserve-provision-location\") on node \"ip-10-0-142-195.ec2.internal\" DevicePath \"\"" Apr 22 15:11:46.693920 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:11:46.693822 2576 generic.go:358] "Generic (PLEG): container finished" podID="e2a1d7ef-7de9-4fba-ac24-de0edbbb38d3" containerID="31f0a0800ac387969f8afdc9c59f63cf1c4a6e6d9b7f75fe3aedf08b17b5692a" exitCode=0 Apr 22 15:11:46.693920 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:11:46.693883 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-988f7fbdb-55sgl" event={"ID":"e2a1d7ef-7de9-4fba-ac24-de0edbbb38d3","Type":"ContainerDied","Data":"31f0a0800ac387969f8afdc9c59f63cf1c4a6e6d9b7f75fe3aedf08b17b5692a"} Apr 22 15:11:46.693920 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:11:46.693915 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-988f7fbdb-55sgl" event={"ID":"e2a1d7ef-7de9-4fba-ac24-de0edbbb38d3","Type":"ContainerDied","Data":"4e6ce721515987246b26fd4b783b03d8499cef0dcdc1ce3d18c8b40425ca8d79"} Apr 22 15:11:46.694408 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:11:46.693929 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-988f7fbdb-55sgl" Apr 22 15:11:46.694408 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:11:46.693934 2576 scope.go:117] "RemoveContainer" containerID="31f0a0800ac387969f8afdc9c59f63cf1c4a6e6d9b7f75fe3aedf08b17b5692a" Apr 22 15:11:46.701441 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:11:46.701423 2576 scope.go:117] "RemoveContainer" containerID="3639f8e35534d172d2443a138c41ec88929971bc628d1c0d9ab7414c5fb90038" Apr 22 15:11:46.707843 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:11:46.707783 2576 scope.go:117] "RemoveContainer" containerID="31f0a0800ac387969f8afdc9c59f63cf1c4a6e6d9b7f75fe3aedf08b17b5692a" Apr 22 15:11:46.708154 ip-10-0-142-195 kubenswrapper[2576]: E0422 15:11:46.708121 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31f0a0800ac387969f8afdc9c59f63cf1c4a6e6d9b7f75fe3aedf08b17b5692a\": container with ID starting with 31f0a0800ac387969f8afdc9c59f63cf1c4a6e6d9b7f75fe3aedf08b17b5692a not found: ID does not exist" containerID="31f0a0800ac387969f8afdc9c59f63cf1c4a6e6d9b7f75fe3aedf08b17b5692a" Apr 22 15:11:46.708263 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:11:46.708168 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31f0a0800ac387969f8afdc9c59f63cf1c4a6e6d9b7f75fe3aedf08b17b5692a"} err="failed to get container status \"31f0a0800ac387969f8afdc9c59f63cf1c4a6e6d9b7f75fe3aedf08b17b5692a\": rpc error: code = NotFound desc = could not find container \"31f0a0800ac387969f8afdc9c59f63cf1c4a6e6d9b7f75fe3aedf08b17b5692a\": container with ID starting with 31f0a0800ac387969f8afdc9c59f63cf1c4a6e6d9b7f75fe3aedf08b17b5692a not found: ID does not exist" Apr 22 15:11:46.708263 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:11:46.708206 2576 scope.go:117] "RemoveContainer" containerID="3639f8e35534d172d2443a138c41ec88929971bc628d1c0d9ab7414c5fb90038" Apr 22 15:11:46.708497 ip-10-0-142-195 kubenswrapper[2576]: E0422 15:11:46.708476 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3639f8e35534d172d2443a138c41ec88929971bc628d1c0d9ab7414c5fb90038\": container with ID starting with 3639f8e35534d172d2443a138c41ec88929971bc628d1c0d9ab7414c5fb90038 not found: ID does not exist" containerID="3639f8e35534d172d2443a138c41ec88929971bc628d1c0d9ab7414c5fb90038" Apr 22 15:11:46.708567 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:11:46.708502 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3639f8e35534d172d2443a138c41ec88929971bc628d1c0d9ab7414c5fb90038"} err="failed to get container status \"3639f8e35534d172d2443a138c41ec88929971bc628d1c0d9ab7414c5fb90038\": rpc error: code = NotFound desc = could not find container \"3639f8e35534d172d2443a138c41ec88929971bc628d1c0d9ab7414c5fb90038\": container with ID starting with 3639f8e35534d172d2443a138c41ec88929971bc628d1c0d9ab7414c5fb90038 not found: ID does not exist" Apr 22 15:11:46.710306 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:11:46.710285 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-988f7fbdb-55sgl"] Apr 22 15:11:46.716596 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:11:46.716574 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-988f7fbdb-55sgl"] Apr 22 15:11:48.292131 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:11:48.292098 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2a1d7ef-7de9-4fba-ac24-de0edbbb38d3" path="/var/lib/kubelet/pods/e2a1d7ef-7de9-4fba-ac24-de0edbbb38d3/volumes" Apr 22 15:11:55.685867 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:11:55.685795 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-v726n" podUID="244c69ff-29dd-4743-8c6a-dd03534a1c15" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 22 15:12:05.686079 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:12:05.686032 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-v726n" podUID="244c69ff-29dd-4743-8c6a-dd03534a1c15" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 22 15:12:15.685583 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:12:15.685484 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-v726n" podUID="244c69ff-29dd-4743-8c6a-dd03534a1c15" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 22 15:12:25.686466 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:12:25.686420 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-v726n" podUID="244c69ff-29dd-4743-8c6a-dd03534a1c15" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 22 15:12:35.686394 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:12:35.686348 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-v726n" podUID="244c69ff-29dd-4743-8c6a-dd03534a1c15" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 22 15:12:45.686497 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:12:45.686455 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-v726n" podUID="244c69ff-29dd-4743-8c6a-dd03534a1c15" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 22 15:12:55.686848 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:12:55.686798 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-v726n" Apr 22 15:13:02.072840 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:02.072792 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-v726n"] Apr 22 15:13:02.073253 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:02.073074 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-v726n" podUID="244c69ff-29dd-4743-8c6a-dd03534a1c15" containerName="kserve-container" containerID="cri-o://caa834cee3ff8e7d93aad3f49a341e36d5bb78dcddd5f713d0a90c66cfa7dc43" gracePeriod=30 Apr 22 15:13:03.036023 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:03.035984 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-nnh22"] Apr 22 15:13:03.036459 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:03.036443 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e2a1d7ef-7de9-4fba-ac24-de0edbbb38d3" containerName="kserve-container" Apr 22 15:13:03.036502 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:03.036463 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2a1d7ef-7de9-4fba-ac24-de0edbbb38d3" containerName="kserve-container" Apr 22 15:13:03.036502 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:03.036475 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e2a1d7ef-7de9-4fba-ac24-de0edbbb38d3" containerName="storage-initializer" Apr 22 15:13:03.036502 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:03.036486 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2a1d7ef-7de9-4fba-ac24-de0edbbb38d3" containerName="storage-initializer" Apr 22 15:13:03.036597 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:03.036556 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="e2a1d7ef-7de9-4fba-ac24-de0edbbb38d3" containerName="kserve-container" Apr 22 15:13:03.039541 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:03.039526 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-nnh22" Apr 22 15:13:03.047986 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:03.047955 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-nnh22"] Apr 22 15:13:03.120280 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:03.120246 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/870090f4-d483-4c06-a043-6730d9c05d48-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-nnh22\" (UID: \"870090f4-d483-4c06-a043-6730d9c05d48\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-nnh22" Apr 22 15:13:03.221407 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:03.221355 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/870090f4-d483-4c06-a043-6730d9c05d48-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-nnh22\" (UID: \"870090f4-d483-4c06-a043-6730d9c05d48\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-nnh22" Apr 22 15:13:03.221740 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:03.221718 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/870090f4-d483-4c06-a043-6730d9c05d48-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-nnh22\" (UID: \"870090f4-d483-4c06-a043-6730d9c05d48\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-nnh22" Apr 22 15:13:03.350266 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:03.350167 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-nnh22" Apr 22 15:13:03.466630 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:03.466605 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-nnh22"] Apr 22 15:13:03.468654 ip-10-0-142-195 kubenswrapper[2576]: W0422 15:13:03.468626 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod870090f4_d483_4c06_a043_6730d9c05d48.slice/crio-8ca8c29f4e084531fa48e53fbcdc1f30129f481ac77ba9c46a7752b9025b4c53 WatchSource:0}: Error finding container 8ca8c29f4e084531fa48e53fbcdc1f30129f481ac77ba9c46a7752b9025b4c53: Status 404 returned error can't find the container with id 8ca8c29f4e084531fa48e53fbcdc1f30129f481ac77ba9c46a7752b9025b4c53 Apr 22 15:13:03.897106 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:03.897068 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-nnh22" event={"ID":"870090f4-d483-4c06-a043-6730d9c05d48","Type":"ContainerStarted","Data":"affa3bc890da618ca87aa57455764870162c8fe5cc935c253944bece79223f4c"} Apr 22 15:13:03.897106 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:03.897109 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-nnh22" event={"ID":"870090f4-d483-4c06-a043-6730d9c05d48","Type":"ContainerStarted","Data":"8ca8c29f4e084531fa48e53fbcdc1f30129f481ac77ba9c46a7752b9025b4c53"} Apr 22 15:13:05.686340 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:05.686298 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-v726n" podUID="244c69ff-29dd-4743-8c6a-dd03534a1c15" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 22 15:13:06.410901 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:06.410872 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-v726n" Apr 22 15:13:06.547407 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:06.547378 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/244c69ff-29dd-4743-8c6a-dd03534a1c15-cabundle-cert\") pod \"244c69ff-29dd-4743-8c6a-dd03534a1c15\" (UID: \"244c69ff-29dd-4743-8c6a-dd03534a1c15\") " Apr 22 15:13:06.547573 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:06.547422 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/244c69ff-29dd-4743-8c6a-dd03534a1c15-kserve-provision-location\") pod \"244c69ff-29dd-4743-8c6a-dd03534a1c15\" (UID: \"244c69ff-29dd-4743-8c6a-dd03534a1c15\") " Apr 22 15:13:06.547725 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:06.547699 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/244c69ff-29dd-4743-8c6a-dd03534a1c15-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "244c69ff-29dd-4743-8c6a-dd03534a1c15" (UID: "244c69ff-29dd-4743-8c6a-dd03534a1c15"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 15:13:06.547777 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:06.547762 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/244c69ff-29dd-4743-8c6a-dd03534a1c15-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "244c69ff-29dd-4743-8c6a-dd03534a1c15" (UID: "244c69ff-29dd-4743-8c6a-dd03534a1c15"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:13:06.648442 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:06.648352 2576 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/244c69ff-29dd-4743-8c6a-dd03534a1c15-cabundle-cert\") on node \"ip-10-0-142-195.ec2.internal\" DevicePath \"\"" Apr 22 15:13:06.648442 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:06.648382 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/244c69ff-29dd-4743-8c6a-dd03534a1c15-kserve-provision-location\") on node \"ip-10-0-142-195.ec2.internal\" DevicePath \"\"" Apr 22 15:13:06.905755 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:06.905665 2576 generic.go:358] "Generic (PLEG): container finished" podID="244c69ff-29dd-4743-8c6a-dd03534a1c15" containerID="caa834cee3ff8e7d93aad3f49a341e36d5bb78dcddd5f713d0a90c66cfa7dc43" exitCode=0 Apr 22 15:13:06.905755 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:06.905740 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-v726n" Apr 22 15:13:06.906305 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:06.905752 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-v726n" event={"ID":"244c69ff-29dd-4743-8c6a-dd03534a1c15","Type":"ContainerDied","Data":"caa834cee3ff8e7d93aad3f49a341e36d5bb78dcddd5f713d0a90c66cfa7dc43"} Apr 22 15:13:06.906305 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:06.905797 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-v726n" event={"ID":"244c69ff-29dd-4743-8c6a-dd03534a1c15","Type":"ContainerDied","Data":"bcda681faf80b3aa757aaac71425f5491064922c1aacc17c5d52b4e1180a5fd6"} Apr 22 15:13:06.906305 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:06.905834 2576 scope.go:117] "RemoveContainer" containerID="caa834cee3ff8e7d93aad3f49a341e36d5bb78dcddd5f713d0a90c66cfa7dc43" Apr 22 15:13:06.907207 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:06.907190 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-nnh22_870090f4-d483-4c06-a043-6730d9c05d48/storage-initializer/0.log" Apr 22 15:13:06.907297 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:06.907224 2576 generic.go:358] "Generic (PLEG): container finished" podID="870090f4-d483-4c06-a043-6730d9c05d48" containerID="affa3bc890da618ca87aa57455764870162c8fe5cc935c253944bece79223f4c" exitCode=1 Apr 22 15:13:06.907297 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:06.907267 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-nnh22" event={"ID":"870090f4-d483-4c06-a043-6730d9c05d48","Type":"ContainerDied","Data":"affa3bc890da618ca87aa57455764870162c8fe5cc935c253944bece79223f4c"} Apr 22 15:13:06.913739 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:06.913631 2576 scope.go:117] "RemoveContainer" containerID="ce6ba844be29fc9a97fde2e9977265e22b8cdc0bb8075bf744086e6d982e3e19" Apr 22 15:13:06.920641 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:06.920619 2576 scope.go:117] "RemoveContainer" containerID="caa834cee3ff8e7d93aad3f49a341e36d5bb78dcddd5f713d0a90c66cfa7dc43" Apr 22 15:13:06.920905 ip-10-0-142-195 kubenswrapper[2576]: E0422 15:13:06.920886 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"caa834cee3ff8e7d93aad3f49a341e36d5bb78dcddd5f713d0a90c66cfa7dc43\": container with ID starting with caa834cee3ff8e7d93aad3f49a341e36d5bb78dcddd5f713d0a90c66cfa7dc43 not found: ID does not exist" containerID="caa834cee3ff8e7d93aad3f49a341e36d5bb78dcddd5f713d0a90c66cfa7dc43" Apr 22 15:13:06.920975 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:06.920912 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"caa834cee3ff8e7d93aad3f49a341e36d5bb78dcddd5f713d0a90c66cfa7dc43"} err="failed to get container status \"caa834cee3ff8e7d93aad3f49a341e36d5bb78dcddd5f713d0a90c66cfa7dc43\": rpc error: code = NotFound desc = could not find container \"caa834cee3ff8e7d93aad3f49a341e36d5bb78dcddd5f713d0a90c66cfa7dc43\": container with ID starting with caa834cee3ff8e7d93aad3f49a341e36d5bb78dcddd5f713d0a90c66cfa7dc43 not found: ID does not exist" Apr 22 15:13:06.920975 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:06.920927 2576 scope.go:117] "RemoveContainer" containerID="ce6ba844be29fc9a97fde2e9977265e22b8cdc0bb8075bf744086e6d982e3e19" Apr 22 15:13:06.921130 ip-10-0-142-195 kubenswrapper[2576]: E0422 15:13:06.921116 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce6ba844be29fc9a97fde2e9977265e22b8cdc0bb8075bf744086e6d982e3e19\": container with ID starting with ce6ba844be29fc9a97fde2e9977265e22b8cdc0bb8075bf744086e6d982e3e19 not found: ID does not exist" containerID="ce6ba844be29fc9a97fde2e9977265e22b8cdc0bb8075bf744086e6d982e3e19" Apr 22 15:13:06.921169 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:06.921133 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce6ba844be29fc9a97fde2e9977265e22b8cdc0bb8075bf744086e6d982e3e19"} err="failed to get container status \"ce6ba844be29fc9a97fde2e9977265e22b8cdc0bb8075bf744086e6d982e3e19\": rpc error: code = NotFound desc = could not find container \"ce6ba844be29fc9a97fde2e9977265e22b8cdc0bb8075bf744086e6d982e3e19\": container with ID starting with ce6ba844be29fc9a97fde2e9977265e22b8cdc0bb8075bf744086e6d982e3e19 not found: ID does not exist" Apr 22 15:13:06.940141 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:06.940120 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-v726n"] Apr 22 15:13:06.944162 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:06.944142 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-v726n"] Apr 22 15:13:07.911408 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:07.911383 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-nnh22_870090f4-d483-4c06-a043-6730d9c05d48/storage-initializer/0.log" Apr 22 15:13:07.911790 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:07.911445 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-nnh22" event={"ID":"870090f4-d483-4c06-a043-6730d9c05d48","Type":"ContainerStarted","Data":"f92dcd4fd98ef094b753f51d10423407c6f4b902024dd97a8c9a7b1d1d976609"} Apr 22 15:13:08.292079 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:08.292047 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="244c69ff-29dd-4743-8c6a-dd03534a1c15" path="/var/lib/kubelet/pods/244c69ff-29dd-4743-8c6a-dd03534a1c15/volumes" Apr 22 15:13:12.925618 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:12.925583 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-nnh22_870090f4-d483-4c06-a043-6730d9c05d48/storage-initializer/1.log" Apr 22 15:13:12.926049 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:12.925929 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-nnh22_870090f4-d483-4c06-a043-6730d9c05d48/storage-initializer/0.log" Apr 22 15:13:12.926049 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:12.925963 2576 generic.go:358] "Generic (PLEG): container finished" podID="870090f4-d483-4c06-a043-6730d9c05d48" containerID="f92dcd4fd98ef094b753f51d10423407c6f4b902024dd97a8c9a7b1d1d976609" exitCode=1 Apr 22 15:13:12.926049 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:12.926013 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-nnh22" event={"ID":"870090f4-d483-4c06-a043-6730d9c05d48","Type":"ContainerDied","Data":"f92dcd4fd98ef094b753f51d10423407c6f4b902024dd97a8c9a7b1d1d976609"} Apr 22 15:13:12.926162 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:12.926054 2576 scope.go:117] "RemoveContainer" containerID="affa3bc890da618ca87aa57455764870162c8fe5cc935c253944bece79223f4c" Apr 22 15:13:12.926379 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:12.926362 2576 scope.go:117] "RemoveContainer" containerID="affa3bc890da618ca87aa57455764870162c8fe5cc935c253944bece79223f4c" Apr 22 15:13:12.935918 ip-10-0-142-195 kubenswrapper[2576]: E0422 15:13:12.935886 2576 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-nnh22_kserve-ci-e2e-test_870090f4-d483-4c06-a043-6730d9c05d48_0 in pod sandbox 8ca8c29f4e084531fa48e53fbcdc1f30129f481ac77ba9c46a7752b9025b4c53 from index: no such id: 'affa3bc890da618ca87aa57455764870162c8fe5cc935c253944bece79223f4c'" containerID="affa3bc890da618ca87aa57455764870162c8fe5cc935c253944bece79223f4c" Apr 22 15:13:12.936004 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:12.935928 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"affa3bc890da618ca87aa57455764870162c8fe5cc935c253944bece79223f4c"} err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-nnh22_kserve-ci-e2e-test_870090f4-d483-4c06-a043-6730d9c05d48_0 in pod sandbox 8ca8c29f4e084531fa48e53fbcdc1f30129f481ac77ba9c46a7752b9025b4c53 from index: no such id: 'affa3bc890da618ca87aa57455764870162c8fe5cc935c253944bece79223f4c'" Apr 22 15:13:12.936102 ip-10-0-142-195 kubenswrapper[2576]: E0422 15:13:12.936081 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-nnh22_kserve-ci-e2e-test(870090f4-d483-4c06-a043-6730d9c05d48)\"" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-nnh22" podUID="870090f4-d483-4c06-a043-6730d9c05d48" Apr 22 15:13:13.060880 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:13.060849 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-nnh22"] Apr 22 15:13:13.929954 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:13.929927 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-nnh22_870090f4-d483-4c06-a043-6730d9c05d48/storage-initializer/1.log" Apr 22 15:13:14.054343 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:14.054325 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-nnh22_870090f4-d483-4c06-a043-6730d9c05d48/storage-initializer/1.log" Apr 22 15:13:14.054452 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:14.054380 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-nnh22" Apr 22 15:13:14.105164 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:14.105137 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-2rjjc"] Apr 22 15:13:14.105390 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:14.105379 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="244c69ff-29dd-4743-8c6a-dd03534a1c15" containerName="kserve-container" Apr 22 15:13:14.105442 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:14.105391 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="244c69ff-29dd-4743-8c6a-dd03534a1c15" containerName="kserve-container" Apr 22 15:13:14.105442 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:14.105405 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="870090f4-d483-4c06-a043-6730d9c05d48" containerName="storage-initializer" Apr 22 15:13:14.105442 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:14.105410 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="870090f4-d483-4c06-a043-6730d9c05d48" containerName="storage-initializer" Apr 22 15:13:14.105442 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:14.105420 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="244c69ff-29dd-4743-8c6a-dd03534a1c15" containerName="storage-initializer" Apr 22 15:13:14.105442 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:14.105427 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="244c69ff-29dd-4743-8c6a-dd03534a1c15" containerName="storage-initializer" Apr 22 15:13:14.105442 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:14.105441 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="870090f4-d483-4c06-a043-6730d9c05d48" containerName="storage-initializer" Apr 22 15:13:14.105615 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:14.105447 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="870090f4-d483-4c06-a043-6730d9c05d48" containerName="storage-initializer" Apr 22 15:13:14.105615 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:14.105483 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="870090f4-d483-4c06-a043-6730d9c05d48" containerName="storage-initializer" Apr 22 15:13:14.105615 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:14.105492 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="244c69ff-29dd-4743-8c6a-dd03534a1c15" containerName="kserve-container" Apr 22 15:13:14.105615 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:14.105562 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="870090f4-d483-4c06-a043-6730d9c05d48" containerName="storage-initializer" Apr 22 15:13:14.107050 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:14.107029 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/870090f4-d483-4c06-a043-6730d9c05d48-kserve-provision-location\") pod \"870090f4-d483-4c06-a043-6730d9c05d48\" (UID: \"870090f4-d483-4c06-a043-6730d9c05d48\") " Apr 22 15:13:14.107290 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:14.107266 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/870090f4-d483-4c06-a043-6730d9c05d48-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "870090f4-d483-4c06-a043-6730d9c05d48" (UID: "870090f4-d483-4c06-a043-6730d9c05d48"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 15:13:14.109569 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:14.109552 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-2rjjc" Apr 22 15:13:14.112057 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:14.112037 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 22 15:13:14.119189 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:14.119167 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-2rjjc"] Apr 22 15:13:14.207675 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:14.207595 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cc799a82-3f21-4954-8aeb-418dfe2e32d9-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-2rjjc\" (UID: \"cc799a82-3f21-4954-8aeb-418dfe2e32d9\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-2rjjc" Apr 22 15:13:14.207675 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:14.207627 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/cc799a82-3f21-4954-8aeb-418dfe2e32d9-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-2rjjc\" (UID: \"cc799a82-3f21-4954-8aeb-418dfe2e32d9\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-2rjjc" Apr 22 15:13:14.207887 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:14.207741 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/870090f4-d483-4c06-a043-6730d9c05d48-kserve-provision-location\") on node \"ip-10-0-142-195.ec2.internal\" DevicePath \"\"" Apr 22 15:13:14.308162 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:14.308138 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cc799a82-3f21-4954-8aeb-418dfe2e32d9-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-2rjjc\" (UID: \"cc799a82-3f21-4954-8aeb-418dfe2e32d9\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-2rjjc" Apr 22 15:13:14.308277 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:14.308169 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/cc799a82-3f21-4954-8aeb-418dfe2e32d9-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-2rjjc\" (UID: \"cc799a82-3f21-4954-8aeb-418dfe2e32d9\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-2rjjc" Apr 22 15:13:14.308488 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:14.308469 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cc799a82-3f21-4954-8aeb-418dfe2e32d9-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-2rjjc\" (UID: \"cc799a82-3f21-4954-8aeb-418dfe2e32d9\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-2rjjc" Apr 22 15:13:14.308702 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:14.308687 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/cc799a82-3f21-4954-8aeb-418dfe2e32d9-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-2rjjc\" (UID: \"cc799a82-3f21-4954-8aeb-418dfe2e32d9\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-2rjjc" Apr 22 15:13:14.419512 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:14.419477 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-2rjjc" Apr 22 15:13:14.532035 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:14.531982 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-2rjjc"] Apr 22 15:13:14.534441 ip-10-0-142-195 kubenswrapper[2576]: W0422 15:13:14.534415 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc799a82_3f21_4954_8aeb_418dfe2e32d9.slice/crio-f47be2fe222d6af9ce948fa87b8f309dbfe2b36eb911388bf7774d3ee8ce52c5 WatchSource:0}: Error finding container f47be2fe222d6af9ce948fa87b8f309dbfe2b36eb911388bf7774d3ee8ce52c5: Status 404 returned error can't find the container with id f47be2fe222d6af9ce948fa87b8f309dbfe2b36eb911388bf7774d3ee8ce52c5 Apr 22 15:13:14.933895 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:14.933858 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-2rjjc" event={"ID":"cc799a82-3f21-4954-8aeb-418dfe2e32d9","Type":"ContainerStarted","Data":"710bfa3e4da98b270162521254c5a7967cbeac8ce388fe0ec281bd65cf08e8df"} Apr 22 15:13:14.933895 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:14.933897 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-2rjjc" event={"ID":"cc799a82-3f21-4954-8aeb-418dfe2e32d9","Type":"ContainerStarted","Data":"f47be2fe222d6af9ce948fa87b8f309dbfe2b36eb911388bf7774d3ee8ce52c5"} Apr 22 15:13:14.934991 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:14.934972 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-nnh22_870090f4-d483-4c06-a043-6730d9c05d48/storage-initializer/1.log" Apr 22 15:13:14.935128 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:14.935019 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-nnh22" event={"ID":"870090f4-d483-4c06-a043-6730d9c05d48","Type":"ContainerDied","Data":"8ca8c29f4e084531fa48e53fbcdc1f30129f481ac77ba9c46a7752b9025b4c53"} Apr 22 15:13:14.935128 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:14.935046 2576 scope.go:117] "RemoveContainer" containerID="f92dcd4fd98ef094b753f51d10423407c6f4b902024dd97a8c9a7b1d1d976609" Apr 22 15:13:14.935128 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:14.935057 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-nnh22" Apr 22 15:13:14.974147 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:14.974117 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-nnh22"] Apr 22 15:13:14.977040 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:14.977014 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-nnh22"] Apr 22 15:13:15.941846 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:15.941744 2576 generic.go:358] "Generic (PLEG): container finished" podID="cc799a82-3f21-4954-8aeb-418dfe2e32d9" containerID="710bfa3e4da98b270162521254c5a7967cbeac8ce388fe0ec281bd65cf08e8df" exitCode=0 Apr 22 15:13:15.941846 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:15.941827 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-2rjjc" event={"ID":"cc799a82-3f21-4954-8aeb-418dfe2e32d9","Type":"ContainerDied","Data":"710bfa3e4da98b270162521254c5a7967cbeac8ce388fe0ec281bd65cf08e8df"} Apr 22 15:13:16.291902 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:16.291864 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="870090f4-d483-4c06-a043-6730d9c05d48" path="/var/lib/kubelet/pods/870090f4-d483-4c06-a043-6730d9c05d48/volumes" Apr 22 15:13:16.947195 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:16.947158 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-2rjjc" event={"ID":"cc799a82-3f21-4954-8aeb-418dfe2e32d9","Type":"ContainerStarted","Data":"9b9f6c28fae2fc933e7753375d600631f76ffa276d3aacea7b878d0bde94393e"} Apr 22 15:13:16.947625 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:16.947354 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-2rjjc" Apr 22 15:13:16.948565 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:16.948542 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-2rjjc" podUID="cc799a82-3f21-4954-8aeb-418dfe2e32d9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.56:8080: connect: connection refused" Apr 22 15:13:16.964370 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:16.964321 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-2rjjc" podStartSLOduration=2.964305488 podStartE2EDuration="2.964305488s" podCreationTimestamp="2026-04-22 15:13:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:13:16.962973464 +0000 UTC m=+3467.288707878" watchObservedRunningTime="2026-04-22 15:13:16.964305488 +0000 UTC m=+3467.290039905" Apr 22 15:13:17.951087 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:17.951052 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-2rjjc" podUID="cc799a82-3f21-4954-8aeb-418dfe2e32d9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.56:8080: connect: connection refused" Apr 22 15:13:27.951047 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:27.951004 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-2rjjc" podUID="cc799a82-3f21-4954-8aeb-418dfe2e32d9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.56:8080: connect: connection refused" Apr 22 15:13:37.951304 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:37.951262 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-2rjjc" podUID="cc799a82-3f21-4954-8aeb-418dfe2e32d9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.56:8080: connect: connection refused" Apr 22 15:13:47.951568 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:47.951479 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-2rjjc" podUID="cc799a82-3f21-4954-8aeb-418dfe2e32d9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.56:8080: connect: connection refused" Apr 22 15:13:57.951265 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:13:57.951219 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-2rjjc" podUID="cc799a82-3f21-4954-8aeb-418dfe2e32d9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.56:8080: connect: connection refused" Apr 22 15:14:07.951470 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:07.951427 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-2rjjc" podUID="cc799a82-3f21-4954-8aeb-418dfe2e32d9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.56:8080: connect: connection refused" Apr 22 15:14:17.952256 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:17.952224 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-2rjjc" Apr 22 15:14:24.169793 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:24.169764 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-2rjjc"] Apr 22 15:14:24.170199 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:24.170012 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-2rjjc" podUID="cc799a82-3f21-4954-8aeb-418dfe2e32d9" containerName="kserve-container" containerID="cri-o://9b9f6c28fae2fc933e7753375d600631f76ffa276d3aacea7b878d0bde94393e" gracePeriod=30 Apr 22 15:14:25.297754 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:25.297723 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-68d779cb64-crc65"] Apr 22 15:14:25.300988 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:25.300972 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-68d779cb64-crc65" Apr 22 15:14:25.309050 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:25.309028 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-68d779cb64-crc65"] Apr 22 15:14:25.416465 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:25.416437 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8c976f43-333f-4cf3-818f-160110bdab7e-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-68d779cb64-crc65\" (UID: \"8c976f43-333f-4cf3-818f-160110bdab7e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-68d779cb64-crc65" Apr 22 15:14:25.517331 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:25.517296 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8c976f43-333f-4cf3-818f-160110bdab7e-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-68d779cb64-crc65\" (UID: \"8c976f43-333f-4cf3-818f-160110bdab7e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-68d779cb64-crc65" Apr 22 15:14:25.517633 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:25.517615 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8c976f43-333f-4cf3-818f-160110bdab7e-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-68d779cb64-crc65\" (UID: \"8c976f43-333f-4cf3-818f-160110bdab7e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-68d779cb64-crc65" Apr 22 15:14:25.610751 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:25.610675 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-68d779cb64-crc65" Apr 22 15:14:25.724219 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:25.724191 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-68d779cb64-crc65"] Apr 22 15:14:25.725725 ip-10-0-142-195 kubenswrapper[2576]: W0422 15:14:25.725690 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c976f43_333f_4cf3_818f_160110bdab7e.slice/crio-e1b2ec95e85b9d1d170a143e850a725ec9d71bcb2dd52cb8b4efe6e90e3bf2c3 WatchSource:0}: Error finding container e1b2ec95e85b9d1d170a143e850a725ec9d71bcb2dd52cb8b4efe6e90e3bf2c3: Status 404 returned error can't find the container with id e1b2ec95e85b9d1d170a143e850a725ec9d71bcb2dd52cb8b4efe6e90e3bf2c3 Apr 22 15:14:26.133780 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:26.133742 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-68d779cb64-crc65" event={"ID":"8c976f43-333f-4cf3-818f-160110bdab7e","Type":"ContainerStarted","Data":"6b23d88f113679b0ffe4bbe5a842e86729cbcec3c9982d79b1f655ff12b0e9f0"} Apr 22 15:14:26.133780 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:26.133781 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-68d779cb64-crc65" event={"ID":"8c976f43-333f-4cf3-818f-160110bdab7e","Type":"ContainerStarted","Data":"e1b2ec95e85b9d1d170a143e850a725ec9d71bcb2dd52cb8b4efe6e90e3bf2c3"} Apr 22 15:14:28.005330 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:28.005310 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-2rjjc" Apr 22 15:14:28.035804 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:28.035779 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cc799a82-3f21-4954-8aeb-418dfe2e32d9-kserve-provision-location\") pod \"cc799a82-3f21-4954-8aeb-418dfe2e32d9\" (UID: \"cc799a82-3f21-4954-8aeb-418dfe2e32d9\") " Apr 22 15:14:28.035953 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:28.035866 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/cc799a82-3f21-4954-8aeb-418dfe2e32d9-cabundle-cert\") pod \"cc799a82-3f21-4954-8aeb-418dfe2e32d9\" (UID: \"cc799a82-3f21-4954-8aeb-418dfe2e32d9\") " Apr 22 15:14:28.036126 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:28.036062 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc799a82-3f21-4954-8aeb-418dfe2e32d9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "cc799a82-3f21-4954-8aeb-418dfe2e32d9" (UID: "cc799a82-3f21-4954-8aeb-418dfe2e32d9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 15:14:28.036219 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:28.036198 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc799a82-3f21-4954-8aeb-418dfe2e32d9-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "cc799a82-3f21-4954-8aeb-418dfe2e32d9" (UID: "cc799a82-3f21-4954-8aeb-418dfe2e32d9"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:14:28.137022 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:28.136963 2576 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/cc799a82-3f21-4954-8aeb-418dfe2e32d9-cabundle-cert\") on node \"ip-10-0-142-195.ec2.internal\" DevicePath \"\"" Apr 22 15:14:28.137022 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:28.136988 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cc799a82-3f21-4954-8aeb-418dfe2e32d9-kserve-provision-location\") on node \"ip-10-0-142-195.ec2.internal\" DevicePath \"\"" Apr 22 15:14:28.140167 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:28.140143 2576 generic.go:358] "Generic (PLEG): container finished" podID="cc799a82-3f21-4954-8aeb-418dfe2e32d9" containerID="9b9f6c28fae2fc933e7753375d600631f76ffa276d3aacea7b878d0bde94393e" exitCode=0 Apr 22 15:14:28.140265 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:28.140184 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-2rjjc" event={"ID":"cc799a82-3f21-4954-8aeb-418dfe2e32d9","Type":"ContainerDied","Data":"9b9f6c28fae2fc933e7753375d600631f76ffa276d3aacea7b878d0bde94393e"} Apr 22 15:14:28.140265 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:28.140203 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-2rjjc" Apr 22 15:14:28.140265 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:28.140211 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-2rjjc" event={"ID":"cc799a82-3f21-4954-8aeb-418dfe2e32d9","Type":"ContainerDied","Data":"f47be2fe222d6af9ce948fa87b8f309dbfe2b36eb911388bf7774d3ee8ce52c5"} Apr 22 15:14:28.140265 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:28.140227 2576 scope.go:117] "RemoveContainer" containerID="9b9f6c28fae2fc933e7753375d600631f76ffa276d3aacea7b878d0bde94393e" Apr 22 15:14:28.147757 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:28.147734 2576 scope.go:117] "RemoveContainer" containerID="710bfa3e4da98b270162521254c5a7967cbeac8ce388fe0ec281bd65cf08e8df" Apr 22 15:14:28.154101 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:28.154085 2576 scope.go:117] "RemoveContainer" containerID="9b9f6c28fae2fc933e7753375d600631f76ffa276d3aacea7b878d0bde94393e" Apr 22 15:14:28.154322 ip-10-0-142-195 kubenswrapper[2576]: E0422 15:14:28.154303 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b9f6c28fae2fc933e7753375d600631f76ffa276d3aacea7b878d0bde94393e\": container with ID starting with 9b9f6c28fae2fc933e7753375d600631f76ffa276d3aacea7b878d0bde94393e not found: ID does not exist" containerID="9b9f6c28fae2fc933e7753375d600631f76ffa276d3aacea7b878d0bde94393e" Apr 22 15:14:28.154399 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:28.154333 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b9f6c28fae2fc933e7753375d600631f76ffa276d3aacea7b878d0bde94393e"} err="failed to get container status \"9b9f6c28fae2fc933e7753375d600631f76ffa276d3aacea7b878d0bde94393e\": rpc error: code = NotFound desc = could not find container \"9b9f6c28fae2fc933e7753375d600631f76ffa276d3aacea7b878d0bde94393e\": container with ID starting with 9b9f6c28fae2fc933e7753375d600631f76ffa276d3aacea7b878d0bde94393e not found: ID does not exist" Apr 22 15:14:28.154399 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:28.154357 2576 scope.go:117] "RemoveContainer" containerID="710bfa3e4da98b270162521254c5a7967cbeac8ce388fe0ec281bd65cf08e8df" Apr 22 15:14:28.154573 ip-10-0-142-195 kubenswrapper[2576]: E0422 15:14:28.154557 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"710bfa3e4da98b270162521254c5a7967cbeac8ce388fe0ec281bd65cf08e8df\": container with ID starting with 710bfa3e4da98b270162521254c5a7967cbeac8ce388fe0ec281bd65cf08e8df not found: ID does not exist" containerID="710bfa3e4da98b270162521254c5a7967cbeac8ce388fe0ec281bd65cf08e8df" Apr 22 15:14:28.154611 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:28.154579 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"710bfa3e4da98b270162521254c5a7967cbeac8ce388fe0ec281bd65cf08e8df"} err="failed to get container status \"710bfa3e4da98b270162521254c5a7967cbeac8ce388fe0ec281bd65cf08e8df\": rpc error: code = NotFound desc = could not find container \"710bfa3e4da98b270162521254c5a7967cbeac8ce388fe0ec281bd65cf08e8df\": container with ID starting with 710bfa3e4da98b270162521254c5a7967cbeac8ce388fe0ec281bd65cf08e8df not found: ID does not exist" Apr 22 15:14:28.161524 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:28.161508 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-2rjjc"] Apr 22 15:14:28.162585 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:28.162566 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-2rjjc"] Apr 22 15:14:28.291223 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:28.291190 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc799a82-3f21-4954-8aeb-418dfe2e32d9" path="/var/lib/kubelet/pods/cc799a82-3f21-4954-8aeb-418dfe2e32d9/volumes" Apr 22 15:14:28.951394 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:28.951346 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-2rjjc" podUID="cc799a82-3f21-4954-8aeb-418dfe2e32d9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.56:8080: i/o timeout" Apr 22 15:14:31.150895 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:31.150869 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-68d779cb64-crc65_8c976f43-333f-4cf3-818f-160110bdab7e/storage-initializer/0.log" Apr 22 15:14:31.151233 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:31.150908 2576 generic.go:358] "Generic (PLEG): container finished" podID="8c976f43-333f-4cf3-818f-160110bdab7e" containerID="6b23d88f113679b0ffe4bbe5a842e86729cbcec3c9982d79b1f655ff12b0e9f0" exitCode=1 Apr 22 15:14:31.151233 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:31.150933 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-68d779cb64-crc65" event={"ID":"8c976f43-333f-4cf3-818f-160110bdab7e","Type":"ContainerDied","Data":"6b23d88f113679b0ffe4bbe5a842e86729cbcec3c9982d79b1f655ff12b0e9f0"} Apr 22 15:14:32.155234 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:32.155190 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-68d779cb64-crc65_8c976f43-333f-4cf3-818f-160110bdab7e/storage-initializer/0.log" Apr 22 15:14:32.155587 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:32.155275 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-68d779cb64-crc65" event={"ID":"8c976f43-333f-4cf3-818f-160110bdab7e","Type":"ContainerStarted","Data":"da29649b8d4142d243f8092aa2fe12527e758a191fb09fbe269655fbdc643634"} Apr 22 15:14:35.232522 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:35.232491 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-68d779cb64-crc65"] Apr 22 15:14:35.232963 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:35.232720 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-68d779cb64-crc65" podUID="8c976f43-333f-4cf3-818f-160110bdab7e" containerName="storage-initializer" containerID="cri-o://da29649b8d4142d243f8092aa2fe12527e758a191fb09fbe269655fbdc643634" gracePeriod=30 Apr 22 15:14:36.166160 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:36.166137 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-68d779cb64-crc65_8c976f43-333f-4cf3-818f-160110bdab7e/storage-initializer/1.log" Apr 22 15:14:36.166445 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:36.166425 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-68d779cb64-crc65_8c976f43-333f-4cf3-818f-160110bdab7e/storage-initializer/0.log" Apr 22 15:14:36.166530 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:36.166487 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-68d779cb64-crc65_8c976f43-333f-4cf3-818f-160110bdab7e/storage-initializer/1.log" Apr 22 15:14:36.166530 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:36.166504 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-68d779cb64-crc65" Apr 22 15:14:36.166839 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:36.166822 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-68d779cb64-crc65_8c976f43-333f-4cf3-818f-160110bdab7e/storage-initializer/0.log" Apr 22 15:14:36.166917 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:36.166858 2576 generic.go:358] "Generic (PLEG): container finished" podID="8c976f43-333f-4cf3-818f-160110bdab7e" containerID="da29649b8d4142d243f8092aa2fe12527e758a191fb09fbe269655fbdc643634" exitCode=1 Apr 22 15:14:36.166980 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:36.166924 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-68d779cb64-crc65" event={"ID":"8c976f43-333f-4cf3-818f-160110bdab7e","Type":"ContainerDied","Data":"da29649b8d4142d243f8092aa2fe12527e758a191fb09fbe269655fbdc643634"} Apr 22 15:14:36.166980 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:36.166956 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-68d779cb64-crc65" event={"ID":"8c976f43-333f-4cf3-818f-160110bdab7e","Type":"ContainerDied","Data":"e1b2ec95e85b9d1d170a143e850a725ec9d71bcb2dd52cb8b4efe6e90e3bf2c3"} Apr 22 15:14:36.166980 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:36.166977 2576 scope.go:117] "RemoveContainer" containerID="da29649b8d4142d243f8092aa2fe12527e758a191fb09fbe269655fbdc643634" Apr 22 15:14:36.173934 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:36.173918 2576 scope.go:117] "RemoveContainer" containerID="6b23d88f113679b0ffe4bbe5a842e86729cbcec3c9982d79b1f655ff12b0e9f0" Apr 22 15:14:36.180176 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:36.180159 2576 scope.go:117] "RemoveContainer" containerID="da29649b8d4142d243f8092aa2fe12527e758a191fb09fbe269655fbdc643634" Apr 22 15:14:36.180391 ip-10-0-142-195 kubenswrapper[2576]: E0422 15:14:36.180371 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da29649b8d4142d243f8092aa2fe12527e758a191fb09fbe269655fbdc643634\": container with ID starting with da29649b8d4142d243f8092aa2fe12527e758a191fb09fbe269655fbdc643634 not found: ID does not exist" containerID="da29649b8d4142d243f8092aa2fe12527e758a191fb09fbe269655fbdc643634" Apr 22 15:14:36.180442 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:36.180421 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da29649b8d4142d243f8092aa2fe12527e758a191fb09fbe269655fbdc643634"} err="failed to get container status \"da29649b8d4142d243f8092aa2fe12527e758a191fb09fbe269655fbdc643634\": rpc error: code = NotFound desc = could not find container \"da29649b8d4142d243f8092aa2fe12527e758a191fb09fbe269655fbdc643634\": container with ID starting with da29649b8d4142d243f8092aa2fe12527e758a191fb09fbe269655fbdc643634 not found: ID does not exist" Apr 22 15:14:36.180489 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:36.180443 2576 scope.go:117] "RemoveContainer" containerID="6b23d88f113679b0ffe4bbe5a842e86729cbcec3c9982d79b1f655ff12b0e9f0" Apr 22 15:14:36.182195 ip-10-0-142-195 kubenswrapper[2576]: E0422 15:14:36.182162 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b23d88f113679b0ffe4bbe5a842e86729cbcec3c9982d79b1f655ff12b0e9f0\": container with ID starting with 6b23d88f113679b0ffe4bbe5a842e86729cbcec3c9982d79b1f655ff12b0e9f0 not found: ID does not exist" containerID="6b23d88f113679b0ffe4bbe5a842e86729cbcec3c9982d79b1f655ff12b0e9f0" Apr 22 15:14:36.182294 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:36.182193 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b23d88f113679b0ffe4bbe5a842e86729cbcec3c9982d79b1f655ff12b0e9f0"} err="failed to get container status \"6b23d88f113679b0ffe4bbe5a842e86729cbcec3c9982d79b1f655ff12b0e9f0\": rpc error: code = NotFound desc = could not find container \"6b23d88f113679b0ffe4bbe5a842e86729cbcec3c9982d79b1f655ff12b0e9f0\": container with ID starting with 6b23d88f113679b0ffe4bbe5a842e86729cbcec3c9982d79b1f655ff12b0e9f0 not found: ID does not exist" Apr 22 15:14:36.287046 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:36.287020 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8c976f43-333f-4cf3-818f-160110bdab7e-kserve-provision-location\") pod \"8c976f43-333f-4cf3-818f-160110bdab7e\" (UID: \"8c976f43-333f-4cf3-818f-160110bdab7e\") " Apr 22 15:14:36.287392 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:36.287247 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c976f43-333f-4cf3-818f-160110bdab7e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8c976f43-333f-4cf3-818f-160110bdab7e" (UID: "8c976f43-333f-4cf3-818f-160110bdab7e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 15:14:36.296526 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:36.296503 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-np2f7"] Apr 22 15:14:36.296743 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:36.296732 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8c976f43-333f-4cf3-818f-160110bdab7e" containerName="storage-initializer" Apr 22 15:14:36.296787 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:36.296745 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c976f43-333f-4cf3-818f-160110bdab7e" containerName="storage-initializer" Apr 22 15:14:36.296787 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:36.296757 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cc799a82-3f21-4954-8aeb-418dfe2e32d9" containerName="storage-initializer" Apr 22 15:14:36.296787 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:36.296762 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc799a82-3f21-4954-8aeb-418dfe2e32d9" containerName="storage-initializer" Apr 22 15:14:36.296787 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:36.296769 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8c976f43-333f-4cf3-818f-160110bdab7e" containerName="storage-initializer" Apr 22 15:14:36.296787 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:36.296775 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c976f43-333f-4cf3-818f-160110bdab7e" containerName="storage-initializer" Apr 22 15:14:36.296787 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:36.296782 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cc799a82-3f21-4954-8aeb-418dfe2e32d9" containerName="kserve-container" Apr 22 15:14:36.296787 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:36.296787 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc799a82-3f21-4954-8aeb-418dfe2e32d9" containerName="kserve-container" Apr 22 15:14:36.297008 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:36.296853 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="8c976f43-333f-4cf3-818f-160110bdab7e" containerName="storage-initializer" Apr 22 15:14:36.297008 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:36.296861 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="cc799a82-3f21-4954-8aeb-418dfe2e32d9" containerName="kserve-container" Apr 22 15:14:36.297008 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:36.296869 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="8c976f43-333f-4cf3-818f-160110bdab7e" containerName="storage-initializer" Apr 22 15:14:36.300785 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:36.300769 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-np2f7" Apr 22 15:14:36.303476 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:36.303459 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 22 15:14:36.307582 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:36.307559 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-np2f7"] Apr 22 15:14:36.388428 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:36.388402 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6a0c1417-3198-4fe4-a796-531e60aded51-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-np2f7\" (UID: \"6a0c1417-3198-4fe4-a796-531e60aded51\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-np2f7" Apr 22 15:14:36.388796 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:36.388437 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6a0c1417-3198-4fe4-a796-531e60aded51-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-np2f7\" (UID: \"6a0c1417-3198-4fe4-a796-531e60aded51\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-np2f7" Apr 22 15:14:36.388916 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:36.388895 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8c976f43-333f-4cf3-818f-160110bdab7e-kserve-provision-location\") on node \"ip-10-0-142-195.ec2.internal\" DevicePath \"\"" Apr 22 15:14:36.489516 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:36.489487 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6a0c1417-3198-4fe4-a796-531e60aded51-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-np2f7\" (UID: \"6a0c1417-3198-4fe4-a796-531e60aded51\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-np2f7" Apr 22 15:14:36.489645 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:36.489548 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6a0c1417-3198-4fe4-a796-531e60aded51-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-np2f7\" (UID: \"6a0c1417-3198-4fe4-a796-531e60aded51\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-np2f7" Apr 22 15:14:36.489874 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:36.489857 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6a0c1417-3198-4fe4-a796-531e60aded51-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-np2f7\" (UID: \"6a0c1417-3198-4fe4-a796-531e60aded51\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-np2f7" Apr 22 15:14:36.490128 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:36.490107 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6a0c1417-3198-4fe4-a796-531e60aded51-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-np2f7\" (UID: \"6a0c1417-3198-4fe4-a796-531e60aded51\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-np2f7" Apr 22 15:14:36.609483 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:36.609418 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-np2f7" Apr 22 15:14:36.721174 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:36.721150 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-np2f7"] Apr 22 15:14:36.723427 ip-10-0-142-195 kubenswrapper[2576]: W0422 15:14:36.723393 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a0c1417_3198_4fe4_a796_531e60aded51.slice/crio-f5067453740b5193b232df150fd55bb908aec732ff70fd443f6745f0fc00ce64 WatchSource:0}: Error finding container f5067453740b5193b232df150fd55bb908aec732ff70fd443f6745f0fc00ce64: Status 404 returned error can't find the container with id f5067453740b5193b232df150fd55bb908aec732ff70fd443f6745f0fc00ce64 Apr 22 15:14:37.170550 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:37.170520 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-68d779cb64-crc65" Apr 22 15:14:37.171827 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:37.171786 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-np2f7" event={"ID":"6a0c1417-3198-4fe4-a796-531e60aded51","Type":"ContainerStarted","Data":"e32cc37c347ea03534949db62f6bea9ca20ea7aa9e4982dfa6e7818cb3a3c774"} Apr 22 15:14:37.171945 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:37.171830 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-np2f7" event={"ID":"6a0c1417-3198-4fe4-a796-531e60aded51","Type":"ContainerStarted","Data":"f5067453740b5193b232df150fd55bb908aec732ff70fd443f6745f0fc00ce64"} Apr 22 15:14:37.214625 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:37.214595 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-68d779cb64-crc65"] Apr 22 15:14:37.218163 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:37.218142 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-68d779cb64-crc65"] Apr 22 15:14:38.175579 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:38.175545 2576 generic.go:358] "Generic (PLEG): container finished" podID="6a0c1417-3198-4fe4-a796-531e60aded51" containerID="e32cc37c347ea03534949db62f6bea9ca20ea7aa9e4982dfa6e7818cb3a3c774" exitCode=0 Apr 22 15:14:38.175980 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:38.175606 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-np2f7" event={"ID":"6a0c1417-3198-4fe4-a796-531e60aded51","Type":"ContainerDied","Data":"e32cc37c347ea03534949db62f6bea9ca20ea7aa9e4982dfa6e7818cb3a3c774"} Apr 22 15:14:38.291399 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:38.291366 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c976f43-333f-4cf3-818f-160110bdab7e" path="/var/lib/kubelet/pods/8c976f43-333f-4cf3-818f-160110bdab7e/volumes" Apr 22 15:14:39.179592 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:39.179551 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-np2f7" event={"ID":"6a0c1417-3198-4fe4-a796-531e60aded51","Type":"ContainerStarted","Data":"435314f6932b1144e795b5dca34fe8ac7c11dd941b35943fd6e4e9414f4b571e"} Apr 22 15:14:39.180026 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:39.179783 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-np2f7" Apr 22 15:14:39.181096 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:39.181071 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-np2f7" podUID="6a0c1417-3198-4fe4-a796-531e60aded51" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.58:8080: connect: connection refused" Apr 22 15:14:39.196219 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:39.196183 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-np2f7" podStartSLOduration=3.196172045 podStartE2EDuration="3.196172045s" podCreationTimestamp="2026-04-22 15:14:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:14:39.19566237 +0000 UTC m=+3549.521396786" watchObservedRunningTime="2026-04-22 15:14:39.196172045 +0000 UTC m=+3549.521906443" Apr 22 15:14:40.183085 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:40.183043 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-np2f7" podUID="6a0c1417-3198-4fe4-a796-531e60aded51" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.58:8080: connect: connection refused" Apr 22 15:14:50.183726 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:14:50.183690 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-np2f7" podUID="6a0c1417-3198-4fe4-a796-531e60aded51" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.58:8080: connect: connection refused" Apr 22 15:15:00.183335 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:00.183291 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-np2f7" podUID="6a0c1417-3198-4fe4-a796-531e60aded51" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.58:8080: connect: connection refused" Apr 22 15:15:10.183273 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:10.183191 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-np2f7" podUID="6a0c1417-3198-4fe4-a796-531e60aded51" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.58:8080: connect: connection refused" Apr 22 15:15:20.183636 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:20.183593 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-np2f7" podUID="6a0c1417-3198-4fe4-a796-531e60aded51" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.58:8080: connect: connection refused" Apr 22 15:15:30.183741 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:30.183702 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-np2f7" podUID="6a0c1417-3198-4fe4-a796-531e60aded51" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.58:8080: connect: connection refused" Apr 22 15:15:31.118244 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:31.118218 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x7g6p_5880a8e9-777a-4921-b5f6-c6325c768bf2/ovn-acl-logging/0.log" Apr 22 15:15:31.125914 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:31.125897 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x7g6p_5880a8e9-777a-4921-b5f6-c6325c768bf2/ovn-acl-logging/0.log" Apr 22 15:15:40.184829 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:40.184782 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-np2f7" Apr 22 15:15:46.413935 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:46.413904 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-np2f7"] Apr 22 15:15:46.414405 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:46.414136 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-np2f7" podUID="6a0c1417-3198-4fe4-a796-531e60aded51" containerName="kserve-container" containerID="cri-o://435314f6932b1144e795b5dca34fe8ac7c11dd941b35943fd6e4e9414f4b571e" gracePeriod=30 Apr 22 15:15:47.451278 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:47.451248 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6784d5b89c-5vds9"] Apr 22 15:15:47.454444 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:47.454426 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6784d5b89c-5vds9" Apr 22 15:15:47.472742 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:47.472716 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6784d5b89c-5vds9"] Apr 22 15:15:47.564027 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:47.564000 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e6e97f11-dec4-44fa-a279-0a86c425c064-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-6784d5b89c-5vds9\" (UID: \"e6e97f11-dec4-44fa-a279-0a86c425c064\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6784d5b89c-5vds9" Apr 22 15:15:47.665219 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:47.665189 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e6e97f11-dec4-44fa-a279-0a86c425c064-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-6784d5b89c-5vds9\" (UID: \"e6e97f11-dec4-44fa-a279-0a86c425c064\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6784d5b89c-5vds9" Apr 22 15:15:47.665534 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:47.665515 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e6e97f11-dec4-44fa-a279-0a86c425c064-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-6784d5b89c-5vds9\" (UID: \"e6e97f11-dec4-44fa-a279-0a86c425c064\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6784d5b89c-5vds9" Apr 22 15:15:47.763619 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:47.763591 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6784d5b89c-5vds9" Apr 22 15:15:47.879846 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:47.879823 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6784d5b89c-5vds9"] Apr 22 15:15:47.881923 ip-10-0-142-195 kubenswrapper[2576]: W0422 15:15:47.881893 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6e97f11_dec4_44fa_a279_0a86c425c064.slice/crio-879d9db50c079e52fd60e7281a5c62824d0d73e016fb0bf7c33e2e4d9d6ebf3d WatchSource:0}: Error finding container 879d9db50c079e52fd60e7281a5c62824d0d73e016fb0bf7c33e2e4d9d6ebf3d: Status 404 returned error can't find the container with id 879d9db50c079e52fd60e7281a5c62824d0d73e016fb0bf7c33e2e4d9d6ebf3d Apr 22 15:15:48.356104 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:48.356067 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6784d5b89c-5vds9" event={"ID":"e6e97f11-dec4-44fa-a279-0a86c425c064","Type":"ContainerStarted","Data":"004a9a80bc554f53aea9f014440c260b7ff26522e947143905f099a46280868b"} Apr 22 15:15:48.356267 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:48.356108 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6784d5b89c-5vds9" event={"ID":"e6e97f11-dec4-44fa-a279-0a86c425c064","Type":"ContainerStarted","Data":"879d9db50c079e52fd60e7281a5c62824d0d73e016fb0bf7c33e2e4d9d6ebf3d"} Apr 22 15:15:49.844926 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:49.844905 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-np2f7" Apr 22 15:15:49.986046 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:49.986026 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6a0c1417-3198-4fe4-a796-531e60aded51-kserve-provision-location\") pod \"6a0c1417-3198-4fe4-a796-531e60aded51\" (UID: \"6a0c1417-3198-4fe4-a796-531e60aded51\") " Apr 22 15:15:49.986188 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:49.986096 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6a0c1417-3198-4fe4-a796-531e60aded51-cabundle-cert\") pod \"6a0c1417-3198-4fe4-a796-531e60aded51\" (UID: \"6a0c1417-3198-4fe4-a796-531e60aded51\") " Apr 22 15:15:49.986373 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:49.986347 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a0c1417-3198-4fe4-a796-531e60aded51-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6a0c1417-3198-4fe4-a796-531e60aded51" (UID: "6a0c1417-3198-4fe4-a796-531e60aded51"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 15:15:49.986418 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:49.986383 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a0c1417-3198-4fe4-a796-531e60aded51-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "6a0c1417-3198-4fe4-a796-531e60aded51" (UID: "6a0c1417-3198-4fe4-a796-531e60aded51"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:15:50.086650 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:50.086628 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6a0c1417-3198-4fe4-a796-531e60aded51-kserve-provision-location\") on node \"ip-10-0-142-195.ec2.internal\" DevicePath \"\"" Apr 22 15:15:50.086650 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:50.086648 2576 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6a0c1417-3198-4fe4-a796-531e60aded51-cabundle-cert\") on node \"ip-10-0-142-195.ec2.internal\" DevicePath \"\"" Apr 22 15:15:50.361740 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:50.361720 2576 generic.go:358] "Generic (PLEG): container finished" podID="6a0c1417-3198-4fe4-a796-531e60aded51" containerID="435314f6932b1144e795b5dca34fe8ac7c11dd941b35943fd6e4e9414f4b571e" exitCode=0 Apr 22 15:15:50.361845 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:50.361768 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-np2f7" event={"ID":"6a0c1417-3198-4fe4-a796-531e60aded51","Type":"ContainerDied","Data":"435314f6932b1144e795b5dca34fe8ac7c11dd941b35943fd6e4e9414f4b571e"} Apr 22 15:15:50.361845 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:50.361778 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-np2f7" Apr 22 15:15:50.361845 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:50.361786 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-np2f7" event={"ID":"6a0c1417-3198-4fe4-a796-531e60aded51","Type":"ContainerDied","Data":"f5067453740b5193b232df150fd55bb908aec732ff70fd443f6745f0fc00ce64"} Apr 22 15:15:50.361845 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:50.361800 2576 scope.go:117] "RemoveContainer" containerID="435314f6932b1144e795b5dca34fe8ac7c11dd941b35943fd6e4e9414f4b571e" Apr 22 15:15:50.368778 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:50.368756 2576 scope.go:117] "RemoveContainer" containerID="e32cc37c347ea03534949db62f6bea9ca20ea7aa9e4982dfa6e7818cb3a3c774" Apr 22 15:15:50.375139 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:50.375121 2576 scope.go:117] "RemoveContainer" containerID="435314f6932b1144e795b5dca34fe8ac7c11dd941b35943fd6e4e9414f4b571e" Apr 22 15:15:50.375399 ip-10-0-142-195 kubenswrapper[2576]: E0422 15:15:50.375380 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"435314f6932b1144e795b5dca34fe8ac7c11dd941b35943fd6e4e9414f4b571e\": container with ID starting with 435314f6932b1144e795b5dca34fe8ac7c11dd941b35943fd6e4e9414f4b571e not found: ID does not exist" containerID="435314f6932b1144e795b5dca34fe8ac7c11dd941b35943fd6e4e9414f4b571e" Apr 22 15:15:50.375446 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:50.375407 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"435314f6932b1144e795b5dca34fe8ac7c11dd941b35943fd6e4e9414f4b571e"} err="failed to get container status \"435314f6932b1144e795b5dca34fe8ac7c11dd941b35943fd6e4e9414f4b571e\": rpc error: code = NotFound desc = could not find container \"435314f6932b1144e795b5dca34fe8ac7c11dd941b35943fd6e4e9414f4b571e\": container with ID starting with 435314f6932b1144e795b5dca34fe8ac7c11dd941b35943fd6e4e9414f4b571e not found: ID does not exist" Apr 22 15:15:50.375446 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:50.375425 2576 scope.go:117] "RemoveContainer" containerID="e32cc37c347ea03534949db62f6bea9ca20ea7aa9e4982dfa6e7818cb3a3c774" Apr 22 15:15:50.375660 ip-10-0-142-195 kubenswrapper[2576]: E0422 15:15:50.375634 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e32cc37c347ea03534949db62f6bea9ca20ea7aa9e4982dfa6e7818cb3a3c774\": container with ID starting with e32cc37c347ea03534949db62f6bea9ca20ea7aa9e4982dfa6e7818cb3a3c774 not found: ID does not exist" containerID="e32cc37c347ea03534949db62f6bea9ca20ea7aa9e4982dfa6e7818cb3a3c774" Apr 22 15:15:50.375741 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:50.375661 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e32cc37c347ea03534949db62f6bea9ca20ea7aa9e4982dfa6e7818cb3a3c774"} err="failed to get container status \"e32cc37c347ea03534949db62f6bea9ca20ea7aa9e4982dfa6e7818cb3a3c774\": rpc error: code = NotFound desc = could not find container \"e32cc37c347ea03534949db62f6bea9ca20ea7aa9e4982dfa6e7818cb3a3c774\": container with ID starting with e32cc37c347ea03534949db62f6bea9ca20ea7aa9e4982dfa6e7818cb3a3c774 not found: ID does not exist" Apr 22 15:15:50.380508 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:50.380487 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-np2f7"] Apr 22 15:15:50.384259 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:50.384235 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-np2f7"] Apr 22 15:15:52.291953 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:52.291921 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a0c1417-3198-4fe4-a796-531e60aded51" path="/var/lib/kubelet/pods/6a0c1417-3198-4fe4-a796-531e60aded51/volumes" Apr 22 15:15:53.371470 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:53.371448 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-6784d5b89c-5vds9_e6e97f11-dec4-44fa-a279-0a86c425c064/storage-initializer/0.log" Apr 22 15:15:53.371769 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:53.371481 2576 generic.go:358] "Generic (PLEG): container finished" podID="e6e97f11-dec4-44fa-a279-0a86c425c064" containerID="004a9a80bc554f53aea9f014440c260b7ff26522e947143905f099a46280868b" exitCode=1 Apr 22 15:15:53.371769 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:53.371538 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6784d5b89c-5vds9" event={"ID":"e6e97f11-dec4-44fa-a279-0a86c425c064","Type":"ContainerDied","Data":"004a9a80bc554f53aea9f014440c260b7ff26522e947143905f099a46280868b"} Apr 22 15:15:54.375906 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:54.375873 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-6784d5b89c-5vds9_e6e97f11-dec4-44fa-a279-0a86c425c064/storage-initializer/0.log" Apr 22 15:15:54.376253 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:54.375919 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6784d5b89c-5vds9" event={"ID":"e6e97f11-dec4-44fa-a279-0a86c425c064","Type":"ContainerStarted","Data":"e45ef5c011bee7395cf165b3a7d29831ba141eba791a19f4bb4f42344dc901f7"} Apr 22 15:15:57.409398 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:57.409370 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6784d5b89c-5vds9"] Apr 22 15:15:57.409752 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:57.409561 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6784d5b89c-5vds9" podUID="e6e97f11-dec4-44fa-a279-0a86c425c064" containerName="storage-initializer" containerID="cri-o://e45ef5c011bee7395cf165b3a7d29831ba141eba791a19f4bb4f42344dc901f7" gracePeriod=30 Apr 22 15:15:57.843387 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:57.843366 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-6784d5b89c-5vds9_e6e97f11-dec4-44fa-a279-0a86c425c064/storage-initializer/1.log" Apr 22 15:15:57.843730 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:57.843714 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-6784d5b89c-5vds9_e6e97f11-dec4-44fa-a279-0a86c425c064/storage-initializer/0.log" Apr 22 15:15:57.843793 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:57.843784 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6784d5b89c-5vds9" Apr 22 15:15:57.939316 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:57.939265 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e6e97f11-dec4-44fa-a279-0a86c425c064-kserve-provision-location\") pod \"e6e97f11-dec4-44fa-a279-0a86c425c064\" (UID: \"e6e97f11-dec4-44fa-a279-0a86c425c064\") " Apr 22 15:15:57.939492 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:57.939472 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6e97f11-dec4-44fa-a279-0a86c425c064-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e6e97f11-dec4-44fa-a279-0a86c425c064" (UID: "e6e97f11-dec4-44fa-a279-0a86c425c064"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 15:15:58.040405 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:58.040386 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e6e97f11-dec4-44fa-a279-0a86c425c064-kserve-provision-location\") on node \"ip-10-0-142-195.ec2.internal\" DevicePath \"\"" Apr 22 15:15:58.387147 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:58.387129 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-6784d5b89c-5vds9_e6e97f11-dec4-44fa-a279-0a86c425c064/storage-initializer/1.log" Apr 22 15:15:58.387472 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:58.387456 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-6784d5b89c-5vds9_e6e97f11-dec4-44fa-a279-0a86c425c064/storage-initializer/0.log" Apr 22 15:15:58.387555 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:58.387493 2576 generic.go:358] "Generic (PLEG): container finished" podID="e6e97f11-dec4-44fa-a279-0a86c425c064" containerID="e45ef5c011bee7395cf165b3a7d29831ba141eba791a19f4bb4f42344dc901f7" exitCode=1 Apr 22 15:15:58.387555 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:58.387544 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6784d5b89c-5vds9" Apr 22 15:15:58.387632 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:58.387569 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6784d5b89c-5vds9" event={"ID":"e6e97f11-dec4-44fa-a279-0a86c425c064","Type":"ContainerDied","Data":"e45ef5c011bee7395cf165b3a7d29831ba141eba791a19f4bb4f42344dc901f7"} Apr 22 15:15:58.387632 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:58.387597 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6784d5b89c-5vds9" event={"ID":"e6e97f11-dec4-44fa-a279-0a86c425c064","Type":"ContainerDied","Data":"879d9db50c079e52fd60e7281a5c62824d0d73e016fb0bf7c33e2e4d9d6ebf3d"} Apr 22 15:15:58.387632 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:58.387611 2576 scope.go:117] "RemoveContainer" containerID="e45ef5c011bee7395cf165b3a7d29831ba141eba791a19f4bb4f42344dc901f7" Apr 22 15:15:58.396164 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:58.396146 2576 scope.go:117] "RemoveContainer" containerID="004a9a80bc554f53aea9f014440c260b7ff26522e947143905f099a46280868b" Apr 22 15:15:58.402574 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:58.402556 2576 scope.go:117] "RemoveContainer" containerID="e45ef5c011bee7395cf165b3a7d29831ba141eba791a19f4bb4f42344dc901f7" Apr 22 15:15:58.402791 ip-10-0-142-195 kubenswrapper[2576]: E0422 15:15:58.402774 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e45ef5c011bee7395cf165b3a7d29831ba141eba791a19f4bb4f42344dc901f7\": container with ID starting with e45ef5c011bee7395cf165b3a7d29831ba141eba791a19f4bb4f42344dc901f7 not found: ID does not exist" containerID="e45ef5c011bee7395cf165b3a7d29831ba141eba791a19f4bb4f42344dc901f7" Apr 22 15:15:58.402851 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:58.402798 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e45ef5c011bee7395cf165b3a7d29831ba141eba791a19f4bb4f42344dc901f7"} err="failed to get container status \"e45ef5c011bee7395cf165b3a7d29831ba141eba791a19f4bb4f42344dc901f7\": rpc error: code = NotFound desc = could not find container \"e45ef5c011bee7395cf165b3a7d29831ba141eba791a19f4bb4f42344dc901f7\": container with ID starting with e45ef5c011bee7395cf165b3a7d29831ba141eba791a19f4bb4f42344dc901f7 not found: ID does not exist" Apr 22 15:15:58.402851 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:58.402827 2576 scope.go:117] "RemoveContainer" containerID="004a9a80bc554f53aea9f014440c260b7ff26522e947143905f099a46280868b" Apr 22 15:15:58.403053 ip-10-0-142-195 kubenswrapper[2576]: E0422 15:15:58.403037 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"004a9a80bc554f53aea9f014440c260b7ff26522e947143905f099a46280868b\": container with ID starting with 004a9a80bc554f53aea9f014440c260b7ff26522e947143905f099a46280868b not found: ID does not exist" containerID="004a9a80bc554f53aea9f014440c260b7ff26522e947143905f099a46280868b" Apr 22 15:15:58.403104 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:58.403055 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"004a9a80bc554f53aea9f014440c260b7ff26522e947143905f099a46280868b"} err="failed to get container status \"004a9a80bc554f53aea9f014440c260b7ff26522e947143905f099a46280868b\": rpc error: code = NotFound desc = could not find container \"004a9a80bc554f53aea9f014440c260b7ff26522e947143905f099a46280868b\": container with ID starting with 004a9a80bc554f53aea9f014440c260b7ff26522e947143905f099a46280868b not found: ID does not exist" Apr 22 15:15:58.420525 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:58.420503 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6784d5b89c-5vds9"] Apr 22 15:15:58.424339 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:15:58.424319 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6784d5b89c-5vds9"] Apr 22 15:16:00.296296 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:00.296271 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6e97f11-dec4-44fa-a279-0a86c425c064" path="/var/lib/kubelet/pods/e6e97f11-dec4-44fa-a279-0a86c425c064/volumes" Apr 22 15:16:32.105580 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:32.105555 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-fmwsm_634576c9-9317-4b50-810b-568419af85ba/global-pull-secret-syncer/0.log" Apr 22 15:16:32.240894 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:32.240871 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-mj62z_2b976276-fdc8-4595-a9a4-76cc1b34317e/konnectivity-agent/0.log" Apr 22 15:16:32.327951 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:32.327928 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-142-195.ec2.internal_24b00f41063a53ab6b85ee845aa19b10/haproxy/0.log" Apr 22 15:16:35.496517 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:35.496420 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2bnpq_56ff2a83-14c9-44a2-99cb-54c501242f8a/node-exporter/0.log" Apr 22 15:16:35.534852 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:35.534829 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2bnpq_56ff2a83-14c9-44a2-99cb-54c501242f8a/kube-rbac-proxy/0.log" Apr 22 15:16:35.567781 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:35.567765 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2bnpq_56ff2a83-14c9-44a2-99cb-54c501242f8a/init-textfile/0.log" Apr 22 15:16:39.233429 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:39.233359 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zm2sq/perf-node-gather-daemonset-n5t5w"] Apr 22 15:16:39.233748 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:39.233587 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6a0c1417-3198-4fe4-a796-531e60aded51" containerName="storage-initializer" Apr 22 15:16:39.233748 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:39.233597 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a0c1417-3198-4fe4-a796-531e60aded51" containerName="storage-initializer" Apr 22 15:16:39.233748 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:39.233606 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e6e97f11-dec4-44fa-a279-0a86c425c064" containerName="storage-initializer" Apr 22 15:16:39.233748 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:39.233612 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6e97f11-dec4-44fa-a279-0a86c425c064" containerName="storage-initializer" Apr 22 15:16:39.233748 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:39.233632 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6a0c1417-3198-4fe4-a796-531e60aded51" containerName="kserve-container" Apr 22 15:16:39.233748 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:39.233638 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a0c1417-3198-4fe4-a796-531e60aded51" containerName="kserve-container" Apr 22 15:16:39.233748 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:39.233673 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="e6e97f11-dec4-44fa-a279-0a86c425c064" containerName="storage-initializer" Apr 22 15:16:39.233748 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:39.233681 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="e6e97f11-dec4-44fa-a279-0a86c425c064" containerName="storage-initializer" Apr 22 15:16:39.233748 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:39.233689 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="6a0c1417-3198-4fe4-a796-531e60aded51" containerName="kserve-container" Apr 22 15:16:39.236466 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:39.236450 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-n5t5w" Apr 22 15:16:39.239143 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:39.239121 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-zm2sq\"/\"default-dockercfg-jkn8s\"" Apr 22 15:16:39.239407 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:39.239388 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-zm2sq\"/\"openshift-service-ca.crt\"" Apr 22 15:16:39.240606 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:39.240589 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-zm2sq\"/\"kube-root-ca.crt\"" Apr 22 15:16:39.244833 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:39.244787 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zm2sq/perf-node-gather-daemonset-n5t5w"] Apr 22 15:16:39.304580 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:39.304563 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6568cb18-cc6f-4736-aae0-96e249f7e5b6-lib-modules\") pod \"perf-node-gather-daemonset-n5t5w\" (UID: \"6568cb18-cc6f-4736-aae0-96e249f7e5b6\") " pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-n5t5w" Apr 22 15:16:39.304675 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:39.304599 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/6568cb18-cc6f-4736-aae0-96e249f7e5b6-proc\") pod \"perf-node-gather-daemonset-n5t5w\" (UID: \"6568cb18-cc6f-4736-aae0-96e249f7e5b6\") " pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-n5t5w" Apr 22 15:16:39.304675 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:39.304626 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82ftf\" (UniqueName: \"kubernetes.io/projected/6568cb18-cc6f-4736-aae0-96e249f7e5b6-kube-api-access-82ftf\") pod \"perf-node-gather-daemonset-n5t5w\" (UID: \"6568cb18-cc6f-4736-aae0-96e249f7e5b6\") " pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-n5t5w" Apr 22 15:16:39.304747 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:39.304675 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6568cb18-cc6f-4736-aae0-96e249f7e5b6-sys\") pod \"perf-node-gather-daemonset-n5t5w\" (UID: \"6568cb18-cc6f-4736-aae0-96e249f7e5b6\") " pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-n5t5w" Apr 22 15:16:39.304747 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:39.304711 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/6568cb18-cc6f-4736-aae0-96e249f7e5b6-podres\") pod \"perf-node-gather-daemonset-n5t5w\" (UID: \"6568cb18-cc6f-4736-aae0-96e249f7e5b6\") " pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-n5t5w" Apr 22 15:16:39.405633 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:39.405612 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/6568cb18-cc6f-4736-aae0-96e249f7e5b6-proc\") pod \"perf-node-gather-daemonset-n5t5w\" (UID: \"6568cb18-cc6f-4736-aae0-96e249f7e5b6\") " pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-n5t5w" Apr 22 15:16:39.405709 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:39.405644 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-82ftf\" (UniqueName: \"kubernetes.io/projected/6568cb18-cc6f-4736-aae0-96e249f7e5b6-kube-api-access-82ftf\") pod \"perf-node-gather-daemonset-n5t5w\" (UID: \"6568cb18-cc6f-4736-aae0-96e249f7e5b6\") " pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-n5t5w" Apr 22 15:16:39.405709 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:39.405661 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6568cb18-cc6f-4736-aae0-96e249f7e5b6-sys\") pod \"perf-node-gather-daemonset-n5t5w\" (UID: \"6568cb18-cc6f-4736-aae0-96e249f7e5b6\") " pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-n5t5w" Apr 22 15:16:39.405709 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:39.405677 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/6568cb18-cc6f-4736-aae0-96e249f7e5b6-podres\") pod \"perf-node-gather-daemonset-n5t5w\" (UID: \"6568cb18-cc6f-4736-aae0-96e249f7e5b6\") " pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-n5t5w" Apr 22 15:16:39.405851 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:39.405711 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6568cb18-cc6f-4736-aae0-96e249f7e5b6-lib-modules\") pod \"perf-node-gather-daemonset-n5t5w\" (UID: \"6568cb18-cc6f-4736-aae0-96e249f7e5b6\") " pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-n5t5w" Apr 22 15:16:39.405851 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:39.405722 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/6568cb18-cc6f-4736-aae0-96e249f7e5b6-proc\") pod \"perf-node-gather-daemonset-n5t5w\" (UID: \"6568cb18-cc6f-4736-aae0-96e249f7e5b6\") " pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-n5t5w" Apr 22 15:16:39.405851 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:39.405752 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6568cb18-cc6f-4736-aae0-96e249f7e5b6-sys\") pod \"perf-node-gather-daemonset-n5t5w\" (UID: \"6568cb18-cc6f-4736-aae0-96e249f7e5b6\") " pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-n5t5w" Apr 22 15:16:39.405851 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:39.405795 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/6568cb18-cc6f-4736-aae0-96e249f7e5b6-podres\") pod \"perf-node-gather-daemonset-n5t5w\" (UID: \"6568cb18-cc6f-4736-aae0-96e249f7e5b6\") " pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-n5t5w" Apr 22 15:16:39.405851 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:39.405844 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6568cb18-cc6f-4736-aae0-96e249f7e5b6-lib-modules\") pod \"perf-node-gather-daemonset-n5t5w\" (UID: \"6568cb18-cc6f-4736-aae0-96e249f7e5b6\") " pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-n5t5w" Apr 22 15:16:39.414344 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:39.414326 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-82ftf\" (UniqueName: \"kubernetes.io/projected/6568cb18-cc6f-4736-aae0-96e249f7e5b6-kube-api-access-82ftf\") pod \"perf-node-gather-daemonset-n5t5w\" (UID: \"6568cb18-cc6f-4736-aae0-96e249f7e5b6\") " pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-n5t5w" Apr 22 15:16:39.546300 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:39.546254 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-n5t5w" Apr 22 15:16:39.661718 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:39.661686 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zm2sq/perf-node-gather-daemonset-n5t5w"] Apr 22 15:16:39.663798 ip-10-0-142-195 kubenswrapper[2576]: W0422 15:16:39.663770 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod6568cb18_cc6f_4736_aae0_96e249f7e5b6.slice/crio-ae82f369c8386769cebcec2f931229d2e3d6d2dee1112db2c0a04dfc91b2ce24 WatchSource:0}: Error finding container ae82f369c8386769cebcec2f931229d2e3d6d2dee1112db2c0a04dfc91b2ce24: Status 404 returned error can't find the container with id ae82f369c8386769cebcec2f931229d2e3d6d2dee1112db2c0a04dfc91b2ce24 Apr 22 15:16:39.855509 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:39.855451 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-v29ss_9eeb019a-e2e1-4f80-8bf1-18b7ce973747/dns/0.log" Apr 22 15:16:39.879403 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:39.879381 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-v29ss_9eeb019a-e2e1-4f80-8bf1-18b7ce973747/kube-rbac-proxy/0.log" Apr 22 15:16:39.904712 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:39.904690 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-8h8cm_fbf11e8c-40e6-4fe9-b51b-b3a87ff88577/dns-node-resolver/0.log" Apr 22 15:16:40.393986 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:40.393965 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-74fb95cffd-6sw58_2bb74d4a-dc01-4e62-87e5-7e08deed660e/registry/0.log" Apr 22 15:16:40.477464 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:40.477446 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-bzcbn_b185de97-4d30-47da-bb08-402ac8989235/node-ca/0.log" Apr 22 15:16:40.492977 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:40.492953 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-n5t5w" event={"ID":"6568cb18-cc6f-4736-aae0-96e249f7e5b6","Type":"ContainerStarted","Data":"0a9b66b55b172c84b3dc071cd276cb573c638f9f39ea8f4d9dae73ed4f35d576"} Apr 22 15:16:40.493063 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:40.492981 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-n5t5w" event={"ID":"6568cb18-cc6f-4736-aae0-96e249f7e5b6","Type":"ContainerStarted","Data":"ae82f369c8386769cebcec2f931229d2e3d6d2dee1112db2c0a04dfc91b2ce24"} Apr 22 15:16:40.493122 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:40.493087 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-n5t5w" Apr 22 15:16:40.510125 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:40.510094 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-n5t5w" podStartSLOduration=1.510084411 podStartE2EDuration="1.510084411s" podCreationTimestamp="2026-04-22 15:16:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:16:40.509756524 +0000 UTC m=+3670.835490938" watchObservedRunningTime="2026-04-22 15:16:40.510084411 +0000 UTC m=+3670.835818825" Apr 22 15:16:41.683244 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:41.683217 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-9cr92_e7bf6435-22e2-4377-a9e9-e35bf103f96a/serve-healthcheck-canary/0.log" Apr 22 15:16:42.485174 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:42.485149 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-sgkpp_057deb2d-ff54-4f6d-b3f5-419c3f0e070e/kube-rbac-proxy/0.log" Apr 22 15:16:42.513441 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:42.513422 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-sgkpp_057deb2d-ff54-4f6d-b3f5-419c3f0e070e/exporter/0.log" Apr 22 15:16:42.545608 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:42.545588 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-sgkpp_057deb2d-ff54-4f6d-b3f5-419c3f0e070e/extractor/0.log" Apr 22 15:16:44.708819 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:44.708778 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-9pjvw_95948c22-da6d-425e-b9ab-3ce2bcb73905/manager/0.log" Apr 22 15:16:44.930359 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:44.930331 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-c2wh7_e5a1967a-f314-4ecc-81d9-16eb90609713/manager/0.log" Apr 22 15:16:45.040383 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:45.040357 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-5wkgh_faa255cd-039a-42fe-8393-76525944bc1a/seaweedfs/0.log" Apr 22 15:16:45.066316 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:45.066287 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-tls-custom-5c88b85bb7-4svkh_1ade1b13-4eaf-4e7b-ae1b-f5439819135b/seaweedfs-tls-custom/0.log" Apr 22 15:16:45.092098 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:45.092079 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-tls-serving-7fd5766db9-hnw8m_ad315599-98f1-46ea-b529-63e21ca49370/seaweedfs-tls-serving/0.log" Apr 22 15:16:46.502844 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:46.502802 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-n5t5w" Apr 22 15:16:50.548172 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:50.548134 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7mzfp_3895a4bb-68d7-4a37-8937-3ce81c84a431/kube-multus/0.log" Apr 22 15:16:50.572734 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:50.572716 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8pqgl_110f68fd-5d58-411e-a7fc-980d5d6050e4/kube-multus-additional-cni-plugins/0.log" Apr 22 15:16:50.595910 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:50.595886 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8pqgl_110f68fd-5d58-411e-a7fc-980d5d6050e4/egress-router-binary-copy/0.log" Apr 22 15:16:50.618608 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:50.618587 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8pqgl_110f68fd-5d58-411e-a7fc-980d5d6050e4/cni-plugins/0.log" Apr 22 15:16:50.641937 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:50.641918 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8pqgl_110f68fd-5d58-411e-a7fc-980d5d6050e4/bond-cni-plugin/0.log" Apr 22 15:16:50.664515 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:50.664497 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8pqgl_110f68fd-5d58-411e-a7fc-980d5d6050e4/routeoverride-cni/0.log" Apr 22 15:16:50.687449 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:50.687433 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8pqgl_110f68fd-5d58-411e-a7fc-980d5d6050e4/whereabouts-cni-bincopy/0.log" Apr 22 15:16:50.710952 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:50.710936 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8pqgl_110f68fd-5d58-411e-a7fc-980d5d6050e4/whereabouts-cni/0.log" Apr 22 15:16:51.256609 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:51.256534 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-r7psp_dbee0e27-41ea-4d42-84c7-681872bfcda1/network-metrics-daemon/0.log" Apr 22 15:16:51.276763 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:51.276748 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-r7psp_dbee0e27-41ea-4d42-84c7-681872bfcda1/kube-rbac-proxy/0.log" Apr 22 15:16:52.795294 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:52.795268 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x7g6p_5880a8e9-777a-4921-b5f6-c6325c768bf2/ovn-controller/0.log" Apr 22 15:16:52.816940 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:52.816917 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x7g6p_5880a8e9-777a-4921-b5f6-c6325c768bf2/ovn-acl-logging/0.log" Apr 22 15:16:52.834753 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:52.834735 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x7g6p_5880a8e9-777a-4921-b5f6-c6325c768bf2/ovn-acl-logging/1.log" Apr 22 15:16:52.855843 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:52.855824 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x7g6p_5880a8e9-777a-4921-b5f6-c6325c768bf2/kube-rbac-proxy-node/0.log" Apr 22 15:16:52.879102 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:52.879080 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x7g6p_5880a8e9-777a-4921-b5f6-c6325c768bf2/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 15:16:52.901360 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:52.901339 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x7g6p_5880a8e9-777a-4921-b5f6-c6325c768bf2/northd/0.log" Apr 22 15:16:52.925080 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:52.925061 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x7g6p_5880a8e9-777a-4921-b5f6-c6325c768bf2/nbdb/0.log" Apr 22 15:16:52.954516 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:52.954502 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x7g6p_5880a8e9-777a-4921-b5f6-c6325c768bf2/sbdb/0.log" Apr 22 15:16:53.057965 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:53.057898 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x7g6p_5880a8e9-777a-4921-b5f6-c6325c768bf2/ovnkube-controller/0.log" Apr 22 15:16:54.145250 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:54.145227 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-rbpm2_e0b90fcd-ae7e-46f8-83c0-b9bdec12c8aa/network-check-target-container/0.log" Apr 22 15:16:55.170503 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:55.170480 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-h5kfd_88aefe02-de5c-45b6-a697-2d18d8ae2754/iptables-alerter/0.log" Apr 22 15:16:55.889879 ip-10-0-142-195 kubenswrapper[2576]: I0422 15:16:55.889844 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-89qbq_35bb6914-dec8-4b09-9315-761449933a8a/tuned/0.log"