Apr 28 19:16:23.253734 ip-10-0-139-128 systemd[1]: Starting Kubernetes Kubelet... Apr 28 19:16:23.691811 ip-10-0-139-128 kubenswrapper[2571]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 28 19:16:23.691811 ip-10-0-139-128 kubenswrapper[2571]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 28 19:16:23.691811 ip-10-0-139-128 kubenswrapper[2571]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 28 19:16:23.691811 ip-10-0-139-128 kubenswrapper[2571]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 28 19:16:23.691811 ip-10-0-139-128 kubenswrapper[2571]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 28 19:16:23.692824 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.692740 2571 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 28 19:16:23.696834 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.696819 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 28 19:16:23.696834 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.696834 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 28 19:16:23.696899 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.696837 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 28 19:16:23.696899 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.696841 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 28 19:16:23.696899 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.696844 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 28 19:16:23.696899 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.696847 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 28 19:16:23.696899 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.696851 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 28 19:16:23.696899 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.696854 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 28 19:16:23.696899 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.696857 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 28 19:16:23.696899 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.696860 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 28 19:16:23.696899 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.696863 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 28 19:16:23.696899 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.696866 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 28 19:16:23.696899 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.696869 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 28 19:16:23.696899 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.696872 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 28 19:16:23.696899 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.696874 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 28 19:16:23.696899 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.696877 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 28 19:16:23.696899 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.696879 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 28 19:16:23.696899 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.696882 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 28 19:16:23.696899 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.696884 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 28 19:16:23.696899 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.696890 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 28 19:16:23.696899 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.696894 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 28 19:16:23.696899 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.696896 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 28 19:16:23.697419 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.696899 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 28 19:16:23.697419 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.696901 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 28 19:16:23.697419 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.696905 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 28 19:16:23.697419 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.696908 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 28 19:16:23.697419 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.696911 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 28 19:16:23.697419 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.696914 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 28 19:16:23.697419 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.696918 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 28 19:16:23.697419 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.696922 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 28 19:16:23.697419 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.696924 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 28 19:16:23.697419 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.696927 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 28 19:16:23.697419 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.696929 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 28 19:16:23.697419 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.696932 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 28 19:16:23.697419 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.696934 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 28 19:16:23.697419 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.696937 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 28 19:16:23.697419 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.696939 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 28 19:16:23.697419 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.696942 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 28 19:16:23.697419 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.696945 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 28 19:16:23.697419 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.696947 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 28 19:16:23.697419 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.696950 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 28 19:16:23.697992 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.696952 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 28 19:16:23.697992 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.696956 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 28 19:16:23.697992 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.696960 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 28 19:16:23.697992 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.696963 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 28 19:16:23.697992 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.696967 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 28 19:16:23.697992 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.696969 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 28 19:16:23.697992 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.696972 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 28 19:16:23.697992 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.696975 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 28 19:16:23.697992 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.696978 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 28 19:16:23.697992 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.696980 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 28 19:16:23.697992 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.696983 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 28 19:16:23.697992 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.696985 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 28 19:16:23.697992 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.696988 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 28 19:16:23.697992 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.696990 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 28 19:16:23.697992 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.696994 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 28 19:16:23.697992 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.696997 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 28 19:16:23.697992 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697000 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 28 19:16:23.697992 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697002 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 28 19:16:23.697992 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697005 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 28 19:16:23.698452 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697008 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 28 19:16:23.698452 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697011 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 28 19:16:23.698452 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697013 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 28 19:16:23.698452 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697016 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 28 19:16:23.698452 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697019 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 28 19:16:23.698452 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697021 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 28 19:16:23.698452 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697024 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 28 19:16:23.698452 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697026 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 28 19:16:23.698452 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697029 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 28 19:16:23.698452 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697031 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 28 19:16:23.698452 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697034 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 28 19:16:23.698452 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697037 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 28 19:16:23.698452 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697039 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 28 19:16:23.698452 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697042 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 28 19:16:23.698452 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697044 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 28 19:16:23.698452 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697047 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 28 19:16:23.698452 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697051 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 28 19:16:23.698452 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697053 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 28 19:16:23.698452 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697056 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 28 19:16:23.698452 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697058 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 28 19:16:23.698452 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697061 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 28 19:16:23.698969 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697063 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 28 19:16:23.698969 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697066 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 28 19:16:23.698969 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697068 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 28 19:16:23.698969 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697071 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 28 19:16:23.698969 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697073 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 28 19:16:23.698969 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697468 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 28 19:16:23.698969 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697475 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 28 19:16:23.698969 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697496 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 28 19:16:23.698969 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697499 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 28 19:16:23.698969 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697503 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 28 19:16:23.698969 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697506 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 28 19:16:23.698969 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697509 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 28 19:16:23.698969 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697512 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 28 19:16:23.698969 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697514 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 28 19:16:23.698969 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697517 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 28 19:16:23.698969 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697520 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 28 19:16:23.698969 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697522 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 28 19:16:23.698969 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697525 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 28 19:16:23.698969 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697527 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 28 19:16:23.698969 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697530 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 28 19:16:23.699445 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697532 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 28 19:16:23.699445 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697535 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 28 19:16:23.699445 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697538 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 28 19:16:23.699445 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697541 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 28 19:16:23.699445 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697543 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 28 19:16:23.699445 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697546 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 28 19:16:23.699445 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697549 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 28 19:16:23.699445 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697551 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 28 19:16:23.699445 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697554 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 28 19:16:23.699445 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697557 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 28 19:16:23.699445 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697559 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 28 19:16:23.699445 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697562 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 28 19:16:23.699445 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697565 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 28 19:16:23.699445 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697567 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 28 19:16:23.699445 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697570 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 28 19:16:23.699445 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697572 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 28 19:16:23.699445 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697576 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 28 19:16:23.699445 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697578 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 28 19:16:23.699445 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697581 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 28 19:16:23.699966 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697585 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 28 19:16:23.699966 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697588 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 28 19:16:23.699966 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697590 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 28 19:16:23.699966 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697593 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 28 19:16:23.699966 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697596 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 28 19:16:23.699966 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697598 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 28 19:16:23.699966 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697601 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 28 19:16:23.699966 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697603 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 28 19:16:23.699966 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697607 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 28 19:16:23.699966 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697611 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 28 19:16:23.699966 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697614 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 28 19:16:23.699966 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697617 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 28 19:16:23.699966 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697620 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 28 19:16:23.699966 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697623 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 28 19:16:23.699966 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697626 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 28 19:16:23.699966 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697629 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 28 19:16:23.699966 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697631 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 28 19:16:23.699966 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697634 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 28 19:16:23.699966 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697636 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 28 19:16:23.700427 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697639 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 28 19:16:23.700427 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697643 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 28 19:16:23.700427 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697645 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 28 19:16:23.700427 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697648 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 28 19:16:23.700427 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697651 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 28 19:16:23.700427 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697655 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 28 19:16:23.700427 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697658 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 28 19:16:23.700427 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697660 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 28 19:16:23.700427 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697663 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 28 19:16:23.700427 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697665 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 28 19:16:23.700427 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697668 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 28 19:16:23.700427 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697671 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 28 19:16:23.700427 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697674 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 28 19:16:23.700427 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697677 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 28 19:16:23.700427 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697680 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 28 19:16:23.700427 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697683 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 28 19:16:23.700427 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697685 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 28 19:16:23.700427 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697689 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 28 19:16:23.700427 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697691 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 28 19:16:23.700427 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697694 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 28 19:16:23.700929 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697696 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 28 19:16:23.700929 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697699 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 28 19:16:23.700929 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697702 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 28 19:16:23.700929 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697704 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 28 19:16:23.700929 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697706 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 28 19:16:23.700929 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697709 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 28 19:16:23.700929 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697711 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 28 19:16:23.700929 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697714 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 28 19:16:23.700929 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697717 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 28 19:16:23.700929 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697719 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 28 19:16:23.700929 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697722 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 28 19:16:23.700929 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697724 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 28 19:16:23.700929 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.697727 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 28 19:16:23.700929 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.697802 2571 flags.go:64] FLAG: --address="0.0.0.0" Apr 28 19:16:23.700929 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.697810 2571 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 28 19:16:23.700929 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.697816 2571 flags.go:64] FLAG: --anonymous-auth="true" Apr 28 19:16:23.700929 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.697822 2571 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 28 19:16:23.700929 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.697827 2571 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 28 19:16:23.700929 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.697830 2571 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 28 19:16:23.700929 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.697834 2571 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 28 19:16:23.700929 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.697839 2571 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 28 19:16:23.701437 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.697842 2571 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 28 19:16:23.701437 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.697846 2571 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 28 19:16:23.701437 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.697849 2571 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 28 19:16:23.701437 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.697853 2571 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 28 19:16:23.701437 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.697856 2571 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 28 19:16:23.701437 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.697859 2571 flags.go:64] FLAG: --cgroup-root="" Apr 28 19:16:23.701437 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.697862 2571 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 28 19:16:23.701437 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.697865 2571 flags.go:64] FLAG: --client-ca-file="" Apr 28 19:16:23.701437 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.697868 2571 flags.go:64] FLAG: --cloud-config="" Apr 28 19:16:23.701437 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.697871 2571 flags.go:64] FLAG: --cloud-provider="external" Apr 28 19:16:23.701437 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.697874 2571 flags.go:64] FLAG: --cluster-dns="[]" Apr 28 19:16:23.701437 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.697878 2571 flags.go:64] FLAG: --cluster-domain="" Apr 28 19:16:23.701437 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.697881 2571 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 28 19:16:23.701437 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.697884 2571 flags.go:64] FLAG: --config-dir="" Apr 28 19:16:23.701437 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.697887 2571 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 28 19:16:23.701437 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.697890 2571 flags.go:64] FLAG: --container-log-max-files="5" Apr 28 19:16:23.701437 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.697894 2571 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 28 19:16:23.701437 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.697897 2571 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 28 19:16:23.701437 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.697900 2571 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 28 19:16:23.701437 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.697903 2571 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 28 19:16:23.701437 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.697906 2571 flags.go:64] FLAG: --contention-profiling="false" Apr 28 19:16:23.701437 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.697909 2571 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 28 19:16:23.701437 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.697912 2571 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 28 19:16:23.701437 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.697915 2571 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 28 19:16:23.701437 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.697918 2571 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 28 19:16:23.702057 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.697922 2571 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 28 19:16:23.702057 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.697925 2571 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 28 19:16:23.702057 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.697929 2571 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 28 19:16:23.702057 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.697932 2571 flags.go:64] FLAG: --enable-load-reader="false" Apr 28 19:16:23.702057 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.697935 2571 flags.go:64] FLAG: --enable-server="true" Apr 28 19:16:23.702057 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.697938 2571 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 28 19:16:23.702057 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.697952 2571 flags.go:64] FLAG: --event-burst="100" Apr 28 19:16:23.702057 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.697957 2571 flags.go:64] FLAG: --event-qps="50" Apr 28 19:16:23.702057 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.697964 2571 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 28 19:16:23.702057 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.697967 2571 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 28 19:16:23.702057 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.697970 2571 flags.go:64] FLAG: --eviction-hard="" Apr 28 19:16:23.702057 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.697974 2571 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 28 19:16:23.702057 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.697977 2571 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 28 19:16:23.702057 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.697980 2571 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 28 19:16:23.702057 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.697983 2571 flags.go:64] FLAG: --eviction-soft="" Apr 28 19:16:23.702057 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.697986 2571 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 28 19:16:23.702057 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.697989 2571 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 28 19:16:23.702057 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.697992 2571 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 28 19:16:23.702057 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.697996 2571 flags.go:64] FLAG: --experimental-mounter-path="" Apr 28 19:16:23.702057 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.697999 2571 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 28 19:16:23.702057 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698002 2571 flags.go:64] FLAG: --fail-swap-on="true" Apr 28 19:16:23.702057 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698005 2571 flags.go:64] FLAG: --feature-gates="" Apr 28 19:16:23.702057 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698009 2571 flags.go:64] FLAG: --file-check-frequency="20s" Apr 28 19:16:23.702057 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698012 2571 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 28 19:16:23.702057 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698015 2571 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 28 19:16:23.702667 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698018 2571 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 28 19:16:23.702667 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698022 2571 flags.go:64] FLAG: --healthz-port="10248" Apr 28 19:16:23.702667 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698025 2571 flags.go:64] FLAG: --help="false" Apr 28 19:16:23.702667 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698028 2571 flags.go:64] FLAG: --hostname-override="ip-10-0-139-128.ec2.internal" Apr 28 19:16:23.702667 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698031 2571 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 28 19:16:23.702667 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698034 2571 flags.go:64] FLAG: --http-check-frequency="20s" Apr 28 19:16:23.702667 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698037 2571 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 28 19:16:23.702667 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698040 2571 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 28 19:16:23.702667 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698043 2571 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 28 19:16:23.702667 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698047 2571 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 28 19:16:23.702667 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698050 2571 flags.go:64] FLAG: --image-service-endpoint="" Apr 28 19:16:23.702667 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698053 2571 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 28 19:16:23.702667 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698055 2571 flags.go:64] FLAG: --kube-api-burst="100" Apr 28 19:16:23.702667 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698058 2571 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 28 19:16:23.702667 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698062 2571 flags.go:64] FLAG: --kube-api-qps="50" Apr 28 19:16:23.702667 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698065 2571 flags.go:64] FLAG: --kube-reserved="" Apr 28 19:16:23.702667 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698068 2571 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 28 19:16:23.702667 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698071 2571 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 28 19:16:23.702667 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698074 2571 flags.go:64] FLAG: --kubelet-cgroups="" Apr 28 19:16:23.702667 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698076 2571 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 28 19:16:23.702667 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698079 2571 flags.go:64] FLAG: --lock-file="" Apr 28 19:16:23.702667 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698082 2571 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 28 19:16:23.702667 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698085 2571 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 28 19:16:23.702667 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698088 2571 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 28 19:16:23.703291 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698094 2571 flags.go:64] FLAG: --log-json-split-stream="false" Apr 28 19:16:23.703291 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698097 2571 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 28 19:16:23.703291 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698100 2571 flags.go:64] FLAG: --log-text-split-stream="false" Apr 28 19:16:23.703291 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698102 2571 flags.go:64] FLAG: --logging-format="text" Apr 28 19:16:23.703291 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698105 2571 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 28 19:16:23.703291 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698109 2571 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 28 19:16:23.703291 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698111 2571 flags.go:64] FLAG: --manifest-url="" Apr 28 19:16:23.703291 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698114 2571 flags.go:64] FLAG: --manifest-url-header="" Apr 28 19:16:23.703291 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698118 2571 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 28 19:16:23.703291 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698121 2571 flags.go:64] FLAG: --max-open-files="1000000" Apr 28 19:16:23.703291 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698125 2571 flags.go:64] FLAG: --max-pods="110" Apr 28 19:16:23.703291 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698128 2571 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 28 19:16:23.703291 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698131 2571 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 28 19:16:23.703291 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698134 2571 flags.go:64] FLAG: --memory-manager-policy="None" Apr 28 19:16:23.703291 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698137 2571 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 28 19:16:23.703291 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698141 2571 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 28 19:16:23.703291 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698143 2571 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 28 19:16:23.703291 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698146 2571 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 28 19:16:23.703291 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698154 2571 flags.go:64] FLAG: --node-status-max-images="50" Apr 28 19:16:23.703291 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698157 2571 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 28 19:16:23.703291 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698160 2571 flags.go:64] FLAG: --oom-score-adj="-999" Apr 28 19:16:23.703291 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698163 2571 flags.go:64] FLAG: --pod-cidr="" Apr 28 19:16:23.703291 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698166 2571 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 28 19:16:23.703880 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698171 2571 flags.go:64] FLAG: --pod-manifest-path="" Apr 28 19:16:23.703880 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698174 2571 flags.go:64] FLAG: --pod-max-pids="-1" Apr 28 19:16:23.703880 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698177 2571 flags.go:64] FLAG: --pods-per-core="0" Apr 28 19:16:23.703880 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698180 2571 flags.go:64] FLAG: --port="10250" Apr 28 19:16:23.703880 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698184 2571 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 28 19:16:23.703880 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698186 2571 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0b6d9101355774739" Apr 28 19:16:23.703880 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698190 2571 flags.go:64] FLAG: --qos-reserved="" Apr 28 19:16:23.703880 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698193 2571 flags.go:64] FLAG: --read-only-port="10255" Apr 28 19:16:23.703880 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698197 2571 flags.go:64] FLAG: --register-node="true" Apr 28 19:16:23.703880 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698199 2571 flags.go:64] FLAG: --register-schedulable="true" Apr 28 19:16:23.703880 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698202 2571 flags.go:64] FLAG: --register-with-taints="" Apr 28 19:16:23.703880 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698206 2571 flags.go:64] FLAG: --registry-burst="10" Apr 28 19:16:23.703880 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698209 2571 flags.go:64] FLAG: --registry-qps="5" Apr 28 19:16:23.703880 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698212 2571 flags.go:64] FLAG: --reserved-cpus="" Apr 28 19:16:23.703880 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698215 2571 flags.go:64] FLAG: --reserved-memory="" Apr 28 19:16:23.703880 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698219 2571 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 28 19:16:23.703880 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698222 2571 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 28 19:16:23.703880 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698225 2571 flags.go:64] FLAG: --rotate-certificates="false" Apr 28 19:16:23.703880 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698227 2571 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 28 19:16:23.703880 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698230 2571 flags.go:64] FLAG: --runonce="false" Apr 28 19:16:23.703880 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698233 2571 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 28 19:16:23.703880 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698236 2571 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 28 19:16:23.703880 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698239 2571 flags.go:64] FLAG: --seccomp-default="false" Apr 28 19:16:23.703880 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698242 2571 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 28 19:16:23.703880 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698245 2571 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 28 19:16:23.703880 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698248 2571 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 28 19:16:23.704535 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698251 2571 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 28 19:16:23.704535 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698254 2571 flags.go:64] FLAG: --storage-driver-password="root" Apr 28 19:16:23.704535 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698257 2571 flags.go:64] FLAG: --storage-driver-secure="false" Apr 28 19:16:23.704535 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698260 2571 flags.go:64] FLAG: --storage-driver-table="stats" Apr 28 19:16:23.704535 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698263 2571 flags.go:64] FLAG: --storage-driver-user="root" Apr 28 19:16:23.704535 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698265 2571 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 28 19:16:23.704535 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698268 2571 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 28 19:16:23.704535 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698271 2571 flags.go:64] FLAG: --system-cgroups="" Apr 28 19:16:23.704535 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698274 2571 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 28 19:16:23.704535 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698280 2571 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 28 19:16:23.704535 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698283 2571 flags.go:64] FLAG: --tls-cert-file="" Apr 28 19:16:23.704535 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698286 2571 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 28 19:16:23.704535 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698290 2571 flags.go:64] FLAG: --tls-min-version="" Apr 28 19:16:23.704535 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698293 2571 flags.go:64] FLAG: --tls-private-key-file="" Apr 28 19:16:23.704535 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698296 2571 flags.go:64] FLAG: --topology-manager-policy="none" Apr 28 19:16:23.704535 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698298 2571 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 28 19:16:23.704535 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698301 2571 flags.go:64] FLAG: --topology-manager-scope="container" Apr 28 19:16:23.704535 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698304 2571 flags.go:64] FLAG: --v="2" Apr 28 19:16:23.704535 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698309 2571 flags.go:64] FLAG: --version="false" Apr 28 19:16:23.704535 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698313 2571 flags.go:64] FLAG: --vmodule="" Apr 28 19:16:23.704535 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698317 2571 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 28 19:16:23.704535 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.698320 2571 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 28 19:16:23.704535 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698415 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 28 19:16:23.704535 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698420 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 28 19:16:23.705139 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698423 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 28 19:16:23.705139 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698426 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 28 19:16:23.705139 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698429 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 28 19:16:23.705139 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698432 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 28 19:16:23.705139 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698435 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 28 19:16:23.705139 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698437 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 28 19:16:23.705139 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698440 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 28 19:16:23.705139 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698443 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 28 19:16:23.705139 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698445 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 28 19:16:23.705139 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698448 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 28 19:16:23.705139 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698451 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 28 19:16:23.705139 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698456 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 28 19:16:23.705139 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698459 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 28 19:16:23.705139 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698462 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 28 19:16:23.705139 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698465 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 28 19:16:23.705139 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698470 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 28 19:16:23.705139 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698474 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 28 19:16:23.705139 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698489 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 28 19:16:23.705139 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698493 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 28 19:16:23.705676 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698495 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 28 19:16:23.705676 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698498 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 28 19:16:23.705676 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698501 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 28 19:16:23.705676 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698504 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 28 19:16:23.705676 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698507 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 28 19:16:23.705676 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698509 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 28 19:16:23.705676 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698512 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 28 19:16:23.705676 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698515 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 28 19:16:23.705676 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698517 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 28 19:16:23.705676 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698520 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 28 19:16:23.705676 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698523 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 28 19:16:23.705676 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698526 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 28 19:16:23.705676 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698528 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 28 19:16:23.705676 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698531 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 28 19:16:23.705676 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698533 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 28 19:16:23.705676 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698536 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 28 19:16:23.705676 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698538 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 28 19:16:23.705676 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698541 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 28 19:16:23.705676 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698543 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 28 19:16:23.705676 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698546 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 28 19:16:23.706187 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698548 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 28 19:16:23.706187 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698551 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 28 19:16:23.706187 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698554 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 28 19:16:23.706187 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698557 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 28 19:16:23.706187 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698559 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 28 19:16:23.706187 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698562 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 28 19:16:23.706187 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698565 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 28 19:16:23.706187 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698568 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 28 19:16:23.706187 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698572 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 28 19:16:23.706187 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698576 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 28 19:16:23.706187 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698578 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 28 19:16:23.706187 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698581 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 28 19:16:23.706187 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698583 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 28 19:16:23.706187 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698586 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 28 19:16:23.706187 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698590 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 28 19:16:23.706187 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698594 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 28 19:16:23.706187 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698597 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 28 19:16:23.706187 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698600 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 28 19:16:23.706187 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698603 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 28 19:16:23.706712 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698606 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 28 19:16:23.706712 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698609 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 28 19:16:23.706712 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698612 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 28 19:16:23.706712 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698615 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 28 19:16:23.706712 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698617 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 28 19:16:23.706712 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698620 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 28 19:16:23.706712 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698623 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 28 19:16:23.706712 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698625 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 28 19:16:23.706712 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698628 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 28 19:16:23.706712 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698630 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 28 19:16:23.706712 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698633 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 28 19:16:23.706712 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698635 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 28 19:16:23.706712 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698638 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 28 19:16:23.706712 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698640 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 28 19:16:23.706712 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698643 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 28 19:16:23.706712 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698646 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 28 19:16:23.706712 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698648 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 28 19:16:23.706712 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698651 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 28 19:16:23.706712 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698654 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 28 19:16:23.706712 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698657 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 28 19:16:23.707196 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698659 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 28 19:16:23.707196 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698663 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 28 19:16:23.707196 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698666 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 28 19:16:23.707196 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698668 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 28 19:16:23.707196 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698671 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 28 19:16:23.707196 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.698673 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 28 19:16:23.707196 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.699508 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 28 19:16:23.707196 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.706349 2571 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 28 19:16:23.707196 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.706364 2571 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 28 19:16:23.707196 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706411 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 28 19:16:23.707196 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706416 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 28 19:16:23.707196 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706419 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 28 19:16:23.707196 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706422 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 28 19:16:23.707196 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706426 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 28 19:16:23.707196 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706431 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 28 19:16:23.707589 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706434 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 28 19:16:23.707589 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706436 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 28 19:16:23.707589 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706439 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 28 19:16:23.707589 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706442 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 28 19:16:23.707589 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706445 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 28 19:16:23.707589 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706447 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 28 19:16:23.707589 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706450 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 28 19:16:23.707589 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706453 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 28 19:16:23.707589 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706455 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 28 19:16:23.707589 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706458 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 28 19:16:23.707589 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706460 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 28 19:16:23.707589 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706463 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 28 19:16:23.707589 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706466 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 28 19:16:23.707589 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706468 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 28 19:16:23.707589 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706471 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 28 19:16:23.707589 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706473 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 28 19:16:23.707589 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706500 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 28 19:16:23.707589 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706504 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 28 19:16:23.707589 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706506 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 28 19:16:23.707589 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706509 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 28 19:16:23.708084 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706511 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 28 19:16:23.708084 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706514 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 28 19:16:23.708084 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706517 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 28 19:16:23.708084 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706521 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 28 19:16:23.708084 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706525 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 28 19:16:23.708084 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706529 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 28 19:16:23.708084 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706532 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 28 19:16:23.708084 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706536 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 28 19:16:23.708084 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706538 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 28 19:16:23.708084 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706542 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 28 19:16:23.708084 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706545 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 28 19:16:23.708084 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706547 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 28 19:16:23.708084 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706550 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 28 19:16:23.708084 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706552 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 28 19:16:23.708084 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706555 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 28 19:16:23.708084 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706558 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 28 19:16:23.708084 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706560 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 28 19:16:23.708084 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706563 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 28 19:16:23.708084 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706565 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 28 19:16:23.708565 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706568 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 28 19:16:23.708565 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706571 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 28 19:16:23.708565 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706574 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 28 19:16:23.708565 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706576 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 28 19:16:23.708565 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706578 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 28 19:16:23.708565 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706581 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 28 19:16:23.708565 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706584 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 28 19:16:23.708565 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706586 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 28 19:16:23.708565 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706588 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 28 19:16:23.708565 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706591 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 28 19:16:23.708565 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706599 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 28 19:16:23.708565 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706602 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 28 19:16:23.708565 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706605 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 28 19:16:23.708565 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706607 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 28 19:16:23.708565 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706610 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 28 19:16:23.708565 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706612 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 28 19:16:23.708565 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706615 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 28 19:16:23.708565 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706617 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 28 19:16:23.708565 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706621 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 28 19:16:23.708565 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706623 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 28 19:16:23.709083 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706626 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 28 19:16:23.709083 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706629 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 28 19:16:23.709083 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706631 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 28 19:16:23.709083 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706634 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 28 19:16:23.709083 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706636 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 28 19:16:23.709083 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706639 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 28 19:16:23.709083 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706641 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 28 19:16:23.709083 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706644 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 28 19:16:23.709083 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706646 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 28 19:16:23.709083 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706649 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 28 19:16:23.709083 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706652 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 28 19:16:23.709083 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706654 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 28 19:16:23.709083 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706657 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 28 19:16:23.709083 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706660 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 28 19:16:23.709083 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706662 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 28 19:16:23.709083 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706665 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 28 19:16:23.709083 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706667 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 28 19:16:23.709083 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706670 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 28 19:16:23.709083 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706673 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 28 19:16:23.709083 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706675 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 28 19:16:23.709599 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706678 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 28 19:16:23.709599 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.706683 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 28 19:16:23.709599 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706783 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 28 19:16:23.709599 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706788 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 28 19:16:23.709599 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706791 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 28 19:16:23.709599 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706794 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 28 19:16:23.709599 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706797 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 28 19:16:23.709599 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706800 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 28 19:16:23.709599 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706803 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 28 19:16:23.709599 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706805 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 28 19:16:23.709599 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706808 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 28 19:16:23.709599 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706811 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 28 19:16:23.709599 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706814 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 28 19:16:23.709599 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706817 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 28 19:16:23.709599 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706820 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 28 19:16:23.709599 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706822 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 28 19:16:23.710003 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706824 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 28 19:16:23.710003 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706827 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 28 19:16:23.710003 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706830 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 28 19:16:23.710003 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706832 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 28 19:16:23.710003 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706836 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 28 19:16:23.710003 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706840 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 28 19:16:23.710003 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706843 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 28 19:16:23.710003 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706846 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 28 19:16:23.710003 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706849 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 28 19:16:23.710003 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706852 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 28 19:16:23.710003 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706855 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 28 19:16:23.710003 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706857 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 28 19:16:23.710003 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706860 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 28 19:16:23.710003 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706863 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 28 19:16:23.710003 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706866 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 28 19:16:23.710003 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706870 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 28 19:16:23.710003 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706873 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 28 19:16:23.710003 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706876 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 28 19:16:23.710003 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706879 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 28 19:16:23.710456 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706882 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 28 19:16:23.710456 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706885 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 28 19:16:23.710456 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706888 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 28 19:16:23.710456 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706891 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 28 19:16:23.710456 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706894 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 28 19:16:23.710456 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706896 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 28 19:16:23.710456 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706899 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 28 19:16:23.710456 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706901 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 28 19:16:23.710456 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706904 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 28 19:16:23.710456 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706907 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 28 19:16:23.710456 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706910 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 28 19:16:23.710456 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706912 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 28 19:16:23.710456 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706915 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 28 19:16:23.710456 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706917 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 28 19:16:23.710456 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706920 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 28 19:16:23.710456 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706922 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 28 19:16:23.710456 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706924 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 28 19:16:23.710456 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706927 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 28 19:16:23.710456 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706929 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 28 19:16:23.710456 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706932 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 28 19:16:23.711009 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706934 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 28 19:16:23.711009 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706937 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 28 19:16:23.711009 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706939 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 28 19:16:23.711009 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706942 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 28 19:16:23.711009 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706944 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 28 19:16:23.711009 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706947 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 28 19:16:23.711009 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706949 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 28 19:16:23.711009 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706952 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 28 19:16:23.711009 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706954 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 28 19:16:23.711009 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706956 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 28 19:16:23.711009 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706959 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 28 19:16:23.711009 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706961 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 28 19:16:23.711009 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706964 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 28 19:16:23.711009 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706966 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 28 19:16:23.711009 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706969 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 28 19:16:23.711009 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706972 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 28 19:16:23.711009 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706974 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 28 19:16:23.711009 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706977 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 28 19:16:23.711009 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706979 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 28 19:16:23.711009 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706982 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 28 19:16:23.711503 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706984 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 28 19:16:23.711503 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706987 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 28 19:16:23.711503 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706989 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 28 19:16:23.711503 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706992 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 28 19:16:23.711503 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706994 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 28 19:16:23.711503 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706997 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 28 19:16:23.711503 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.706999 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 28 19:16:23.711503 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.707002 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 28 19:16:23.711503 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.707004 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 28 19:16:23.711503 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.707021 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 28 19:16:23.711503 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.707025 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 28 19:16:23.711503 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.707027 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 28 19:16:23.711503 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:23.707030 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 28 19:16:23.711503 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.707035 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 28 19:16:23.711503 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.707676 2571 server.go:962] "Client rotation is on, will bootstrap in background" Apr 28 19:16:23.711936 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.711922 2571 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 28 19:16:23.712993 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.712982 2571 server.go:1019] "Starting client certificate rotation" Apr 28 19:16:23.713100 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.713083 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 28 19:16:23.713139 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.713129 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 28 19:16:23.738829 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.738812 2571 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 28 19:16:23.745437 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.745415 2571 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 28 19:16:23.759809 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.759790 2571 log.go:25] "Validated CRI v1 runtime API" Apr 28 19:16:23.765577 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.765560 2571 log.go:25] "Validated CRI v1 image API" Apr 28 19:16:23.767596 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.767578 2571 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 28 19:16:23.769960 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.769935 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 28 19:16:23.772768 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.772736 2571 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 9ace213d-54cc-482e-99e9-ef9e30212ec2:/dev/nvme0n1p3 ca454ea3-c9dc-4026-abf7-c9cb7d320295:/dev/nvme0n1p4] Apr 28 19:16:23.772842 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.772764 2571 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 28 19:16:23.778882 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.778773 2571 manager.go:217] Machine: {Timestamp:2026-04-28 19:16:23.776915902 +0000 UTC m=+0.407816847 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100663 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec294c1d819ad709a2c25ef46186faa8 SystemUUID:ec294c1d-819a-d709-a2c2-5ef46186faa8 BootID:988750ca-ce9b-44bf-9fc7-0364ebf25e8f Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:25:3b:64:c4:e1 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:25:3b:64:c4:e1 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:1e:74:d8:41:e6:c1 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 28 19:16:23.778882 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.778878 2571 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 28 19:16:23.778993 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.778962 2571 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 28 19:16:23.780127 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.780100 2571 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 28 19:16:23.780270 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.780128 2571 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-139-128.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 28 19:16:23.780311 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.780279 2571 topology_manager.go:138] "Creating topology manager with none policy" Apr 28 19:16:23.780311 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.780287 2571 container_manager_linux.go:306] "Creating device plugin manager" Apr 28 19:16:23.780311 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.780300 2571 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 28 19:16:23.781025 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.781015 2571 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 28 19:16:23.782013 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.782003 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 28 19:16:23.782130 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.782121 2571 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 28 19:16:23.784273 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.784263 2571 kubelet.go:491] "Attempting to sync node with API server" Apr 28 19:16:23.784312 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.784277 2571 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 28 19:16:23.784312 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.784288 2571 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 28 19:16:23.784312 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.784298 2571 kubelet.go:397] "Adding apiserver pod source" Apr 28 19:16:23.784312 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.784307 2571 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 28 19:16:23.785417 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.785406 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 28 19:16:23.785451 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.785424 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 28 19:16:23.788371 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.788349 2571 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 28 19:16:23.789706 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.789692 2571 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 28 19:16:23.791191 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.791170 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-qx5w5" Apr 28 19:16:23.791512 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.791498 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 28 19:16:23.791563 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.791517 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 28 19:16:23.791563 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.791524 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 28 19:16:23.791563 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.791529 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 28 19:16:23.791563 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.791541 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 28 19:16:23.791563 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.791547 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 28 19:16:23.791563 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.791553 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 28 19:16:23.791563 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.791558 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 28 19:16:23.791563 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.791565 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 28 19:16:23.791766 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.791571 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 28 19:16:23.791766 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.791585 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 28 19:16:23.791766 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.791593 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 28 19:16:23.792441 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.792433 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 28 19:16:23.792441 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.792441 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 28 19:16:23.795892 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.795868 2571 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-139-128.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 28 19:16:23.795985 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:23.795899 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-139-128.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 28 19:16:23.796140 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:23.796110 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 28 19:16:23.797233 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.797211 2571 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 28 19:16:23.797322 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.797272 2571 server.go:1295] "Started kubelet" Apr 28 19:16:23.797412 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.797371 2571 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 28 19:16:23.797522 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.797425 2571 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 28 19:16:23.797590 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.797572 2571 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 28 19:16:23.798206 ip-10-0-139-128 systemd[1]: Started Kubernetes Kubelet. Apr 28 19:16:23.798877 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.798766 2571 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 28 19:16:23.803218 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.803193 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-qx5w5" Apr 28 19:16:23.803553 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.803537 2571 server.go:317] "Adding debug handlers to kubelet server" Apr 28 19:16:23.805696 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:23.804461 2571 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-139-128.ec2.internal.18aa9b5303ab51cc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-139-128.ec2.internal,UID:ip-10-0-139-128.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-139-128.ec2.internal,},FirstTimestamp:2026-04-28 19:16:23.797232076 +0000 UTC m=+0.428133020,LastTimestamp:2026-04-28 19:16:23.797232076 +0000 UTC m=+0.428133020,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-139-128.ec2.internal,}" Apr 28 19:16:23.808504 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.808476 2571 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 28 19:16:23.808504 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.808494 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 28 19:16:23.809100 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.809080 2571 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 28 19:16:23.809100 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.809087 2571 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 28 19:16:23.809218 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.809111 2571 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 28 19:16:23.809218 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.809198 2571 reconstruct.go:97] "Volume reconstruction finished" Apr 28 19:16:23.809218 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.809207 2571 reconciler.go:26] "Reconciler: start to sync state" Apr 28 19:16:23.809359 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:23.809286 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-128.ec2.internal\" not found" Apr 28 19:16:23.810685 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.810668 2571 factory.go:55] Registering systemd factory Apr 28 19:16:23.811096 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.811074 2571 factory.go:223] Registration of the systemd container factory successfully Apr 28 19:16:23.811331 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.811314 2571 factory.go:153] Registering CRI-O factory Apr 28 19:16:23.811406 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.811334 2571 factory.go:223] Registration of the crio container factory successfully Apr 28 19:16:23.811406 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.811383 2571 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 28 19:16:23.811406 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.811407 2571 factory.go:103] Registering Raw factory Apr 28 19:16:23.811580 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.811422 2571 manager.go:1196] Started watching for new ooms in manager Apr 28 19:16:23.811715 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:23.811696 2571 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 28 19:16:23.812012 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.812000 2571 manager.go:319] Starting recovery of all containers Apr 28 19:16:23.821027 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.820908 2571 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 28 19:16:23.822068 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.822053 2571 manager.go:324] Recovery completed Apr 28 19:16:23.824490 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:23.824464 2571 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-139-128.ec2.internal\" not found" node="ip-10-0-139-128.ec2.internal" Apr 28 19:16:23.826763 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.826751 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:16:23.829030 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.829015 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-128.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:16:23.829085 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.829042 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-128.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:16:23.829085 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.829052 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-128.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:16:23.829467 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.829455 2571 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 28 19:16:23.829467 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.829465 2571 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 28 19:16:23.829560 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.829494 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 28 19:16:23.831964 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.831953 2571 policy_none.go:49] "None policy: Start" Apr 28 19:16:23.832015 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.831968 2571 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 28 19:16:23.832015 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.831977 2571 state_mem.go:35] "Initializing new in-memory state store" Apr 28 19:16:23.874119 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.871009 2571 manager.go:341] "Starting Device Plugin manager" Apr 28 19:16:23.874119 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:23.871046 2571 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 28 19:16:23.874119 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.871056 2571 server.go:85] "Starting device plugin registration server" Apr 28 19:16:23.874119 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.871321 2571 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 28 19:16:23.874119 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.871335 2571 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 28 19:16:23.874119 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.871432 2571 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 28 19:16:23.874119 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.871547 2571 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 28 19:16:23.874119 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.871584 2571 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 28 19:16:23.874119 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:23.872039 2571 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 28 19:16:23.874119 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:23.872079 2571 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-139-128.ec2.internal\" not found" Apr 28 19:16:23.940905 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.940868 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 28 19:16:23.942126 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.942084 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 28 19:16:23.942126 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.942108 2571 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 28 19:16:23.942126 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.942124 2571 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 28 19:16:23.942289 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.942130 2571 kubelet.go:2451] "Starting kubelet main sync loop" Apr 28 19:16:23.942289 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:23.942159 2571 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 28 19:16:23.944869 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.944850 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 28 19:16:23.972164 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.972135 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:16:23.973008 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.972994 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-128.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:16:23.973078 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.973022 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-128.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:16:23.973078 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.973037 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-128.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:16:23.973078 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.973067 2571 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-139-128.ec2.internal" Apr 28 19:16:23.981364 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:23.981347 2571 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-139-128.ec2.internal" Apr 28 19:16:23.981417 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:23.981368 2571 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-139-128.ec2.internal\": node \"ip-10-0-139-128.ec2.internal\" not found" Apr 28 19:16:24.006078 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:24.006056 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-128.ec2.internal\" not found" Apr 28 19:16:24.042237 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:24.042216 2571 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-128.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-139-128.ec2.internal"] Apr 28 19:16:24.042290 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:24.042275 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:16:24.043621 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:24.043605 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-128.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:16:24.043678 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:24.043632 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-128.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:16:24.043678 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:24.043642 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-128.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:16:24.045026 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:24.045015 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:16:24.045173 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:24.045158 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-128.ec2.internal" Apr 28 19:16:24.045221 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:24.045188 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:16:24.045716 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:24.045700 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-128.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:16:24.045798 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:24.045703 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-128.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:16:24.045798 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:24.045749 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-128.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:16:24.045798 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:24.045760 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-128.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:16:24.045798 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:24.045730 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-128.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:16:24.045930 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:24.045819 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-128.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:16:24.047067 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:24.047050 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-128.ec2.internal" Apr 28 19:16:24.047157 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:24.047072 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:16:24.047687 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:24.047669 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-128.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:16:24.047770 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:24.047694 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-128.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:16:24.047770 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:24.047706 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-128.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:16:24.071025 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:24.071006 2571 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-128.ec2.internal\" not found" node="ip-10-0-139-128.ec2.internal" Apr 28 19:16:24.075463 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:24.075449 2571 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-128.ec2.internal\" not found" node="ip-10-0-139-128.ec2.internal" Apr 28 19:16:24.106520 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:24.106500 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-128.ec2.internal\" not found" Apr 28 19:16:24.206968 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:24.206916 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-128.ec2.internal\" not found" Apr 28 19:16:24.211281 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:24.211268 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/096e44f912735215c31abcb4ee60cd12-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-128.ec2.internal\" (UID: \"096e44f912735215c31abcb4ee60cd12\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-128.ec2.internal" Apr 28 19:16:24.211334 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:24.211298 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/096e44f912735215c31abcb4ee60cd12-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-128.ec2.internal\" (UID: \"096e44f912735215c31abcb4ee60cd12\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-128.ec2.internal" Apr 28 19:16:24.211368 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:24.211346 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/3f217b631ac7267173c9067d07088610-config\") pod \"kube-apiserver-proxy-ip-10-0-139-128.ec2.internal\" (UID: \"3f217b631ac7267173c9067d07088610\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-128.ec2.internal" Apr 28 19:16:24.307389 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:24.307367 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-128.ec2.internal\" not found" Apr 28 19:16:24.311730 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:24.311711 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/3f217b631ac7267173c9067d07088610-config\") pod \"kube-apiserver-proxy-ip-10-0-139-128.ec2.internal\" (UID: \"3f217b631ac7267173c9067d07088610\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-128.ec2.internal" Apr 28 19:16:24.311777 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:24.311739 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/096e44f912735215c31abcb4ee60cd12-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-128.ec2.internal\" (UID: \"096e44f912735215c31abcb4ee60cd12\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-128.ec2.internal" Apr 28 19:16:24.311777 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:24.311755 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/096e44f912735215c31abcb4ee60cd12-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-128.ec2.internal\" (UID: \"096e44f912735215c31abcb4ee60cd12\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-128.ec2.internal" Apr 28 19:16:24.311852 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:24.311791 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/096e44f912735215c31abcb4ee60cd12-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-128.ec2.internal\" (UID: \"096e44f912735215c31abcb4ee60cd12\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-128.ec2.internal" Apr 28 19:16:24.311852 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:24.311812 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/3f217b631ac7267173c9067d07088610-config\") pod \"kube-apiserver-proxy-ip-10-0-139-128.ec2.internal\" (UID: \"3f217b631ac7267173c9067d07088610\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-128.ec2.internal" Apr 28 19:16:24.311852 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:24.311814 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/096e44f912735215c31abcb4ee60cd12-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-128.ec2.internal\" (UID: \"096e44f912735215c31abcb4ee60cd12\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-128.ec2.internal" Apr 28 19:16:24.372842 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:24.372820 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-128.ec2.internal" Apr 28 19:16:24.378382 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:24.378364 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-128.ec2.internal" Apr 28 19:16:24.408097 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:24.408075 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-128.ec2.internal\" not found" Apr 28 19:16:24.508568 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:24.508494 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-128.ec2.internal\" not found" Apr 28 19:16:24.608964 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:24.608932 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-128.ec2.internal\" not found" Apr 28 19:16:24.709372 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:24.709339 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-128.ec2.internal\" not found" Apr 28 19:16:24.712568 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:24.712536 2571 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 28 19:16:24.712700 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:24.712682 2571 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 28 19:16:24.712750 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:24.712698 2571 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 28 19:16:24.806529 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:24.806496 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-27 19:11:23 +0000 UTC" deadline="2028-02-03 23:54:13.87948443 +0000 UTC" Apr 28 19:16:24.806529 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:24.806525 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15508h37m49.07296286s" Apr 28 19:16:24.809949 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:24.809769 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-128.ec2.internal\" not found" Apr 28 19:16:24.810252 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:24.810151 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 28 19:16:24.828884 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:24.828860 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 28 19:16:24.884603 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:24.884566 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-5mz8v" Apr 28 19:16:24.894557 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:24.894535 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-5mz8v" Apr 28 19:16:24.899784 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:24.899754 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod096e44f912735215c31abcb4ee60cd12.slice/crio-375edbd6bfbbe753206e7d6f5f018f23c20c5c7cf2fefd1c618f306cddea0241 WatchSource:0}: Error finding container 375edbd6bfbbe753206e7d6f5f018f23c20c5c7cf2fefd1c618f306cddea0241: Status 404 returned error can't find the container with id 375edbd6bfbbe753206e7d6f5f018f23c20c5c7cf2fefd1c618f306cddea0241 Apr 28 19:16:24.900350 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:24.900330 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f217b631ac7267173c9067d07088610.slice/crio-cc64942e3b95368ef0696a63535ffcceedad301efbeab5398b725ab7e37aaec3 WatchSource:0}: Error finding container cc64942e3b95368ef0696a63535ffcceedad301efbeab5398b725ab7e37aaec3: Status 404 returned error can't find the container with id cc64942e3b95368ef0696a63535ffcceedad301efbeab5398b725ab7e37aaec3 Apr 28 19:16:24.906731 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:24.906580 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 19:16:24.910435 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:24.910418 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-128.ec2.internal\" not found" Apr 28 19:16:24.945043 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:24.944996 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-128.ec2.internal" event={"ID":"096e44f912735215c31abcb4ee60cd12","Type":"ContainerStarted","Data":"375edbd6bfbbe753206e7d6f5f018f23c20c5c7cf2fefd1c618f306cddea0241"} Apr 28 19:16:24.945920 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:24.945891 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-128.ec2.internal" event={"ID":"3f217b631ac7267173c9067d07088610","Type":"ContainerStarted","Data":"cc64942e3b95368ef0696a63535ffcceedad301efbeab5398b725ab7e37aaec3"} Apr 28 19:16:25.011195 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:25.011168 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-128.ec2.internal\" not found" Apr 28 19:16:25.076617 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.076560 2571 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 28 19:16:25.112244 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:25.112219 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-128.ec2.internal\" not found" Apr 28 19:16:25.212738 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:25.212702 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-128.ec2.internal\" not found" Apr 28 19:16:25.307784 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.307757 2571 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 28 19:16:25.308636 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.308608 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-128.ec2.internal" Apr 28 19:16:25.319111 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.319086 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 28 19:16:25.320217 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.320169 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-128.ec2.internal" Apr 28 19:16:25.333763 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.333654 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 28 19:16:25.643751 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.643675 2571 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 28 19:16:25.785983 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.785951 2571 apiserver.go:52] "Watching apiserver" Apr 28 19:16:25.792916 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.792891 2571 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 28 19:16:25.795127 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.795097 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-5qtkh","openshift-ovn-kubernetes/ovnkube-node-ppk4t","kube-system/kube-apiserver-proxy-ip-10-0-139-128.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qgcmm","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-128.ec2.internal","openshift-multus/network-metrics-daemon-zlvsf","openshift-network-operator/iptables-alerter-4t6vk","kube-system/konnectivity-agent-76d57","openshift-cluster-node-tuning-operator/tuned-kd4w2","openshift-dns/node-resolver-w5zst","openshift-image-registry/node-ca-jw8bb","openshift-multus/multus-additional-cni-plugins-cwbdf","openshift-multus/multus-kdzc2"] Apr 28 19:16:25.797205 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.797183 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-76d57" Apr 28 19:16:25.799357 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.799335 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 28 19:16:25.799465 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.799338 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 28 19:16:25.799465 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.799403 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-zj9wk\"" Apr 28 19:16:25.799873 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.799817 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jw8bb" Apr 28 19:16:25.801422 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.801138 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4t6vk" Apr 28 19:16:25.801422 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.801232 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qgcmm" Apr 28 19:16:25.801824 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.801801 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 28 19:16:25.801925 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.801846 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 28 19:16:25.802109 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.802092 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-k9gmt\"" Apr 28 19:16:25.802176 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.802144 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 28 19:16:25.802841 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.802821 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 28 19:16:25.803271 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.803153 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 28 19:16:25.803271 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.803243 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 28 19:16:25.803438 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.803278 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-jzmcr\"" Apr 28 19:16:25.803517 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.803445 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 28 19:16:25.803599 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.803581 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlvsf" Apr 28 19:16:25.803671 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:25.803651 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlvsf" podUID="caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b" Apr 28 19:16:25.803770 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.803753 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 28 19:16:25.803770 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.803762 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 28 19:16:25.803923 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.803759 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-w4mjc\"" Apr 28 19:16:25.805140 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.804866 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5qtkh" Apr 28 19:16:25.805140 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:25.804930 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5qtkh" podUID="e46a06a4-894f-4f3d-a446-b501af6e42eb" Apr 28 19:16:25.806285 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.806262 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-kd4w2" Apr 28 19:16:25.807562 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.807540 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-w5zst" Apr 28 19:16:25.808097 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.808080 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-fxtrj\"" Apr 28 19:16:25.808194 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.808179 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 28 19:16:25.808252 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.808208 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 28 19:16:25.809072 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.809056 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:25.809712 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.809226 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 28 19:16:25.810044 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.810024 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 28 19:16:25.810044 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.810038 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-4z7ww\"" Apr 28 19:16:25.810568 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.810471 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-cwbdf" Apr 28 19:16:25.810829 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.810812 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 28 19:16:25.811837 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.811821 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-kdzc2" Apr 28 19:16:25.811918 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.811899 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 28 19:16:25.812194 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.812161 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 28 19:16:25.812194 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.812171 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 28 19:16:25.812384 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.812206 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 28 19:16:25.812384 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.812259 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-8x9t6\"" Apr 28 19:16:25.812853 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.812573 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 28 19:16:25.812853 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.812744 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 28 19:16:25.812853 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.812752 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-jhqhr\"" Apr 28 19:16:25.813789 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.813773 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 28 19:16:25.814053 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.814030 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 28 19:16:25.814143 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.814034 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 28 19:16:25.814143 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.814099 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-659j2\"" Apr 28 19:16:25.814348 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.814332 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 28 19:16:25.814962 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.814940 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 28 19:16:25.821180 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.821159 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/26923c38-af88-40e6-acd9-e1135c078ad1-etc-systemd\") pod \"tuned-kd4w2\" (UID: \"26923c38-af88-40e6-acd9-e1135c078ad1\") " pod="openshift-cluster-node-tuning-operator/tuned-kd4w2" Apr 28 19:16:25.821267 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.821186 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/26923c38-af88-40e6-acd9-e1135c078ad1-lib-modules\") pod \"tuned-kd4w2\" (UID: \"26923c38-af88-40e6-acd9-e1135c078ad1\") " pod="openshift-cluster-node-tuning-operator/tuned-kd4w2" Apr 28 19:16:25.821267 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.821204 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rj6gs\" (UniqueName: \"kubernetes.io/projected/31324b10-25d4-4dfc-b0e6-a7f99e5a27c7-kube-api-access-rj6gs\") pod \"iptables-alerter-4t6vk\" (UID: \"31324b10-25d4-4dfc-b0e6-a7f99e5a27c7\") " pod="openshift-network-operator/iptables-alerter-4t6vk" Apr 28 19:16:25.821267 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.821219 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/238e19ca-102f-43b1-8aed-9322ca47bfc9-systemd-units\") pod \"ovnkube-node-ppk4t\" (UID: \"238e19ca-102f-43b1-8aed-9322ca47bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:25.821267 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.821238 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/238e19ca-102f-43b1-8aed-9322ca47bfc9-host-slash\") pod \"ovnkube-node-ppk4t\" (UID: \"238e19ca-102f-43b1-8aed-9322ca47bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:25.821433 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.821302 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d4a7a1d2-5229-47cb-b4b1-097846a273d7-system-cni-dir\") pod \"multus-additional-cni-plugins-cwbdf\" (UID: \"d4a7a1d2-5229-47cb-b4b1-097846a273d7\") " pod="openshift-multus/multus-additional-cni-plugins-cwbdf" Apr 28 19:16:25.821433 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.821338 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/26923c38-af88-40e6-acd9-e1135c078ad1-sys\") pod \"tuned-kd4w2\" (UID: \"26923c38-af88-40e6-acd9-e1135c078ad1\") " pod="openshift-cluster-node-tuning-operator/tuned-kd4w2" Apr 28 19:16:25.821433 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.821375 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/238e19ca-102f-43b1-8aed-9322ca47bfc9-log-socket\") pod \"ovnkube-node-ppk4t\" (UID: \"238e19ca-102f-43b1-8aed-9322ca47bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:25.821433 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.821400 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/26923c38-af88-40e6-acd9-e1135c078ad1-etc-sysconfig\") pod \"tuned-kd4w2\" (UID: \"26923c38-af88-40e6-acd9-e1135c078ad1\") " pod="openshift-cluster-node-tuning-operator/tuned-kd4w2" Apr 28 19:16:25.821599 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.821434 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/26923c38-af88-40e6-acd9-e1135c078ad1-etc-tuned\") pod \"tuned-kd4w2\" (UID: \"26923c38-af88-40e6-acd9-e1135c078ad1\") " pod="openshift-cluster-node-tuning-operator/tuned-kd4w2" Apr 28 19:16:25.821599 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.821461 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/238e19ca-102f-43b1-8aed-9322ca47bfc9-node-log\") pod \"ovnkube-node-ppk4t\" (UID: \"238e19ca-102f-43b1-8aed-9322ca47bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:25.821599 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.821496 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/238e19ca-102f-43b1-8aed-9322ca47bfc9-ovnkube-config\") pod \"ovnkube-node-ppk4t\" (UID: \"238e19ca-102f-43b1-8aed-9322ca47bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:25.821599 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.821542 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/238e19ca-102f-43b1-8aed-9322ca47bfc9-env-overrides\") pod \"ovnkube-node-ppk4t\" (UID: \"238e19ca-102f-43b1-8aed-9322ca47bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:25.821599 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.821589 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgfpz\" (UniqueName: \"kubernetes.io/projected/d4a7a1d2-5229-47cb-b4b1-097846a273d7-kube-api-access-cgfpz\") pod \"multus-additional-cni-plugins-cwbdf\" (UID: \"d4a7a1d2-5229-47cb-b4b1-097846a273d7\") " pod="openshift-multus/multus-additional-cni-plugins-cwbdf" Apr 28 19:16:25.821813 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.821634 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zvnx\" (UniqueName: \"kubernetes.io/projected/caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b-kube-api-access-7zvnx\") pod \"network-metrics-daemon-zlvsf\" (UID: \"caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b\") " pod="openshift-multus/network-metrics-daemon-zlvsf" Apr 28 19:16:25.821813 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.821662 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/238e19ca-102f-43b1-8aed-9322ca47bfc9-host-run-ovn-kubernetes\") pod \"ovnkube-node-ppk4t\" (UID: \"238e19ca-102f-43b1-8aed-9322ca47bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:25.821813 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.821688 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/42108ecf-b2c5-4c69-b376-2e4a5f47a989-registration-dir\") pod \"aws-ebs-csi-driver-node-qgcmm\" (UID: \"42108ecf-b2c5-4c69-b376-2e4a5f47a989\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qgcmm" Apr 28 19:16:25.821813 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.821732 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wjwj\" (UniqueName: \"kubernetes.io/projected/26923c38-af88-40e6-acd9-e1135c078ad1-kube-api-access-4wjwj\") pod \"tuned-kd4w2\" (UID: \"26923c38-af88-40e6-acd9-e1135c078ad1\") " pod="openshift-cluster-node-tuning-operator/tuned-kd4w2" Apr 28 19:16:25.821813 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.821783 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5de918e6-f589-4708-869e-21232a3f0b2e-host\") pod \"node-ca-jw8bb\" (UID: \"5de918e6-f589-4708-869e-21232a3f0b2e\") " pod="openshift-image-registry/node-ca-jw8bb" Apr 28 19:16:25.822034 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.821829 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/238e19ca-102f-43b1-8aed-9322ca47bfc9-host-cni-netd\") pod \"ovnkube-node-ppk4t\" (UID: \"238e19ca-102f-43b1-8aed-9322ca47bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:25.822034 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.821865 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/26923c38-af88-40e6-acd9-e1135c078ad1-etc-sysctl-conf\") pod \"tuned-kd4w2\" (UID: \"26923c38-af88-40e6-acd9-e1135c078ad1\") " pod="openshift-cluster-node-tuning-operator/tuned-kd4w2" Apr 28 19:16:25.822034 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.821891 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/9a11e608-5ea9-4123-8db2-08683b9e10b6-agent-certs\") pod \"konnectivity-agent-76d57\" (UID: \"9a11e608-5ea9-4123-8db2-08683b9e10b6\") " pod="kube-system/konnectivity-agent-76d57" Apr 28 19:16:25.822034 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.821926 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/238e19ca-102f-43b1-8aed-9322ca47bfc9-run-openvswitch\") pod \"ovnkube-node-ppk4t\" (UID: \"238e19ca-102f-43b1-8aed-9322ca47bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:25.822034 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.821953 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d4a7a1d2-5229-47cb-b4b1-097846a273d7-os-release\") pod \"multus-additional-cni-plugins-cwbdf\" (UID: \"d4a7a1d2-5229-47cb-b4b1-097846a273d7\") " pod="openshift-multus/multus-additional-cni-plugins-cwbdf" Apr 28 19:16:25.822034 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.821978 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/42108ecf-b2c5-4c69-b376-2e4a5f47a989-kubelet-dir\") pod \"aws-ebs-csi-driver-node-qgcmm\" (UID: \"42108ecf-b2c5-4c69-b376-2e4a5f47a989\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qgcmm" Apr 28 19:16:25.822034 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.822001 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/42108ecf-b2c5-4c69-b376-2e4a5f47a989-device-dir\") pod \"aws-ebs-csi-driver-node-qgcmm\" (UID: \"42108ecf-b2c5-4c69-b376-2e4a5f47a989\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qgcmm" Apr 28 19:16:25.822034 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.822024 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/238e19ca-102f-43b1-8aed-9322ca47bfc9-etc-openvswitch\") pod \"ovnkube-node-ppk4t\" (UID: \"238e19ca-102f-43b1-8aed-9322ca47bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:25.822441 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.822057 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/26923c38-af88-40e6-acd9-e1135c078ad1-host\") pod \"tuned-kd4w2\" (UID: \"26923c38-af88-40e6-acd9-e1135c078ad1\") " pod="openshift-cluster-node-tuning-operator/tuned-kd4w2" Apr 28 19:16:25.822441 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.822099 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/26923c38-af88-40e6-acd9-e1135c078ad1-tmp\") pod \"tuned-kd4w2\" (UID: \"26923c38-af88-40e6-acd9-e1135c078ad1\") " pod="openshift-cluster-node-tuning-operator/tuned-kd4w2" Apr 28 19:16:25.822441 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.822142 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/42108ecf-b2c5-4c69-b376-2e4a5f47a989-socket-dir\") pod \"aws-ebs-csi-driver-node-qgcmm\" (UID: \"42108ecf-b2c5-4c69-b376-2e4a5f47a989\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qgcmm" Apr 28 19:16:25.822441 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.822180 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/26923c38-af88-40e6-acd9-e1135c078ad1-etc-modprobe-d\") pod \"tuned-kd4w2\" (UID: \"26923c38-af88-40e6-acd9-e1135c078ad1\") " pod="openshift-cluster-node-tuning-operator/tuned-kd4w2" Apr 28 19:16:25.822441 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.822206 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjkwb\" (UniqueName: \"kubernetes.io/projected/dae64fa9-2628-461e-a0d3-e468450879cf-kube-api-access-mjkwb\") pod \"node-resolver-w5zst\" (UID: \"dae64fa9-2628-461e-a0d3-e468450879cf\") " pod="openshift-dns/node-resolver-w5zst" Apr 28 19:16:25.822441 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.822240 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d4a7a1d2-5229-47cb-b4b1-097846a273d7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-cwbdf\" (UID: \"d4a7a1d2-5229-47cb-b4b1-097846a273d7\") " pod="openshift-multus/multus-additional-cni-plugins-cwbdf" Apr 28 19:16:25.822441 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.822267 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d4a7a1d2-5229-47cb-b4b1-097846a273d7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-cwbdf\" (UID: \"d4a7a1d2-5229-47cb-b4b1-097846a273d7\") " pod="openshift-multus/multus-additional-cni-plugins-cwbdf" Apr 28 19:16:25.822441 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.822290 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5c2w\" (UniqueName: \"kubernetes.io/projected/42108ecf-b2c5-4c69-b376-2e4a5f47a989-kube-api-access-d5c2w\") pod \"aws-ebs-csi-driver-node-qgcmm\" (UID: \"42108ecf-b2c5-4c69-b376-2e4a5f47a989\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qgcmm" Apr 28 19:16:25.822441 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.822353 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/9a11e608-5ea9-4123-8db2-08683b9e10b6-konnectivity-ca\") pod \"konnectivity-agent-76d57\" (UID: \"9a11e608-5ea9-4123-8db2-08683b9e10b6\") " pod="kube-system/konnectivity-agent-76d57" Apr 28 19:16:25.822441 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.822387 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zk8d\" (UniqueName: \"kubernetes.io/projected/5de918e6-f589-4708-869e-21232a3f0b2e-kube-api-access-9zk8d\") pod \"node-ca-jw8bb\" (UID: \"5de918e6-f589-4708-869e-21232a3f0b2e\") " pod="openshift-image-registry/node-ca-jw8bb" Apr 28 19:16:25.822441 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.822403 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/238e19ca-102f-43b1-8aed-9322ca47bfc9-host-kubelet\") pod \"ovnkube-node-ppk4t\" (UID: \"238e19ca-102f-43b1-8aed-9322ca47bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:25.822441 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.822422 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/42108ecf-b2c5-4c69-b376-2e4a5f47a989-sys-fs\") pod \"aws-ebs-csi-driver-node-qgcmm\" (UID: \"42108ecf-b2c5-4c69-b376-2e4a5f47a989\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qgcmm" Apr 28 19:16:25.822441 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.822442 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/dae64fa9-2628-461e-a0d3-e468450879cf-hosts-file\") pod \"node-resolver-w5zst\" (UID: \"dae64fa9-2628-461e-a0d3-e468450879cf\") " pod="openshift-dns/node-resolver-w5zst" Apr 28 19:16:25.822998 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.822463 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/238e19ca-102f-43b1-8aed-9322ca47bfc9-host-run-netns\") pod \"ovnkube-node-ppk4t\" (UID: \"238e19ca-102f-43b1-8aed-9322ca47bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:25.822998 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.822512 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d4a7a1d2-5229-47cb-b4b1-097846a273d7-cnibin\") pod \"multus-additional-cni-plugins-cwbdf\" (UID: \"d4a7a1d2-5229-47cb-b4b1-097846a273d7\") " pod="openshift-multus/multus-additional-cni-plugins-cwbdf" Apr 28 19:16:25.822998 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.822550 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/26923c38-af88-40e6-acd9-e1135c078ad1-etc-sysctl-d\") pod \"tuned-kd4w2\" (UID: \"26923c38-af88-40e6-acd9-e1135c078ad1\") " pod="openshift-cluster-node-tuning-operator/tuned-kd4w2" Apr 28 19:16:25.822998 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.822579 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/238e19ca-102f-43b1-8aed-9322ca47bfc9-host-cni-bin\") pod \"ovnkube-node-ppk4t\" (UID: \"238e19ca-102f-43b1-8aed-9322ca47bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:25.822998 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.822634 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d4a7a1d2-5229-47cb-b4b1-097846a273d7-cni-binary-copy\") pod \"multus-additional-cni-plugins-cwbdf\" (UID: \"d4a7a1d2-5229-47cb-b4b1-097846a273d7\") " pod="openshift-multus/multus-additional-cni-plugins-cwbdf" Apr 28 19:16:25.822998 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.822694 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/42108ecf-b2c5-4c69-b376-2e4a5f47a989-etc-selinux\") pod \"aws-ebs-csi-driver-node-qgcmm\" (UID: \"42108ecf-b2c5-4c69-b376-2e4a5f47a989\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qgcmm" Apr 28 19:16:25.822998 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.822744 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b-metrics-certs\") pod \"network-metrics-daemon-zlvsf\" (UID: \"caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b\") " pod="openshift-multus/network-metrics-daemon-zlvsf" Apr 28 19:16:25.822998 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.822769 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/31324b10-25d4-4dfc-b0e6-a7f99e5a27c7-host-slash\") pod \"iptables-alerter-4t6vk\" (UID: \"31324b10-25d4-4dfc-b0e6-a7f99e5a27c7\") " pod="openshift-network-operator/iptables-alerter-4t6vk" Apr 28 19:16:25.822998 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.822796 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pc96\" (UniqueName: \"kubernetes.io/projected/e46a06a4-894f-4f3d-a446-b501af6e42eb-kube-api-access-6pc96\") pod \"network-check-target-5qtkh\" (UID: \"e46a06a4-894f-4f3d-a446-b501af6e42eb\") " pod="openshift-network-diagnostics/network-check-target-5qtkh" Apr 28 19:16:25.822998 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.822820 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/238e19ca-102f-43b1-8aed-9322ca47bfc9-var-lib-openvswitch\") pod \"ovnkube-node-ppk4t\" (UID: \"238e19ca-102f-43b1-8aed-9322ca47bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:25.822998 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.822843 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/26923c38-af88-40e6-acd9-e1135c078ad1-etc-kubernetes\") pod \"tuned-kd4w2\" (UID: \"26923c38-af88-40e6-acd9-e1135c078ad1\") " pod="openshift-cluster-node-tuning-operator/tuned-kd4w2" Apr 28 19:16:25.822998 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.822871 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5de918e6-f589-4708-869e-21232a3f0b2e-serviceca\") pod \"node-ca-jw8bb\" (UID: \"5de918e6-f589-4708-869e-21232a3f0b2e\") " pod="openshift-image-registry/node-ca-jw8bb" Apr 28 19:16:25.822998 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.822892 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/31324b10-25d4-4dfc-b0e6-a7f99e5a27c7-iptables-alerter-script\") pod \"iptables-alerter-4t6vk\" (UID: \"31324b10-25d4-4dfc-b0e6-a7f99e5a27c7\") " pod="openshift-network-operator/iptables-alerter-4t6vk" Apr 28 19:16:25.822998 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.822912 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/238e19ca-102f-43b1-8aed-9322ca47bfc9-run-systemd\") pod \"ovnkube-node-ppk4t\" (UID: \"238e19ca-102f-43b1-8aed-9322ca47bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:25.822998 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.822927 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/238e19ca-102f-43b1-8aed-9322ca47bfc9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ppk4t\" (UID: \"238e19ca-102f-43b1-8aed-9322ca47bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:25.822998 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.822943 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/dae64fa9-2628-461e-a0d3-e468450879cf-tmp-dir\") pod \"node-resolver-w5zst\" (UID: \"dae64fa9-2628-461e-a0d3-e468450879cf\") " pod="openshift-dns/node-resolver-w5zst" Apr 28 19:16:25.823942 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.822956 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/238e19ca-102f-43b1-8aed-9322ca47bfc9-run-ovn\") pod \"ovnkube-node-ppk4t\" (UID: \"238e19ca-102f-43b1-8aed-9322ca47bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:25.823942 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.822976 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/238e19ca-102f-43b1-8aed-9322ca47bfc9-ovn-node-metrics-cert\") pod \"ovnkube-node-ppk4t\" (UID: \"238e19ca-102f-43b1-8aed-9322ca47bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:25.823942 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.823022 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/238e19ca-102f-43b1-8aed-9322ca47bfc9-ovnkube-script-lib\") pod \"ovnkube-node-ppk4t\" (UID: \"238e19ca-102f-43b1-8aed-9322ca47bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:25.823942 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.823056 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d4a7a1d2-5229-47cb-b4b1-097846a273d7-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-cwbdf\" (UID: \"d4a7a1d2-5229-47cb-b4b1-097846a273d7\") " pod="openshift-multus/multus-additional-cni-plugins-cwbdf" Apr 28 19:16:25.823942 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.823082 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/26923c38-af88-40e6-acd9-e1135c078ad1-run\") pod \"tuned-kd4w2\" (UID: \"26923c38-af88-40e6-acd9-e1135c078ad1\") " pod="openshift-cluster-node-tuning-operator/tuned-kd4w2" Apr 28 19:16:25.823942 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.823123 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/26923c38-af88-40e6-acd9-e1135c078ad1-var-lib-kubelet\") pod \"tuned-kd4w2\" (UID: \"26923c38-af88-40e6-acd9-e1135c078ad1\") " pod="openshift-cluster-node-tuning-operator/tuned-kd4w2" Apr 28 19:16:25.823942 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.823146 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njlhg\" (UniqueName: \"kubernetes.io/projected/238e19ca-102f-43b1-8aed-9322ca47bfc9-kube-api-access-njlhg\") pod \"ovnkube-node-ppk4t\" (UID: \"238e19ca-102f-43b1-8aed-9322ca47bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:25.895474 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.895374 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-27 19:11:24 +0000 UTC" deadline="2027-10-17 14:08:56.260777626 +0000 UTC" Apr 28 19:16:25.895474 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.895410 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12882h52m30.365372551s" Apr 28 19:16:25.910019 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.909989 2571 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 28 19:16:25.923872 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.923842 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/26923c38-af88-40e6-acd9-e1135c078ad1-etc-systemd\") pod \"tuned-kd4w2\" (UID: \"26923c38-af88-40e6-acd9-e1135c078ad1\") " pod="openshift-cluster-node-tuning-operator/tuned-kd4w2" Apr 28 19:16:25.924029 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.923883 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/26923c38-af88-40e6-acd9-e1135c078ad1-lib-modules\") pod \"tuned-kd4w2\" (UID: \"26923c38-af88-40e6-acd9-e1135c078ad1\") " pod="openshift-cluster-node-tuning-operator/tuned-kd4w2" Apr 28 19:16:25.924029 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.923909 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rj6gs\" (UniqueName: \"kubernetes.io/projected/31324b10-25d4-4dfc-b0e6-a7f99e5a27c7-kube-api-access-rj6gs\") pod \"iptables-alerter-4t6vk\" (UID: \"31324b10-25d4-4dfc-b0e6-a7f99e5a27c7\") " pod="openshift-network-operator/iptables-alerter-4t6vk" Apr 28 19:16:25.924029 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.923933 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/238e19ca-102f-43b1-8aed-9322ca47bfc9-systemd-units\") pod \"ovnkube-node-ppk4t\" (UID: \"238e19ca-102f-43b1-8aed-9322ca47bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:25.924029 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.923955 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/238e19ca-102f-43b1-8aed-9322ca47bfc9-host-slash\") pod \"ovnkube-node-ppk4t\" (UID: \"238e19ca-102f-43b1-8aed-9322ca47bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:25.924029 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.923976 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d4a7a1d2-5229-47cb-b4b1-097846a273d7-system-cni-dir\") pod \"multus-additional-cni-plugins-cwbdf\" (UID: \"d4a7a1d2-5229-47cb-b4b1-097846a273d7\") " pod="openshift-multus/multus-additional-cni-plugins-cwbdf" Apr 28 19:16:25.924029 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.923980 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/26923c38-af88-40e6-acd9-e1135c078ad1-etc-systemd\") pod \"tuned-kd4w2\" (UID: \"26923c38-af88-40e6-acd9-e1135c078ad1\") " pod="openshift-cluster-node-tuning-operator/tuned-kd4w2" Apr 28 19:16:25.924320 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.924032 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd-cni-binary-copy\") pod \"multus-kdzc2\" (UID: \"3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd\") " pod="openshift-multus/multus-kdzc2" Apr 28 19:16:25.924320 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.924035 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/238e19ca-102f-43b1-8aed-9322ca47bfc9-systemd-units\") pod \"ovnkube-node-ppk4t\" (UID: \"238e19ca-102f-43b1-8aed-9322ca47bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:25.924320 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.924059 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/26923c38-af88-40e6-acd9-e1135c078ad1-sys\") pod \"tuned-kd4w2\" (UID: \"26923c38-af88-40e6-acd9-e1135c078ad1\") " pod="openshift-cluster-node-tuning-operator/tuned-kd4w2" Apr 28 19:16:25.924320 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.924071 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/26923c38-af88-40e6-acd9-e1135c078ad1-lib-modules\") pod \"tuned-kd4w2\" (UID: \"26923c38-af88-40e6-acd9-e1135c078ad1\") " pod="openshift-cluster-node-tuning-operator/tuned-kd4w2" Apr 28 19:16:25.924320 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.924098 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d4a7a1d2-5229-47cb-b4b1-097846a273d7-system-cni-dir\") pod \"multus-additional-cni-plugins-cwbdf\" (UID: \"d4a7a1d2-5229-47cb-b4b1-097846a273d7\") " pod="openshift-multus/multus-additional-cni-plugins-cwbdf" Apr 28 19:16:25.924320 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.924127 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/238e19ca-102f-43b1-8aed-9322ca47bfc9-host-slash\") pod \"ovnkube-node-ppk4t\" (UID: \"238e19ca-102f-43b1-8aed-9322ca47bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:25.924320 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.924133 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/238e19ca-102f-43b1-8aed-9322ca47bfc9-log-socket\") pod \"ovnkube-node-ppk4t\" (UID: \"238e19ca-102f-43b1-8aed-9322ca47bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:25.924320 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.924168 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/238e19ca-102f-43b1-8aed-9322ca47bfc9-log-socket\") pod \"ovnkube-node-ppk4t\" (UID: \"238e19ca-102f-43b1-8aed-9322ca47bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:25.924320 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.924175 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd-hostroot\") pod \"multus-kdzc2\" (UID: \"3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd\") " pod="openshift-multus/multus-kdzc2" Apr 28 19:16:25.924320 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.924130 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/26923c38-af88-40e6-acd9-e1135c078ad1-sys\") pod \"tuned-kd4w2\" (UID: \"26923c38-af88-40e6-acd9-e1135c078ad1\") " pod="openshift-cluster-node-tuning-operator/tuned-kd4w2" Apr 28 19:16:25.924320 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.924202 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7nr5\" (UniqueName: \"kubernetes.io/projected/3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd-kube-api-access-j7nr5\") pod \"multus-kdzc2\" (UID: \"3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd\") " pod="openshift-multus/multus-kdzc2" Apr 28 19:16:25.924320 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.924234 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/26923c38-af88-40e6-acd9-e1135c078ad1-etc-sysconfig\") pod \"tuned-kd4w2\" (UID: \"26923c38-af88-40e6-acd9-e1135c078ad1\") " pod="openshift-cluster-node-tuning-operator/tuned-kd4w2" Apr 28 19:16:25.924320 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.924264 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/26923c38-af88-40e6-acd9-e1135c078ad1-etc-tuned\") pod \"tuned-kd4w2\" (UID: \"26923c38-af88-40e6-acd9-e1135c078ad1\") " pod="openshift-cluster-node-tuning-operator/tuned-kd4w2" Apr 28 19:16:25.924320 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.924291 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/238e19ca-102f-43b1-8aed-9322ca47bfc9-node-log\") pod \"ovnkube-node-ppk4t\" (UID: \"238e19ca-102f-43b1-8aed-9322ca47bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:25.924320 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.924300 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/26923c38-af88-40e6-acd9-e1135c078ad1-etc-sysconfig\") pod \"tuned-kd4w2\" (UID: \"26923c38-af88-40e6-acd9-e1135c078ad1\") " pod="openshift-cluster-node-tuning-operator/tuned-kd4w2" Apr 28 19:16:25.924320 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.924316 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/238e19ca-102f-43b1-8aed-9322ca47bfc9-ovnkube-config\") pod \"ovnkube-node-ppk4t\" (UID: \"238e19ca-102f-43b1-8aed-9322ca47bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:25.925043 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.924340 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/238e19ca-102f-43b1-8aed-9322ca47bfc9-env-overrides\") pod \"ovnkube-node-ppk4t\" (UID: \"238e19ca-102f-43b1-8aed-9322ca47bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:25.925043 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.924358 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/238e19ca-102f-43b1-8aed-9322ca47bfc9-node-log\") pod \"ovnkube-node-ppk4t\" (UID: \"238e19ca-102f-43b1-8aed-9322ca47bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:25.925043 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.924366 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cgfpz\" (UniqueName: \"kubernetes.io/projected/d4a7a1d2-5229-47cb-b4b1-097846a273d7-kube-api-access-cgfpz\") pod \"multus-additional-cni-plugins-cwbdf\" (UID: \"d4a7a1d2-5229-47cb-b4b1-097846a273d7\") " pod="openshift-multus/multus-additional-cni-plugins-cwbdf" Apr 28 19:16:25.925043 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.924404 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd-multus-socket-dir-parent\") pod \"multus-kdzc2\" (UID: \"3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd\") " pod="openshift-multus/multus-kdzc2" Apr 28 19:16:25.925043 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.924436 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7zvnx\" (UniqueName: \"kubernetes.io/projected/caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b-kube-api-access-7zvnx\") pod \"network-metrics-daemon-zlvsf\" (UID: \"caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b\") " pod="openshift-multus/network-metrics-daemon-zlvsf" Apr 28 19:16:25.925043 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.924461 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/238e19ca-102f-43b1-8aed-9322ca47bfc9-host-run-ovn-kubernetes\") pod \"ovnkube-node-ppk4t\" (UID: \"238e19ca-102f-43b1-8aed-9322ca47bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:25.925043 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.924502 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd-system-cni-dir\") pod \"multus-kdzc2\" (UID: \"3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd\") " pod="openshift-multus/multus-kdzc2" Apr 28 19:16:25.925043 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.924530 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd-multus-daemon-config\") pod \"multus-kdzc2\" (UID: \"3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd\") " pod="openshift-multus/multus-kdzc2" Apr 28 19:16:25.925043 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.924557 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/42108ecf-b2c5-4c69-b376-2e4a5f47a989-registration-dir\") pod \"aws-ebs-csi-driver-node-qgcmm\" (UID: \"42108ecf-b2c5-4c69-b376-2e4a5f47a989\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qgcmm" Apr 28 19:16:25.925043 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.924585 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4wjwj\" (UniqueName: \"kubernetes.io/projected/26923c38-af88-40e6-acd9-e1135c078ad1-kube-api-access-4wjwj\") pod \"tuned-kd4w2\" (UID: \"26923c38-af88-40e6-acd9-e1135c078ad1\") " pod="openshift-cluster-node-tuning-operator/tuned-kd4w2" Apr 28 19:16:25.925043 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.924622 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5de918e6-f589-4708-869e-21232a3f0b2e-host\") pod \"node-ca-jw8bb\" (UID: \"5de918e6-f589-4708-869e-21232a3f0b2e\") " pod="openshift-image-registry/node-ca-jw8bb" Apr 28 19:16:25.925043 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.924647 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/238e19ca-102f-43b1-8aed-9322ca47bfc9-host-cni-netd\") pod \"ovnkube-node-ppk4t\" (UID: \"238e19ca-102f-43b1-8aed-9322ca47bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:25.925043 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.924679 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd-host-run-netns\") pod \"multus-kdzc2\" (UID: \"3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd\") " pod="openshift-multus/multus-kdzc2" Apr 28 19:16:25.925043 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.924701 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/26923c38-af88-40e6-acd9-e1135c078ad1-etc-sysctl-conf\") pod \"tuned-kd4w2\" (UID: \"26923c38-af88-40e6-acd9-e1135c078ad1\") " pod="openshift-cluster-node-tuning-operator/tuned-kd4w2" Apr 28 19:16:25.925043 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.924724 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/9a11e608-5ea9-4123-8db2-08683b9e10b6-agent-certs\") pod \"konnectivity-agent-76d57\" (UID: \"9a11e608-5ea9-4123-8db2-08683b9e10b6\") " pod="kube-system/konnectivity-agent-76d57" Apr 28 19:16:25.925043 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.924754 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/238e19ca-102f-43b1-8aed-9322ca47bfc9-run-openvswitch\") pod \"ovnkube-node-ppk4t\" (UID: \"238e19ca-102f-43b1-8aed-9322ca47bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:25.925043 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.924782 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d4a7a1d2-5229-47cb-b4b1-097846a273d7-os-release\") pod \"multus-additional-cni-plugins-cwbdf\" (UID: \"d4a7a1d2-5229-47cb-b4b1-097846a273d7\") " pod="openshift-multus/multus-additional-cni-plugins-cwbdf" Apr 28 19:16:25.925825 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.924808 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/238e19ca-102f-43b1-8aed-9322ca47bfc9-host-run-ovn-kubernetes\") pod \"ovnkube-node-ppk4t\" (UID: \"238e19ca-102f-43b1-8aed-9322ca47bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:25.925825 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.924845 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5de918e6-f589-4708-869e-21232a3f0b2e-host\") pod \"node-ca-jw8bb\" (UID: \"5de918e6-f589-4708-869e-21232a3f0b2e\") " pod="openshift-image-registry/node-ca-jw8bb" Apr 28 19:16:25.925825 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.924852 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/238e19ca-102f-43b1-8aed-9322ca47bfc9-env-overrides\") pod \"ovnkube-node-ppk4t\" (UID: \"238e19ca-102f-43b1-8aed-9322ca47bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:25.925825 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.924906 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/42108ecf-b2c5-4c69-b376-2e4a5f47a989-registration-dir\") pod \"aws-ebs-csi-driver-node-qgcmm\" (UID: \"42108ecf-b2c5-4c69-b376-2e4a5f47a989\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qgcmm" Apr 28 19:16:25.925825 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.924854 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd-host-run-multus-certs\") pod \"multus-kdzc2\" (UID: \"3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd\") " pod="openshift-multus/multus-kdzc2" Apr 28 19:16:25.925825 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.924947 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/42108ecf-b2c5-4c69-b376-2e4a5f47a989-kubelet-dir\") pod \"aws-ebs-csi-driver-node-qgcmm\" (UID: \"42108ecf-b2c5-4c69-b376-2e4a5f47a989\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qgcmm" Apr 28 19:16:25.925825 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.924959 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/26923c38-af88-40e6-acd9-e1135c078ad1-etc-sysctl-conf\") pod \"tuned-kd4w2\" (UID: \"26923c38-af88-40e6-acd9-e1135c078ad1\") " pod="openshift-cluster-node-tuning-operator/tuned-kd4w2" Apr 28 19:16:25.925825 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.924973 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/42108ecf-b2c5-4c69-b376-2e4a5f47a989-device-dir\") pod \"aws-ebs-csi-driver-node-qgcmm\" (UID: \"42108ecf-b2c5-4c69-b376-2e4a5f47a989\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qgcmm" Apr 28 19:16:25.925825 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.924982 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d4a7a1d2-5229-47cb-b4b1-097846a273d7-os-release\") pod \"multus-additional-cni-plugins-cwbdf\" (UID: \"d4a7a1d2-5229-47cb-b4b1-097846a273d7\") " pod="openshift-multus/multus-additional-cni-plugins-cwbdf" Apr 28 19:16:25.925825 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.924994 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/238e19ca-102f-43b1-8aed-9322ca47bfc9-host-cni-netd\") pod \"ovnkube-node-ppk4t\" (UID: \"238e19ca-102f-43b1-8aed-9322ca47bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:25.925825 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.924999 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/238e19ca-102f-43b1-8aed-9322ca47bfc9-etc-openvswitch\") pod \"ovnkube-node-ppk4t\" (UID: \"238e19ca-102f-43b1-8aed-9322ca47bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:25.925825 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.925027 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/238e19ca-102f-43b1-8aed-9322ca47bfc9-run-openvswitch\") pod \"ovnkube-node-ppk4t\" (UID: \"238e19ca-102f-43b1-8aed-9322ca47bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:25.925825 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.925028 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd-host-var-lib-cni-bin\") pod \"multus-kdzc2\" (UID: \"3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd\") " pod="openshift-multus/multus-kdzc2" Apr 28 19:16:25.925825 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.925042 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/42108ecf-b2c5-4c69-b376-2e4a5f47a989-kubelet-dir\") pod \"aws-ebs-csi-driver-node-qgcmm\" (UID: \"42108ecf-b2c5-4c69-b376-2e4a5f47a989\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qgcmm" Apr 28 19:16:25.925825 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.924567 2571 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 28 19:16:25.925825 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.925059 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/26923c38-af88-40e6-acd9-e1135c078ad1-host\") pod \"tuned-kd4w2\" (UID: \"26923c38-af88-40e6-acd9-e1135c078ad1\") " pod="openshift-cluster-node-tuning-operator/tuned-kd4w2" Apr 28 19:16:25.925825 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.925081 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/42108ecf-b2c5-4c69-b376-2e4a5f47a989-device-dir\") pod \"aws-ebs-csi-driver-node-qgcmm\" (UID: \"42108ecf-b2c5-4c69-b376-2e4a5f47a989\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qgcmm" Apr 28 19:16:25.925825 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.925083 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/26923c38-af88-40e6-acd9-e1135c078ad1-tmp\") pod \"tuned-kd4w2\" (UID: \"26923c38-af88-40e6-acd9-e1135c078ad1\") " pod="openshift-cluster-node-tuning-operator/tuned-kd4w2" Apr 28 19:16:25.926656 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.925110 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/42108ecf-b2c5-4c69-b376-2e4a5f47a989-socket-dir\") pod \"aws-ebs-csi-driver-node-qgcmm\" (UID: \"42108ecf-b2c5-4c69-b376-2e4a5f47a989\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qgcmm" Apr 28 19:16:25.926656 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.925134 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/26923c38-af88-40e6-acd9-e1135c078ad1-etc-modprobe-d\") pod \"tuned-kd4w2\" (UID: \"26923c38-af88-40e6-acd9-e1135c078ad1\") " pod="openshift-cluster-node-tuning-operator/tuned-kd4w2" Apr 28 19:16:25.926656 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.925159 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mjkwb\" (UniqueName: \"kubernetes.io/projected/dae64fa9-2628-461e-a0d3-e468450879cf-kube-api-access-mjkwb\") pod \"node-resolver-w5zst\" (UID: \"dae64fa9-2628-461e-a0d3-e468450879cf\") " pod="openshift-dns/node-resolver-w5zst" Apr 28 19:16:25.926656 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.925185 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d4a7a1d2-5229-47cb-b4b1-097846a273d7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-cwbdf\" (UID: \"d4a7a1d2-5229-47cb-b4b1-097846a273d7\") " pod="openshift-multus/multus-additional-cni-plugins-cwbdf" Apr 28 19:16:25.926656 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.925210 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d4a7a1d2-5229-47cb-b4b1-097846a273d7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-cwbdf\" (UID: \"d4a7a1d2-5229-47cb-b4b1-097846a273d7\") " pod="openshift-multus/multus-additional-cni-plugins-cwbdf" Apr 28 19:16:25.926656 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.925220 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/238e19ca-102f-43b1-8aed-9322ca47bfc9-etc-openvswitch\") pod \"ovnkube-node-ppk4t\" (UID: \"238e19ca-102f-43b1-8aed-9322ca47bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:25.926656 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.925237 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd-multus-cni-dir\") pod \"multus-kdzc2\" (UID: \"3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd\") " pod="openshift-multus/multus-kdzc2" Apr 28 19:16:25.926656 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.925264 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d5c2w\" (UniqueName: \"kubernetes.io/projected/42108ecf-b2c5-4c69-b376-2e4a5f47a989-kube-api-access-d5c2w\") pod \"aws-ebs-csi-driver-node-qgcmm\" (UID: \"42108ecf-b2c5-4c69-b376-2e4a5f47a989\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qgcmm" Apr 28 19:16:25.926656 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.925289 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/9a11e608-5ea9-4123-8db2-08683b9e10b6-konnectivity-ca\") pod \"konnectivity-agent-76d57\" (UID: \"9a11e608-5ea9-4123-8db2-08683b9e10b6\") " pod="kube-system/konnectivity-agent-76d57" Apr 28 19:16:25.926656 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.925345 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/26923c38-af88-40e6-acd9-e1135c078ad1-etc-modprobe-d\") pod \"tuned-kd4w2\" (UID: \"26923c38-af88-40e6-acd9-e1135c078ad1\") " pod="openshift-cluster-node-tuning-operator/tuned-kd4w2" Apr 28 19:16:25.926656 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.925404 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/42108ecf-b2c5-4c69-b376-2e4a5f47a989-socket-dir\") pod \"aws-ebs-csi-driver-node-qgcmm\" (UID: \"42108ecf-b2c5-4c69-b376-2e4a5f47a989\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qgcmm" Apr 28 19:16:25.926656 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.925457 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9zk8d\" (UniqueName: \"kubernetes.io/projected/5de918e6-f589-4708-869e-21232a3f0b2e-kube-api-access-9zk8d\") pod \"node-ca-jw8bb\" (UID: \"5de918e6-f589-4708-869e-21232a3f0b2e\") " pod="openshift-image-registry/node-ca-jw8bb" Apr 28 19:16:25.926656 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.925507 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/238e19ca-102f-43b1-8aed-9322ca47bfc9-host-kubelet\") pod \"ovnkube-node-ppk4t\" (UID: \"238e19ca-102f-43b1-8aed-9322ca47bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:25.926656 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.925542 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd-etc-kubernetes\") pod \"multus-kdzc2\" (UID: \"3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd\") " pod="openshift-multus/multus-kdzc2" Apr 28 19:16:25.926656 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.925544 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/238e19ca-102f-43b1-8aed-9322ca47bfc9-ovnkube-config\") pod \"ovnkube-node-ppk4t\" (UID: \"238e19ca-102f-43b1-8aed-9322ca47bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:25.926656 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.925547 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/26923c38-af88-40e6-acd9-e1135c078ad1-host\") pod \"tuned-kd4w2\" (UID: \"26923c38-af88-40e6-acd9-e1135c078ad1\") " pod="openshift-cluster-node-tuning-operator/tuned-kd4w2" Apr 28 19:16:25.926656 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.925570 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/42108ecf-b2c5-4c69-b376-2e4a5f47a989-sys-fs\") pod \"aws-ebs-csi-driver-node-qgcmm\" (UID: \"42108ecf-b2c5-4c69-b376-2e4a5f47a989\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qgcmm" Apr 28 19:16:25.927407 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.925601 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/dae64fa9-2628-461e-a0d3-e468450879cf-hosts-file\") pod \"node-resolver-w5zst\" (UID: \"dae64fa9-2628-461e-a0d3-e468450879cf\") " pod="openshift-dns/node-resolver-w5zst" Apr 28 19:16:25.927407 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.925627 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/238e19ca-102f-43b1-8aed-9322ca47bfc9-host-run-netns\") pod \"ovnkube-node-ppk4t\" (UID: \"238e19ca-102f-43b1-8aed-9322ca47bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:25.927407 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.925651 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d4a7a1d2-5229-47cb-b4b1-097846a273d7-cnibin\") pod \"multus-additional-cni-plugins-cwbdf\" (UID: \"d4a7a1d2-5229-47cb-b4b1-097846a273d7\") " pod="openshift-multus/multus-additional-cni-plugins-cwbdf" Apr 28 19:16:25.927407 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.925676 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/26923c38-af88-40e6-acd9-e1135c078ad1-etc-sysctl-d\") pod \"tuned-kd4w2\" (UID: \"26923c38-af88-40e6-acd9-e1135c078ad1\") " pod="openshift-cluster-node-tuning-operator/tuned-kd4w2" Apr 28 19:16:25.927407 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.925701 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/238e19ca-102f-43b1-8aed-9322ca47bfc9-host-cni-bin\") pod \"ovnkube-node-ppk4t\" (UID: \"238e19ca-102f-43b1-8aed-9322ca47bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:25.927407 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.925727 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d4a7a1d2-5229-47cb-b4b1-097846a273d7-cni-binary-copy\") pod \"multus-additional-cni-plugins-cwbdf\" (UID: \"d4a7a1d2-5229-47cb-b4b1-097846a273d7\") " pod="openshift-multus/multus-additional-cni-plugins-cwbdf" Apr 28 19:16:25.927407 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.925753 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd-cnibin\") pod \"multus-kdzc2\" (UID: \"3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd\") " pod="openshift-multus/multus-kdzc2" Apr 28 19:16:25.927407 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.925798 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/9a11e608-5ea9-4123-8db2-08683b9e10b6-konnectivity-ca\") pod \"konnectivity-agent-76d57\" (UID: \"9a11e608-5ea9-4123-8db2-08683b9e10b6\") " pod="kube-system/konnectivity-agent-76d57" Apr 28 19:16:25.927407 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.925807 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd-host-run-k8s-cni-cncf-io\") pod \"multus-kdzc2\" (UID: \"3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd\") " pod="openshift-multus/multus-kdzc2" Apr 28 19:16:25.927407 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.925831 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd-host-var-lib-cni-multus\") pod \"multus-kdzc2\" (UID: \"3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd\") " pod="openshift-multus/multus-kdzc2" Apr 28 19:16:25.927407 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.925839 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/dae64fa9-2628-461e-a0d3-e468450879cf-hosts-file\") pod \"node-resolver-w5zst\" (UID: \"dae64fa9-2628-461e-a0d3-e468450879cf\") " pod="openshift-dns/node-resolver-w5zst" Apr 28 19:16:25.927407 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.925850 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/238e19ca-102f-43b1-8aed-9322ca47bfc9-host-kubelet\") pod \"ovnkube-node-ppk4t\" (UID: \"238e19ca-102f-43b1-8aed-9322ca47bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:25.927407 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.925857 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d4a7a1d2-5229-47cb-b4b1-097846a273d7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-cwbdf\" (UID: \"d4a7a1d2-5229-47cb-b4b1-097846a273d7\") " pod="openshift-multus/multus-additional-cni-plugins-cwbdf" Apr 28 19:16:25.927407 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.925862 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/42108ecf-b2c5-4c69-b376-2e4a5f47a989-etc-selinux\") pod \"aws-ebs-csi-driver-node-qgcmm\" (UID: \"42108ecf-b2c5-4c69-b376-2e4a5f47a989\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qgcmm" Apr 28 19:16:25.927407 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.925875 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d4a7a1d2-5229-47cb-b4b1-097846a273d7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-cwbdf\" (UID: \"d4a7a1d2-5229-47cb-b4b1-097846a273d7\") " pod="openshift-multus/multus-additional-cni-plugins-cwbdf" Apr 28 19:16:25.927407 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.925904 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b-metrics-certs\") pod \"network-metrics-daemon-zlvsf\" (UID: \"caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b\") " pod="openshift-multus/network-metrics-daemon-zlvsf" Apr 28 19:16:25.927407 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.925894 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/42108ecf-b2c5-4c69-b376-2e4a5f47a989-sys-fs\") pod \"aws-ebs-csi-driver-node-qgcmm\" (UID: \"42108ecf-b2c5-4c69-b376-2e4a5f47a989\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qgcmm" Apr 28 19:16:25.928220 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.925936 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/31324b10-25d4-4dfc-b0e6-a7f99e5a27c7-host-slash\") pod \"iptables-alerter-4t6vk\" (UID: \"31324b10-25d4-4dfc-b0e6-a7f99e5a27c7\") " pod="openshift-network-operator/iptables-alerter-4t6vk" Apr 28 19:16:25.928220 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.925945 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/238e19ca-102f-43b1-8aed-9322ca47bfc9-host-cni-bin\") pod \"ovnkube-node-ppk4t\" (UID: \"238e19ca-102f-43b1-8aed-9322ca47bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:25.928220 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.925963 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/42108ecf-b2c5-4c69-b376-2e4a5f47a989-etc-selinux\") pod \"aws-ebs-csi-driver-node-qgcmm\" (UID: \"42108ecf-b2c5-4c69-b376-2e4a5f47a989\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qgcmm" Apr 28 19:16:25.928220 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.925964 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6pc96\" (UniqueName: \"kubernetes.io/projected/e46a06a4-894f-4f3d-a446-b501af6e42eb-kube-api-access-6pc96\") pod \"network-check-target-5qtkh\" (UID: \"e46a06a4-894f-4f3d-a446-b501af6e42eb\") " pod="openshift-network-diagnostics/network-check-target-5qtkh" Apr 28 19:16:25.928220 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.925985 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/238e19ca-102f-43b1-8aed-9322ca47bfc9-host-run-netns\") pod \"ovnkube-node-ppk4t\" (UID: \"238e19ca-102f-43b1-8aed-9322ca47bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:25.928220 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:25.926009 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:25.928220 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.926020 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/238e19ca-102f-43b1-8aed-9322ca47bfc9-var-lib-openvswitch\") pod \"ovnkube-node-ppk4t\" (UID: \"238e19ca-102f-43b1-8aed-9322ca47bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:25.928220 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.926045 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/26923c38-af88-40e6-acd9-e1135c078ad1-etc-kubernetes\") pod \"tuned-kd4w2\" (UID: \"26923c38-af88-40e6-acd9-e1135c078ad1\") " pod="openshift-cluster-node-tuning-operator/tuned-kd4w2" Apr 28 19:16:25.928220 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:25.926077 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b-metrics-certs podName:caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b nodeName:}" failed. No retries permitted until 2026-04-28 19:16:26.426057016 +0000 UTC m=+3.056957978 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b-metrics-certs") pod "network-metrics-daemon-zlvsf" (UID: "caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:25.928220 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.926095 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/238e19ca-102f-43b1-8aed-9322ca47bfc9-var-lib-openvswitch\") pod \"ovnkube-node-ppk4t\" (UID: \"238e19ca-102f-43b1-8aed-9322ca47bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:25.928220 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.926098 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5de918e6-f589-4708-869e-21232a3f0b2e-serviceca\") pod \"node-ca-jw8bb\" (UID: \"5de918e6-f589-4708-869e-21232a3f0b2e\") " pod="openshift-image-registry/node-ca-jw8bb" Apr 28 19:16:25.928220 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.926109 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/31324b10-25d4-4dfc-b0e6-a7f99e5a27c7-host-slash\") pod \"iptables-alerter-4t6vk\" (UID: \"31324b10-25d4-4dfc-b0e6-a7f99e5a27c7\") " pod="openshift-network-operator/iptables-alerter-4t6vk" Apr 28 19:16:25.928220 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.926143 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/31324b10-25d4-4dfc-b0e6-a7f99e5a27c7-iptables-alerter-script\") pod \"iptables-alerter-4t6vk\" (UID: \"31324b10-25d4-4dfc-b0e6-a7f99e5a27c7\") " pod="openshift-network-operator/iptables-alerter-4t6vk" Apr 28 19:16:25.928220 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.926169 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/238e19ca-102f-43b1-8aed-9322ca47bfc9-run-systemd\") pod \"ovnkube-node-ppk4t\" (UID: \"238e19ca-102f-43b1-8aed-9322ca47bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:25.928220 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.926171 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d4a7a1d2-5229-47cb-b4b1-097846a273d7-cnibin\") pod \"multus-additional-cni-plugins-cwbdf\" (UID: \"d4a7a1d2-5229-47cb-b4b1-097846a273d7\") " pod="openshift-multus/multus-additional-cni-plugins-cwbdf" Apr 28 19:16:25.928220 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.926196 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/238e19ca-102f-43b1-8aed-9322ca47bfc9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ppk4t\" (UID: \"238e19ca-102f-43b1-8aed-9322ca47bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:25.928220 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.926222 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd-os-release\") pod \"multus-kdzc2\" (UID: \"3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd\") " pod="openshift-multus/multus-kdzc2" Apr 28 19:16:25.929067 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.926245 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/dae64fa9-2628-461e-a0d3-e468450879cf-tmp-dir\") pod \"node-resolver-w5zst\" (UID: \"dae64fa9-2628-461e-a0d3-e468450879cf\") " pod="openshift-dns/node-resolver-w5zst" Apr 28 19:16:25.929067 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.926269 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/238e19ca-102f-43b1-8aed-9322ca47bfc9-run-ovn\") pod \"ovnkube-node-ppk4t\" (UID: \"238e19ca-102f-43b1-8aed-9322ca47bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:25.929067 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.926296 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/238e19ca-102f-43b1-8aed-9322ca47bfc9-ovn-node-metrics-cert\") pod \"ovnkube-node-ppk4t\" (UID: \"238e19ca-102f-43b1-8aed-9322ca47bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:25.929067 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.926320 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/238e19ca-102f-43b1-8aed-9322ca47bfc9-ovnkube-script-lib\") pod \"ovnkube-node-ppk4t\" (UID: \"238e19ca-102f-43b1-8aed-9322ca47bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:25.929067 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.926325 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/26923c38-af88-40e6-acd9-e1135c078ad1-etc-sysctl-d\") pod \"tuned-kd4w2\" (UID: \"26923c38-af88-40e6-acd9-e1135c078ad1\") " pod="openshift-cluster-node-tuning-operator/tuned-kd4w2" Apr 28 19:16:25.929067 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.926345 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d4a7a1d2-5229-47cb-b4b1-097846a273d7-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-cwbdf\" (UID: \"d4a7a1d2-5229-47cb-b4b1-097846a273d7\") " pod="openshift-multus/multus-additional-cni-plugins-cwbdf" Apr 28 19:16:25.929067 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.926372 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd-multus-conf-dir\") pod \"multus-kdzc2\" (UID: \"3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd\") " pod="openshift-multus/multus-kdzc2" Apr 28 19:16:25.929067 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.926377 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/238e19ca-102f-43b1-8aed-9322ca47bfc9-run-systemd\") pod \"ovnkube-node-ppk4t\" (UID: \"238e19ca-102f-43b1-8aed-9322ca47bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:25.929067 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.926399 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/26923c38-af88-40e6-acd9-e1135c078ad1-run\") pod \"tuned-kd4w2\" (UID: \"26923c38-af88-40e6-acd9-e1135c078ad1\") " pod="openshift-cluster-node-tuning-operator/tuned-kd4w2" Apr 28 19:16:25.929067 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.926410 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/238e19ca-102f-43b1-8aed-9322ca47bfc9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ppk4t\" (UID: \"238e19ca-102f-43b1-8aed-9322ca47bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:25.929067 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.926442 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/26923c38-af88-40e6-acd9-e1135c078ad1-var-lib-kubelet\") pod \"tuned-kd4w2\" (UID: \"26923c38-af88-40e6-acd9-e1135c078ad1\") " pod="openshift-cluster-node-tuning-operator/tuned-kd4w2" Apr 28 19:16:25.929067 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.926468 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-njlhg\" (UniqueName: \"kubernetes.io/projected/238e19ca-102f-43b1-8aed-9322ca47bfc9-kube-api-access-njlhg\") pod \"ovnkube-node-ppk4t\" (UID: \"238e19ca-102f-43b1-8aed-9322ca47bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:25.929067 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.926473 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d4a7a1d2-5229-47cb-b4b1-097846a273d7-cni-binary-copy\") pod \"multus-additional-cni-plugins-cwbdf\" (UID: \"d4a7a1d2-5229-47cb-b4b1-097846a273d7\") " pod="openshift-multus/multus-additional-cni-plugins-cwbdf" Apr 28 19:16:25.929067 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.926511 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd-host-var-lib-kubelet\") pod \"multus-kdzc2\" (UID: \"3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd\") " pod="openshift-multus/multus-kdzc2" Apr 28 19:16:25.929067 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.926146 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/26923c38-af88-40e6-acd9-e1135c078ad1-etc-kubernetes\") pod \"tuned-kd4w2\" (UID: \"26923c38-af88-40e6-acd9-e1135c078ad1\") " pod="openshift-cluster-node-tuning-operator/tuned-kd4w2" Apr 28 19:16:25.929067 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.926548 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5de918e6-f589-4708-869e-21232a3f0b2e-serviceca\") pod \"node-ca-jw8bb\" (UID: \"5de918e6-f589-4708-869e-21232a3f0b2e\") " pod="openshift-image-registry/node-ca-jw8bb" Apr 28 19:16:25.929067 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.926609 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/26923c38-af88-40e6-acd9-e1135c078ad1-var-lib-kubelet\") pod \"tuned-kd4w2\" (UID: \"26923c38-af88-40e6-acd9-e1135c078ad1\") " pod="openshift-cluster-node-tuning-operator/tuned-kd4w2" Apr 28 19:16:25.929564 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.926747 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/31324b10-25d4-4dfc-b0e6-a7f99e5a27c7-iptables-alerter-script\") pod \"iptables-alerter-4t6vk\" (UID: \"31324b10-25d4-4dfc-b0e6-a7f99e5a27c7\") " pod="openshift-network-operator/iptables-alerter-4t6vk" Apr 28 19:16:25.929564 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.926866 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/238e19ca-102f-43b1-8aed-9322ca47bfc9-run-ovn\") pod \"ovnkube-node-ppk4t\" (UID: \"238e19ca-102f-43b1-8aed-9322ca47bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:25.929564 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.926611 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/26923c38-af88-40e6-acd9-e1135c078ad1-run\") pod \"tuned-kd4w2\" (UID: \"26923c38-af88-40e6-acd9-e1135c078ad1\") " pod="openshift-cluster-node-tuning-operator/tuned-kd4w2" Apr 28 19:16:25.929564 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.927038 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/238e19ca-102f-43b1-8aed-9322ca47bfc9-ovnkube-script-lib\") pod \"ovnkube-node-ppk4t\" (UID: \"238e19ca-102f-43b1-8aed-9322ca47bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:25.929564 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.927066 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d4a7a1d2-5229-47cb-b4b1-097846a273d7-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-cwbdf\" (UID: \"d4a7a1d2-5229-47cb-b4b1-097846a273d7\") " pod="openshift-multus/multus-additional-cni-plugins-cwbdf" Apr 28 19:16:25.929564 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.927154 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/dae64fa9-2628-461e-a0d3-e468450879cf-tmp-dir\") pod \"node-resolver-w5zst\" (UID: \"dae64fa9-2628-461e-a0d3-e468450879cf\") " pod="openshift-dns/node-resolver-w5zst" Apr 28 19:16:25.929564 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.928348 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/26923c38-af88-40e6-acd9-e1135c078ad1-etc-tuned\") pod \"tuned-kd4w2\" (UID: \"26923c38-af88-40e6-acd9-e1135c078ad1\") " pod="openshift-cluster-node-tuning-operator/tuned-kd4w2" Apr 28 19:16:25.929564 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.928720 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/26923c38-af88-40e6-acd9-e1135c078ad1-tmp\") pod \"tuned-kd4w2\" (UID: \"26923c38-af88-40e6-acd9-e1135c078ad1\") " pod="openshift-cluster-node-tuning-operator/tuned-kd4w2" Apr 28 19:16:25.929564 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.928726 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/9a11e608-5ea9-4123-8db2-08683b9e10b6-agent-certs\") pod \"konnectivity-agent-76d57\" (UID: \"9a11e608-5ea9-4123-8db2-08683b9e10b6\") " pod="kube-system/konnectivity-agent-76d57" Apr 28 19:16:25.929564 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.929220 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/238e19ca-102f-43b1-8aed-9322ca47bfc9-ovn-node-metrics-cert\") pod \"ovnkube-node-ppk4t\" (UID: \"238e19ca-102f-43b1-8aed-9322ca47bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:25.933425 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:25.933403 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:16:25.933425 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:25.933428 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:16:25.933789 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:25.933441 2571 projected.go:194] Error preparing data for projected volume kube-api-access-6pc96 for pod openshift-network-diagnostics/network-check-target-5qtkh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:25.933789 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:25.933545 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e46a06a4-894f-4f3d-a446-b501af6e42eb-kube-api-access-6pc96 podName:e46a06a4-894f-4f3d-a446-b501af6e42eb nodeName:}" failed. No retries permitted until 2026-04-28 19:16:26.433512985 +0000 UTC m=+3.064413909 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-6pc96" (UniqueName: "kubernetes.io/projected/e46a06a4-894f-4f3d-a446-b501af6e42eb-kube-api-access-6pc96") pod "network-check-target-5qtkh" (UID: "e46a06a4-894f-4f3d-a446-b501af6e42eb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:25.936647 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.936167 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rj6gs\" (UniqueName: \"kubernetes.io/projected/31324b10-25d4-4dfc-b0e6-a7f99e5a27c7-kube-api-access-rj6gs\") pod \"iptables-alerter-4t6vk\" (UID: \"31324b10-25d4-4dfc-b0e6-a7f99e5a27c7\") " pod="openshift-network-operator/iptables-alerter-4t6vk" Apr 28 19:16:25.936647 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.936252 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zvnx\" (UniqueName: \"kubernetes.io/projected/caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b-kube-api-access-7zvnx\") pod \"network-metrics-daemon-zlvsf\" (UID: \"caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b\") " pod="openshift-multus/network-metrics-daemon-zlvsf" Apr 28 19:16:25.936647 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.936589 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjkwb\" (UniqueName: \"kubernetes.io/projected/dae64fa9-2628-461e-a0d3-e468450879cf-kube-api-access-mjkwb\") pod \"node-resolver-w5zst\" (UID: \"dae64fa9-2628-461e-a0d3-e468450879cf\") " pod="openshift-dns/node-resolver-w5zst" Apr 28 19:16:25.937026 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.936996 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-njlhg\" (UniqueName: \"kubernetes.io/projected/238e19ca-102f-43b1-8aed-9322ca47bfc9-kube-api-access-njlhg\") pod \"ovnkube-node-ppk4t\" (UID: \"238e19ca-102f-43b1-8aed-9322ca47bfc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:25.937685 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.937658 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zk8d\" (UniqueName: \"kubernetes.io/projected/5de918e6-f589-4708-869e-21232a3f0b2e-kube-api-access-9zk8d\") pod \"node-ca-jw8bb\" (UID: \"5de918e6-f589-4708-869e-21232a3f0b2e\") " pod="openshift-image-registry/node-ca-jw8bb" Apr 28 19:16:25.938125 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.938102 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5c2w\" (UniqueName: \"kubernetes.io/projected/42108ecf-b2c5-4c69-b376-2e4a5f47a989-kube-api-access-d5c2w\") pod \"aws-ebs-csi-driver-node-qgcmm\" (UID: \"42108ecf-b2c5-4c69-b376-2e4a5f47a989\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qgcmm" Apr 28 19:16:25.938717 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.938699 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgfpz\" (UniqueName: \"kubernetes.io/projected/d4a7a1d2-5229-47cb-b4b1-097846a273d7-kube-api-access-cgfpz\") pod \"multus-additional-cni-plugins-cwbdf\" (UID: \"d4a7a1d2-5229-47cb-b4b1-097846a273d7\") " pod="openshift-multus/multus-additional-cni-plugins-cwbdf" Apr 28 19:16:25.939106 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.939090 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wjwj\" (UniqueName: \"kubernetes.io/projected/26923c38-af88-40e6-acd9-e1135c078ad1-kube-api-access-4wjwj\") pod \"tuned-kd4w2\" (UID: \"26923c38-af88-40e6-acd9-e1135c078ad1\") " pod="openshift-cluster-node-tuning-operator/tuned-kd4w2" Apr 28 19:16:25.948343 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:25.948320 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 28 19:16:26.027430 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:26.027397 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd-cni-binary-copy\") pod \"multus-kdzc2\" (UID: \"3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd\") " pod="openshift-multus/multus-kdzc2" Apr 28 19:16:26.027430 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:26.027434 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd-hostroot\") pod \"multus-kdzc2\" (UID: \"3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd\") " pod="openshift-multus/multus-kdzc2" Apr 28 19:16:26.027653 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:26.027454 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j7nr5\" (UniqueName: \"kubernetes.io/projected/3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd-kube-api-access-j7nr5\") pod \"multus-kdzc2\" (UID: \"3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd\") " pod="openshift-multus/multus-kdzc2" Apr 28 19:16:26.027653 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:26.027498 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd-multus-socket-dir-parent\") pod \"multus-kdzc2\" (UID: \"3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd\") " pod="openshift-multus/multus-kdzc2" Apr 28 19:16:26.027653 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:26.027524 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd-system-cni-dir\") pod \"multus-kdzc2\" (UID: \"3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd\") " pod="openshift-multus/multus-kdzc2" Apr 28 19:16:26.027653 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:26.027537 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd-hostroot\") pod \"multus-kdzc2\" (UID: \"3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd\") " pod="openshift-multus/multus-kdzc2" Apr 28 19:16:26.027653 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:26.027544 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd-multus-daemon-config\") pod \"multus-kdzc2\" (UID: \"3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd\") " pod="openshift-multus/multus-kdzc2" Apr 28 19:16:26.027653 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:26.027602 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd-host-run-netns\") pod \"multus-kdzc2\" (UID: \"3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd\") " pod="openshift-multus/multus-kdzc2" Apr 28 19:16:26.027653 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:26.027633 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd-host-run-multus-certs\") pod \"multus-kdzc2\" (UID: \"3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd\") " pod="openshift-multus/multus-kdzc2" Apr 28 19:16:26.027910 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:26.027662 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd-host-var-lib-cni-bin\") pod \"multus-kdzc2\" (UID: \"3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd\") " pod="openshift-multus/multus-kdzc2" Apr 28 19:16:26.027910 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:26.027692 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd-multus-cni-dir\") pod \"multus-kdzc2\" (UID: \"3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd\") " pod="openshift-multus/multus-kdzc2" Apr 28 19:16:26.027910 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:26.027720 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd-etc-kubernetes\") pod \"multus-kdzc2\" (UID: \"3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd\") " pod="openshift-multus/multus-kdzc2" Apr 28 19:16:26.027910 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:26.027749 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd-cnibin\") pod \"multus-kdzc2\" (UID: \"3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd\") " pod="openshift-multus/multus-kdzc2" Apr 28 19:16:26.027910 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:26.027769 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd-multus-socket-dir-parent\") pod \"multus-kdzc2\" (UID: \"3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd\") " pod="openshift-multus/multus-kdzc2" Apr 28 19:16:26.027910 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:26.027773 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd-host-run-k8s-cni-cncf-io\") pod \"multus-kdzc2\" (UID: \"3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd\") " pod="openshift-multus/multus-kdzc2" Apr 28 19:16:26.027910 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:26.027800 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd-host-var-lib-cni-multus\") pod \"multus-kdzc2\" (UID: \"3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd\") " pod="openshift-multus/multus-kdzc2" Apr 28 19:16:26.027910 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:26.027810 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd-host-run-k8s-cni-cncf-io\") pod \"multus-kdzc2\" (UID: \"3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd\") " pod="openshift-multus/multus-kdzc2" Apr 28 19:16:26.027910 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:26.027849 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd-os-release\") pod \"multus-kdzc2\" (UID: \"3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd\") " pod="openshift-multus/multus-kdzc2" Apr 28 19:16:26.027910 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:26.027880 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd-multus-conf-dir\") pod \"multus-kdzc2\" (UID: \"3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd\") " pod="openshift-multus/multus-kdzc2" Apr 28 19:16:26.027910 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:26.027907 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd-host-var-lib-kubelet\") pod \"multus-kdzc2\" (UID: \"3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd\") " pod="openshift-multus/multus-kdzc2" Apr 28 19:16:26.027910 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:26.027911 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd-host-var-lib-cni-bin\") pod \"multus-kdzc2\" (UID: \"3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd\") " pod="openshift-multus/multus-kdzc2" Apr 28 19:16:26.028368 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:26.027949 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd-cni-binary-copy\") pod \"multus-kdzc2\" (UID: \"3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd\") " pod="openshift-multus/multus-kdzc2" Apr 28 19:16:26.028368 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:26.027967 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd-system-cni-dir\") pod \"multus-kdzc2\" (UID: \"3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd\") " pod="openshift-multus/multus-kdzc2" Apr 28 19:16:26.028368 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:26.027851 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd-host-run-netns\") pod \"multus-kdzc2\" (UID: \"3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd\") " pod="openshift-multus/multus-kdzc2" Apr 28 19:16:26.028368 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:26.027999 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd-host-var-lib-cni-multus\") pod \"multus-kdzc2\" (UID: \"3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd\") " pod="openshift-multus/multus-kdzc2" Apr 28 19:16:26.028368 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:26.028009 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd-multus-conf-dir\") pod \"multus-kdzc2\" (UID: \"3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd\") " pod="openshift-multus/multus-kdzc2" Apr 28 19:16:26.028368 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:26.028036 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd-etc-kubernetes\") pod \"multus-kdzc2\" (UID: \"3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd\") " pod="openshift-multus/multus-kdzc2" Apr 28 19:16:26.028368 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:26.028055 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd-multus-cni-dir\") pod \"multus-kdzc2\" (UID: \"3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd\") " pod="openshift-multus/multus-kdzc2" Apr 28 19:16:26.028368 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:26.027881 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd-host-run-multus-certs\") pod \"multus-kdzc2\" (UID: \"3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd\") " pod="openshift-multus/multus-kdzc2" Apr 28 19:16:26.028368 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:26.028084 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd-cnibin\") pod \"multus-kdzc2\" (UID: \"3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd\") " pod="openshift-multus/multus-kdzc2" Apr 28 19:16:26.028368 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:26.028097 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd-host-var-lib-kubelet\") pod \"multus-kdzc2\" (UID: \"3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd\") " pod="openshift-multus/multus-kdzc2" Apr 28 19:16:26.028368 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:26.028139 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd-os-release\") pod \"multus-kdzc2\" (UID: \"3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd\") " pod="openshift-multus/multus-kdzc2" Apr 28 19:16:26.028368 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:26.028341 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd-multus-daemon-config\") pod \"multus-kdzc2\" (UID: \"3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd\") " pod="openshift-multus/multus-kdzc2" Apr 28 19:16:26.036415 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:26.036396 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7nr5\" (UniqueName: \"kubernetes.io/projected/3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd-kube-api-access-j7nr5\") pod \"multus-kdzc2\" (UID: \"3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd\") " pod="openshift-multus/multus-kdzc2" Apr 28 19:16:26.108385 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:26.108345 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-76d57" Apr 28 19:16:26.123258 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:26.123232 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4t6vk" Apr 28 19:16:26.128792 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:26.128771 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jw8bb" Apr 28 19:16:26.134330 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:26.134312 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qgcmm" Apr 28 19:16:26.139892 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:26.139874 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-kd4w2" Apr 28 19:16:26.146518 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:26.146459 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-w5zst" Apr 28 19:16:26.153070 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:26.153053 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:26.158568 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:26.158544 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-cwbdf" Apr 28 19:16:26.163101 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:26.163083 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-kdzc2" Apr 28 19:16:26.430011 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:26.429985 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b-metrics-certs\") pod \"network-metrics-daemon-zlvsf\" (UID: \"caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b\") " pod="openshift-multus/network-metrics-daemon-zlvsf" Apr 28 19:16:26.430119 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:26.430095 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:26.430183 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:26.430147 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b-metrics-certs podName:caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b nodeName:}" failed. No retries permitted until 2026-04-28 19:16:27.430133283 +0000 UTC m=+4.061034205 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b-metrics-certs") pod "network-metrics-daemon-zlvsf" (UID: "caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:26.437228 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:26.437205 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a11e608_5ea9_4123_8db2_08683b9e10b6.slice/crio-6662c101b1c2cd417b9f276c8664ecc71c56c02dd296636d79146cefa3b30f27 WatchSource:0}: Error finding container 6662c101b1c2cd417b9f276c8664ecc71c56c02dd296636d79146cefa3b30f27: Status 404 returned error can't find the container with id 6662c101b1c2cd417b9f276c8664ecc71c56c02dd296636d79146cefa3b30f27 Apr 28 19:16:26.438229 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:26.438206 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31324b10_25d4_4dfc_b0e6_a7f99e5a27c7.slice/crio-0ab4be037a817e785b6c60ef9a8b18417bccba889336b3826589b819730fe9a0 WatchSource:0}: Error finding container 0ab4be037a817e785b6c60ef9a8b18417bccba889336b3826589b819730fe9a0: Status 404 returned error can't find the container with id 0ab4be037a817e785b6c60ef9a8b18417bccba889336b3826589b819730fe9a0 Apr 28 19:16:26.439569 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:26.439547 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4a7a1d2_5229_47cb_b4b1_097846a273d7.slice/crio-2035325ff15e20f5c5682c8fde7977f529c5a108351309f868792e4bb3a5b09d WatchSource:0}: Error finding container 2035325ff15e20f5c5682c8fde7977f529c5a108351309f868792e4bb3a5b09d: Status 404 returned error can't find the container with id 2035325ff15e20f5c5682c8fde7977f529c5a108351309f868792e4bb3a5b09d Apr 28 19:16:26.442631 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:26.442606 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod238e19ca_102f_43b1_8aed_9322ca47bfc9.slice/crio-7e272c16dc44a4e80b19655436df5349e7d8d65cde9ab39dabedc1842831754e WatchSource:0}: Error finding container 7e272c16dc44a4e80b19655436df5349e7d8d65cde9ab39dabedc1842831754e: Status 404 returned error can't find the container with id 7e272c16dc44a4e80b19655436df5349e7d8d65cde9ab39dabedc1842831754e Apr 28 19:16:26.531361 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:26.531214 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6pc96\" (UniqueName: \"kubernetes.io/projected/e46a06a4-894f-4f3d-a446-b501af6e42eb-kube-api-access-6pc96\") pod \"network-check-target-5qtkh\" (UID: \"e46a06a4-894f-4f3d-a446-b501af6e42eb\") " pod="openshift-network-diagnostics/network-check-target-5qtkh" Apr 28 19:16:26.531361 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:26.531358 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:16:26.531530 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:26.531380 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:16:26.531530 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:26.531391 2571 projected.go:194] Error preparing data for projected volume kube-api-access-6pc96 for pod openshift-network-diagnostics/network-check-target-5qtkh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:26.531530 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:26.531448 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e46a06a4-894f-4f3d-a446-b501af6e42eb-kube-api-access-6pc96 podName:e46a06a4-894f-4f3d-a446-b501af6e42eb nodeName:}" failed. No retries permitted until 2026-04-28 19:16:27.531425204 +0000 UTC m=+4.162326127 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-6pc96" (UniqueName: "kubernetes.io/projected/e46a06a4-894f-4f3d-a446-b501af6e42eb-kube-api-access-6pc96") pod "network-check-target-5qtkh" (UID: "e46a06a4-894f-4f3d-a446-b501af6e42eb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:26.897605 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:26.896275 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-27 19:11:24 +0000 UTC" deadline="2027-10-13 02:11:07.450350623 +0000 UTC" Apr 28 19:16:26.897605 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:26.896322 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12774h54m40.554031496s" Apr 28 19:16:26.942934 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:26.942411 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5qtkh" Apr 28 19:16:26.942934 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:26.942546 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5qtkh" podUID="e46a06a4-894f-4f3d-a446-b501af6e42eb" Apr 28 19:16:26.968035 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:26.967991 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-76d57" event={"ID":"9a11e608-5ea9-4123-8db2-08683b9e10b6","Type":"ContainerStarted","Data":"6662c101b1c2cd417b9f276c8664ecc71c56c02dd296636d79146cefa3b30f27"} Apr 28 19:16:26.972839 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:26.972750 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qgcmm" event={"ID":"42108ecf-b2c5-4c69-b376-2e4a5f47a989","Type":"ContainerStarted","Data":"5012b4f31d42b45f3d52e5a9c388a3b4e530c929f11af3d10699bf57bf2ca329"} Apr 28 19:16:26.989065 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:26.989037 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kdzc2" event={"ID":"3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd","Type":"ContainerStarted","Data":"a4bd826354830b8ef1a32789c56d421cbae2b25ed33053a71467d72b7e2be698"} Apr 28 19:16:26.997275 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:26.997218 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jw8bb" event={"ID":"5de918e6-f589-4708-869e-21232a3f0b2e","Type":"ContainerStarted","Data":"5305f55b5218df86627856190d711ecc7d0f68bf5a29105319a65235e99c09f8"} Apr 28 19:16:27.005338 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:27.005297 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-kd4w2" event={"ID":"26923c38-af88-40e6-acd9-e1135c078ad1","Type":"ContainerStarted","Data":"1f57bebba9961531701c6f76674a84429d7cf2dfe9447ddcdb1f549a739b4fa1"} Apr 28 19:16:27.013395 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:27.013369 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" event={"ID":"238e19ca-102f-43b1-8aed-9322ca47bfc9","Type":"ContainerStarted","Data":"7e272c16dc44a4e80b19655436df5349e7d8d65cde9ab39dabedc1842831754e"} Apr 28 19:16:27.025781 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:27.025730 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cwbdf" event={"ID":"d4a7a1d2-5229-47cb-b4b1-097846a273d7","Type":"ContainerStarted","Data":"2035325ff15e20f5c5682c8fde7977f529c5a108351309f868792e4bb3a5b09d"} Apr 28 19:16:27.032320 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:27.032283 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-128.ec2.internal" event={"ID":"3f217b631ac7267173c9067d07088610","Type":"ContainerStarted","Data":"7e49e2c516c27e0fa400053c38a92319357814ade24b255592a63685182f828c"} Apr 28 19:16:27.035273 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:27.035247 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-w5zst" event={"ID":"dae64fa9-2628-461e-a0d3-e468450879cf","Type":"ContainerStarted","Data":"770c612b3fadc61f1d1551e748767cc20121af4b57c325e3b537e1d0ce94c96e"} Apr 28 19:16:27.041180 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:27.041151 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4t6vk" event={"ID":"31324b10-25d4-4dfc-b0e6-a7f99e5a27c7","Type":"ContainerStarted","Data":"0ab4be037a817e785b6c60ef9a8b18417bccba889336b3826589b819730fe9a0"} Apr 28 19:16:27.047436 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:27.047384 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-128.ec2.internal" podStartSLOduration=2.047369747 podStartE2EDuration="2.047369747s" podCreationTimestamp="2026-04-28 19:16:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:16:27.047156594 +0000 UTC m=+3.678057539" watchObservedRunningTime="2026-04-28 19:16:27.047369747 +0000 UTC m=+3.678270693" Apr 28 19:16:27.440063 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:27.439630 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b-metrics-certs\") pod \"network-metrics-daemon-zlvsf\" (UID: \"caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b\") " pod="openshift-multus/network-metrics-daemon-zlvsf" Apr 28 19:16:27.440063 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:27.439757 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:27.440063 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:27.439810 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b-metrics-certs podName:caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b nodeName:}" failed. No retries permitted until 2026-04-28 19:16:29.439793406 +0000 UTC m=+6.070694332 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b-metrics-certs") pod "network-metrics-daemon-zlvsf" (UID: "caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:27.540853 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:27.540817 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6pc96\" (UniqueName: \"kubernetes.io/projected/e46a06a4-894f-4f3d-a446-b501af6e42eb-kube-api-access-6pc96\") pod \"network-check-target-5qtkh\" (UID: \"e46a06a4-894f-4f3d-a446-b501af6e42eb\") " pod="openshift-network-diagnostics/network-check-target-5qtkh" Apr 28 19:16:27.541012 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:27.540976 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:16:27.541012 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:27.540994 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:16:27.541012 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:27.541007 2571 projected.go:194] Error preparing data for projected volume kube-api-access-6pc96 for pod openshift-network-diagnostics/network-check-target-5qtkh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:27.541155 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:27.541063 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e46a06a4-894f-4f3d-a446-b501af6e42eb-kube-api-access-6pc96 podName:e46a06a4-894f-4f3d-a446-b501af6e42eb nodeName:}" failed. No retries permitted until 2026-04-28 19:16:29.541042405 +0000 UTC m=+6.171943334 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-6pc96" (UniqueName: "kubernetes.io/projected/e46a06a4-894f-4f3d-a446-b501af6e42eb-kube-api-access-6pc96") pod "network-check-target-5qtkh" (UID: "e46a06a4-894f-4f3d-a446-b501af6e42eb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:27.945244 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:27.945212 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlvsf" Apr 28 19:16:27.945724 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:27.945347 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlvsf" podUID="caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b" Apr 28 19:16:28.059306 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:28.058374 2571 generic.go:358] "Generic (PLEG): container finished" podID="096e44f912735215c31abcb4ee60cd12" containerID="5926957c4a3e8829606ac06668ce58ceb10bb52f7a6fd6ce27dc44e955aaee2b" exitCode=0 Apr 28 19:16:28.059306 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:28.059236 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-128.ec2.internal" event={"ID":"096e44f912735215c31abcb4ee60cd12","Type":"ContainerDied","Data":"5926957c4a3e8829606ac06668ce58ceb10bb52f7a6fd6ce27dc44e955aaee2b"} Apr 28 19:16:28.943310 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:28.943266 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5qtkh" Apr 28 19:16:28.943495 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:28.943395 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5qtkh" podUID="e46a06a4-894f-4f3d-a446-b501af6e42eb" Apr 28 19:16:29.065348 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:29.064720 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-128.ec2.internal" event={"ID":"096e44f912735215c31abcb4ee60cd12","Type":"ContainerStarted","Data":"2b5baaaa338910ca57f3966d0b50ec30259e674eaa342fa94a9cd25975b017ac"} Apr 28 19:16:29.084575 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:29.084302 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-128.ec2.internal" podStartSLOduration=4.084282766 podStartE2EDuration="4.084282766s" podCreationTimestamp="2026-04-28 19:16:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:16:29.083320378 +0000 UTC m=+5.714221324" watchObservedRunningTime="2026-04-28 19:16:29.084282766 +0000 UTC m=+5.715183722" Apr 28 19:16:29.457274 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:29.457232 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b-metrics-certs\") pod \"network-metrics-daemon-zlvsf\" (UID: \"caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b\") " pod="openshift-multus/network-metrics-daemon-zlvsf" Apr 28 19:16:29.457444 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:29.457407 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:29.457528 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:29.457496 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b-metrics-certs podName:caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b nodeName:}" failed. No retries permitted until 2026-04-28 19:16:33.457460614 +0000 UTC m=+10.088361560 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b-metrics-certs") pod "network-metrics-daemon-zlvsf" (UID: "caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:29.557637 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:29.557590 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6pc96\" (UniqueName: \"kubernetes.io/projected/e46a06a4-894f-4f3d-a446-b501af6e42eb-kube-api-access-6pc96\") pod \"network-check-target-5qtkh\" (UID: \"e46a06a4-894f-4f3d-a446-b501af6e42eb\") " pod="openshift-network-diagnostics/network-check-target-5qtkh" Apr 28 19:16:29.557860 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:29.557802 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:16:29.557860 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:29.557822 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:16:29.557860 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:29.557835 2571 projected.go:194] Error preparing data for projected volume kube-api-access-6pc96 for pod openshift-network-diagnostics/network-check-target-5qtkh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:29.558023 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:29.557898 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e46a06a4-894f-4f3d-a446-b501af6e42eb-kube-api-access-6pc96 podName:e46a06a4-894f-4f3d-a446-b501af6e42eb nodeName:}" failed. No retries permitted until 2026-04-28 19:16:33.557877779 +0000 UTC m=+10.188778711 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-6pc96" (UniqueName: "kubernetes.io/projected/e46a06a4-894f-4f3d-a446-b501af6e42eb-kube-api-access-6pc96") pod "network-check-target-5qtkh" (UID: "e46a06a4-894f-4f3d-a446-b501af6e42eb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:29.942751 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:29.942670 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlvsf" Apr 28 19:16:29.942914 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:29.942816 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlvsf" podUID="caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b" Apr 28 19:16:30.942841 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:30.942801 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5qtkh" Apr 28 19:16:30.943221 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:30.942934 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5qtkh" podUID="e46a06a4-894f-4f3d-a446-b501af6e42eb" Apr 28 19:16:31.942656 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:31.942618 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlvsf" Apr 28 19:16:31.942809 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:31.942775 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlvsf" podUID="caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b" Apr 28 19:16:32.942652 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:32.942610 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5qtkh" Apr 28 19:16:32.943102 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:32.942747 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5qtkh" podUID="e46a06a4-894f-4f3d-a446-b501af6e42eb" Apr 28 19:16:33.491283 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:33.491215 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b-metrics-certs\") pod \"network-metrics-daemon-zlvsf\" (UID: \"caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b\") " pod="openshift-multus/network-metrics-daemon-zlvsf" Apr 28 19:16:33.491471 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:33.491368 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:33.491471 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:33.491436 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b-metrics-certs podName:caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b nodeName:}" failed. No retries permitted until 2026-04-28 19:16:41.491418016 +0000 UTC m=+18.122318951 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b-metrics-certs") pod "network-metrics-daemon-zlvsf" (UID: "caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:33.592715 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:33.592632 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6pc96\" (UniqueName: \"kubernetes.io/projected/e46a06a4-894f-4f3d-a446-b501af6e42eb-kube-api-access-6pc96\") pod \"network-check-target-5qtkh\" (UID: \"e46a06a4-894f-4f3d-a446-b501af6e42eb\") " pod="openshift-network-diagnostics/network-check-target-5qtkh" Apr 28 19:16:33.592900 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:33.592782 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:16:33.592900 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:33.592802 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:16:33.592900 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:33.592819 2571 projected.go:194] Error preparing data for projected volume kube-api-access-6pc96 for pod openshift-network-diagnostics/network-check-target-5qtkh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:33.592900 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:33.592863 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e46a06a4-894f-4f3d-a446-b501af6e42eb-kube-api-access-6pc96 podName:e46a06a4-894f-4f3d-a446-b501af6e42eb nodeName:}" failed. No retries permitted until 2026-04-28 19:16:41.59284952 +0000 UTC m=+18.223750444 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-6pc96" (UniqueName: "kubernetes.io/projected/e46a06a4-894f-4f3d-a446-b501af6e42eb-kube-api-access-6pc96") pod "network-check-target-5qtkh" (UID: "e46a06a4-894f-4f3d-a446-b501af6e42eb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:33.944552 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:33.944174 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlvsf" Apr 28 19:16:33.944552 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:33.944351 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlvsf" podUID="caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b" Apr 28 19:16:34.942591 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:34.942552 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5qtkh" Apr 28 19:16:34.942762 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:34.942697 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5qtkh" podUID="e46a06a4-894f-4f3d-a446-b501af6e42eb" Apr 28 19:16:35.943521 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:35.943336 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlvsf" Apr 28 19:16:35.943521 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:35.943505 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlvsf" podUID="caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b" Apr 28 19:16:36.942969 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:36.942926 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5qtkh" Apr 28 19:16:36.943172 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:36.943040 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5qtkh" podUID="e46a06a4-894f-4f3d-a446-b501af6e42eb" Apr 28 19:16:37.943226 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:37.943186 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlvsf" Apr 28 19:16:37.943658 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:37.943307 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlvsf" podUID="caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b" Apr 28 19:16:38.942560 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:38.942515 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5qtkh" Apr 28 19:16:38.942723 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:38.942655 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5qtkh" podUID="e46a06a4-894f-4f3d-a446-b501af6e42eb" Apr 28 19:16:39.942757 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:39.942723 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlvsf" Apr 28 19:16:39.943200 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:39.942852 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlvsf" podUID="caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b" Apr 28 19:16:40.942671 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:40.942622 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5qtkh" Apr 28 19:16:40.943013 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:40.942777 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5qtkh" podUID="e46a06a4-894f-4f3d-a446-b501af6e42eb" Apr 28 19:16:41.551566 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:41.551523 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b-metrics-certs\") pod \"network-metrics-daemon-zlvsf\" (UID: \"caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b\") " pod="openshift-multus/network-metrics-daemon-zlvsf" Apr 28 19:16:41.551738 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:41.551708 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:41.551803 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:41.551789 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b-metrics-certs podName:caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b nodeName:}" failed. No retries permitted until 2026-04-28 19:16:57.551768991 +0000 UTC m=+34.182669921 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b-metrics-certs") pod "network-metrics-daemon-zlvsf" (UID: "caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:41.652932 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:41.652893 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6pc96\" (UniqueName: \"kubernetes.io/projected/e46a06a4-894f-4f3d-a446-b501af6e42eb-kube-api-access-6pc96\") pod \"network-check-target-5qtkh\" (UID: \"e46a06a4-894f-4f3d-a446-b501af6e42eb\") " pod="openshift-network-diagnostics/network-check-target-5qtkh" Apr 28 19:16:41.653091 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:41.653028 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:16:41.653091 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:41.653043 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:16:41.653091 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:41.653052 2571 projected.go:194] Error preparing data for projected volume kube-api-access-6pc96 for pod openshift-network-diagnostics/network-check-target-5qtkh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:41.653203 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:41.653105 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e46a06a4-894f-4f3d-a446-b501af6e42eb-kube-api-access-6pc96 podName:e46a06a4-894f-4f3d-a446-b501af6e42eb nodeName:}" failed. No retries permitted until 2026-04-28 19:16:57.653091857 +0000 UTC m=+34.283992780 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-6pc96" (UniqueName: "kubernetes.io/projected/e46a06a4-894f-4f3d-a446-b501af6e42eb-kube-api-access-6pc96") pod "network-check-target-5qtkh" (UID: "e46a06a4-894f-4f3d-a446-b501af6e42eb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:41.943383 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:41.943300 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlvsf" Apr 28 19:16:41.943829 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:41.943448 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlvsf" podUID="caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b" Apr 28 19:16:42.943292 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:42.943255 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5qtkh" Apr 28 19:16:42.943457 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:42.943386 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5qtkh" podUID="e46a06a4-894f-4f3d-a446-b501af6e42eb" Apr 28 19:16:43.943087 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:43.942923 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlvsf" Apr 28 19:16:43.943182 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:43.943155 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlvsf" podUID="caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b" Apr 28 19:16:44.094285 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:44.094246 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qgcmm" event={"ID":"42108ecf-b2c5-4c69-b376-2e4a5f47a989","Type":"ContainerStarted","Data":"c8af41f8ef822f40aabb5ce407aeea03bb09f1b01c35efba0db06b35ab3ebce9"} Apr 28 19:16:44.095673 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:44.095643 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kdzc2" event={"ID":"3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd","Type":"ContainerStarted","Data":"a6af789a05c6ed210f132743d815f922b450a6b59ff6f66b1c47cc6ee980bec8"} Apr 28 19:16:44.097026 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:44.096994 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jw8bb" event={"ID":"5de918e6-f589-4708-869e-21232a3f0b2e","Type":"ContainerStarted","Data":"4c44b81dd89dd107b4e2bd169ecaad67273fbc162329bef50c7f940145d98b1c"} Apr 28 19:16:44.098342 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:44.098315 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-kd4w2" event={"ID":"26923c38-af88-40e6-acd9-e1135c078ad1","Type":"ContainerStarted","Data":"a1bf43c2efd93537de9e63d2aee51d665f10d48c85c3fc5309889c7fc6f74fe3"} Apr 28 19:16:44.102378 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:44.102357 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ppk4t_238e19ca-102f-43b1-8aed-9322ca47bfc9/ovn-acl-logging/0.log" Apr 28 19:16:44.102761 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:44.102734 2571 generic.go:358] "Generic (PLEG): container finished" podID="238e19ca-102f-43b1-8aed-9322ca47bfc9" containerID="1b165f85a318b93203a1162d68f7bf1d02bceae5c85901247d86e6bf8258a83f" exitCode=1 Apr 28 19:16:44.102761 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:44.102757 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" event={"ID":"238e19ca-102f-43b1-8aed-9322ca47bfc9","Type":"ContainerStarted","Data":"44efd3082444f79e37ccb681c5d650f881029676263d0619255f49726d1fdd51"} Apr 28 19:16:44.102906 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:44.102779 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" event={"ID":"238e19ca-102f-43b1-8aed-9322ca47bfc9","Type":"ContainerStarted","Data":"493df0d028fad0e963bb17b9d7ebaea5ba03d8eab29ac503dc126dc6b6fad9d0"} Apr 28 19:16:44.102906 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:44.102788 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" event={"ID":"238e19ca-102f-43b1-8aed-9322ca47bfc9","Type":"ContainerDied","Data":"1b165f85a318b93203a1162d68f7bf1d02bceae5c85901247d86e6bf8258a83f"} Apr 28 19:16:44.102906 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:44.102797 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" event={"ID":"238e19ca-102f-43b1-8aed-9322ca47bfc9","Type":"ContainerStarted","Data":"4d40e7827ab976cd8518bf73fc774dfc16598d5d8506b398be0e1d983abca7da"} Apr 28 19:16:44.104252 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:44.104224 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cwbdf" event={"ID":"d4a7a1d2-5229-47cb-b4b1-097846a273d7","Type":"ContainerStarted","Data":"36347a804989e01aef6d05502b2ef3df6307a524c65d437a9d4d971b8fcfb6f4"} Apr 28 19:16:44.105698 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:44.105667 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-w5zst" event={"ID":"dae64fa9-2628-461e-a0d3-e468450879cf","Type":"ContainerStarted","Data":"d242d9e3d0c9e9a5a5d61e32a9fa8ea78c8867e7b41dee478b93f4ab37695d47"} Apr 28 19:16:44.107015 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:44.106992 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-76d57" event={"ID":"9a11e608-5ea9-4123-8db2-08683b9e10b6","Type":"ContainerStarted","Data":"d26891ffd407ae4be5f0f8bc9c37ba286018b46f33afeac2eddfa8edfcbacc8c"} Apr 28 19:16:44.118413 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:44.118373 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-kdzc2" podStartSLOduration=3.044513117 podStartE2EDuration="20.118361215s" podCreationTimestamp="2026-04-28 19:16:24 +0000 UTC" firstStartedPulling="2026-04-28 19:16:26.45019031 +0000 UTC m=+3.081091238" lastFinishedPulling="2026-04-28 19:16:43.524038413 +0000 UTC m=+20.154939336" observedRunningTime="2026-04-28 19:16:44.11798888 +0000 UTC m=+20.748889842" watchObservedRunningTime="2026-04-28 19:16:44.118361215 +0000 UTC m=+20.749262159" Apr 28 19:16:44.162288 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:44.162129 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-kd4w2" podStartSLOduration=3.103533787 podStartE2EDuration="20.162115438s" podCreationTimestamp="2026-04-28 19:16:24 +0000 UTC" firstStartedPulling="2026-04-28 19:16:26.451459261 +0000 UTC m=+3.082360193" lastFinishedPulling="2026-04-28 19:16:43.510040912 +0000 UTC m=+20.140941844" observedRunningTime="2026-04-28 19:16:44.144175745 +0000 UTC m=+20.775076695" watchObservedRunningTime="2026-04-28 19:16:44.162115438 +0000 UTC m=+20.793016383" Apr 28 19:16:44.162440 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:44.162339 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-w5zst" podStartSLOduration=3.104398156 podStartE2EDuration="20.162335352s" podCreationTimestamp="2026-04-28 19:16:24 +0000 UTC" firstStartedPulling="2026-04-28 19:16:26.450288839 +0000 UTC m=+3.081189762" lastFinishedPulling="2026-04-28 19:16:43.508226032 +0000 UTC m=+20.139126958" observedRunningTime="2026-04-28 19:16:44.162033581 +0000 UTC m=+20.792934541" watchObservedRunningTime="2026-04-28 19:16:44.162335352 +0000 UTC m=+20.793236297" Apr 28 19:16:44.214547 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:44.214502 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-76d57" podStartSLOduration=12.177336329 podStartE2EDuration="21.214471068s" podCreationTimestamp="2026-04-28 19:16:23 +0000 UTC" firstStartedPulling="2026-04-28 19:16:26.43867565 +0000 UTC m=+3.069576573" lastFinishedPulling="2026-04-28 19:16:35.475810387 +0000 UTC m=+12.106711312" observedRunningTime="2026-04-28 19:16:44.214462795 +0000 UTC m=+20.845363739" watchObservedRunningTime="2026-04-28 19:16:44.214471068 +0000 UTC m=+20.845372012" Apr 28 19:16:44.214670 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:44.214650 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-jw8bb" podStartSLOduration=4.155670322 podStartE2EDuration="21.2146442s" podCreationTimestamp="2026-04-28 19:16:23 +0000 UTC" firstStartedPulling="2026-04-28 19:16:26.449040948 +0000 UTC m=+3.079941876" lastFinishedPulling="2026-04-28 19:16:43.508014828 +0000 UTC m=+20.138915754" observedRunningTime="2026-04-28 19:16:44.199561499 +0000 UTC m=+20.830462461" watchObservedRunningTime="2026-04-28 19:16:44.2146442 +0000 UTC m=+20.845545144" Apr 28 19:16:44.942861 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:44.942840 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5qtkh" Apr 28 19:16:44.942948 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:44.942932 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5qtkh" podUID="e46a06a4-894f-4f3d-a446-b501af6e42eb" Apr 28 19:16:45.096362 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:45.096338 2571 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 28 19:16:45.110286 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:45.110255 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qgcmm" event={"ID":"42108ecf-b2c5-4c69-b376-2e4a5f47a989","Type":"ContainerStarted","Data":"eed41a89655e616f6ca616bd11645d81e2d0ff8afe7611451b1505b09362fa0f"} Apr 28 19:16:45.112456 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:45.112439 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ppk4t_238e19ca-102f-43b1-8aed-9322ca47bfc9/ovn-acl-logging/0.log" Apr 28 19:16:45.112794 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:45.112773 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" event={"ID":"238e19ca-102f-43b1-8aed-9322ca47bfc9","Type":"ContainerStarted","Data":"888d20222062153e840fd50a857a0b4382d6f4e4d57d4c9e376306acda6cf4bc"} Apr 28 19:16:45.112894 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:45.112804 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" event={"ID":"238e19ca-102f-43b1-8aed-9322ca47bfc9","Type":"ContainerStarted","Data":"98ceab1a0bf48ec72a150960c1eaef1508ce0034d7d52e6a0a0cc6d5eae438a3"} Apr 28 19:16:45.114003 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:45.113979 2571 generic.go:358] "Generic (PLEG): container finished" podID="d4a7a1d2-5229-47cb-b4b1-097846a273d7" containerID="36347a804989e01aef6d05502b2ef3df6307a524c65d437a9d4d971b8fcfb6f4" exitCode=0 Apr 28 19:16:45.114080 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:45.114048 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cwbdf" event={"ID":"d4a7a1d2-5229-47cb-b4b1-097846a273d7","Type":"ContainerDied","Data":"36347a804989e01aef6d05502b2ef3df6307a524c65d437a9d4d971b8fcfb6f4"} Apr 28 19:16:45.115447 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:45.115374 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4t6vk" event={"ID":"31324b10-25d4-4dfc-b0e6-a7f99e5a27c7","Type":"ContainerStarted","Data":"610e1c5c85c9d55a38073d76a76e48ad253acf1e94628e79559f563ddba5ceb9"} Apr 28 19:16:45.150171 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:45.150119 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-4t6vk" podStartSLOduration=4.404731805 podStartE2EDuration="21.150104655s" podCreationTimestamp="2026-04-28 19:16:24 +0000 UTC" firstStartedPulling="2026-04-28 19:16:26.440703231 +0000 UTC m=+3.071604154" lastFinishedPulling="2026-04-28 19:16:43.186076068 +0000 UTC m=+19.816977004" observedRunningTime="2026-04-28 19:16:45.149821037 +0000 UTC m=+21.780721981" watchObservedRunningTime="2026-04-28 19:16:45.150104655 +0000 UTC m=+21.781005601" Apr 28 19:16:45.892417 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:45.892316 2571 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-28T19:16:45.096356832Z","UUID":"b135d502-3d43-4893-b079-360a82eb4db5","Handler":null,"Name":"","Endpoint":""} Apr 28 19:16:45.894770 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:45.894744 2571 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 28 19:16:45.894900 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:45.894781 2571 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 28 19:16:45.942887 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:45.942856 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlvsf" Apr 28 19:16:45.943061 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:45.942974 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlvsf" podUID="caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b" Apr 28 19:16:46.385085 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:46.385033 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-76d57" Apr 28 19:16:46.385765 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:46.385652 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-76d57" Apr 28 19:16:46.943284 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:46.943074 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5qtkh" Apr 28 19:16:46.943504 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:46.943384 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5qtkh" podUID="e46a06a4-894f-4f3d-a446-b501af6e42eb" Apr 28 19:16:47.121286 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:47.121249 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qgcmm" event={"ID":"42108ecf-b2c5-4c69-b376-2e4a5f47a989","Type":"ContainerStarted","Data":"5b804cebc58c6c533d6e3c4f80144b7e3f45ba2ea1ba9f655839b9872d697798"} Apr 28 19:16:47.127271 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:47.127250 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ppk4t_238e19ca-102f-43b1-8aed-9322ca47bfc9/ovn-acl-logging/0.log" Apr 28 19:16:47.128192 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:47.128161 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" event={"ID":"238e19ca-102f-43b1-8aed-9322ca47bfc9","Type":"ContainerStarted","Data":"7ee3e9d6197d2dd130602e3a629df0cfe69a467b161b6d9bf422f57d80838b1f"} Apr 28 19:16:47.142741 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:47.142691 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qgcmm" podStartSLOduration=3.126426824 podStartE2EDuration="23.142674935s" podCreationTimestamp="2026-04-28 19:16:24 +0000 UTC" firstStartedPulling="2026-04-28 19:16:26.451363638 +0000 UTC m=+3.082264572" lastFinishedPulling="2026-04-28 19:16:46.467611746 +0000 UTC m=+23.098512683" observedRunningTime="2026-04-28 19:16:47.141971015 +0000 UTC m=+23.772871961" watchObservedRunningTime="2026-04-28 19:16:47.142674935 +0000 UTC m=+23.773575882" Apr 28 19:16:47.942952 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:47.942908 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlvsf" Apr 28 19:16:47.943495 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:47.943061 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlvsf" podUID="caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b" Apr 28 19:16:48.943457 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:48.943368 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5qtkh" Apr 28 19:16:48.944141 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:48.943523 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5qtkh" podUID="e46a06a4-894f-4f3d-a446-b501af6e42eb" Apr 28 19:16:49.896603 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:49.896402 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-76d57" Apr 28 19:16:49.896754 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:49.896685 2571 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 28 19:16:49.897063 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:49.897035 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-76d57" Apr 28 19:16:49.942989 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:49.942968 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlvsf" Apr 28 19:16:49.943136 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:49.943062 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlvsf" podUID="caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b" Apr 28 19:16:50.136586 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:50.136562 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ppk4t_238e19ca-102f-43b1-8aed-9322ca47bfc9/ovn-acl-logging/0.log" Apr 28 19:16:50.137065 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:50.136901 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" event={"ID":"238e19ca-102f-43b1-8aed-9322ca47bfc9","Type":"ContainerStarted","Data":"5b6042d8c9b68b9ddd0a23b3fa61a34621ece708acd93e55d5e414abe83867ab"} Apr 28 19:16:50.137206 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:50.137185 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:50.137380 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:50.137361 2571 scope.go:117] "RemoveContainer" containerID="1b165f85a318b93203a1162d68f7bf1d02bceae5c85901247d86e6bf8258a83f" Apr 28 19:16:50.138570 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:50.138455 2571 generic.go:358] "Generic (PLEG): container finished" podID="d4a7a1d2-5229-47cb-b4b1-097846a273d7" containerID="994235d2db6d00adf2aa13fcb4dae9516333749df3318ec069f862ce9be4dc2c" exitCode=0 Apr 28 19:16:50.138570 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:50.138508 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cwbdf" event={"ID":"d4a7a1d2-5229-47cb-b4b1-097846a273d7","Type":"ContainerDied","Data":"994235d2db6d00adf2aa13fcb4dae9516333749df3318ec069f862ce9be4dc2c"} Apr 28 19:16:50.153111 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:50.153059 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:50.943259 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:50.943231 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5qtkh" Apr 28 19:16:50.943388 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:50.943322 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5qtkh" podUID="e46a06a4-894f-4f3d-a446-b501af6e42eb" Apr 28 19:16:51.143780 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:51.143712 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ppk4t_238e19ca-102f-43b1-8aed-9322ca47bfc9/ovn-acl-logging/0.log" Apr 28 19:16:51.144180 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:51.143998 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" event={"ID":"238e19ca-102f-43b1-8aed-9322ca47bfc9","Type":"ContainerStarted","Data":"bb9e977d458bf1768225031e6e47b8bc162bfd6913fa154f2c51dd622730e1f5"} Apr 28 19:16:51.144228 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:51.144202 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:51.144271 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:51.144255 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:51.145980 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:51.145951 2571 generic.go:358] "Generic (PLEG): container finished" podID="d4a7a1d2-5229-47cb-b4b1-097846a273d7" containerID="9c882c57846c0cbfac247758365e99bf76b3d96c1d4218322ae2326a2b953be7" exitCode=0 Apr 28 19:16:51.146082 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:51.145995 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cwbdf" event={"ID":"d4a7a1d2-5229-47cb-b4b1-097846a273d7","Type":"ContainerDied","Data":"9c882c57846c0cbfac247758365e99bf76b3d96c1d4218322ae2326a2b953be7"} Apr 28 19:16:51.158589 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:51.158569 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:16:51.177437 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:51.177398 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" podStartSLOduration=10.064049492 podStartE2EDuration="27.177384835s" podCreationTimestamp="2026-04-28 19:16:24 +0000 UTC" firstStartedPulling="2026-04-28 19:16:26.446708778 +0000 UTC m=+3.077609715" lastFinishedPulling="2026-04-28 19:16:43.560044123 +0000 UTC m=+20.190945058" observedRunningTime="2026-04-28 19:16:51.177246156 +0000 UTC m=+27.808147101" watchObservedRunningTime="2026-04-28 19:16:51.177384835 +0000 UTC m=+27.808285780" Apr 28 19:16:51.945811 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:51.945785 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlvsf" Apr 28 19:16:51.945941 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:51.945887 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlvsf" podUID="caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b" Apr 28 19:16:52.149992 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:52.149906 2571 generic.go:358] "Generic (PLEG): container finished" podID="d4a7a1d2-5229-47cb-b4b1-097846a273d7" containerID="95c55d79cbea00481c2e16b8f2fe196fa00cb244c602a0d830a2916d09cb3a8a" exitCode=0 Apr 28 19:16:52.150424 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:52.149998 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cwbdf" event={"ID":"d4a7a1d2-5229-47cb-b4b1-097846a273d7","Type":"ContainerDied","Data":"95c55d79cbea00481c2e16b8f2fe196fa00cb244c602a0d830a2916d09cb3a8a"} Apr 28 19:16:52.943096 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:52.943063 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5qtkh" Apr 28 19:16:52.943279 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:52.943179 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5qtkh" podUID="e46a06a4-894f-4f3d-a446-b501af6e42eb" Apr 28 19:16:53.946447 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:53.946233 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlvsf" Apr 28 19:16:53.946856 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:53.946684 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlvsf" podUID="caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b" Apr 28 19:16:54.116327 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:54.116292 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-5qtkh"] Apr 28 19:16:54.116511 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:54.116448 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5qtkh" Apr 28 19:16:54.116600 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:54.116574 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5qtkh" podUID="e46a06a4-894f-4f3d-a446-b501af6e42eb" Apr 28 19:16:54.155841 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:54.155741 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-zlvsf"] Apr 28 19:16:54.156000 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:54.155860 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlvsf" Apr 28 19:16:54.156000 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:54.155979 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlvsf" podUID="caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b" Apr 28 19:16:55.942997 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:55.942962 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5qtkh" Apr 28 19:16:55.942997 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:55.942977 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlvsf" Apr 28 19:16:55.943713 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:55.943089 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5qtkh" podUID="e46a06a4-894f-4f3d-a446-b501af6e42eb" Apr 28 19:16:55.943713 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:55.943234 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlvsf" podUID="caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b" Apr 28 19:16:57.569761 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:57.569672 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b-metrics-certs\") pod \"network-metrics-daemon-zlvsf\" (UID: \"caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b\") " pod="openshift-multus/network-metrics-daemon-zlvsf" Apr 28 19:16:57.570377 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:57.569836 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:57.570377 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:57.569918 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b-metrics-certs podName:caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b nodeName:}" failed. No retries permitted until 2026-04-28 19:17:29.56988987 +0000 UTC m=+66.200790810 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b-metrics-certs") pod "network-metrics-daemon-zlvsf" (UID: "caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:57.670993 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:57.670957 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6pc96\" (UniqueName: \"kubernetes.io/projected/e46a06a4-894f-4f3d-a446-b501af6e42eb-kube-api-access-6pc96\") pod \"network-check-target-5qtkh\" (UID: \"e46a06a4-894f-4f3d-a446-b501af6e42eb\") " pod="openshift-network-diagnostics/network-check-target-5qtkh" Apr 28 19:16:57.671178 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:57.671144 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:16:57.671178 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:57.671163 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:16:57.671178 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:57.671176 2571 projected.go:194] Error preparing data for projected volume kube-api-access-6pc96 for pod openshift-network-diagnostics/network-check-target-5qtkh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:57.671320 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:16:57.671239 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e46a06a4-894f-4f3d-a446-b501af6e42eb-kube-api-access-6pc96 podName:e46a06a4-894f-4f3d-a446-b501af6e42eb nodeName:}" failed. No retries permitted until 2026-04-28 19:17:29.671220394 +0000 UTC m=+66.302121320 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-6pc96" (UniqueName: "kubernetes.io/projected/e46a06a4-894f-4f3d-a446-b501af6e42eb-kube-api-access-6pc96") pod "network-check-target-5qtkh" (UID: "e46a06a4-894f-4f3d-a446-b501af6e42eb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:57.747253 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:57.747217 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-128.ec2.internal" event="NodeReady" Apr 28 19:16:57.747417 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:57.747375 2571 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 28 19:16:57.832792 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:57.832701 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-bczfz"] Apr 28 19:16:57.850364 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:57.850332 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-8cwst"] Apr 28 19:16:57.850544 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:57.850490 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-bczfz" Apr 28 19:16:57.852916 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:57.852700 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 28 19:16:57.852916 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:57.852836 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-xdh6h\"" Apr 28 19:16:57.852916 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:57.852876 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 28 19:16:57.865362 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:57.865338 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8cwst"] Apr 28 19:16:57.865504 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:57.865400 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-bczfz"] Apr 28 19:16:57.865504 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:57.865429 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-pd8ms"] Apr 28 19:16:57.865504 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:57.865498 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8cwst" Apr 28 19:16:57.868102 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:57.868080 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 28 19:16:57.868216 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:57.868083 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 28 19:16:57.868271 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:57.868235 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-qbgzj\"" Apr 28 19:16:57.868309 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:57.868273 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 28 19:16:57.880195 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:57.880175 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-pd8ms"] Apr 28 19:16:57.880327 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:57.880315 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-pd8ms" Apr 28 19:16:57.882439 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:57.882418 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 28 19:16:57.882568 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:57.882453 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 28 19:16:57.882568 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:57.882501 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 28 19:16:57.882824 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:57.882809 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 28 19:16:57.882903 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:57.882827 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-tpf57\"" Apr 28 19:16:57.943217 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:57.943182 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5qtkh" Apr 28 19:16:57.943365 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:57.943182 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlvsf" Apr 28 19:16:57.945605 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:57.945584 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 28 19:16:57.945605 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:57.945609 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-n7p66\"" Apr 28 19:16:57.945784 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:57.945706 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 28 19:16:57.945784 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:57.945718 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 28 19:16:57.945988 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:57.945975 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-6mdm8\"" Apr 28 19:16:57.974140 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:57.974116 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkjj8\" (UniqueName: \"kubernetes.io/projected/4b4de6e2-0f57-4508-837c-5b18d4524864-kube-api-access-nkjj8\") pod \"ingress-canary-8cwst\" (UID: \"4b4de6e2-0f57-4508-837c-5b18d4524864\") " pod="openshift-ingress-canary/ingress-canary-8cwst" Apr 28 19:16:57.974268 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:57.974150 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/f525727f-5701-4a75-ae8d-ab2bea2bde16-data-volume\") pod \"insights-runtime-extractor-pd8ms\" (UID: \"f525727f-5701-4a75-ae8d-ab2bea2bde16\") " pod="openshift-insights/insights-runtime-extractor-pd8ms" Apr 28 19:16:57.974268 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:57.974181 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsz6m\" (UniqueName: \"kubernetes.io/projected/0b1ddaed-b20b-4d05-9006-8d55a8bd05f8-kube-api-access-qsz6m\") pod \"dns-default-bczfz\" (UID: \"0b1ddaed-b20b-4d05-9006-8d55a8bd05f8\") " pod="openshift-dns/dns-default-bczfz" Apr 28 19:16:57.974268 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:57.974250 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/f525727f-5701-4a75-ae8d-ab2bea2bde16-crio-socket\") pod \"insights-runtime-extractor-pd8ms\" (UID: \"f525727f-5701-4a75-ae8d-ab2bea2bde16\") " pod="openshift-insights/insights-runtime-extractor-pd8ms" Apr 28 19:16:57.974382 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:57.974281 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b1ddaed-b20b-4d05-9006-8d55a8bd05f8-metrics-tls\") pod \"dns-default-bczfz\" (UID: \"0b1ddaed-b20b-4d05-9006-8d55a8bd05f8\") " pod="openshift-dns/dns-default-bczfz" Apr 28 19:16:57.974382 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:57.974298 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4b4de6e2-0f57-4508-837c-5b18d4524864-cert\") pod \"ingress-canary-8cwst\" (UID: \"4b4de6e2-0f57-4508-837c-5b18d4524864\") " pod="openshift-ingress-canary/ingress-canary-8cwst" Apr 28 19:16:57.974382 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:57.974325 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtj64\" (UniqueName: \"kubernetes.io/projected/f525727f-5701-4a75-ae8d-ab2bea2bde16-kube-api-access-vtj64\") pod \"insights-runtime-extractor-pd8ms\" (UID: \"f525727f-5701-4a75-ae8d-ab2bea2bde16\") " pod="openshift-insights/insights-runtime-extractor-pd8ms" Apr 28 19:16:57.974382 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:57.974351 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f525727f-5701-4a75-ae8d-ab2bea2bde16-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-pd8ms\" (UID: \"f525727f-5701-4a75-ae8d-ab2bea2bde16\") " pod="openshift-insights/insights-runtime-extractor-pd8ms" Apr 28 19:16:57.974382 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:57.974369 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/f525727f-5701-4a75-ae8d-ab2bea2bde16-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-pd8ms\" (UID: \"f525727f-5701-4a75-ae8d-ab2bea2bde16\") " pod="openshift-insights/insights-runtime-extractor-pd8ms" Apr 28 19:16:57.974561 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:57.974389 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b1ddaed-b20b-4d05-9006-8d55a8bd05f8-config-volume\") pod \"dns-default-bczfz\" (UID: \"0b1ddaed-b20b-4d05-9006-8d55a8bd05f8\") " pod="openshift-dns/dns-default-bczfz" Apr 28 19:16:57.974561 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:57.974444 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0b1ddaed-b20b-4d05-9006-8d55a8bd05f8-tmp-dir\") pod \"dns-default-bczfz\" (UID: \"0b1ddaed-b20b-4d05-9006-8d55a8bd05f8\") " pod="openshift-dns/dns-default-bczfz" Apr 28 19:16:58.075525 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:58.075473 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0b1ddaed-b20b-4d05-9006-8d55a8bd05f8-tmp-dir\") pod \"dns-default-bczfz\" (UID: \"0b1ddaed-b20b-4d05-9006-8d55a8bd05f8\") " pod="openshift-dns/dns-default-bczfz" Apr 28 19:16:58.075695 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:58.075531 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nkjj8\" (UniqueName: \"kubernetes.io/projected/4b4de6e2-0f57-4508-837c-5b18d4524864-kube-api-access-nkjj8\") pod \"ingress-canary-8cwst\" (UID: \"4b4de6e2-0f57-4508-837c-5b18d4524864\") " pod="openshift-ingress-canary/ingress-canary-8cwst" Apr 28 19:16:58.075695 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:58.075554 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/f525727f-5701-4a75-ae8d-ab2bea2bde16-data-volume\") pod \"insights-runtime-extractor-pd8ms\" (UID: \"f525727f-5701-4a75-ae8d-ab2bea2bde16\") " pod="openshift-insights/insights-runtime-extractor-pd8ms" Apr 28 19:16:58.075695 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:58.075575 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qsz6m\" (UniqueName: \"kubernetes.io/projected/0b1ddaed-b20b-4d05-9006-8d55a8bd05f8-kube-api-access-qsz6m\") pod \"dns-default-bczfz\" (UID: \"0b1ddaed-b20b-4d05-9006-8d55a8bd05f8\") " pod="openshift-dns/dns-default-bczfz" Apr 28 19:16:58.075695 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:58.075675 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/f525727f-5701-4a75-ae8d-ab2bea2bde16-crio-socket\") pod \"insights-runtime-extractor-pd8ms\" (UID: \"f525727f-5701-4a75-ae8d-ab2bea2bde16\") " pod="openshift-insights/insights-runtime-extractor-pd8ms" Apr 28 19:16:58.075900 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:58.075721 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b1ddaed-b20b-4d05-9006-8d55a8bd05f8-metrics-tls\") pod \"dns-default-bczfz\" (UID: \"0b1ddaed-b20b-4d05-9006-8d55a8bd05f8\") " pod="openshift-dns/dns-default-bczfz" Apr 28 19:16:58.075900 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:58.075835 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0b1ddaed-b20b-4d05-9006-8d55a8bd05f8-tmp-dir\") pod \"dns-default-bczfz\" (UID: \"0b1ddaed-b20b-4d05-9006-8d55a8bd05f8\") " pod="openshift-dns/dns-default-bczfz" Apr 28 19:16:58.075900 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:58.075877 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4b4de6e2-0f57-4508-837c-5b18d4524864-cert\") pod \"ingress-canary-8cwst\" (UID: \"4b4de6e2-0f57-4508-837c-5b18d4524864\") " pod="openshift-ingress-canary/ingress-canary-8cwst" Apr 28 19:16:58.076048 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:58.075919 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vtj64\" (UniqueName: \"kubernetes.io/projected/f525727f-5701-4a75-ae8d-ab2bea2bde16-kube-api-access-vtj64\") pod \"insights-runtime-extractor-pd8ms\" (UID: \"f525727f-5701-4a75-ae8d-ab2bea2bde16\") " pod="openshift-insights/insights-runtime-extractor-pd8ms" Apr 28 19:16:58.076048 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:58.075934 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/f525727f-5701-4a75-ae8d-ab2bea2bde16-crio-socket\") pod \"insights-runtime-extractor-pd8ms\" (UID: \"f525727f-5701-4a75-ae8d-ab2bea2bde16\") " pod="openshift-insights/insights-runtime-extractor-pd8ms" Apr 28 19:16:58.076048 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:58.075935 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/f525727f-5701-4a75-ae8d-ab2bea2bde16-data-volume\") pod \"insights-runtime-extractor-pd8ms\" (UID: \"f525727f-5701-4a75-ae8d-ab2bea2bde16\") " pod="openshift-insights/insights-runtime-extractor-pd8ms" Apr 28 19:16:58.076048 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:58.075960 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f525727f-5701-4a75-ae8d-ab2bea2bde16-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-pd8ms\" (UID: \"f525727f-5701-4a75-ae8d-ab2bea2bde16\") " pod="openshift-insights/insights-runtime-extractor-pd8ms" Apr 28 19:16:58.076048 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:58.076013 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/f525727f-5701-4a75-ae8d-ab2bea2bde16-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-pd8ms\" (UID: \"f525727f-5701-4a75-ae8d-ab2bea2bde16\") " pod="openshift-insights/insights-runtime-extractor-pd8ms" Apr 28 19:16:58.076273 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:58.076061 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b1ddaed-b20b-4d05-9006-8d55a8bd05f8-config-volume\") pod \"dns-default-bczfz\" (UID: \"0b1ddaed-b20b-4d05-9006-8d55a8bd05f8\") " pod="openshift-dns/dns-default-bczfz" Apr 28 19:16:58.076512 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:58.076474 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/f525727f-5701-4a75-ae8d-ab2bea2bde16-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-pd8ms\" (UID: \"f525727f-5701-4a75-ae8d-ab2bea2bde16\") " pod="openshift-insights/insights-runtime-extractor-pd8ms" Apr 28 19:16:58.076596 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:58.076573 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b1ddaed-b20b-4d05-9006-8d55a8bd05f8-config-volume\") pod \"dns-default-bczfz\" (UID: \"0b1ddaed-b20b-4d05-9006-8d55a8bd05f8\") " pod="openshift-dns/dns-default-bczfz" Apr 28 19:16:58.079877 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:58.079852 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f525727f-5701-4a75-ae8d-ab2bea2bde16-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-pd8ms\" (UID: \"f525727f-5701-4a75-ae8d-ab2bea2bde16\") " pod="openshift-insights/insights-runtime-extractor-pd8ms" Apr 28 19:16:58.079965 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:58.079890 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4b4de6e2-0f57-4508-837c-5b18d4524864-cert\") pod \"ingress-canary-8cwst\" (UID: \"4b4de6e2-0f57-4508-837c-5b18d4524864\") " pod="openshift-ingress-canary/ingress-canary-8cwst" Apr 28 19:16:58.079965 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:58.079930 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b1ddaed-b20b-4d05-9006-8d55a8bd05f8-metrics-tls\") pod \"dns-default-bczfz\" (UID: \"0b1ddaed-b20b-4d05-9006-8d55a8bd05f8\") " pod="openshift-dns/dns-default-bczfz" Apr 28 19:16:58.088036 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:58.087977 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtj64\" (UniqueName: \"kubernetes.io/projected/f525727f-5701-4a75-ae8d-ab2bea2bde16-kube-api-access-vtj64\") pod \"insights-runtime-extractor-pd8ms\" (UID: \"f525727f-5701-4a75-ae8d-ab2bea2bde16\") " pod="openshift-insights/insights-runtime-extractor-pd8ms" Apr 28 19:16:58.088109 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:58.088091 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkjj8\" (UniqueName: \"kubernetes.io/projected/4b4de6e2-0f57-4508-837c-5b18d4524864-kube-api-access-nkjj8\") pod \"ingress-canary-8cwst\" (UID: \"4b4de6e2-0f57-4508-837c-5b18d4524864\") " pod="openshift-ingress-canary/ingress-canary-8cwst" Apr 28 19:16:58.088441 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:58.088425 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsz6m\" (UniqueName: \"kubernetes.io/projected/0b1ddaed-b20b-4d05-9006-8d55a8bd05f8-kube-api-access-qsz6m\") pod \"dns-default-bczfz\" (UID: \"0b1ddaed-b20b-4d05-9006-8d55a8bd05f8\") " pod="openshift-dns/dns-default-bczfz" Apr 28 19:16:58.161089 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:58.161058 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-bczfz" Apr 28 19:16:58.175000 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:58.174978 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8cwst" Apr 28 19:16:58.188632 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:58.188611 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-pd8ms" Apr 28 19:16:58.410429 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:58.409172 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-pd8ms"] Apr 28 19:16:58.413052 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:58.413026 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-bczfz"] Apr 28 19:16:58.414045 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:58.413723 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8cwst"] Apr 28 19:16:58.415817 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:58.415790 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf525727f_5701_4a75_ae8d_ab2bea2bde16.slice/crio-f6d9f24b94d0259b1ea05dd611256a40fcf3365c44e8e80a32dc07d35c014792 WatchSource:0}: Error finding container f6d9f24b94d0259b1ea05dd611256a40fcf3365c44e8e80a32dc07d35c014792: Status 404 returned error can't find the container with id f6d9f24b94d0259b1ea05dd611256a40fcf3365c44e8e80a32dc07d35c014792 Apr 28 19:16:58.418688 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:58.418662 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b4de6e2_0f57_4508_837c_5b18d4524864.slice/crio-cd74df6f7b8f678943cd80044f30d5a965ac8e1a9e9accdd53d9dade6ccb84fa WatchSource:0}: Error finding container cd74df6f7b8f678943cd80044f30d5a965ac8e1a9e9accdd53d9dade6ccb84fa: Status 404 returned error can't find the container with id cd74df6f7b8f678943cd80044f30d5a965ac8e1a9e9accdd53d9dade6ccb84fa Apr 28 19:16:58.419678 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:16:58.419658 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b1ddaed_b20b_4d05_9006_8d55a8bd05f8.slice/crio-b5f61a923ddbb7a2be0a94991c3e05e6f7d88ea8fcc113cdcb03c658d8b0bc95 WatchSource:0}: Error finding container b5f61a923ddbb7a2be0a94991c3e05e6f7d88ea8fcc113cdcb03c658d8b0bc95: Status 404 returned error can't find the container with id b5f61a923ddbb7a2be0a94991c3e05e6f7d88ea8fcc113cdcb03c658d8b0bc95 Apr 28 19:16:59.169107 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:59.169067 2571 generic.go:358] "Generic (PLEG): container finished" podID="d4a7a1d2-5229-47cb-b4b1-097846a273d7" containerID="8b8bf7ae1d832cd605c11eb5f0e238d169ff351bc62f8cf737248dc48c6d913a" exitCode=0 Apr 28 19:16:59.169715 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:59.169136 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cwbdf" event={"ID":"d4a7a1d2-5229-47cb-b4b1-097846a273d7","Type":"ContainerDied","Data":"8b8bf7ae1d832cd605c11eb5f0e238d169ff351bc62f8cf737248dc48c6d913a"} Apr 28 19:16:59.171633 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:59.171580 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bczfz" event={"ID":"0b1ddaed-b20b-4d05-9006-8d55a8bd05f8","Type":"ContainerStarted","Data":"b5f61a923ddbb7a2be0a94991c3e05e6f7d88ea8fcc113cdcb03c658d8b0bc95"} Apr 28 19:16:59.172883 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:59.172855 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8cwst" event={"ID":"4b4de6e2-0f57-4508-837c-5b18d4524864","Type":"ContainerStarted","Data":"cd74df6f7b8f678943cd80044f30d5a965ac8e1a9e9accdd53d9dade6ccb84fa"} Apr 28 19:16:59.174568 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:59.174542 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-pd8ms" event={"ID":"f525727f-5701-4a75-ae8d-ab2bea2bde16","Type":"ContainerStarted","Data":"6befd358a78cf987a67eb7bb8ec0b5c117c0fe1740c204ce5d0b750fd43f535a"} Apr 28 19:16:59.174662 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:16:59.174574 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-pd8ms" event={"ID":"f525727f-5701-4a75-ae8d-ab2bea2bde16","Type":"ContainerStarted","Data":"f6d9f24b94d0259b1ea05dd611256a40fcf3365c44e8e80a32dc07d35c014792"} Apr 28 19:17:00.180034 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:00.180002 2571 generic.go:358] "Generic (PLEG): container finished" podID="d4a7a1d2-5229-47cb-b4b1-097846a273d7" containerID="828e515865aa5d519be609a452cd06a7495452628134da50fad453f67a4ff2b6" exitCode=0 Apr 28 19:17:00.180459 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:00.180055 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cwbdf" event={"ID":"d4a7a1d2-5229-47cb-b4b1-097846a273d7","Type":"ContainerDied","Data":"828e515865aa5d519be609a452cd06a7495452628134da50fad453f67a4ff2b6"} Apr 28 19:17:00.736461 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:00.736432 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-m7r5s"] Apr 28 19:17:00.755193 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:00.755151 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-m7r5s"] Apr 28 19:17:00.755311 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:00.755276 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m7r5s" Apr 28 19:17:00.758601 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:00.758575 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 28 19:17:00.758865 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:00.758842 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 28 19:17:00.758986 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:00.758848 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 28 19:17:00.759050 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:00.758991 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 28 19:17:00.759145 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:00.759126 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 28 19:17:00.759295 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:00.759279 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-sczf9\"" Apr 28 19:17:00.768910 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:00.767240 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-tl8ms"] Apr 28 19:17:00.782985 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:00.782957 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-zqwh5"] Apr 28 19:17:00.783119 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:00.783104 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-tl8ms" Apr 28 19:17:00.785310 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:00.785291 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 28 19:17:00.785310 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:00.785300 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-x74v7\"" Apr 28 19:17:00.785471 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:00.785314 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 28 19:17:00.785605 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:00.785590 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 28 19:17:00.795245 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:00.795217 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-zqwh5" Apr 28 19:17:00.795827 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:00.795801 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-zqwh5"] Apr 28 19:17:00.797289 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:00.797273 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 28 19:17:00.797593 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:00.797566 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-d2kh8\"" Apr 28 19:17:00.797820 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:00.797798 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 28 19:17:00.798159 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:00.798142 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 28 19:17:00.921812 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:00.919621 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f03f118d-0ede-40a8-a7cf-3c637824276d-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-zqwh5\" (UID: \"f03f118d-0ede-40a8-a7cf-3c637824276d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-zqwh5" Apr 28 19:17:00.921812 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:00.919671 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/86939d9a-e349-408c-aad2-55e43a981aac-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-m7r5s\" (UID: \"86939d9a-e349-408c-aad2-55e43a981aac\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m7r5s" Apr 28 19:17:00.921812 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:00.919704 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6eb370a0-b2e9-42c0-acb9-92f39db33103-sys\") pod \"node-exporter-tl8ms\" (UID: \"6eb370a0-b2e9-42c0-acb9-92f39db33103\") " pod="openshift-monitoring/node-exporter-tl8ms" Apr 28 19:17:00.921812 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:00.919789 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6eb370a0-b2e9-42c0-acb9-92f39db33103-root\") pod \"node-exporter-tl8ms\" (UID: \"6eb370a0-b2e9-42c0-acb9-92f39db33103\") " pod="openshift-monitoring/node-exporter-tl8ms" Apr 28 19:17:00.923271 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:00.922616 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f03f118d-0ede-40a8-a7cf-3c637824276d-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-zqwh5\" (UID: \"f03f118d-0ede-40a8-a7cf-3c637824276d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-zqwh5" Apr 28 19:17:00.923271 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:00.922688 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jj8wn\" (UniqueName: \"kubernetes.io/projected/f03f118d-0ede-40a8-a7cf-3c637824276d-kube-api-access-jj8wn\") pod \"kube-state-metrics-69db897b98-zqwh5\" (UID: \"f03f118d-0ede-40a8-a7cf-3c637824276d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-zqwh5" Apr 28 19:17:00.923271 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:00.922721 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/f03f118d-0ede-40a8-a7cf-3c637824276d-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-zqwh5\" (UID: \"f03f118d-0ede-40a8-a7cf-3c637824276d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-zqwh5" Apr 28 19:17:00.923271 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:00.922747 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/f03f118d-0ede-40a8-a7cf-3c637824276d-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-zqwh5\" (UID: \"f03f118d-0ede-40a8-a7cf-3c637824276d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-zqwh5" Apr 28 19:17:00.923271 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:00.922779 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6eb370a0-b2e9-42c0-acb9-92f39db33103-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-tl8ms\" (UID: \"6eb370a0-b2e9-42c0-acb9-92f39db33103\") " pod="openshift-monitoring/node-exporter-tl8ms" Apr 28 19:17:00.923271 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:00.922823 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6eb370a0-b2e9-42c0-acb9-92f39db33103-node-exporter-wtmp\") pod \"node-exporter-tl8ms\" (UID: \"6eb370a0-b2e9-42c0-acb9-92f39db33103\") " pod="openshift-monitoring/node-exporter-tl8ms" Apr 28 19:17:00.923271 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:00.922846 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6eb370a0-b2e9-42c0-acb9-92f39db33103-metrics-client-ca\") pod \"node-exporter-tl8ms\" (UID: \"6eb370a0-b2e9-42c0-acb9-92f39db33103\") " pod="openshift-monitoring/node-exporter-tl8ms" Apr 28 19:17:00.923271 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:00.922873 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6eb370a0-b2e9-42c0-acb9-92f39db33103-node-exporter-textfile\") pod \"node-exporter-tl8ms\" (UID: \"6eb370a0-b2e9-42c0-acb9-92f39db33103\") " pod="openshift-monitoring/node-exporter-tl8ms" Apr 28 19:17:00.923271 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:00.922895 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6eb370a0-b2e9-42c0-acb9-92f39db33103-node-exporter-tls\") pod \"node-exporter-tl8ms\" (UID: \"6eb370a0-b2e9-42c0-acb9-92f39db33103\") " pod="openshift-monitoring/node-exporter-tl8ms" Apr 28 19:17:00.923271 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:00.922921 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6eb370a0-b2e9-42c0-acb9-92f39db33103-node-exporter-accelerators-collector-config\") pod \"node-exporter-tl8ms\" (UID: \"6eb370a0-b2e9-42c0-acb9-92f39db33103\") " pod="openshift-monitoring/node-exporter-tl8ms" Apr 28 19:17:00.923271 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:00.922956 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/86939d9a-e349-408c-aad2-55e43a981aac-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-m7r5s\" (UID: \"86939d9a-e349-408c-aad2-55e43a981aac\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m7r5s" Apr 28 19:17:00.923271 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:00.922979 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4hlh\" (UniqueName: \"kubernetes.io/projected/86939d9a-e349-408c-aad2-55e43a981aac-kube-api-access-b4hlh\") pod \"openshift-state-metrics-9d44df66c-m7r5s\" (UID: \"86939d9a-e349-408c-aad2-55e43a981aac\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m7r5s" Apr 28 19:17:00.923271 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:00.923006 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f03f118d-0ede-40a8-a7cf-3c637824276d-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-zqwh5\" (UID: \"f03f118d-0ede-40a8-a7cf-3c637824276d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-zqwh5" Apr 28 19:17:00.923271 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:00.923031 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/86939d9a-e349-408c-aad2-55e43a981aac-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-m7r5s\" (UID: \"86939d9a-e349-408c-aad2-55e43a981aac\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m7r5s" Apr 28 19:17:00.923271 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:00.923062 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6r42z\" (UniqueName: \"kubernetes.io/projected/6eb370a0-b2e9-42c0-acb9-92f39db33103-kube-api-access-6r42z\") pod \"node-exporter-tl8ms\" (UID: \"6eb370a0-b2e9-42c0-acb9-92f39db33103\") " pod="openshift-monitoring/node-exporter-tl8ms" Apr 28 19:17:01.024369 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:01.023893 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/86939d9a-e349-408c-aad2-55e43a981aac-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-m7r5s\" (UID: \"86939d9a-e349-408c-aad2-55e43a981aac\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m7r5s" Apr 28 19:17:01.024369 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:01.023925 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6eb370a0-b2e9-42c0-acb9-92f39db33103-sys\") pod \"node-exporter-tl8ms\" (UID: \"6eb370a0-b2e9-42c0-acb9-92f39db33103\") " pod="openshift-monitoring/node-exporter-tl8ms" Apr 28 19:17:01.024369 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:01.023953 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6eb370a0-b2e9-42c0-acb9-92f39db33103-root\") pod \"node-exporter-tl8ms\" (UID: \"6eb370a0-b2e9-42c0-acb9-92f39db33103\") " pod="openshift-monitoring/node-exporter-tl8ms" Apr 28 19:17:01.024369 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:01.024000 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6eb370a0-b2e9-42c0-acb9-92f39db33103-root\") pod \"node-exporter-tl8ms\" (UID: \"6eb370a0-b2e9-42c0-acb9-92f39db33103\") " pod="openshift-monitoring/node-exporter-tl8ms" Apr 28 19:17:01.024369 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:01.024035 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6eb370a0-b2e9-42c0-acb9-92f39db33103-sys\") pod \"node-exporter-tl8ms\" (UID: \"6eb370a0-b2e9-42c0-acb9-92f39db33103\") " pod="openshift-monitoring/node-exporter-tl8ms" Apr 28 19:17:01.024369 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:01.024112 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f03f118d-0ede-40a8-a7cf-3c637824276d-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-zqwh5\" (UID: \"f03f118d-0ede-40a8-a7cf-3c637824276d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-zqwh5" Apr 28 19:17:01.024369 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:01.024138 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jj8wn\" (UniqueName: \"kubernetes.io/projected/f03f118d-0ede-40a8-a7cf-3c637824276d-kube-api-access-jj8wn\") pod \"kube-state-metrics-69db897b98-zqwh5\" (UID: \"f03f118d-0ede-40a8-a7cf-3c637824276d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-zqwh5" Apr 28 19:17:01.024369 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:01.024156 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/f03f118d-0ede-40a8-a7cf-3c637824276d-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-zqwh5\" (UID: \"f03f118d-0ede-40a8-a7cf-3c637824276d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-zqwh5" Apr 28 19:17:01.024369 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:01.024172 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/f03f118d-0ede-40a8-a7cf-3c637824276d-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-zqwh5\" (UID: \"f03f118d-0ede-40a8-a7cf-3c637824276d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-zqwh5" Apr 28 19:17:01.024369 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:01.024189 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6eb370a0-b2e9-42c0-acb9-92f39db33103-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-tl8ms\" (UID: \"6eb370a0-b2e9-42c0-acb9-92f39db33103\") " pod="openshift-monitoring/node-exporter-tl8ms" Apr 28 19:17:01.024369 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:01.024240 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6eb370a0-b2e9-42c0-acb9-92f39db33103-node-exporter-wtmp\") pod \"node-exporter-tl8ms\" (UID: \"6eb370a0-b2e9-42c0-acb9-92f39db33103\") " pod="openshift-monitoring/node-exporter-tl8ms" Apr 28 19:17:01.024369 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:01.024259 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6eb370a0-b2e9-42c0-acb9-92f39db33103-metrics-client-ca\") pod \"node-exporter-tl8ms\" (UID: \"6eb370a0-b2e9-42c0-acb9-92f39db33103\") " pod="openshift-monitoring/node-exporter-tl8ms" Apr 28 19:17:01.024369 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:01.024276 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6eb370a0-b2e9-42c0-acb9-92f39db33103-node-exporter-textfile\") pod \"node-exporter-tl8ms\" (UID: \"6eb370a0-b2e9-42c0-acb9-92f39db33103\") " pod="openshift-monitoring/node-exporter-tl8ms" Apr 28 19:17:01.024369 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:01.024292 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6eb370a0-b2e9-42c0-acb9-92f39db33103-node-exporter-tls\") pod \"node-exporter-tl8ms\" (UID: \"6eb370a0-b2e9-42c0-acb9-92f39db33103\") " pod="openshift-monitoring/node-exporter-tl8ms" Apr 28 19:17:01.024369 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:01.024313 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6eb370a0-b2e9-42c0-acb9-92f39db33103-node-exporter-accelerators-collector-config\") pod \"node-exporter-tl8ms\" (UID: \"6eb370a0-b2e9-42c0-acb9-92f39db33103\") " pod="openshift-monitoring/node-exporter-tl8ms" Apr 28 19:17:01.024369 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:01.024350 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/86939d9a-e349-408c-aad2-55e43a981aac-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-m7r5s\" (UID: \"86939d9a-e349-408c-aad2-55e43a981aac\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m7r5s" Apr 28 19:17:01.025366 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:01.024387 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b4hlh\" (UniqueName: \"kubernetes.io/projected/86939d9a-e349-408c-aad2-55e43a981aac-kube-api-access-b4hlh\") pod \"openshift-state-metrics-9d44df66c-m7r5s\" (UID: \"86939d9a-e349-408c-aad2-55e43a981aac\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m7r5s" Apr 28 19:17:01.025366 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:01.024420 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f03f118d-0ede-40a8-a7cf-3c637824276d-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-zqwh5\" (UID: \"f03f118d-0ede-40a8-a7cf-3c637824276d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-zqwh5" Apr 28 19:17:01.025366 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:01.024443 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/86939d9a-e349-408c-aad2-55e43a981aac-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-m7r5s\" (UID: \"86939d9a-e349-408c-aad2-55e43a981aac\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m7r5s" Apr 28 19:17:01.025366 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:01.024470 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6r42z\" (UniqueName: \"kubernetes.io/projected/6eb370a0-b2e9-42c0-acb9-92f39db33103-kube-api-access-6r42z\") pod \"node-exporter-tl8ms\" (UID: \"6eb370a0-b2e9-42c0-acb9-92f39db33103\") " pod="openshift-monitoring/node-exporter-tl8ms" Apr 28 19:17:01.025366 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:01.024513 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f03f118d-0ede-40a8-a7cf-3c637824276d-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-zqwh5\" (UID: \"f03f118d-0ede-40a8-a7cf-3c637824276d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-zqwh5" Apr 28 19:17:01.025366 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:01.025010 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6eb370a0-b2e9-42c0-acb9-92f39db33103-node-exporter-textfile\") pod \"node-exporter-tl8ms\" (UID: \"6eb370a0-b2e9-42c0-acb9-92f39db33103\") " pod="openshift-monitoring/node-exporter-tl8ms" Apr 28 19:17:01.025366 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:17:01.025118 2571 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 28 19:17:01.025366 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:17:01.025175 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f03f118d-0ede-40a8-a7cf-3c637824276d-kube-state-metrics-tls podName:f03f118d-0ede-40a8-a7cf-3c637824276d nodeName:}" failed. No retries permitted until 2026-04-28 19:17:01.525154504 +0000 UTC m=+38.156055431 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/f03f118d-0ede-40a8-a7cf-3c637824276d-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-zqwh5" (UID: "f03f118d-0ede-40a8-a7cf-3c637824276d") : secret "kube-state-metrics-tls" not found Apr 28 19:17:01.025841 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:01.025745 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/f03f118d-0ede-40a8-a7cf-3c637824276d-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-zqwh5\" (UID: \"f03f118d-0ede-40a8-a7cf-3c637824276d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-zqwh5" Apr 28 19:17:01.026309 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:01.026213 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6eb370a0-b2e9-42c0-acb9-92f39db33103-node-exporter-wtmp\") pod \"node-exporter-tl8ms\" (UID: \"6eb370a0-b2e9-42c0-acb9-92f39db33103\") " pod="openshift-monitoring/node-exporter-tl8ms" Apr 28 19:17:01.026409 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:17:01.026369 2571 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 28 19:17:01.026468 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:17:01.026429 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86939d9a-e349-408c-aad2-55e43a981aac-openshift-state-metrics-tls podName:86939d9a-e349-408c-aad2-55e43a981aac nodeName:}" failed. No retries permitted until 2026-04-28 19:17:01.526412909 +0000 UTC m=+38.157313841 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/86939d9a-e349-408c-aad2-55e43a981aac-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-m7r5s" (UID: "86939d9a-e349-408c-aad2-55e43a981aac") : secret "openshift-state-metrics-tls" not found Apr 28 19:17:01.027000 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:01.026736 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6eb370a0-b2e9-42c0-acb9-92f39db33103-metrics-client-ca\") pod \"node-exporter-tl8ms\" (UID: \"6eb370a0-b2e9-42c0-acb9-92f39db33103\") " pod="openshift-monitoring/node-exporter-tl8ms" Apr 28 19:17:01.027000 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:01.026945 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6eb370a0-b2e9-42c0-acb9-92f39db33103-node-exporter-accelerators-collector-config\") pod \"node-exporter-tl8ms\" (UID: \"6eb370a0-b2e9-42c0-acb9-92f39db33103\") " pod="openshift-monitoring/node-exporter-tl8ms" Apr 28 19:17:01.027323 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:01.027301 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/86939d9a-e349-408c-aad2-55e43a981aac-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-m7r5s\" (UID: \"86939d9a-e349-408c-aad2-55e43a981aac\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m7r5s" Apr 28 19:17:01.029545 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:01.029518 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6eb370a0-b2e9-42c0-acb9-92f39db33103-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-tl8ms\" (UID: \"6eb370a0-b2e9-42c0-acb9-92f39db33103\") " pod="openshift-monitoring/node-exporter-tl8ms" Apr 28 19:17:01.029651 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:01.029595 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f03f118d-0ede-40a8-a7cf-3c637824276d-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-zqwh5\" (UID: \"f03f118d-0ede-40a8-a7cf-3c637824276d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-zqwh5" Apr 28 19:17:01.029755 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:01.029739 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/86939d9a-e349-408c-aad2-55e43a981aac-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-m7r5s\" (UID: \"86939d9a-e349-408c-aad2-55e43a981aac\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m7r5s" Apr 28 19:17:01.031139 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:01.031111 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6eb370a0-b2e9-42c0-acb9-92f39db33103-node-exporter-tls\") pod \"node-exporter-tl8ms\" (UID: \"6eb370a0-b2e9-42c0-acb9-92f39db33103\") " pod="openshift-monitoring/node-exporter-tl8ms" Apr 28 19:17:01.035688 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:01.035636 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jj8wn\" (UniqueName: \"kubernetes.io/projected/f03f118d-0ede-40a8-a7cf-3c637824276d-kube-api-access-jj8wn\") pod \"kube-state-metrics-69db897b98-zqwh5\" (UID: \"f03f118d-0ede-40a8-a7cf-3c637824276d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-zqwh5" Apr 28 19:17:01.038605 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:01.038542 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/f03f118d-0ede-40a8-a7cf-3c637824276d-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-zqwh5\" (UID: \"f03f118d-0ede-40a8-a7cf-3c637824276d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-zqwh5" Apr 28 19:17:01.039021 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:01.038965 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f03f118d-0ede-40a8-a7cf-3c637824276d-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-zqwh5\" (UID: \"f03f118d-0ede-40a8-a7cf-3c637824276d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-zqwh5" Apr 28 19:17:01.039226 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:01.039204 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4hlh\" (UniqueName: \"kubernetes.io/projected/86939d9a-e349-408c-aad2-55e43a981aac-kube-api-access-b4hlh\") pod \"openshift-state-metrics-9d44df66c-m7r5s\" (UID: \"86939d9a-e349-408c-aad2-55e43a981aac\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m7r5s" Apr 28 19:17:01.040505 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:01.040470 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6r42z\" (UniqueName: \"kubernetes.io/projected/6eb370a0-b2e9-42c0-acb9-92f39db33103-kube-api-access-6r42z\") pod \"node-exporter-tl8ms\" (UID: \"6eb370a0-b2e9-42c0-acb9-92f39db33103\") " pod="openshift-monitoring/node-exporter-tl8ms" Apr 28 19:17:01.091600 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:01.091560 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-tl8ms" Apr 28 19:17:01.101524 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:17:01.101470 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6eb370a0_b2e9_42c0_acb9_92f39db33103.slice/crio-5dea1e4f67885349cdf174e1f2ea9831db4ef62459c7bac49f2a7fef2492e9c6 WatchSource:0}: Error finding container 5dea1e4f67885349cdf174e1f2ea9831db4ef62459c7bac49f2a7fef2492e9c6: Status 404 returned error can't find the container with id 5dea1e4f67885349cdf174e1f2ea9831db4ef62459c7bac49f2a7fef2492e9c6 Apr 28 19:17:01.184020 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:01.183981 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tl8ms" event={"ID":"6eb370a0-b2e9-42c0-acb9-92f39db33103","Type":"ContainerStarted","Data":"5dea1e4f67885349cdf174e1f2ea9831db4ef62459c7bac49f2a7fef2492e9c6"} Apr 28 19:17:01.188231 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:01.188196 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cwbdf" event={"ID":"d4a7a1d2-5229-47cb-b4b1-097846a273d7","Type":"ContainerStarted","Data":"0c6462d4d363975181b576d7fce95da6778430a36a776d440d760932a50efe45"} Apr 28 19:17:01.191233 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:01.190977 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bczfz" event={"ID":"0b1ddaed-b20b-4d05-9006-8d55a8bd05f8","Type":"ContainerStarted","Data":"ff2f8a05839c8eec08d0840e9c87507be9b2353716aab600cf4a90f0bff734af"} Apr 28 19:17:01.193770 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:01.193638 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8cwst" event={"ID":"4b4de6e2-0f57-4508-837c-5b18d4524864","Type":"ContainerStarted","Data":"0ab702bc5e5e3cb4475af5204758123504c1543ade5e0958063244382c89ef33"} Apr 28 19:17:01.196836 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:01.196785 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-pd8ms" event={"ID":"f525727f-5701-4a75-ae8d-ab2bea2bde16","Type":"ContainerStarted","Data":"a14381acc289819d32bade5ac651646b346d65036c0d6fa842de8710463352e3"} Apr 28 19:17:01.224696 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:01.224641 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-cwbdf" podStartSLOduration=5.414856682 podStartE2EDuration="37.224626101s" podCreationTimestamp="2026-04-28 19:16:24 +0000 UTC" firstStartedPulling="2026-04-28 19:16:26.441750926 +0000 UTC m=+3.072651856" lastFinishedPulling="2026-04-28 19:16:58.251520336 +0000 UTC m=+34.882421275" observedRunningTime="2026-04-28 19:17:01.219173973 +0000 UTC m=+37.850074929" watchObservedRunningTime="2026-04-28 19:17:01.224626101 +0000 UTC m=+37.855527046" Apr 28 19:17:01.240041 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:01.239997 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-8cwst" podStartSLOduration=1.810578684 podStartE2EDuration="4.239985817s" podCreationTimestamp="2026-04-28 19:16:57 +0000 UTC" firstStartedPulling="2026-04-28 19:16:58.420560838 +0000 UTC m=+35.051461761" lastFinishedPulling="2026-04-28 19:17:00.849967971 +0000 UTC m=+37.480868894" observedRunningTime="2026-04-28 19:17:01.239149306 +0000 UTC m=+37.870050251" watchObservedRunningTime="2026-04-28 19:17:01.239985817 +0000 UTC m=+37.870886761" Apr 28 19:17:01.529847 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:01.529797 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/86939d9a-e349-408c-aad2-55e43a981aac-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-m7r5s\" (UID: \"86939d9a-e349-408c-aad2-55e43a981aac\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m7r5s" Apr 28 19:17:01.530034 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:01.529888 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f03f118d-0ede-40a8-a7cf-3c637824276d-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-zqwh5\" (UID: \"f03f118d-0ede-40a8-a7cf-3c637824276d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-zqwh5" Apr 28 19:17:01.533042 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:01.532779 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f03f118d-0ede-40a8-a7cf-3c637824276d-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-zqwh5\" (UID: \"f03f118d-0ede-40a8-a7cf-3c637824276d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-zqwh5" Apr 28 19:17:01.533042 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:01.532811 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/86939d9a-e349-408c-aad2-55e43a981aac-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-m7r5s\" (UID: \"86939d9a-e349-408c-aad2-55e43a981aac\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m7r5s" Apr 28 19:17:01.664980 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:01.664919 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m7r5s" Apr 28 19:17:01.703842 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:01.703805 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-zqwh5" Apr 28 19:17:01.830518 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:01.830320 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-m7r5s"] Apr 28 19:17:01.836077 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:17:01.836044 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86939d9a_e349_408c_aad2_55e43a981aac.slice/crio-a88f9fae0981816e54eb865bee135fa1a5231903ea3e9fb000fcdaec2fc7cafa WatchSource:0}: Error finding container a88f9fae0981816e54eb865bee135fa1a5231903ea3e9fb000fcdaec2fc7cafa: Status 404 returned error can't find the container with id a88f9fae0981816e54eb865bee135fa1a5231903ea3e9fb000fcdaec2fc7cafa Apr 28 19:17:01.864121 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:01.864088 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-zqwh5"] Apr 28 19:17:01.869105 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:17:01.869062 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf03f118d_0ede_40a8_a7cf_3c637824276d.slice/crio-7f921d9f15acdc726e54cae862763ddb0850ce53a38c79f5ae361c1de045fd5a WatchSource:0}: Error finding container 7f921d9f15acdc726e54cae862763ddb0850ce53a38c79f5ae361c1de045fd5a: Status 404 returned error can't find the container with id 7f921d9f15acdc726e54cae862763ddb0850ce53a38c79f5ae361c1de045fd5a Apr 28 19:17:02.201403 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:02.201362 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bczfz" event={"ID":"0b1ddaed-b20b-4d05-9006-8d55a8bd05f8","Type":"ContainerStarted","Data":"eee0253442508ba3c81112eb20b3c4a63818fbb6af646d5b658a40cf898a06db"} Apr 28 19:17:02.201935 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:02.201533 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-bczfz" Apr 28 19:17:02.202421 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:02.202397 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-zqwh5" event={"ID":"f03f118d-0ede-40a8-a7cf-3c637824276d","Type":"ContainerStarted","Data":"7f921d9f15acdc726e54cae862763ddb0850ce53a38c79f5ae361c1de045fd5a"} Apr 28 19:17:02.203676 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:02.203652 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m7r5s" event={"ID":"86939d9a-e349-408c-aad2-55e43a981aac","Type":"ContainerStarted","Data":"dd627d0488e482d90af198d1f020325f85c0ca406b51b780ebc00e6591979d01"} Apr 28 19:17:02.203676 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:02.203677 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m7r5s" event={"ID":"86939d9a-e349-408c-aad2-55e43a981aac","Type":"ContainerStarted","Data":"a88f9fae0981816e54eb865bee135fa1a5231903ea3e9fb000fcdaec2fc7cafa"} Apr 28 19:17:02.220024 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:02.219984 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-bczfz" podStartSLOduration=2.79430784 podStartE2EDuration="5.219969913s" podCreationTimestamp="2026-04-28 19:16:57 +0000 UTC" firstStartedPulling="2026-04-28 19:16:58.421555804 +0000 UTC m=+35.052456732" lastFinishedPulling="2026-04-28 19:17:00.847217876 +0000 UTC m=+37.478118805" observedRunningTime="2026-04-28 19:17:02.21930677 +0000 UTC m=+38.850207717" watchObservedRunningTime="2026-04-28 19:17:02.219969913 +0000 UTC m=+38.850870892" Apr 28 19:17:03.208827 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:03.208783 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m7r5s" event={"ID":"86939d9a-e349-408c-aad2-55e43a981aac","Type":"ContainerStarted","Data":"5a2d398fbce86bed9544860686ce694011db65e4db4687d22ac43ee99d26d513"} Apr 28 19:17:03.210077 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:03.210051 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tl8ms" event={"ID":"6eb370a0-b2e9-42c0-acb9-92f39db33103","Type":"ContainerStarted","Data":"66974751326cbe5e664f5699e5a66c72c1ce7e4e4e1975d125eceddef7447ffc"} Apr 28 19:17:03.212323 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:03.212296 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-pd8ms" event={"ID":"f525727f-5701-4a75-ae8d-ab2bea2bde16","Type":"ContainerStarted","Data":"f90f7b922843267ab7fd4e5b9b3955389fa5e277744ac3aec6fb98dd42792911"} Apr 28 19:17:03.247551 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:03.246051 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-pd8ms" podStartSLOduration=1.722731582 podStartE2EDuration="6.246034742s" podCreationTimestamp="2026-04-28 19:16:57 +0000 UTC" firstStartedPulling="2026-04-28 19:16:58.497791105 +0000 UTC m=+35.128692029" lastFinishedPulling="2026-04-28 19:17:03.021094251 +0000 UTC m=+39.651995189" observedRunningTime="2026-04-28 19:17:03.245150257 +0000 UTC m=+39.876051203" watchObservedRunningTime="2026-04-28 19:17:03.246034742 +0000 UTC m=+39.876935687" Apr 28 19:17:03.801199 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:03.801159 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-67c8d86c4f-d6x8n"] Apr 28 19:17:03.807192 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:03.806505 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-67c8d86c4f-d6x8n" Apr 28 19:17:03.809314 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:03.809262 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 28 19:17:03.809655 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:03.809629 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 28 19:17:03.810176 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:03.810155 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 28 19:17:03.810372 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:03.810342 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-g2jpb\"" Apr 28 19:17:03.811067 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:03.811041 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 28 19:17:03.811373 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:03.811354 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 28 19:17:03.811914 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:03.811784 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-81i9r2bqbpd6c\"" Apr 28 19:17:03.826241 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:03.826216 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-67c8d86c4f-d6x8n"] Apr 28 19:17:03.848350 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:03.848327 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/45277e70-f2c6-4e33-97af-9cb5f76dee0b-secret-thanos-querier-tls\") pod \"thanos-querier-67c8d86c4f-d6x8n\" (UID: \"45277e70-f2c6-4e33-97af-9cb5f76dee0b\") " pod="openshift-monitoring/thanos-querier-67c8d86c4f-d6x8n" Apr 28 19:17:03.848470 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:03.848366 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/45277e70-f2c6-4e33-97af-9cb5f76dee0b-metrics-client-ca\") pod \"thanos-querier-67c8d86c4f-d6x8n\" (UID: \"45277e70-f2c6-4e33-97af-9cb5f76dee0b\") " pod="openshift-monitoring/thanos-querier-67c8d86c4f-d6x8n" Apr 28 19:17:03.848470 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:03.848386 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/45277e70-f2c6-4e33-97af-9cb5f76dee0b-secret-grpc-tls\") pod \"thanos-querier-67c8d86c4f-d6x8n\" (UID: \"45277e70-f2c6-4e33-97af-9cb5f76dee0b\") " pod="openshift-monitoring/thanos-querier-67c8d86c4f-d6x8n" Apr 28 19:17:03.848590 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:03.848461 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/45277e70-f2c6-4e33-97af-9cb5f76dee0b-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-67c8d86c4f-d6x8n\" (UID: \"45277e70-f2c6-4e33-97af-9cb5f76dee0b\") " pod="openshift-monitoring/thanos-querier-67c8d86c4f-d6x8n" Apr 28 19:17:03.848590 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:03.848536 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/45277e70-f2c6-4e33-97af-9cb5f76dee0b-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-67c8d86c4f-d6x8n\" (UID: \"45277e70-f2c6-4e33-97af-9cb5f76dee0b\") " pod="openshift-monitoring/thanos-querier-67c8d86c4f-d6x8n" Apr 28 19:17:03.848675 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:03.848610 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/45277e70-f2c6-4e33-97af-9cb5f76dee0b-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-67c8d86c4f-d6x8n\" (UID: \"45277e70-f2c6-4e33-97af-9cb5f76dee0b\") " pod="openshift-monitoring/thanos-querier-67c8d86c4f-d6x8n" Apr 28 19:17:03.848675 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:03.848632 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/45277e70-f2c6-4e33-97af-9cb5f76dee0b-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-67c8d86c4f-d6x8n\" (UID: \"45277e70-f2c6-4e33-97af-9cb5f76dee0b\") " pod="openshift-monitoring/thanos-querier-67c8d86c4f-d6x8n" Apr 28 19:17:03.848760 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:03.848676 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mddnw\" (UniqueName: \"kubernetes.io/projected/45277e70-f2c6-4e33-97af-9cb5f76dee0b-kube-api-access-mddnw\") pod \"thanos-querier-67c8d86c4f-d6x8n\" (UID: \"45277e70-f2c6-4e33-97af-9cb5f76dee0b\") " pod="openshift-monitoring/thanos-querier-67c8d86c4f-d6x8n" Apr 28 19:17:03.949025 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:03.948995 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/45277e70-f2c6-4e33-97af-9cb5f76dee0b-secret-thanos-querier-tls\") pod \"thanos-querier-67c8d86c4f-d6x8n\" (UID: \"45277e70-f2c6-4e33-97af-9cb5f76dee0b\") " pod="openshift-monitoring/thanos-querier-67c8d86c4f-d6x8n" Apr 28 19:17:03.949201 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:03.949041 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/45277e70-f2c6-4e33-97af-9cb5f76dee0b-metrics-client-ca\") pod \"thanos-querier-67c8d86c4f-d6x8n\" (UID: \"45277e70-f2c6-4e33-97af-9cb5f76dee0b\") " pod="openshift-monitoring/thanos-querier-67c8d86c4f-d6x8n" Apr 28 19:17:03.949201 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:03.949067 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/45277e70-f2c6-4e33-97af-9cb5f76dee0b-secret-grpc-tls\") pod \"thanos-querier-67c8d86c4f-d6x8n\" (UID: \"45277e70-f2c6-4e33-97af-9cb5f76dee0b\") " pod="openshift-monitoring/thanos-querier-67c8d86c4f-d6x8n" Apr 28 19:17:03.949201 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:03.949106 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/45277e70-f2c6-4e33-97af-9cb5f76dee0b-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-67c8d86c4f-d6x8n\" (UID: \"45277e70-f2c6-4e33-97af-9cb5f76dee0b\") " pod="openshift-monitoring/thanos-querier-67c8d86c4f-d6x8n" Apr 28 19:17:03.949201 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:03.949160 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/45277e70-f2c6-4e33-97af-9cb5f76dee0b-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-67c8d86c4f-d6x8n\" (UID: \"45277e70-f2c6-4e33-97af-9cb5f76dee0b\") " pod="openshift-monitoring/thanos-querier-67c8d86c4f-d6x8n" Apr 28 19:17:03.949992 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:03.949502 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/45277e70-f2c6-4e33-97af-9cb5f76dee0b-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-67c8d86c4f-d6x8n\" (UID: \"45277e70-f2c6-4e33-97af-9cb5f76dee0b\") " pod="openshift-monitoring/thanos-querier-67c8d86c4f-d6x8n" Apr 28 19:17:03.949992 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:03.949550 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/45277e70-f2c6-4e33-97af-9cb5f76dee0b-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-67c8d86c4f-d6x8n\" (UID: \"45277e70-f2c6-4e33-97af-9cb5f76dee0b\") " pod="openshift-monitoring/thanos-querier-67c8d86c4f-d6x8n" Apr 28 19:17:03.949992 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:03.949609 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mddnw\" (UniqueName: \"kubernetes.io/projected/45277e70-f2c6-4e33-97af-9cb5f76dee0b-kube-api-access-mddnw\") pod \"thanos-querier-67c8d86c4f-d6x8n\" (UID: \"45277e70-f2c6-4e33-97af-9cb5f76dee0b\") " pod="openshift-monitoring/thanos-querier-67c8d86c4f-d6x8n" Apr 28 19:17:03.950220 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:03.950139 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/45277e70-f2c6-4e33-97af-9cb5f76dee0b-metrics-client-ca\") pod \"thanos-querier-67c8d86c4f-d6x8n\" (UID: \"45277e70-f2c6-4e33-97af-9cb5f76dee0b\") " pod="openshift-monitoring/thanos-querier-67c8d86c4f-d6x8n" Apr 28 19:17:03.953870 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:03.953842 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/45277e70-f2c6-4e33-97af-9cb5f76dee0b-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-67c8d86c4f-d6x8n\" (UID: \"45277e70-f2c6-4e33-97af-9cb5f76dee0b\") " pod="openshift-monitoring/thanos-querier-67c8d86c4f-d6x8n" Apr 28 19:17:03.954001 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:03.953974 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/45277e70-f2c6-4e33-97af-9cb5f76dee0b-secret-thanos-querier-tls\") pod \"thanos-querier-67c8d86c4f-d6x8n\" (UID: \"45277e70-f2c6-4e33-97af-9cb5f76dee0b\") " pod="openshift-monitoring/thanos-querier-67c8d86c4f-d6x8n" Apr 28 19:17:03.954099 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:03.954014 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/45277e70-f2c6-4e33-97af-9cb5f76dee0b-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-67c8d86c4f-d6x8n\" (UID: \"45277e70-f2c6-4e33-97af-9cb5f76dee0b\") " pod="openshift-monitoring/thanos-querier-67c8d86c4f-d6x8n" Apr 28 19:17:03.954099 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:03.954014 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/45277e70-f2c6-4e33-97af-9cb5f76dee0b-secret-grpc-tls\") pod \"thanos-querier-67c8d86c4f-d6x8n\" (UID: \"45277e70-f2c6-4e33-97af-9cb5f76dee0b\") " pod="openshift-monitoring/thanos-querier-67c8d86c4f-d6x8n" Apr 28 19:17:03.954745 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:03.954725 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/45277e70-f2c6-4e33-97af-9cb5f76dee0b-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-67c8d86c4f-d6x8n\" (UID: \"45277e70-f2c6-4e33-97af-9cb5f76dee0b\") " pod="openshift-monitoring/thanos-querier-67c8d86c4f-d6x8n" Apr 28 19:17:03.955707 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:03.955681 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/45277e70-f2c6-4e33-97af-9cb5f76dee0b-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-67c8d86c4f-d6x8n\" (UID: \"45277e70-f2c6-4e33-97af-9cb5f76dee0b\") " pod="openshift-monitoring/thanos-querier-67c8d86c4f-d6x8n" Apr 28 19:17:03.957136 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:03.957107 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mddnw\" (UniqueName: \"kubernetes.io/projected/45277e70-f2c6-4e33-97af-9cb5f76dee0b-kube-api-access-mddnw\") pod \"thanos-querier-67c8d86c4f-d6x8n\" (UID: \"45277e70-f2c6-4e33-97af-9cb5f76dee0b\") " pod="openshift-monitoring/thanos-querier-67c8d86c4f-d6x8n" Apr 28 19:17:04.119343 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:04.119303 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-67c8d86c4f-d6x8n" Apr 28 19:17:04.215498 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:04.215389 2571 generic.go:358] "Generic (PLEG): container finished" podID="6eb370a0-b2e9-42c0-acb9-92f39db33103" containerID="66974751326cbe5e664f5699e5a66c72c1ce7e4e4e1975d125eceddef7447ffc" exitCode=0 Apr 28 19:17:04.215927 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:04.215497 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tl8ms" event={"ID":"6eb370a0-b2e9-42c0-acb9-92f39db33103","Type":"ContainerDied","Data":"66974751326cbe5e664f5699e5a66c72c1ce7e4e4e1975d125eceddef7447ffc"} Apr 28 19:17:04.409257 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:04.409231 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-67c8d86c4f-d6x8n"] Apr 28 19:17:04.604686 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:04.604658 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-c456b96cd-fdnz7"] Apr 28 19:17:04.607866 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:04.607846 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c456b96cd-fdnz7" Apr 28 19:17:04.610134 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:04.610098 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-ffxhh\"" Apr 28 19:17:04.610258 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:04.610104 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 28 19:17:04.610258 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:04.610236 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 28 19:17:04.610389 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:04.610376 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 28 19:17:04.610472 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:04.610453 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 28 19:17:04.610576 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:04.610560 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 28 19:17:04.610696 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:04.610682 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 28 19:17:04.610826 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:04.610795 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 28 19:17:04.616302 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:04.616281 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 28 19:17:04.617217 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:04.617198 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c456b96cd-fdnz7"] Apr 28 19:17:04.656798 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:04.656769 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0eea4846-7972-4b6c-8b2a-58bce3c1e353-console-serving-cert\") pod \"console-c456b96cd-fdnz7\" (UID: \"0eea4846-7972-4b6c-8b2a-58bce3c1e353\") " pod="openshift-console/console-c456b96cd-fdnz7" Apr 28 19:17:04.656962 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:04.656827 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0eea4846-7972-4b6c-8b2a-58bce3c1e353-console-config\") pod \"console-c456b96cd-fdnz7\" (UID: \"0eea4846-7972-4b6c-8b2a-58bce3c1e353\") " pod="openshift-console/console-c456b96cd-fdnz7" Apr 28 19:17:04.656962 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:04.656849 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0eea4846-7972-4b6c-8b2a-58bce3c1e353-service-ca\") pod \"console-c456b96cd-fdnz7\" (UID: \"0eea4846-7972-4b6c-8b2a-58bce3c1e353\") " pod="openshift-console/console-c456b96cd-fdnz7" Apr 28 19:17:04.656962 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:04.656885 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hllw5\" (UniqueName: \"kubernetes.io/projected/0eea4846-7972-4b6c-8b2a-58bce3c1e353-kube-api-access-hllw5\") pod \"console-c456b96cd-fdnz7\" (UID: \"0eea4846-7972-4b6c-8b2a-58bce3c1e353\") " pod="openshift-console/console-c456b96cd-fdnz7" Apr 28 19:17:04.656962 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:04.656944 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0eea4846-7972-4b6c-8b2a-58bce3c1e353-oauth-serving-cert\") pod \"console-c456b96cd-fdnz7\" (UID: \"0eea4846-7972-4b6c-8b2a-58bce3c1e353\") " pod="openshift-console/console-c456b96cd-fdnz7" Apr 28 19:17:04.656962 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:04.656962 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0eea4846-7972-4b6c-8b2a-58bce3c1e353-console-oauth-config\") pod \"console-c456b96cd-fdnz7\" (UID: \"0eea4846-7972-4b6c-8b2a-58bce3c1e353\") " pod="openshift-console/console-c456b96cd-fdnz7" Apr 28 19:17:04.657209 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:04.656988 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0eea4846-7972-4b6c-8b2a-58bce3c1e353-trusted-ca-bundle\") pod \"console-c456b96cd-fdnz7\" (UID: \"0eea4846-7972-4b6c-8b2a-58bce3c1e353\") " pod="openshift-console/console-c456b96cd-fdnz7" Apr 28 19:17:04.758393 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:04.758293 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0eea4846-7972-4b6c-8b2a-58bce3c1e353-console-serving-cert\") pod \"console-c456b96cd-fdnz7\" (UID: \"0eea4846-7972-4b6c-8b2a-58bce3c1e353\") " pod="openshift-console/console-c456b96cd-fdnz7" Apr 28 19:17:04.758393 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:04.758357 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0eea4846-7972-4b6c-8b2a-58bce3c1e353-console-config\") pod \"console-c456b96cd-fdnz7\" (UID: \"0eea4846-7972-4b6c-8b2a-58bce3c1e353\") " pod="openshift-console/console-c456b96cd-fdnz7" Apr 28 19:17:04.758393 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:04.758378 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0eea4846-7972-4b6c-8b2a-58bce3c1e353-service-ca\") pod \"console-c456b96cd-fdnz7\" (UID: \"0eea4846-7972-4b6c-8b2a-58bce3c1e353\") " pod="openshift-console/console-c456b96cd-fdnz7" Apr 28 19:17:04.758393 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:04.758393 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hllw5\" (UniqueName: \"kubernetes.io/projected/0eea4846-7972-4b6c-8b2a-58bce3c1e353-kube-api-access-hllw5\") pod \"console-c456b96cd-fdnz7\" (UID: \"0eea4846-7972-4b6c-8b2a-58bce3c1e353\") " pod="openshift-console/console-c456b96cd-fdnz7" Apr 28 19:17:04.758724 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:04.758412 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0eea4846-7972-4b6c-8b2a-58bce3c1e353-oauth-serving-cert\") pod \"console-c456b96cd-fdnz7\" (UID: \"0eea4846-7972-4b6c-8b2a-58bce3c1e353\") " pod="openshift-console/console-c456b96cd-fdnz7" Apr 28 19:17:04.758724 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:04.758431 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0eea4846-7972-4b6c-8b2a-58bce3c1e353-console-oauth-config\") pod \"console-c456b96cd-fdnz7\" (UID: \"0eea4846-7972-4b6c-8b2a-58bce3c1e353\") " pod="openshift-console/console-c456b96cd-fdnz7" Apr 28 19:17:04.758724 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:04.758461 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0eea4846-7972-4b6c-8b2a-58bce3c1e353-trusted-ca-bundle\") pod \"console-c456b96cd-fdnz7\" (UID: \"0eea4846-7972-4b6c-8b2a-58bce3c1e353\") " pod="openshift-console/console-c456b96cd-fdnz7" Apr 28 19:17:04.759233 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:04.759207 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0eea4846-7972-4b6c-8b2a-58bce3c1e353-service-ca\") pod \"console-c456b96cd-fdnz7\" (UID: \"0eea4846-7972-4b6c-8b2a-58bce3c1e353\") " pod="openshift-console/console-c456b96cd-fdnz7" Apr 28 19:17:04.759361 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:04.759294 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0eea4846-7972-4b6c-8b2a-58bce3c1e353-console-config\") pod \"console-c456b96cd-fdnz7\" (UID: \"0eea4846-7972-4b6c-8b2a-58bce3c1e353\") " pod="openshift-console/console-c456b96cd-fdnz7" Apr 28 19:17:04.759361 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:04.759297 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0eea4846-7972-4b6c-8b2a-58bce3c1e353-oauth-serving-cert\") pod \"console-c456b96cd-fdnz7\" (UID: \"0eea4846-7972-4b6c-8b2a-58bce3c1e353\") " pod="openshift-console/console-c456b96cd-fdnz7" Apr 28 19:17:04.759537 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:04.759522 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0eea4846-7972-4b6c-8b2a-58bce3c1e353-trusted-ca-bundle\") pod \"console-c456b96cd-fdnz7\" (UID: \"0eea4846-7972-4b6c-8b2a-58bce3c1e353\") " pod="openshift-console/console-c456b96cd-fdnz7" Apr 28 19:17:04.760899 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:04.760877 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0eea4846-7972-4b6c-8b2a-58bce3c1e353-console-oauth-config\") pod \"console-c456b96cd-fdnz7\" (UID: \"0eea4846-7972-4b6c-8b2a-58bce3c1e353\") " pod="openshift-console/console-c456b96cd-fdnz7" Apr 28 19:17:04.761409 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:04.761395 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0eea4846-7972-4b6c-8b2a-58bce3c1e353-console-serving-cert\") pod \"console-c456b96cd-fdnz7\" (UID: \"0eea4846-7972-4b6c-8b2a-58bce3c1e353\") " pod="openshift-console/console-c456b96cd-fdnz7" Apr 28 19:17:04.783563 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:04.783538 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hllw5\" (UniqueName: \"kubernetes.io/projected/0eea4846-7972-4b6c-8b2a-58bce3c1e353-kube-api-access-hllw5\") pod \"console-c456b96cd-fdnz7\" (UID: \"0eea4846-7972-4b6c-8b2a-58bce3c1e353\") " pod="openshift-console/console-c456b96cd-fdnz7" Apr 28 19:17:04.919013 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:04.918971 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c456b96cd-fdnz7" Apr 28 19:17:05.041346 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:05.041278 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c456b96cd-fdnz7"] Apr 28 19:17:05.044765 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:17:05.044737 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0eea4846_7972_4b6c_8b2a_58bce3c1e353.slice/crio-643a043bb262e8669ecd7b74d2b077b7aac7bed215c72c5b2ae36d5d219d1d87 WatchSource:0}: Error finding container 643a043bb262e8669ecd7b74d2b077b7aac7bed215c72c5b2ae36d5d219d1d87: Status 404 returned error can't find the container with id 643a043bb262e8669ecd7b74d2b077b7aac7bed215c72c5b2ae36d5d219d1d87 Apr 28 19:17:05.220828 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:05.220772 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c456b96cd-fdnz7" event={"ID":"0eea4846-7972-4b6c-8b2a-58bce3c1e353","Type":"ContainerStarted","Data":"643a043bb262e8669ecd7b74d2b077b7aac7bed215c72c5b2ae36d5d219d1d87"} Apr 28 19:17:05.223021 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:05.222992 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-zqwh5" event={"ID":"f03f118d-0ede-40a8-a7cf-3c637824276d","Type":"ContainerStarted","Data":"3a12fc669da301750060680e262cbc70b8416d180ea76dec06526dadae412478"} Apr 28 19:17:05.223152 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:05.223027 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-zqwh5" event={"ID":"f03f118d-0ede-40a8-a7cf-3c637824276d","Type":"ContainerStarted","Data":"f5a0a7ec38006541a126b3e057c0136b5aee17e770851faa047448200f1a8c87"} Apr 28 19:17:05.223152 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:05.223037 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-zqwh5" event={"ID":"f03f118d-0ede-40a8-a7cf-3c637824276d","Type":"ContainerStarted","Data":"8d0fc0edd728cf07a15f7cec01005a587296ef0e415ab8e486b5e12d88fa2765"} Apr 28 19:17:05.225047 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:05.225015 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m7r5s" event={"ID":"86939d9a-e349-408c-aad2-55e43a981aac","Type":"ContainerStarted","Data":"43a1824deb88dd97d43244d94a67049cf95dfb30ef8188223fbf9ea49f37ccfc"} Apr 28 19:17:05.226411 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:05.226383 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-67c8d86c4f-d6x8n" event={"ID":"45277e70-f2c6-4e33-97af-9cb5f76dee0b","Type":"ContainerStarted","Data":"6fbf7e858d817d0129916c6d8f2052d403be09e721c07d8bc98917a3f7660d54"} Apr 28 19:17:05.228298 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:05.228272 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tl8ms" event={"ID":"6eb370a0-b2e9-42c0-acb9-92f39db33103","Type":"ContainerStarted","Data":"5764537c9ecee6089894e6c13eea2bbe8a0020ea78dd308135fd9c63978cfcb9"} Apr 28 19:17:05.228298 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:05.228296 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tl8ms" event={"ID":"6eb370a0-b2e9-42c0-acb9-92f39db33103","Type":"ContainerStarted","Data":"307cab8b4e153d81417776cb6ff49cb58f5a3a07c6125060ccae36cb213700a6"} Apr 28 19:17:05.242386 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:05.242345 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-zqwh5" podStartSLOduration=2.84645259 podStartE2EDuration="5.242333076s" podCreationTimestamp="2026-04-28 19:17:00 +0000 UTC" firstStartedPulling="2026-04-28 19:17:01.871648428 +0000 UTC m=+38.502549364" lastFinishedPulling="2026-04-28 19:17:04.267528928 +0000 UTC m=+40.898429850" observedRunningTime="2026-04-28 19:17:05.241503467 +0000 UTC m=+41.872404409" watchObservedRunningTime="2026-04-28 19:17:05.242333076 +0000 UTC m=+41.873234062" Apr 28 19:17:05.261812 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:05.261768 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m7r5s" podStartSLOduration=4.088451915 podStartE2EDuration="5.261755684s" podCreationTimestamp="2026-04-28 19:17:00 +0000 UTC" firstStartedPulling="2026-04-28 19:17:03.095628084 +0000 UTC m=+39.726529022" lastFinishedPulling="2026-04-28 19:17:04.268931868 +0000 UTC m=+40.899832791" observedRunningTime="2026-04-28 19:17:05.260728343 +0000 UTC m=+41.891629288" watchObservedRunningTime="2026-04-28 19:17:05.261755684 +0000 UTC m=+41.892656630" Apr 28 19:17:05.288214 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:05.288169 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-tl8ms" podStartSLOduration=3.372062316 podStartE2EDuration="5.288156075s" podCreationTimestamp="2026-04-28 19:17:00 +0000 UTC" firstStartedPulling="2026-04-28 19:17:01.103381588 +0000 UTC m=+37.734282512" lastFinishedPulling="2026-04-28 19:17:03.019475338 +0000 UTC m=+39.650376271" observedRunningTime="2026-04-28 19:17:05.286790327 +0000 UTC m=+41.917691322" watchObservedRunningTime="2026-04-28 19:17:05.288156075 +0000 UTC m=+41.919057020" Apr 28 19:17:05.506845 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:05.506811 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-wqhlw"] Apr 28 19:17:05.510448 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:05.510361 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-wqhlw" Apr 28 19:17:05.512739 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:05.512704 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 28 19:17:05.512739 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:05.512720 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-9hxmf\"" Apr 28 19:17:05.518821 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:05.518698 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-wqhlw"] Apr 28 19:17:05.565612 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:05.565581 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/10b46dea-b1bd-4fe9-b096-d027eda0809d-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-wqhlw\" (UID: \"10b46dea-b1bd-4fe9-b096-d027eda0809d\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-wqhlw" Apr 28 19:17:05.666972 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:05.666877 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/10b46dea-b1bd-4fe9-b096-d027eda0809d-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-wqhlw\" (UID: \"10b46dea-b1bd-4fe9-b096-d027eda0809d\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-wqhlw" Apr 28 19:17:05.667160 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:17:05.667008 2571 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 28 19:17:05.667160 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:17:05.667088 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/10b46dea-b1bd-4fe9-b096-d027eda0809d-monitoring-plugin-cert podName:10b46dea-b1bd-4fe9-b096-d027eda0809d nodeName:}" failed. No retries permitted until 2026-04-28 19:17:06.167069524 +0000 UTC m=+42.797970453 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/10b46dea-b1bd-4fe9-b096-d027eda0809d-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-wqhlw" (UID: "10b46dea-b1bd-4fe9-b096-d027eda0809d") : secret "monitoring-plugin-cert" not found Apr 28 19:17:05.982131 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:05.982085 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-6475644cb-ndpb2"] Apr 28 19:17:05.985688 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:05.985661 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-6475644cb-ndpb2" Apr 28 19:17:05.988146 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:05.988118 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 28 19:17:05.988291 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:05.988229 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-k88vf\"" Apr 28 19:17:05.988291 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:05.988242 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 28 19:17:05.988407 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:05.988398 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 28 19:17:05.988681 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:05.988555 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 28 19:17:05.988681 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:05.988572 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 28 19:17:05.997954 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:05.997918 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-6475644cb-ndpb2"] Apr 28 19:17:05.998497 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:05.998454 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 28 19:17:06.070011 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:06.069977 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3807a95d-ac8a-42a9-95bf-87514836c9be-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6475644cb-ndpb2\" (UID: \"3807a95d-ac8a-42a9-95bf-87514836c9be\") " pod="openshift-monitoring/telemeter-client-6475644cb-ndpb2" Apr 28 19:17:06.070202 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:06.070032 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3807a95d-ac8a-42a9-95bf-87514836c9be-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6475644cb-ndpb2\" (UID: \"3807a95d-ac8a-42a9-95bf-87514836c9be\") " pod="openshift-monitoring/telemeter-client-6475644cb-ndpb2" Apr 28 19:17:06.070202 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:06.070117 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/3807a95d-ac8a-42a9-95bf-87514836c9be-telemeter-client-tls\") pod \"telemeter-client-6475644cb-ndpb2\" (UID: \"3807a95d-ac8a-42a9-95bf-87514836c9be\") " pod="openshift-monitoring/telemeter-client-6475644cb-ndpb2" Apr 28 19:17:06.070202 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:06.070154 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3807a95d-ac8a-42a9-95bf-87514836c9be-metrics-client-ca\") pod \"telemeter-client-6475644cb-ndpb2\" (UID: \"3807a95d-ac8a-42a9-95bf-87514836c9be\") " pod="openshift-monitoring/telemeter-client-6475644cb-ndpb2" Apr 28 19:17:06.070202 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:06.070186 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjlxh\" (UniqueName: \"kubernetes.io/projected/3807a95d-ac8a-42a9-95bf-87514836c9be-kube-api-access-cjlxh\") pod \"telemeter-client-6475644cb-ndpb2\" (UID: \"3807a95d-ac8a-42a9-95bf-87514836c9be\") " pod="openshift-monitoring/telemeter-client-6475644cb-ndpb2" Apr 28 19:17:06.070394 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:06.070215 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/3807a95d-ac8a-42a9-95bf-87514836c9be-federate-client-tls\") pod \"telemeter-client-6475644cb-ndpb2\" (UID: \"3807a95d-ac8a-42a9-95bf-87514836c9be\") " pod="openshift-monitoring/telemeter-client-6475644cb-ndpb2" Apr 28 19:17:06.070394 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:06.070285 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/3807a95d-ac8a-42a9-95bf-87514836c9be-secret-telemeter-client\") pod \"telemeter-client-6475644cb-ndpb2\" (UID: \"3807a95d-ac8a-42a9-95bf-87514836c9be\") " pod="openshift-monitoring/telemeter-client-6475644cb-ndpb2" Apr 28 19:17:06.070394 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:06.070327 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3807a95d-ac8a-42a9-95bf-87514836c9be-serving-certs-ca-bundle\") pod \"telemeter-client-6475644cb-ndpb2\" (UID: \"3807a95d-ac8a-42a9-95bf-87514836c9be\") " pod="openshift-monitoring/telemeter-client-6475644cb-ndpb2" Apr 28 19:17:06.172129 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:06.171774 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cjlxh\" (UniqueName: \"kubernetes.io/projected/3807a95d-ac8a-42a9-95bf-87514836c9be-kube-api-access-cjlxh\") pod \"telemeter-client-6475644cb-ndpb2\" (UID: \"3807a95d-ac8a-42a9-95bf-87514836c9be\") " pod="openshift-monitoring/telemeter-client-6475644cb-ndpb2" Apr 28 19:17:06.172129 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:06.171841 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/3807a95d-ac8a-42a9-95bf-87514836c9be-federate-client-tls\") pod \"telemeter-client-6475644cb-ndpb2\" (UID: \"3807a95d-ac8a-42a9-95bf-87514836c9be\") " pod="openshift-monitoring/telemeter-client-6475644cb-ndpb2" Apr 28 19:17:06.172129 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:06.171896 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/3807a95d-ac8a-42a9-95bf-87514836c9be-secret-telemeter-client\") pod \"telemeter-client-6475644cb-ndpb2\" (UID: \"3807a95d-ac8a-42a9-95bf-87514836c9be\") " pod="openshift-monitoring/telemeter-client-6475644cb-ndpb2" Apr 28 19:17:06.172129 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:06.171944 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3807a95d-ac8a-42a9-95bf-87514836c9be-serving-certs-ca-bundle\") pod \"telemeter-client-6475644cb-ndpb2\" (UID: \"3807a95d-ac8a-42a9-95bf-87514836c9be\") " pod="openshift-monitoring/telemeter-client-6475644cb-ndpb2" Apr 28 19:17:06.172129 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:06.171991 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/10b46dea-b1bd-4fe9-b096-d027eda0809d-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-wqhlw\" (UID: \"10b46dea-b1bd-4fe9-b096-d027eda0809d\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-wqhlw" Apr 28 19:17:06.172129 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:06.172037 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3807a95d-ac8a-42a9-95bf-87514836c9be-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6475644cb-ndpb2\" (UID: \"3807a95d-ac8a-42a9-95bf-87514836c9be\") " pod="openshift-monitoring/telemeter-client-6475644cb-ndpb2" Apr 28 19:17:06.172129 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:06.172087 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3807a95d-ac8a-42a9-95bf-87514836c9be-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6475644cb-ndpb2\" (UID: \"3807a95d-ac8a-42a9-95bf-87514836c9be\") " pod="openshift-monitoring/telemeter-client-6475644cb-ndpb2" Apr 28 19:17:06.172129 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:06.172116 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/3807a95d-ac8a-42a9-95bf-87514836c9be-telemeter-client-tls\") pod \"telemeter-client-6475644cb-ndpb2\" (UID: \"3807a95d-ac8a-42a9-95bf-87514836c9be\") " pod="openshift-monitoring/telemeter-client-6475644cb-ndpb2" Apr 28 19:17:06.172713 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:06.172141 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3807a95d-ac8a-42a9-95bf-87514836c9be-metrics-client-ca\") pod \"telemeter-client-6475644cb-ndpb2\" (UID: \"3807a95d-ac8a-42a9-95bf-87514836c9be\") " pod="openshift-monitoring/telemeter-client-6475644cb-ndpb2" Apr 28 19:17:06.173956 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:06.172974 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3807a95d-ac8a-42a9-95bf-87514836c9be-metrics-client-ca\") pod \"telemeter-client-6475644cb-ndpb2\" (UID: \"3807a95d-ac8a-42a9-95bf-87514836c9be\") " pod="openshift-monitoring/telemeter-client-6475644cb-ndpb2" Apr 28 19:17:06.173956 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:06.173417 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3807a95d-ac8a-42a9-95bf-87514836c9be-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6475644cb-ndpb2\" (UID: \"3807a95d-ac8a-42a9-95bf-87514836c9be\") " pod="openshift-monitoring/telemeter-client-6475644cb-ndpb2" Apr 28 19:17:06.173956 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:06.173908 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3807a95d-ac8a-42a9-95bf-87514836c9be-serving-certs-ca-bundle\") pod \"telemeter-client-6475644cb-ndpb2\" (UID: \"3807a95d-ac8a-42a9-95bf-87514836c9be\") " pod="openshift-monitoring/telemeter-client-6475644cb-ndpb2" Apr 28 19:17:06.175908 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:06.175863 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/3807a95d-ac8a-42a9-95bf-87514836c9be-secret-telemeter-client\") pod \"telemeter-client-6475644cb-ndpb2\" (UID: \"3807a95d-ac8a-42a9-95bf-87514836c9be\") " pod="openshift-monitoring/telemeter-client-6475644cb-ndpb2" Apr 28 19:17:06.176515 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:06.176456 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3807a95d-ac8a-42a9-95bf-87514836c9be-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6475644cb-ndpb2\" (UID: \"3807a95d-ac8a-42a9-95bf-87514836c9be\") " pod="openshift-monitoring/telemeter-client-6475644cb-ndpb2" Apr 28 19:17:06.176632 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:06.176559 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/10b46dea-b1bd-4fe9-b096-d027eda0809d-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-wqhlw\" (UID: \"10b46dea-b1bd-4fe9-b096-d027eda0809d\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-wqhlw" Apr 28 19:17:06.177334 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:06.177294 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/3807a95d-ac8a-42a9-95bf-87514836c9be-telemeter-client-tls\") pod \"telemeter-client-6475644cb-ndpb2\" (UID: \"3807a95d-ac8a-42a9-95bf-87514836c9be\") " pod="openshift-monitoring/telemeter-client-6475644cb-ndpb2" Apr 28 19:17:06.177872 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:06.177846 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/3807a95d-ac8a-42a9-95bf-87514836c9be-federate-client-tls\") pod \"telemeter-client-6475644cb-ndpb2\" (UID: \"3807a95d-ac8a-42a9-95bf-87514836c9be\") " pod="openshift-monitoring/telemeter-client-6475644cb-ndpb2" Apr 28 19:17:06.181265 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:06.181245 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjlxh\" (UniqueName: \"kubernetes.io/projected/3807a95d-ac8a-42a9-95bf-87514836c9be-kube-api-access-cjlxh\") pod \"telemeter-client-6475644cb-ndpb2\" (UID: \"3807a95d-ac8a-42a9-95bf-87514836c9be\") " pod="openshift-monitoring/telemeter-client-6475644cb-ndpb2" Apr 28 19:17:06.297117 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:06.297041 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-6475644cb-ndpb2" Apr 28 19:17:06.422643 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:06.422605 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-wqhlw" Apr 28 19:17:07.021689 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:07.021666 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-6475644cb-ndpb2"] Apr 28 19:17:07.039344 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:07.039295 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-wqhlw"] Apr 28 19:17:07.236591 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:07.236550 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-67c8d86c4f-d6x8n" event={"ID":"45277e70-f2c6-4e33-97af-9cb5f76dee0b","Type":"ContainerStarted","Data":"10b7e41363be051bc7b231865860da2cf77e0db0f8ee2b2719c8dbcf1549284f"} Apr 28 19:17:08.033684 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:17:08.033651 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3807a95d_ac8a_42a9_95bf_87514836c9be.slice/crio-b2907d5892d71eeed0f2bdd25a522fc3b5c3a86ff5851aa0ef11fdf014bef820 WatchSource:0}: Error finding container b2907d5892d71eeed0f2bdd25a522fc3b5c3a86ff5851aa0ef11fdf014bef820: Status 404 returned error can't find the container with id b2907d5892d71eeed0f2bdd25a522fc3b5c3a86ff5851aa0ef11fdf014bef820 Apr 28 19:17:08.034501 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:17:08.034177 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10b46dea_b1bd_4fe9_b096_d027eda0809d.slice/crio-f0f7252222000ecbf11c0c58065e32133618ff88b093024ca79486272b2ad586 WatchSource:0}: Error finding container f0f7252222000ecbf11c0c58065e32133618ff88b093024ca79486272b2ad586: Status 404 returned error can't find the container with id f0f7252222000ecbf11c0c58065e32133618ff88b093024ca79486272b2ad586 Apr 28 19:17:08.240814 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:08.240783 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6475644cb-ndpb2" event={"ID":"3807a95d-ac8a-42a9-95bf-87514836c9be","Type":"ContainerStarted","Data":"b2907d5892d71eeed0f2bdd25a522fc3b5c3a86ff5851aa0ef11fdf014bef820"} Apr 28 19:17:08.241946 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:08.241912 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-wqhlw" event={"ID":"10b46dea-b1bd-4fe9-b096-d027eda0809d","Type":"ContainerStarted","Data":"f0f7252222000ecbf11c0c58065e32133618ff88b093024ca79486272b2ad586"} Apr 28 19:17:08.243202 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:08.243183 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c456b96cd-fdnz7" event={"ID":"0eea4846-7972-4b6c-8b2a-58bce3c1e353","Type":"ContainerStarted","Data":"9d05e82453a40189e68c02eb56a8d8af67bb2378746580820a6706be513b6360"} Apr 28 19:17:08.245182 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:08.245159 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-67c8d86c4f-d6x8n" event={"ID":"45277e70-f2c6-4e33-97af-9cb5f76dee0b","Type":"ContainerStarted","Data":"31a1f2cb7656f4b0a9311f079308921299422e4245a370ba2f9b7e11cb0ebfbc"} Apr 28 19:17:08.245259 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:08.245186 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-67c8d86c4f-d6x8n" event={"ID":"45277e70-f2c6-4e33-97af-9cb5f76dee0b","Type":"ContainerStarted","Data":"55e701333f50fd4a80400d286223adfb0ecea2d82bbca24134905e55a4685031"} Apr 28 19:17:08.270200 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:08.270156 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-c456b96cd-fdnz7" podStartSLOduration=1.232188765 podStartE2EDuration="4.270143277s" podCreationTimestamp="2026-04-28 19:17:04 +0000 UTC" firstStartedPulling="2026-04-28 19:17:05.046685817 +0000 UTC m=+41.677586740" lastFinishedPulling="2026-04-28 19:17:08.084640325 +0000 UTC m=+44.715541252" observedRunningTime="2026-04-28 19:17:08.268381025 +0000 UTC m=+44.899281970" watchObservedRunningTime="2026-04-28 19:17:08.270143277 +0000 UTC m=+44.901044224" Apr 28 19:17:09.815577 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:09.815535 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-c456b96cd-fdnz7"] Apr 28 19:17:09.854115 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:09.854080 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-f56f4f89b-6xgsz"] Apr 28 19:17:09.882421 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:09.882327 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f56f4f89b-6xgsz"] Apr 28 19:17:09.882594 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:09.882470 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f56f4f89b-6xgsz" Apr 28 19:17:09.904376 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:09.904344 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7d7d434-3570-47ef-bdda-560a95a9a87b-trusted-ca-bundle\") pod \"console-f56f4f89b-6xgsz\" (UID: \"a7d7d434-3570-47ef-bdda-560a95a9a87b\") " pod="openshift-console/console-f56f4f89b-6xgsz" Apr 28 19:17:09.904535 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:09.904381 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a7d7d434-3570-47ef-bdda-560a95a9a87b-console-oauth-config\") pod \"console-f56f4f89b-6xgsz\" (UID: \"a7d7d434-3570-47ef-bdda-560a95a9a87b\") " pod="openshift-console/console-f56f4f89b-6xgsz" Apr 28 19:17:09.904535 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:09.904401 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a7d7d434-3570-47ef-bdda-560a95a9a87b-service-ca\") pod \"console-f56f4f89b-6xgsz\" (UID: \"a7d7d434-3570-47ef-bdda-560a95a9a87b\") " pod="openshift-console/console-f56f4f89b-6xgsz" Apr 28 19:17:09.904535 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:09.904508 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv66m\" (UniqueName: \"kubernetes.io/projected/a7d7d434-3570-47ef-bdda-560a95a9a87b-kube-api-access-hv66m\") pod \"console-f56f4f89b-6xgsz\" (UID: \"a7d7d434-3570-47ef-bdda-560a95a9a87b\") " pod="openshift-console/console-f56f4f89b-6xgsz" Apr 28 19:17:09.904695 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:09.904566 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a7d7d434-3570-47ef-bdda-560a95a9a87b-oauth-serving-cert\") pod \"console-f56f4f89b-6xgsz\" (UID: \"a7d7d434-3570-47ef-bdda-560a95a9a87b\") " pod="openshift-console/console-f56f4f89b-6xgsz" Apr 28 19:17:09.904695 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:09.904610 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a7d7d434-3570-47ef-bdda-560a95a9a87b-console-serving-cert\") pod \"console-f56f4f89b-6xgsz\" (UID: \"a7d7d434-3570-47ef-bdda-560a95a9a87b\") " pod="openshift-console/console-f56f4f89b-6xgsz" Apr 28 19:17:09.904695 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:09.904657 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a7d7d434-3570-47ef-bdda-560a95a9a87b-console-config\") pod \"console-f56f4f89b-6xgsz\" (UID: \"a7d7d434-3570-47ef-bdda-560a95a9a87b\") " pod="openshift-console/console-f56f4f89b-6xgsz" Apr 28 19:17:10.005862 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:10.005777 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a7d7d434-3570-47ef-bdda-560a95a9a87b-console-oauth-config\") pod \"console-f56f4f89b-6xgsz\" (UID: \"a7d7d434-3570-47ef-bdda-560a95a9a87b\") " pod="openshift-console/console-f56f4f89b-6xgsz" Apr 28 19:17:10.005862 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:10.005820 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a7d7d434-3570-47ef-bdda-560a95a9a87b-service-ca\") pod \"console-f56f4f89b-6xgsz\" (UID: \"a7d7d434-3570-47ef-bdda-560a95a9a87b\") " pod="openshift-console/console-f56f4f89b-6xgsz" Apr 28 19:17:10.006077 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:10.005875 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hv66m\" (UniqueName: \"kubernetes.io/projected/a7d7d434-3570-47ef-bdda-560a95a9a87b-kube-api-access-hv66m\") pod \"console-f56f4f89b-6xgsz\" (UID: \"a7d7d434-3570-47ef-bdda-560a95a9a87b\") " pod="openshift-console/console-f56f4f89b-6xgsz" Apr 28 19:17:10.006077 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:10.005923 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a7d7d434-3570-47ef-bdda-560a95a9a87b-oauth-serving-cert\") pod \"console-f56f4f89b-6xgsz\" (UID: \"a7d7d434-3570-47ef-bdda-560a95a9a87b\") " pod="openshift-console/console-f56f4f89b-6xgsz" Apr 28 19:17:10.006077 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:10.005954 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a7d7d434-3570-47ef-bdda-560a95a9a87b-console-serving-cert\") pod \"console-f56f4f89b-6xgsz\" (UID: \"a7d7d434-3570-47ef-bdda-560a95a9a87b\") " pod="openshift-console/console-f56f4f89b-6xgsz" Apr 28 19:17:10.006077 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:10.005994 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a7d7d434-3570-47ef-bdda-560a95a9a87b-console-config\") pod \"console-f56f4f89b-6xgsz\" (UID: \"a7d7d434-3570-47ef-bdda-560a95a9a87b\") " pod="openshift-console/console-f56f4f89b-6xgsz" Apr 28 19:17:10.006077 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:10.006031 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7d7d434-3570-47ef-bdda-560a95a9a87b-trusted-ca-bundle\") pod \"console-f56f4f89b-6xgsz\" (UID: \"a7d7d434-3570-47ef-bdda-560a95a9a87b\") " pod="openshift-console/console-f56f4f89b-6xgsz" Apr 28 19:17:10.006742 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:10.006711 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a7d7d434-3570-47ef-bdda-560a95a9a87b-service-ca\") pod \"console-f56f4f89b-6xgsz\" (UID: \"a7d7d434-3570-47ef-bdda-560a95a9a87b\") " pod="openshift-console/console-f56f4f89b-6xgsz" Apr 28 19:17:10.006877 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:10.006746 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a7d7d434-3570-47ef-bdda-560a95a9a87b-oauth-serving-cert\") pod \"console-f56f4f89b-6xgsz\" (UID: \"a7d7d434-3570-47ef-bdda-560a95a9a87b\") " pod="openshift-console/console-f56f4f89b-6xgsz" Apr 28 19:17:10.006949 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:10.006881 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a7d7d434-3570-47ef-bdda-560a95a9a87b-console-config\") pod \"console-f56f4f89b-6xgsz\" (UID: \"a7d7d434-3570-47ef-bdda-560a95a9a87b\") " pod="openshift-console/console-f56f4f89b-6xgsz" Apr 28 19:17:10.007003 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:10.006946 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7d7d434-3570-47ef-bdda-560a95a9a87b-trusted-ca-bundle\") pod \"console-f56f4f89b-6xgsz\" (UID: \"a7d7d434-3570-47ef-bdda-560a95a9a87b\") " pod="openshift-console/console-f56f4f89b-6xgsz" Apr 28 19:17:10.008563 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:10.008539 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a7d7d434-3570-47ef-bdda-560a95a9a87b-console-serving-cert\") pod \"console-f56f4f89b-6xgsz\" (UID: \"a7d7d434-3570-47ef-bdda-560a95a9a87b\") " pod="openshift-console/console-f56f4f89b-6xgsz" Apr 28 19:17:10.008647 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:10.008570 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a7d7d434-3570-47ef-bdda-560a95a9a87b-console-oauth-config\") pod \"console-f56f4f89b-6xgsz\" (UID: \"a7d7d434-3570-47ef-bdda-560a95a9a87b\") " pod="openshift-console/console-f56f4f89b-6xgsz" Apr 28 19:17:10.014542 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:10.014522 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv66m\" (UniqueName: \"kubernetes.io/projected/a7d7d434-3570-47ef-bdda-560a95a9a87b-kube-api-access-hv66m\") pod \"console-f56f4f89b-6xgsz\" (UID: \"a7d7d434-3570-47ef-bdda-560a95a9a87b\") " pod="openshift-console/console-f56f4f89b-6xgsz" Apr 28 19:17:10.191259 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:10.191219 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f56f4f89b-6xgsz" Apr 28 19:17:10.507183 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:10.507153 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f56f4f89b-6xgsz"] Apr 28 19:17:10.509983 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:17:10.509958 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7d7d434_3570_47ef_bdda_560a95a9a87b.slice/crio-cdce8df4f72c418e30abe8244420ac4ddea5089243bec0f653221dc903c14aae WatchSource:0}: Error finding container cdce8df4f72c418e30abe8244420ac4ddea5089243bec0f653221dc903c14aae: Status 404 returned error can't find the container with id cdce8df4f72c418e30abe8244420ac4ddea5089243bec0f653221dc903c14aae Apr 28 19:17:11.258054 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:11.258019 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6475644cb-ndpb2" event={"ID":"3807a95d-ac8a-42a9-95bf-87514836c9be","Type":"ContainerStarted","Data":"f345df7a368af66c521e4c6c65c7dac37330a92e4140d2ae9284af0169fc2a54"} Apr 28 19:17:11.259717 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:11.259688 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f56f4f89b-6xgsz" event={"ID":"a7d7d434-3570-47ef-bdda-560a95a9a87b","Type":"ContainerStarted","Data":"28e3a7c3c881af9cb18e6d980f048a45e7aa251d7b847ecb09328fbf8e4b3b85"} Apr 28 19:17:11.259833 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:11.259725 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f56f4f89b-6xgsz" event={"ID":"a7d7d434-3570-47ef-bdda-560a95a9a87b","Type":"ContainerStarted","Data":"cdce8df4f72c418e30abe8244420ac4ddea5089243bec0f653221dc903c14aae"} Apr 28 19:17:11.261318 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:11.261291 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-wqhlw" event={"ID":"10b46dea-b1bd-4fe9-b096-d027eda0809d","Type":"ContainerStarted","Data":"825acbe865107e2e618a6b8304fdfed6a3e2ae6bbf6ffbee72bee695082e0830"} Apr 28 19:17:11.261587 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:11.261521 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-wqhlw" Apr 28 19:17:11.264830 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:11.264806 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-67c8d86c4f-d6x8n" event={"ID":"45277e70-f2c6-4e33-97af-9cb5f76dee0b","Type":"ContainerStarted","Data":"d173c0ba51f2cdbb5c69938303c1a80f6954776802e894f013671723e8d4efd3"} Apr 28 19:17:11.264947 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:11.264836 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-67c8d86c4f-d6x8n" event={"ID":"45277e70-f2c6-4e33-97af-9cb5f76dee0b","Type":"ContainerStarted","Data":"7d9cc5682d182462a5b07001634e4a1c70165d53b98e7b62b9eac5c67b9bc50f"} Apr 28 19:17:11.264947 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:11.264851 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-67c8d86c4f-d6x8n" event={"ID":"45277e70-f2c6-4e33-97af-9cb5f76dee0b","Type":"ContainerStarted","Data":"f4d0191b9c67266cd5cf3ebbe5f855d3c3c005546d2345f90f39eaf3e91a43c5"} Apr 28 19:17:11.265193 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:11.265174 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-67c8d86c4f-d6x8n" Apr 28 19:17:11.267843 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:11.267822 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-wqhlw" Apr 28 19:17:11.281509 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:11.281424 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f56f4f89b-6xgsz" podStartSLOduration=2.281409118 podStartE2EDuration="2.281409118s" podCreationTimestamp="2026-04-28 19:17:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:17:11.280329077 +0000 UTC m=+47.911230024" watchObservedRunningTime="2026-04-28 19:17:11.281409118 +0000 UTC m=+47.912310056" Apr 28 19:17:11.306306 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:11.306259 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-67c8d86c4f-d6x8n" podStartSLOduration=2.342861732 podStartE2EDuration="8.306247218s" podCreationTimestamp="2026-04-28 19:17:03 +0000 UTC" firstStartedPulling="2026-04-28 19:17:04.417976251 +0000 UTC m=+41.048877183" lastFinishedPulling="2026-04-28 19:17:10.381361738 +0000 UTC m=+47.012262669" observedRunningTime="2026-04-28 19:17:11.305731393 +0000 UTC m=+47.936632337" watchObservedRunningTime="2026-04-28 19:17:11.306247218 +0000 UTC m=+47.937148163" Apr 28 19:17:11.328958 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:11.328911 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-wqhlw" podStartSLOduration=3.985111996 podStartE2EDuration="6.328899445s" podCreationTimestamp="2026-04-28 19:17:05 +0000 UTC" firstStartedPulling="2026-04-28 19:17:08.03613189 +0000 UTC m=+44.667032813" lastFinishedPulling="2026-04-28 19:17:10.379919335 +0000 UTC m=+47.010820262" observedRunningTime="2026-04-28 19:17:11.328033182 +0000 UTC m=+47.958934127" watchObservedRunningTime="2026-04-28 19:17:11.328899445 +0000 UTC m=+47.959800449" Apr 28 19:17:12.214786 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:12.214752 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-bczfz" Apr 28 19:17:12.270221 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:12.270093 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6475644cb-ndpb2" event={"ID":"3807a95d-ac8a-42a9-95bf-87514836c9be","Type":"ContainerStarted","Data":"1a4486b2225eb84c10c463dceb03c831a2eddd2e8e6f604d5bc4adfba2eb1af2"} Apr 28 19:17:12.270221 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:12.270144 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6475644cb-ndpb2" event={"ID":"3807a95d-ac8a-42a9-95bf-87514836c9be","Type":"ContainerStarted","Data":"881cc7d69bb4483d9303c937c8344750c6f3725f572d09b19b2d490524968dcc"} Apr 28 19:17:12.299079 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:12.299031 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-6475644cb-ndpb2" podStartSLOduration=3.333127436 podStartE2EDuration="7.299015809s" podCreationTimestamp="2026-04-28 19:17:05 +0000 UTC" firstStartedPulling="2026-04-28 19:17:08.035746337 +0000 UTC m=+44.666647264" lastFinishedPulling="2026-04-28 19:17:12.001634713 +0000 UTC m=+48.632535637" observedRunningTime="2026-04-28 19:17:12.29855198 +0000 UTC m=+48.929452925" watchObservedRunningTime="2026-04-28 19:17:12.299015809 +0000 UTC m=+48.929916754" Apr 28 19:17:13.615742 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:13.615705 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f56f4f89b-6xgsz"] Apr 28 19:17:13.646750 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:13.646697 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-59bf4679d8-rx8t5"] Apr 28 19:17:13.669276 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:13.669253 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-59bf4679d8-rx8t5"] Apr 28 19:17:13.669403 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:13.669362 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59bf4679d8-rx8t5" Apr 28 19:17:13.739080 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:13.739052 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1-service-ca\") pod \"console-59bf4679d8-rx8t5\" (UID: \"660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1\") " pod="openshift-console/console-59bf4679d8-rx8t5" Apr 28 19:17:13.739209 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:13.739088 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1-console-serving-cert\") pod \"console-59bf4679d8-rx8t5\" (UID: \"660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1\") " pod="openshift-console/console-59bf4679d8-rx8t5" Apr 28 19:17:13.739209 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:13.739111 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1-trusted-ca-bundle\") pod \"console-59bf4679d8-rx8t5\" (UID: \"660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1\") " pod="openshift-console/console-59bf4679d8-rx8t5" Apr 28 19:17:13.739300 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:13.739203 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1-console-config\") pod \"console-59bf4679d8-rx8t5\" (UID: \"660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1\") " pod="openshift-console/console-59bf4679d8-rx8t5" Apr 28 19:17:13.739300 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:13.739279 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzmwt\" (UniqueName: \"kubernetes.io/projected/660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1-kube-api-access-tzmwt\") pod \"console-59bf4679d8-rx8t5\" (UID: \"660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1\") " pod="openshift-console/console-59bf4679d8-rx8t5" Apr 28 19:17:13.739372 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:13.739310 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1-console-oauth-config\") pod \"console-59bf4679d8-rx8t5\" (UID: \"660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1\") " pod="openshift-console/console-59bf4679d8-rx8t5" Apr 28 19:17:13.739372 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:13.739344 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1-oauth-serving-cert\") pod \"console-59bf4679d8-rx8t5\" (UID: \"660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1\") " pod="openshift-console/console-59bf4679d8-rx8t5" Apr 28 19:17:13.839726 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:13.839695 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tzmwt\" (UniqueName: \"kubernetes.io/projected/660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1-kube-api-access-tzmwt\") pod \"console-59bf4679d8-rx8t5\" (UID: \"660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1\") " pod="openshift-console/console-59bf4679d8-rx8t5" Apr 28 19:17:13.839870 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:13.839732 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1-console-oauth-config\") pod \"console-59bf4679d8-rx8t5\" (UID: \"660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1\") " pod="openshift-console/console-59bf4679d8-rx8t5" Apr 28 19:17:13.839936 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:13.839876 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1-oauth-serving-cert\") pod \"console-59bf4679d8-rx8t5\" (UID: \"660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1\") " pod="openshift-console/console-59bf4679d8-rx8t5" Apr 28 19:17:13.839985 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:13.839938 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1-service-ca\") pod \"console-59bf4679d8-rx8t5\" (UID: \"660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1\") " pod="openshift-console/console-59bf4679d8-rx8t5" Apr 28 19:17:13.839985 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:13.839968 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1-console-serving-cert\") pod \"console-59bf4679d8-rx8t5\" (UID: \"660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1\") " pod="openshift-console/console-59bf4679d8-rx8t5" Apr 28 19:17:13.840087 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:13.839992 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1-trusted-ca-bundle\") pod \"console-59bf4679d8-rx8t5\" (UID: \"660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1\") " pod="openshift-console/console-59bf4679d8-rx8t5" Apr 28 19:17:13.840087 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:13.840039 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1-console-config\") pod \"console-59bf4679d8-rx8t5\" (UID: \"660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1\") " pod="openshift-console/console-59bf4679d8-rx8t5" Apr 28 19:17:13.840675 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:13.840647 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1-oauth-serving-cert\") pod \"console-59bf4679d8-rx8t5\" (UID: \"660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1\") " pod="openshift-console/console-59bf4679d8-rx8t5" Apr 28 19:17:13.840793 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:13.840692 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1-service-ca\") pod \"console-59bf4679d8-rx8t5\" (UID: \"660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1\") " pod="openshift-console/console-59bf4679d8-rx8t5" Apr 28 19:17:13.840793 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:13.840701 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1-console-config\") pod \"console-59bf4679d8-rx8t5\" (UID: \"660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1\") " pod="openshift-console/console-59bf4679d8-rx8t5" Apr 28 19:17:13.840911 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:13.840845 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1-trusted-ca-bundle\") pod \"console-59bf4679d8-rx8t5\" (UID: \"660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1\") " pod="openshift-console/console-59bf4679d8-rx8t5" Apr 28 19:17:13.842237 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:13.842221 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1-console-oauth-config\") pod \"console-59bf4679d8-rx8t5\" (UID: \"660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1\") " pod="openshift-console/console-59bf4679d8-rx8t5" Apr 28 19:17:13.842309 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:13.842259 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1-console-serving-cert\") pod \"console-59bf4679d8-rx8t5\" (UID: \"660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1\") " pod="openshift-console/console-59bf4679d8-rx8t5" Apr 28 19:17:13.847963 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:13.847942 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzmwt\" (UniqueName: \"kubernetes.io/projected/660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1-kube-api-access-tzmwt\") pod \"console-59bf4679d8-rx8t5\" (UID: \"660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1\") " pod="openshift-console/console-59bf4679d8-rx8t5" Apr 28 19:17:13.977974 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:13.977950 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59bf4679d8-rx8t5" Apr 28 19:17:14.100467 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:14.100445 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-59bf4679d8-rx8t5"] Apr 28 19:17:14.103201 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:17:14.103167 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod660e2ec9_2c7f_4ef1_9d90_f68454ab1ec1.slice/crio-213c33c2d1dd5c42ea94b1a690edc278972eed36dce4b0a22100cc396af9268e WatchSource:0}: Error finding container 213c33c2d1dd5c42ea94b1a690edc278972eed36dce4b0a22100cc396af9268e: Status 404 returned error can't find the container with id 213c33c2d1dd5c42ea94b1a690edc278972eed36dce4b0a22100cc396af9268e Apr 28 19:17:14.278455 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:14.278378 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59bf4679d8-rx8t5" event={"ID":"660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1","Type":"ContainerStarted","Data":"c879be5b3e4b8cbfc908fa0c410589b4dfcb487500d5537e3576429b0679b590"} Apr 28 19:17:14.278455 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:14.278413 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59bf4679d8-rx8t5" event={"ID":"660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1","Type":"ContainerStarted","Data":"213c33c2d1dd5c42ea94b1a690edc278972eed36dce4b0a22100cc396af9268e"} Apr 28 19:17:14.301544 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:14.301474 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-59bf4679d8-rx8t5" podStartSLOduration=1.301459997 podStartE2EDuration="1.301459997s" podCreationTimestamp="2026-04-28 19:17:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:17:14.299550033 +0000 UTC m=+50.930450977" watchObservedRunningTime="2026-04-28 19:17:14.301459997 +0000 UTC m=+50.932360926" Apr 28 19:17:14.919319 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:14.919291 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-c456b96cd-fdnz7" Apr 28 19:17:17.277110 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:17.277085 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-67c8d86c4f-d6x8n" Apr 28 19:17:20.191778 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:20.191741 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-f56f4f89b-6xgsz" Apr 28 19:17:23.165424 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:23.165392 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ppk4t" Apr 28 19:17:23.978653 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:23.978613 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-59bf4679d8-rx8t5" Apr 28 19:17:23.978899 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:23.978879 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-59bf4679d8-rx8t5" Apr 28 19:17:23.983583 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:23.983562 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-59bf4679d8-rx8t5" Apr 28 19:17:24.315052 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:24.314973 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-59bf4679d8-rx8t5" Apr 28 19:17:29.573198 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:29.573163 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b-metrics-certs\") pod \"network-metrics-daemon-zlvsf\" (UID: \"caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b\") " pod="openshift-multus/network-metrics-daemon-zlvsf" Apr 28 19:17:29.575756 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:29.575729 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 28 19:17:29.585429 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:29.585410 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b-metrics-certs\") pod \"network-metrics-daemon-zlvsf\" (UID: \"caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b\") " pod="openshift-multus/network-metrics-daemon-zlvsf" Apr 28 19:17:29.673807 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:29.673769 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6pc96\" (UniqueName: \"kubernetes.io/projected/e46a06a4-894f-4f3d-a446-b501af6e42eb-kube-api-access-6pc96\") pod \"network-check-target-5qtkh\" (UID: \"e46a06a4-894f-4f3d-a446-b501af6e42eb\") " pod="openshift-network-diagnostics/network-check-target-5qtkh" Apr 28 19:17:29.676331 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:29.676311 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 28 19:17:29.687129 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:29.687109 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 28 19:17:29.698110 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:29.698080 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pc96\" (UniqueName: \"kubernetes.io/projected/e46a06a4-894f-4f3d-a446-b501af6e42eb-kube-api-access-6pc96\") pod \"network-check-target-5qtkh\" (UID: \"e46a06a4-894f-4f3d-a446-b501af6e42eb\") " pod="openshift-network-diagnostics/network-check-target-5qtkh" Apr 28 19:17:29.756624 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:29.756598 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-n7p66\"" Apr 28 19:17:29.759047 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:29.759022 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-6mdm8\"" Apr 28 19:17:29.763551 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:29.763537 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5qtkh" Apr 28 19:17:29.767207 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:29.767190 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlvsf" Apr 28 19:17:29.893756 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:29.893592 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-5qtkh"] Apr 28 19:17:29.896398 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:17:29.896372 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode46a06a4_894f_4f3d_a446_b501af6e42eb.slice/crio-84fe636292726d2414f9184da59e5eb1bf8de112f40f4229968aad787585b75f WatchSource:0}: Error finding container 84fe636292726d2414f9184da59e5eb1bf8de112f40f4229968aad787585b75f: Status 404 returned error can't find the container with id 84fe636292726d2414f9184da59e5eb1bf8de112f40f4229968aad787585b75f Apr 28 19:17:29.907749 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:29.907725 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-zlvsf"] Apr 28 19:17:29.910827 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:17:29.910803 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcaa0f2a2_4dc9_4c44_b8b9_96f4cc8a695b.slice/crio-b7f6ac3f3eff390f923d1c78043335f645e379798d933084f407a9662aca31a3 WatchSource:0}: Error finding container b7f6ac3f3eff390f923d1c78043335f645e379798d933084f407a9662aca31a3: Status 404 returned error can't find the container with id b7f6ac3f3eff390f923d1c78043335f645e379798d933084f407a9662aca31a3 Apr 28 19:17:30.332740 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:30.332691 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-5qtkh" event={"ID":"e46a06a4-894f-4f3d-a446-b501af6e42eb","Type":"ContainerStarted","Data":"84fe636292726d2414f9184da59e5eb1bf8de112f40f4229968aad787585b75f"} Apr 28 19:17:30.334023 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:30.333992 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zlvsf" event={"ID":"caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b","Type":"ContainerStarted","Data":"b7f6ac3f3eff390f923d1c78043335f645e379798d933084f407a9662aca31a3"} Apr 28 19:17:31.339310 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:31.339253 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zlvsf" event={"ID":"caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b","Type":"ContainerStarted","Data":"0733d74440535a48ae489963cc9f1a4b4bb49891e195d126a9b9e7f1c0fc1e7c"} Apr 28 19:17:31.339310 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:31.339305 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zlvsf" event={"ID":"caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b","Type":"ContainerStarted","Data":"c10de7447e1c5d5e15985a5662ca9ae8c08d03c0360b4391846961567ef48cd7"} Apr 28 19:17:31.356295 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:31.356224 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-zlvsf" podStartSLOduration=66.262719533 podStartE2EDuration="1m7.356206401s" podCreationTimestamp="2026-04-28 19:16:24 +0000 UTC" firstStartedPulling="2026-04-28 19:17:29.912612444 +0000 UTC m=+66.543513371" lastFinishedPulling="2026-04-28 19:17:31.006099315 +0000 UTC m=+67.637000239" observedRunningTime="2026-04-28 19:17:31.35503298 +0000 UTC m=+67.985933925" watchObservedRunningTime="2026-04-28 19:17:31.356206401 +0000 UTC m=+67.987107347" Apr 28 19:17:33.347216 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:33.347122 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-5qtkh" event={"ID":"e46a06a4-894f-4f3d-a446-b501af6e42eb","Type":"ContainerStarted","Data":"e6af50aed0563b7e4e6429dbec69328c47407d05d83f0331bd9ef373d91c851d"} Apr 28 19:17:33.347676 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:33.347255 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-5qtkh" Apr 28 19:17:33.366176 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:33.365390 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-5qtkh" podStartSLOduration=66.242784836 podStartE2EDuration="1m9.365372444s" podCreationTimestamp="2026-04-28 19:16:24 +0000 UTC" firstStartedPulling="2026-04-28 19:17:29.898357736 +0000 UTC m=+66.529258660" lastFinishedPulling="2026-04-28 19:17:33.020945343 +0000 UTC m=+69.651846268" observedRunningTime="2026-04-28 19:17:33.36332826 +0000 UTC m=+69.994229205" watchObservedRunningTime="2026-04-28 19:17:33.365372444 +0000 UTC m=+69.996273396" Apr 28 19:17:35.275457 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:35.275396 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-c456b96cd-fdnz7" podUID="0eea4846-7972-4b6c-8b2a-58bce3c1e353" containerName="console" containerID="cri-o://9d05e82453a40189e68c02eb56a8d8af67bb2378746580820a6706be513b6360" gracePeriod=15 Apr 28 19:17:35.543149 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:35.543127 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-c456b96cd-fdnz7_0eea4846-7972-4b6c-8b2a-58bce3c1e353/console/0.log" Apr 28 19:17:35.543267 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:35.543200 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c456b96cd-fdnz7" Apr 28 19:17:35.626168 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:35.626137 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0eea4846-7972-4b6c-8b2a-58bce3c1e353-console-config\") pod \"0eea4846-7972-4b6c-8b2a-58bce3c1e353\" (UID: \"0eea4846-7972-4b6c-8b2a-58bce3c1e353\") " Apr 28 19:17:35.626168 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:35.626177 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0eea4846-7972-4b6c-8b2a-58bce3c1e353-service-ca\") pod \"0eea4846-7972-4b6c-8b2a-58bce3c1e353\" (UID: \"0eea4846-7972-4b6c-8b2a-58bce3c1e353\") " Apr 28 19:17:35.626347 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:35.626205 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0eea4846-7972-4b6c-8b2a-58bce3c1e353-console-serving-cert\") pod \"0eea4846-7972-4b6c-8b2a-58bce3c1e353\" (UID: \"0eea4846-7972-4b6c-8b2a-58bce3c1e353\") " Apr 28 19:17:35.626347 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:35.626229 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hllw5\" (UniqueName: \"kubernetes.io/projected/0eea4846-7972-4b6c-8b2a-58bce3c1e353-kube-api-access-hllw5\") pod \"0eea4846-7972-4b6c-8b2a-58bce3c1e353\" (UID: \"0eea4846-7972-4b6c-8b2a-58bce3c1e353\") " Apr 28 19:17:35.626347 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:35.626250 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0eea4846-7972-4b6c-8b2a-58bce3c1e353-trusted-ca-bundle\") pod \"0eea4846-7972-4b6c-8b2a-58bce3c1e353\" (UID: \"0eea4846-7972-4b6c-8b2a-58bce3c1e353\") " Apr 28 19:17:35.626347 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:35.626307 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0eea4846-7972-4b6c-8b2a-58bce3c1e353-oauth-serving-cert\") pod \"0eea4846-7972-4b6c-8b2a-58bce3c1e353\" (UID: \"0eea4846-7972-4b6c-8b2a-58bce3c1e353\") " Apr 28 19:17:35.626347 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:35.626345 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0eea4846-7972-4b6c-8b2a-58bce3c1e353-console-oauth-config\") pod \"0eea4846-7972-4b6c-8b2a-58bce3c1e353\" (UID: \"0eea4846-7972-4b6c-8b2a-58bce3c1e353\") " Apr 28 19:17:35.626762 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:35.626666 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0eea4846-7972-4b6c-8b2a-58bce3c1e353-console-config" (OuterVolumeSpecName: "console-config") pod "0eea4846-7972-4b6c-8b2a-58bce3c1e353" (UID: "0eea4846-7972-4b6c-8b2a-58bce3c1e353"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:17:35.626872 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:35.626767 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0eea4846-7972-4b6c-8b2a-58bce3c1e353-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "0eea4846-7972-4b6c-8b2a-58bce3c1e353" (UID: "0eea4846-7972-4b6c-8b2a-58bce3c1e353"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:17:35.626872 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:35.626776 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0eea4846-7972-4b6c-8b2a-58bce3c1e353-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "0eea4846-7972-4b6c-8b2a-58bce3c1e353" (UID: "0eea4846-7972-4b6c-8b2a-58bce3c1e353"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:17:35.626954 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:35.626868 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0eea4846-7972-4b6c-8b2a-58bce3c1e353-service-ca" (OuterVolumeSpecName: "service-ca") pod "0eea4846-7972-4b6c-8b2a-58bce3c1e353" (UID: "0eea4846-7972-4b6c-8b2a-58bce3c1e353"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:17:35.628548 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:35.628522 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0eea4846-7972-4b6c-8b2a-58bce3c1e353-kube-api-access-hllw5" (OuterVolumeSpecName: "kube-api-access-hllw5") pod "0eea4846-7972-4b6c-8b2a-58bce3c1e353" (UID: "0eea4846-7972-4b6c-8b2a-58bce3c1e353"). InnerVolumeSpecName "kube-api-access-hllw5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:17:35.628623 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:35.628538 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0eea4846-7972-4b6c-8b2a-58bce3c1e353-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "0eea4846-7972-4b6c-8b2a-58bce3c1e353" (UID: "0eea4846-7972-4b6c-8b2a-58bce3c1e353"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:17:35.628661 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:35.628627 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0eea4846-7972-4b6c-8b2a-58bce3c1e353-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "0eea4846-7972-4b6c-8b2a-58bce3c1e353" (UID: "0eea4846-7972-4b6c-8b2a-58bce3c1e353"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:17:35.727044 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:35.727013 2571 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0eea4846-7972-4b6c-8b2a-58bce3c1e353-oauth-serving-cert\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:17:35.727044 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:35.727042 2571 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0eea4846-7972-4b6c-8b2a-58bce3c1e353-console-oauth-config\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:17:35.727203 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:35.727052 2571 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0eea4846-7972-4b6c-8b2a-58bce3c1e353-console-config\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:17:35.727203 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:35.727062 2571 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0eea4846-7972-4b6c-8b2a-58bce3c1e353-service-ca\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:17:35.727203 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:35.727071 2571 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0eea4846-7972-4b6c-8b2a-58bce3c1e353-console-serving-cert\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:17:35.727203 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:35.727080 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hllw5\" (UniqueName: \"kubernetes.io/projected/0eea4846-7972-4b6c-8b2a-58bce3c1e353-kube-api-access-hllw5\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:17:35.727203 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:35.727088 2571 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0eea4846-7972-4b6c-8b2a-58bce3c1e353-trusted-ca-bundle\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:17:36.356775 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:36.356700 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-c456b96cd-fdnz7_0eea4846-7972-4b6c-8b2a-58bce3c1e353/console/0.log" Apr 28 19:17:36.356775 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:36.356739 2571 generic.go:358] "Generic (PLEG): container finished" podID="0eea4846-7972-4b6c-8b2a-58bce3c1e353" containerID="9d05e82453a40189e68c02eb56a8d8af67bb2378746580820a6706be513b6360" exitCode=2 Apr 28 19:17:36.357220 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:36.356805 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c456b96cd-fdnz7" Apr 28 19:17:36.357220 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:36.356807 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c456b96cd-fdnz7" event={"ID":"0eea4846-7972-4b6c-8b2a-58bce3c1e353","Type":"ContainerDied","Data":"9d05e82453a40189e68c02eb56a8d8af67bb2378746580820a6706be513b6360"} Apr 28 19:17:36.357220 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:36.356926 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c456b96cd-fdnz7" event={"ID":"0eea4846-7972-4b6c-8b2a-58bce3c1e353","Type":"ContainerDied","Data":"643a043bb262e8669ecd7b74d2b077b7aac7bed215c72c5b2ae36d5d219d1d87"} Apr 28 19:17:36.357220 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:36.356956 2571 scope.go:117] "RemoveContainer" containerID="9d05e82453a40189e68c02eb56a8d8af67bb2378746580820a6706be513b6360" Apr 28 19:17:36.365105 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:36.365086 2571 scope.go:117] "RemoveContainer" containerID="9d05e82453a40189e68c02eb56a8d8af67bb2378746580820a6706be513b6360" Apr 28 19:17:36.365338 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:17:36.365321 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d05e82453a40189e68c02eb56a8d8af67bb2378746580820a6706be513b6360\": container with ID starting with 9d05e82453a40189e68c02eb56a8d8af67bb2378746580820a6706be513b6360 not found: ID does not exist" containerID="9d05e82453a40189e68c02eb56a8d8af67bb2378746580820a6706be513b6360" Apr 28 19:17:36.365394 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:36.365346 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d05e82453a40189e68c02eb56a8d8af67bb2378746580820a6706be513b6360"} err="failed to get container status \"9d05e82453a40189e68c02eb56a8d8af67bb2378746580820a6706be513b6360\": rpc error: code = NotFound desc = could not find container \"9d05e82453a40189e68c02eb56a8d8af67bb2378746580820a6706be513b6360\": container with ID starting with 9d05e82453a40189e68c02eb56a8d8af67bb2378746580820a6706be513b6360 not found: ID does not exist" Apr 28 19:17:36.377652 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:36.377626 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-c456b96cd-fdnz7"] Apr 28 19:17:36.386605 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:36.386581 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-c456b96cd-fdnz7"] Apr 28 19:17:37.946865 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:37.946823 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0eea4846-7972-4b6c-8b2a-58bce3c1e353" path="/var/lib/kubelet/pods/0eea4846-7972-4b6c-8b2a-58bce3c1e353/volumes" Apr 28 19:17:38.634239 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:38.634178 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-f56f4f89b-6xgsz" podUID="a7d7d434-3570-47ef-bdda-560a95a9a87b" containerName="console" containerID="cri-o://28e3a7c3c881af9cb18e6d980f048a45e7aa251d7b847ecb09328fbf8e4b3b85" gracePeriod=15 Apr 28 19:17:38.911748 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:38.911724 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f56f4f89b-6xgsz_a7d7d434-3570-47ef-bdda-560a95a9a87b/console/0.log" Apr 28 19:17:38.911877 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:38.911797 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f56f4f89b-6xgsz" Apr 28 19:17:39.055544 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:39.055510 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a7d7d434-3570-47ef-bdda-560a95a9a87b-console-config\") pod \"a7d7d434-3570-47ef-bdda-560a95a9a87b\" (UID: \"a7d7d434-3570-47ef-bdda-560a95a9a87b\") " Apr 28 19:17:39.056028 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:39.055558 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a7d7d434-3570-47ef-bdda-560a95a9a87b-console-oauth-config\") pod \"a7d7d434-3570-47ef-bdda-560a95a9a87b\" (UID: \"a7d7d434-3570-47ef-bdda-560a95a9a87b\") " Apr 28 19:17:39.056028 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:39.055594 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a7d7d434-3570-47ef-bdda-560a95a9a87b-console-serving-cert\") pod \"a7d7d434-3570-47ef-bdda-560a95a9a87b\" (UID: \"a7d7d434-3570-47ef-bdda-560a95a9a87b\") " Apr 28 19:17:39.056028 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:39.055627 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hv66m\" (UniqueName: \"kubernetes.io/projected/a7d7d434-3570-47ef-bdda-560a95a9a87b-kube-api-access-hv66m\") pod \"a7d7d434-3570-47ef-bdda-560a95a9a87b\" (UID: \"a7d7d434-3570-47ef-bdda-560a95a9a87b\") " Apr 28 19:17:39.056028 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:39.055645 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a7d7d434-3570-47ef-bdda-560a95a9a87b-service-ca\") pod \"a7d7d434-3570-47ef-bdda-560a95a9a87b\" (UID: \"a7d7d434-3570-47ef-bdda-560a95a9a87b\") " Apr 28 19:17:39.056028 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:39.055665 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a7d7d434-3570-47ef-bdda-560a95a9a87b-oauth-serving-cert\") pod \"a7d7d434-3570-47ef-bdda-560a95a9a87b\" (UID: \"a7d7d434-3570-47ef-bdda-560a95a9a87b\") " Apr 28 19:17:39.056028 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:39.055714 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7d7d434-3570-47ef-bdda-560a95a9a87b-trusted-ca-bundle\") pod \"a7d7d434-3570-47ef-bdda-560a95a9a87b\" (UID: \"a7d7d434-3570-47ef-bdda-560a95a9a87b\") " Apr 28 19:17:39.056321 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:39.056045 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7d7d434-3570-47ef-bdda-560a95a9a87b-console-config" (OuterVolumeSpecName: "console-config") pod "a7d7d434-3570-47ef-bdda-560a95a9a87b" (UID: "a7d7d434-3570-47ef-bdda-560a95a9a87b"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:17:39.056321 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:39.056106 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7d7d434-3570-47ef-bdda-560a95a9a87b-service-ca" (OuterVolumeSpecName: "service-ca") pod "a7d7d434-3570-47ef-bdda-560a95a9a87b" (UID: "a7d7d434-3570-47ef-bdda-560a95a9a87b"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:17:39.056321 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:39.056120 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7d7d434-3570-47ef-bdda-560a95a9a87b-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "a7d7d434-3570-47ef-bdda-560a95a9a87b" (UID: "a7d7d434-3570-47ef-bdda-560a95a9a87b"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:17:39.056465 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:39.056386 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7d7d434-3570-47ef-bdda-560a95a9a87b-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "a7d7d434-3570-47ef-bdda-560a95a9a87b" (UID: "a7d7d434-3570-47ef-bdda-560a95a9a87b"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:17:39.057861 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:39.057831 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7d7d434-3570-47ef-bdda-560a95a9a87b-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "a7d7d434-3570-47ef-bdda-560a95a9a87b" (UID: "a7d7d434-3570-47ef-bdda-560a95a9a87b"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:17:39.057977 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:39.057875 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7d7d434-3570-47ef-bdda-560a95a9a87b-kube-api-access-hv66m" (OuterVolumeSpecName: "kube-api-access-hv66m") pod "a7d7d434-3570-47ef-bdda-560a95a9a87b" (UID: "a7d7d434-3570-47ef-bdda-560a95a9a87b"). InnerVolumeSpecName "kube-api-access-hv66m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:17:39.057977 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:39.057914 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7d7d434-3570-47ef-bdda-560a95a9a87b-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "a7d7d434-3570-47ef-bdda-560a95a9a87b" (UID: "a7d7d434-3570-47ef-bdda-560a95a9a87b"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:17:39.156689 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:39.156661 2571 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7d7d434-3570-47ef-bdda-560a95a9a87b-trusted-ca-bundle\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:17:39.156689 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:39.156689 2571 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a7d7d434-3570-47ef-bdda-560a95a9a87b-console-config\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:17:39.156861 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:39.156700 2571 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a7d7d434-3570-47ef-bdda-560a95a9a87b-console-oauth-config\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:17:39.156861 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:39.156709 2571 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a7d7d434-3570-47ef-bdda-560a95a9a87b-console-serving-cert\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:17:39.156861 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:39.156718 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hv66m\" (UniqueName: \"kubernetes.io/projected/a7d7d434-3570-47ef-bdda-560a95a9a87b-kube-api-access-hv66m\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:17:39.156861 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:39.156727 2571 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a7d7d434-3570-47ef-bdda-560a95a9a87b-service-ca\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:17:39.156861 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:39.156736 2571 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a7d7d434-3570-47ef-bdda-560a95a9a87b-oauth-serving-cert\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:17:39.367557 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:39.367425 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f56f4f89b-6xgsz_a7d7d434-3570-47ef-bdda-560a95a9a87b/console/0.log" Apr 28 19:17:39.367557 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:39.367465 2571 generic.go:358] "Generic (PLEG): container finished" podID="a7d7d434-3570-47ef-bdda-560a95a9a87b" containerID="28e3a7c3c881af9cb18e6d980f048a45e7aa251d7b847ecb09328fbf8e4b3b85" exitCode=2 Apr 28 19:17:39.367557 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:39.367522 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f56f4f89b-6xgsz" event={"ID":"a7d7d434-3570-47ef-bdda-560a95a9a87b","Type":"ContainerDied","Data":"28e3a7c3c881af9cb18e6d980f048a45e7aa251d7b847ecb09328fbf8e4b3b85"} Apr 28 19:17:39.367557 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:39.367545 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f56f4f89b-6xgsz" Apr 28 19:17:39.367557 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:39.367561 2571 scope.go:117] "RemoveContainer" containerID="28e3a7c3c881af9cb18e6d980f048a45e7aa251d7b847ecb09328fbf8e4b3b85" Apr 28 19:17:39.367881 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:39.367551 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f56f4f89b-6xgsz" event={"ID":"a7d7d434-3570-47ef-bdda-560a95a9a87b","Type":"ContainerDied","Data":"cdce8df4f72c418e30abe8244420ac4ddea5089243bec0f653221dc903c14aae"} Apr 28 19:17:39.376996 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:39.376978 2571 scope.go:117] "RemoveContainer" containerID="28e3a7c3c881af9cb18e6d980f048a45e7aa251d7b847ecb09328fbf8e4b3b85" Apr 28 19:17:39.377244 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:17:39.377223 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28e3a7c3c881af9cb18e6d980f048a45e7aa251d7b847ecb09328fbf8e4b3b85\": container with ID starting with 28e3a7c3c881af9cb18e6d980f048a45e7aa251d7b847ecb09328fbf8e4b3b85 not found: ID does not exist" containerID="28e3a7c3c881af9cb18e6d980f048a45e7aa251d7b847ecb09328fbf8e4b3b85" Apr 28 19:17:39.377312 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:39.377253 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28e3a7c3c881af9cb18e6d980f048a45e7aa251d7b847ecb09328fbf8e4b3b85"} err="failed to get container status \"28e3a7c3c881af9cb18e6d980f048a45e7aa251d7b847ecb09328fbf8e4b3b85\": rpc error: code = NotFound desc = could not find container \"28e3a7c3c881af9cb18e6d980f048a45e7aa251d7b847ecb09328fbf8e4b3b85\": container with ID starting with 28e3a7c3c881af9cb18e6d980f048a45e7aa251d7b847ecb09328fbf8e4b3b85 not found: ID does not exist" Apr 28 19:17:39.390212 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:39.390191 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f56f4f89b-6xgsz"] Apr 28 19:17:39.394394 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:39.394374 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f56f4f89b-6xgsz"] Apr 28 19:17:39.947599 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:17:39.947567 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7d7d434-3570-47ef-bdda-560a95a9a87b" path="/var/lib/kubelet/pods/a7d7d434-3570-47ef-bdda-560a95a9a87b/volumes" Apr 28 19:18:02.777739 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:02.777708 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 28 19:18:02.778226 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:02.778023 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0eea4846-7972-4b6c-8b2a-58bce3c1e353" containerName="console" Apr 28 19:18:02.778226 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:02.778035 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="0eea4846-7972-4b6c-8b2a-58bce3c1e353" containerName="console" Apr 28 19:18:02.778226 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:02.778044 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a7d7d434-3570-47ef-bdda-560a95a9a87b" containerName="console" Apr 28 19:18:02.778226 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:02.778053 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7d7d434-3570-47ef-bdda-560a95a9a87b" containerName="console" Apr 28 19:18:02.778226 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:02.778102 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="0eea4846-7972-4b6c-8b2a-58bce3c1e353" containerName="console" Apr 28 19:18:02.778226 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:02.778109 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="a7d7d434-3570-47ef-bdda-560a95a9a87b" containerName="console" Apr 28 19:18:02.813363 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:02.813333 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 28 19:18:02.813524 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:02.813476 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:02.815892 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:02.815867 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 28 19:18:02.815892 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:02.815880 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 28 19:18:02.816075 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:02.815883 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 28 19:18:02.816220 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:02.816207 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 28 19:18:02.816495 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:02.816453 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 28 19:18:02.816718 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:02.816700 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-2gnvw\"" Apr 28 19:18:02.816948 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:02.816930 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 28 19:18:02.816948 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:02.816937 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 28 19:18:02.816948 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:02.816953 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 28 19:18:02.819351 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:02.819328 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/24e133e7-af81-4b03-9995-6c8082eeaf83-config-volume\") pod \"alertmanager-main-0\" (UID: \"24e133e7-af81-4b03-9995-6c8082eeaf83\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:02.819464 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:02.819366 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/24e133e7-af81-4b03-9995-6c8082eeaf83-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"24e133e7-af81-4b03-9995-6c8082eeaf83\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:02.819464 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:02.819400 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24e133e7-af81-4b03-9995-6c8082eeaf83-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"24e133e7-af81-4b03-9995-6c8082eeaf83\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:02.819464 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:02.819434 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/24e133e7-af81-4b03-9995-6c8082eeaf83-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"24e133e7-af81-4b03-9995-6c8082eeaf83\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:02.819644 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:02.819469 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/24e133e7-af81-4b03-9995-6c8082eeaf83-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"24e133e7-af81-4b03-9995-6c8082eeaf83\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:02.819644 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:02.819536 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/24e133e7-af81-4b03-9995-6c8082eeaf83-config-out\") pod \"alertmanager-main-0\" (UID: \"24e133e7-af81-4b03-9995-6c8082eeaf83\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:02.819644 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:02.819597 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/24e133e7-af81-4b03-9995-6c8082eeaf83-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"24e133e7-af81-4b03-9995-6c8082eeaf83\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:02.819644 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:02.819633 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/24e133e7-af81-4b03-9995-6c8082eeaf83-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"24e133e7-af81-4b03-9995-6c8082eeaf83\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:02.819842 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:02.819658 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/24e133e7-af81-4b03-9995-6c8082eeaf83-web-config\") pod \"alertmanager-main-0\" (UID: \"24e133e7-af81-4b03-9995-6c8082eeaf83\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:02.819842 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:02.819728 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/24e133e7-af81-4b03-9995-6c8082eeaf83-tls-assets\") pod \"alertmanager-main-0\" (UID: \"24e133e7-af81-4b03-9995-6c8082eeaf83\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:02.819842 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:02.819757 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/24e133e7-af81-4b03-9995-6c8082eeaf83-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"24e133e7-af81-4b03-9995-6c8082eeaf83\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:02.819842 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:02.819788 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/24e133e7-af81-4b03-9995-6c8082eeaf83-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"24e133e7-af81-4b03-9995-6c8082eeaf83\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:02.820046 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:02.819852 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctcxp\" (UniqueName: \"kubernetes.io/projected/24e133e7-af81-4b03-9995-6c8082eeaf83-kube-api-access-ctcxp\") pod \"alertmanager-main-0\" (UID: \"24e133e7-af81-4b03-9995-6c8082eeaf83\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:02.821639 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:02.821618 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 28 19:18:02.921061 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:02.921026 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/24e133e7-af81-4b03-9995-6c8082eeaf83-config-out\") pod \"alertmanager-main-0\" (UID: \"24e133e7-af81-4b03-9995-6c8082eeaf83\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:02.921255 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:02.921071 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/24e133e7-af81-4b03-9995-6c8082eeaf83-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"24e133e7-af81-4b03-9995-6c8082eeaf83\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:02.921255 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:02.921094 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/24e133e7-af81-4b03-9995-6c8082eeaf83-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"24e133e7-af81-4b03-9995-6c8082eeaf83\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:02.921255 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:02.921114 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/24e133e7-af81-4b03-9995-6c8082eeaf83-web-config\") pod \"alertmanager-main-0\" (UID: \"24e133e7-af81-4b03-9995-6c8082eeaf83\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:02.921255 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:02.921142 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/24e133e7-af81-4b03-9995-6c8082eeaf83-tls-assets\") pod \"alertmanager-main-0\" (UID: \"24e133e7-af81-4b03-9995-6c8082eeaf83\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:02.921255 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:02.921166 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/24e133e7-af81-4b03-9995-6c8082eeaf83-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"24e133e7-af81-4b03-9995-6c8082eeaf83\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:02.921255 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:02.921193 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/24e133e7-af81-4b03-9995-6c8082eeaf83-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"24e133e7-af81-4b03-9995-6c8082eeaf83\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:02.921255 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:02.921225 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ctcxp\" (UniqueName: \"kubernetes.io/projected/24e133e7-af81-4b03-9995-6c8082eeaf83-kube-api-access-ctcxp\") pod \"alertmanager-main-0\" (UID: \"24e133e7-af81-4b03-9995-6c8082eeaf83\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:02.921629 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:02.921265 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/24e133e7-af81-4b03-9995-6c8082eeaf83-config-volume\") pod \"alertmanager-main-0\" (UID: \"24e133e7-af81-4b03-9995-6c8082eeaf83\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:02.921629 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:02.921286 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/24e133e7-af81-4b03-9995-6c8082eeaf83-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"24e133e7-af81-4b03-9995-6c8082eeaf83\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:02.921629 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:02.921315 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24e133e7-af81-4b03-9995-6c8082eeaf83-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"24e133e7-af81-4b03-9995-6c8082eeaf83\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:02.921629 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:02.921343 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/24e133e7-af81-4b03-9995-6c8082eeaf83-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"24e133e7-af81-4b03-9995-6c8082eeaf83\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:02.921629 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:02.921373 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/24e133e7-af81-4b03-9995-6c8082eeaf83-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"24e133e7-af81-4b03-9995-6c8082eeaf83\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:02.921860 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:02.921641 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/24e133e7-af81-4b03-9995-6c8082eeaf83-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"24e133e7-af81-4b03-9995-6c8082eeaf83\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:02.922818 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:02.922740 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24e133e7-af81-4b03-9995-6c8082eeaf83-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"24e133e7-af81-4b03-9995-6c8082eeaf83\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:02.922930 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:02.922901 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/24e133e7-af81-4b03-9995-6c8082eeaf83-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"24e133e7-af81-4b03-9995-6c8082eeaf83\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:02.924117 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:02.924075 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/24e133e7-af81-4b03-9995-6c8082eeaf83-config-out\") pod \"alertmanager-main-0\" (UID: \"24e133e7-af81-4b03-9995-6c8082eeaf83\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:02.924219 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:02.924153 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/24e133e7-af81-4b03-9995-6c8082eeaf83-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"24e133e7-af81-4b03-9995-6c8082eeaf83\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:02.924219 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:02.924205 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/24e133e7-af81-4b03-9995-6c8082eeaf83-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"24e133e7-af81-4b03-9995-6c8082eeaf83\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:02.924633 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:02.924607 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/24e133e7-af81-4b03-9995-6c8082eeaf83-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"24e133e7-af81-4b03-9995-6c8082eeaf83\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:02.924796 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:02.924772 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/24e133e7-af81-4b03-9995-6c8082eeaf83-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"24e133e7-af81-4b03-9995-6c8082eeaf83\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:02.925375 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:02.925344 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/24e133e7-af81-4b03-9995-6c8082eeaf83-config-volume\") pod \"alertmanager-main-0\" (UID: \"24e133e7-af81-4b03-9995-6c8082eeaf83\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:02.925535 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:02.925515 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/24e133e7-af81-4b03-9995-6c8082eeaf83-tls-assets\") pod \"alertmanager-main-0\" (UID: \"24e133e7-af81-4b03-9995-6c8082eeaf83\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:02.925585 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:02.925558 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/24e133e7-af81-4b03-9995-6c8082eeaf83-web-config\") pod \"alertmanager-main-0\" (UID: \"24e133e7-af81-4b03-9995-6c8082eeaf83\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:02.926135 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:02.926113 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/24e133e7-af81-4b03-9995-6c8082eeaf83-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"24e133e7-af81-4b03-9995-6c8082eeaf83\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:02.932594 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:02.932575 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctcxp\" (UniqueName: \"kubernetes.io/projected/24e133e7-af81-4b03-9995-6c8082eeaf83-kube-api-access-ctcxp\") pod \"alertmanager-main-0\" (UID: \"24e133e7-af81-4b03-9995-6c8082eeaf83\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:03.123044 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:03.122968 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:03.251046 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:03.250849 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 28 19:18:03.253783 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:18:03.253753 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24e133e7_af81_4b03_9995_6c8082eeaf83.slice/crio-5a846d4dcd6d5177e520838e318d5445415047f94ae6838d004de11a9036b02d WatchSource:0}: Error finding container 5a846d4dcd6d5177e520838e318d5445415047f94ae6838d004de11a9036b02d: Status 404 returned error can't find the container with id 5a846d4dcd6d5177e520838e318d5445415047f94ae6838d004de11a9036b02d Apr 28 19:18:03.440450 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:03.440410 2571 generic.go:358] "Generic (PLEG): container finished" podID="24e133e7-af81-4b03-9995-6c8082eeaf83" containerID="850f2366af1982cecb3efc3946ddc97d4ab65aa89fce146b2dbef0f72706e0c3" exitCode=0 Apr 28 19:18:03.440621 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:03.440509 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"24e133e7-af81-4b03-9995-6c8082eeaf83","Type":"ContainerDied","Data":"850f2366af1982cecb3efc3946ddc97d4ab65aa89fce146b2dbef0f72706e0c3"} Apr 28 19:18:03.440621 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:03.440539 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"24e133e7-af81-4b03-9995-6c8082eeaf83","Type":"ContainerStarted","Data":"5a846d4dcd6d5177e520838e318d5445415047f94ae6838d004de11a9036b02d"} Apr 28 19:18:04.352990 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:04.352954 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-5qtkh" Apr 28 19:18:05.449275 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:05.449183 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"24e133e7-af81-4b03-9995-6c8082eeaf83","Type":"ContainerStarted","Data":"8dd7addbbf2f46890e4edcfc8807bd1250e3d5d54929086d490fa63fe39d3b5b"} Apr 28 19:18:05.449275 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:05.449226 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"24e133e7-af81-4b03-9995-6c8082eeaf83","Type":"ContainerStarted","Data":"f95aa915818449824c698968a70cf04fa542392e7bc63e8bd5f95c0ba815131b"} Apr 28 19:18:05.449275 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:05.449241 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"24e133e7-af81-4b03-9995-6c8082eeaf83","Type":"ContainerStarted","Data":"86b28e4a397ffbea5ec17f8b527a3881fb01c370a2a2da58248ab9feccb03b64"} Apr 28 19:18:05.449275 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:05.449256 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"24e133e7-af81-4b03-9995-6c8082eeaf83","Type":"ContainerStarted","Data":"ac8d418ba1213413032048d8d1575e27fca2b9de14bb95863df017326257098a"} Apr 28 19:18:05.449275 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:05.449268 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"24e133e7-af81-4b03-9995-6c8082eeaf83","Type":"ContainerStarted","Data":"c43c60b94552ab162f9eddd8c6df00a9a93d8b038ba4b54a44e7c14a575f66fe"} Apr 28 19:18:05.449275 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:05.449278 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"24e133e7-af81-4b03-9995-6c8082eeaf83","Type":"ContainerStarted","Data":"6f00995f7fba3de27043f1c1490ffb1954506dfbbc9097a01820e836cddd1d35"} Apr 28 19:18:05.483963 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:05.483904 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.9901275059999999 podStartE2EDuration="3.483887583s" podCreationTimestamp="2026-04-28 19:18:02 +0000 UTC" firstStartedPulling="2026-04-28 19:18:03.44167838 +0000 UTC m=+100.072579306" lastFinishedPulling="2026-04-28 19:18:04.935438457 +0000 UTC m=+101.566339383" observedRunningTime="2026-04-28 19:18:05.482344097 +0000 UTC m=+102.113245042" watchObservedRunningTime="2026-04-28 19:18:05.483887583 +0000 UTC m=+102.114788527" Apr 28 19:18:16.494447 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:16.494414 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-f6cf4b8cd-pjsq8"] Apr 28 19:18:16.498022 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:16.497996 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f6cf4b8cd-pjsq8" Apr 28 19:18:16.510744 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:16.510719 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f6cf4b8cd-pjsq8"] Apr 28 19:18:16.539900 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:16.539871 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e5c32464-ec1f-40c0-90df-b98120f5f58a-console-config\") pod \"console-f6cf4b8cd-pjsq8\" (UID: \"e5c32464-ec1f-40c0-90df-b98120f5f58a\") " pod="openshift-console/console-f6cf4b8cd-pjsq8" Apr 28 19:18:16.540024 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:16.539934 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e5c32464-ec1f-40c0-90df-b98120f5f58a-console-oauth-config\") pod \"console-f6cf4b8cd-pjsq8\" (UID: \"e5c32464-ec1f-40c0-90df-b98120f5f58a\") " pod="openshift-console/console-f6cf4b8cd-pjsq8" Apr 28 19:18:16.540024 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:16.539991 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e5c32464-ec1f-40c0-90df-b98120f5f58a-oauth-serving-cert\") pod \"console-f6cf4b8cd-pjsq8\" (UID: \"e5c32464-ec1f-40c0-90df-b98120f5f58a\") " pod="openshift-console/console-f6cf4b8cd-pjsq8" Apr 28 19:18:16.540129 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:16.540026 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbxp8\" (UniqueName: \"kubernetes.io/projected/e5c32464-ec1f-40c0-90df-b98120f5f58a-kube-api-access-cbxp8\") pod \"console-f6cf4b8cd-pjsq8\" (UID: \"e5c32464-ec1f-40c0-90df-b98120f5f58a\") " pod="openshift-console/console-f6cf4b8cd-pjsq8" Apr 28 19:18:16.540129 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:16.540050 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5c32464-ec1f-40c0-90df-b98120f5f58a-trusted-ca-bundle\") pod \"console-f6cf4b8cd-pjsq8\" (UID: \"e5c32464-ec1f-40c0-90df-b98120f5f58a\") " pod="openshift-console/console-f6cf4b8cd-pjsq8" Apr 28 19:18:16.540129 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:16.540069 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e5c32464-ec1f-40c0-90df-b98120f5f58a-service-ca\") pod \"console-f6cf4b8cd-pjsq8\" (UID: \"e5c32464-ec1f-40c0-90df-b98120f5f58a\") " pod="openshift-console/console-f6cf4b8cd-pjsq8" Apr 28 19:18:16.540129 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:16.540116 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e5c32464-ec1f-40c0-90df-b98120f5f58a-console-serving-cert\") pod \"console-f6cf4b8cd-pjsq8\" (UID: \"e5c32464-ec1f-40c0-90df-b98120f5f58a\") " pod="openshift-console/console-f6cf4b8cd-pjsq8" Apr 28 19:18:16.641329 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:16.641292 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e5c32464-ec1f-40c0-90df-b98120f5f58a-console-oauth-config\") pod \"console-f6cf4b8cd-pjsq8\" (UID: \"e5c32464-ec1f-40c0-90df-b98120f5f58a\") " pod="openshift-console/console-f6cf4b8cd-pjsq8" Apr 28 19:18:16.641329 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:16.641332 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e5c32464-ec1f-40c0-90df-b98120f5f58a-oauth-serving-cert\") pod \"console-f6cf4b8cd-pjsq8\" (UID: \"e5c32464-ec1f-40c0-90df-b98120f5f58a\") " pod="openshift-console/console-f6cf4b8cd-pjsq8" Apr 28 19:18:16.641590 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:16.641366 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cbxp8\" (UniqueName: \"kubernetes.io/projected/e5c32464-ec1f-40c0-90df-b98120f5f58a-kube-api-access-cbxp8\") pod \"console-f6cf4b8cd-pjsq8\" (UID: \"e5c32464-ec1f-40c0-90df-b98120f5f58a\") " pod="openshift-console/console-f6cf4b8cd-pjsq8" Apr 28 19:18:16.641590 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:16.641392 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5c32464-ec1f-40c0-90df-b98120f5f58a-trusted-ca-bundle\") pod \"console-f6cf4b8cd-pjsq8\" (UID: \"e5c32464-ec1f-40c0-90df-b98120f5f58a\") " pod="openshift-console/console-f6cf4b8cd-pjsq8" Apr 28 19:18:16.641590 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:16.641418 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e5c32464-ec1f-40c0-90df-b98120f5f58a-service-ca\") pod \"console-f6cf4b8cd-pjsq8\" (UID: \"e5c32464-ec1f-40c0-90df-b98120f5f58a\") " pod="openshift-console/console-f6cf4b8cd-pjsq8" Apr 28 19:18:16.641590 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:16.641456 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e5c32464-ec1f-40c0-90df-b98120f5f58a-console-serving-cert\") pod \"console-f6cf4b8cd-pjsq8\" (UID: \"e5c32464-ec1f-40c0-90df-b98120f5f58a\") " pod="openshift-console/console-f6cf4b8cd-pjsq8" Apr 28 19:18:16.641809 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:16.641590 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e5c32464-ec1f-40c0-90df-b98120f5f58a-console-config\") pod \"console-f6cf4b8cd-pjsq8\" (UID: \"e5c32464-ec1f-40c0-90df-b98120f5f58a\") " pod="openshift-console/console-f6cf4b8cd-pjsq8" Apr 28 19:18:16.642242 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:16.642214 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e5c32464-ec1f-40c0-90df-b98120f5f58a-oauth-serving-cert\") pod \"console-f6cf4b8cd-pjsq8\" (UID: \"e5c32464-ec1f-40c0-90df-b98120f5f58a\") " pod="openshift-console/console-f6cf4b8cd-pjsq8" Apr 28 19:18:16.642346 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:16.642277 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e5c32464-ec1f-40c0-90df-b98120f5f58a-service-ca\") pod \"console-f6cf4b8cd-pjsq8\" (UID: \"e5c32464-ec1f-40c0-90df-b98120f5f58a\") " pod="openshift-console/console-f6cf4b8cd-pjsq8" Apr 28 19:18:16.642409 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:16.642338 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e5c32464-ec1f-40c0-90df-b98120f5f58a-console-config\") pod \"console-f6cf4b8cd-pjsq8\" (UID: \"e5c32464-ec1f-40c0-90df-b98120f5f58a\") " pod="openshift-console/console-f6cf4b8cd-pjsq8" Apr 28 19:18:16.643466 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:16.643442 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5c32464-ec1f-40c0-90df-b98120f5f58a-trusted-ca-bundle\") pod \"console-f6cf4b8cd-pjsq8\" (UID: \"e5c32464-ec1f-40c0-90df-b98120f5f58a\") " pod="openshift-console/console-f6cf4b8cd-pjsq8" Apr 28 19:18:16.643924 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:16.643896 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e5c32464-ec1f-40c0-90df-b98120f5f58a-console-serving-cert\") pod \"console-f6cf4b8cd-pjsq8\" (UID: \"e5c32464-ec1f-40c0-90df-b98120f5f58a\") " pod="openshift-console/console-f6cf4b8cd-pjsq8" Apr 28 19:18:16.644464 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:16.644447 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e5c32464-ec1f-40c0-90df-b98120f5f58a-console-oauth-config\") pod \"console-f6cf4b8cd-pjsq8\" (UID: \"e5c32464-ec1f-40c0-90df-b98120f5f58a\") " pod="openshift-console/console-f6cf4b8cd-pjsq8" Apr 28 19:18:16.650113 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:16.650095 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbxp8\" (UniqueName: \"kubernetes.io/projected/e5c32464-ec1f-40c0-90df-b98120f5f58a-kube-api-access-cbxp8\") pod \"console-f6cf4b8cd-pjsq8\" (UID: \"e5c32464-ec1f-40c0-90df-b98120f5f58a\") " pod="openshift-console/console-f6cf4b8cd-pjsq8" Apr 28 19:18:16.808655 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:16.808577 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f6cf4b8cd-pjsq8" Apr 28 19:18:16.929648 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:16.929592 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f6cf4b8cd-pjsq8"] Apr 28 19:18:16.931848 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:18:16.931821 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5c32464_ec1f_40c0_90df_b98120f5f58a.slice/crio-674fcf53cfb18f7916530f6be5351f202b73b8a877f0c060765d033c7e1a8f45 WatchSource:0}: Error finding container 674fcf53cfb18f7916530f6be5351f202b73b8a877f0c060765d033c7e1a8f45: Status 404 returned error can't find the container with id 674fcf53cfb18f7916530f6be5351f202b73b8a877f0c060765d033c7e1a8f45 Apr 28 19:18:17.501880 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:17.501844 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f6cf4b8cd-pjsq8" event={"ID":"e5c32464-ec1f-40c0-90df-b98120f5f58a","Type":"ContainerStarted","Data":"454beecbfd823252768aa9e9409bc2f24a32e17ff3fe7ac0ab786a14c39135ce"} Apr 28 19:18:17.501880 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:17.501882 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f6cf4b8cd-pjsq8" event={"ID":"e5c32464-ec1f-40c0-90df-b98120f5f58a","Type":"ContainerStarted","Data":"674fcf53cfb18f7916530f6be5351f202b73b8a877f0c060765d033c7e1a8f45"} Apr 28 19:18:26.809713 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:26.809681 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-f6cf4b8cd-pjsq8" Apr 28 19:18:26.810080 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:26.809724 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f6cf4b8cd-pjsq8" Apr 28 19:18:26.814449 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:26.814423 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f6cf4b8cd-pjsq8" Apr 28 19:18:26.837369 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:26.837324 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f6cf4b8cd-pjsq8" podStartSLOduration=10.837311469 podStartE2EDuration="10.837311469s" podCreationTimestamp="2026-04-28 19:18:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:18:17.523918103 +0000 UTC m=+114.154819047" watchObservedRunningTime="2026-04-28 19:18:26.837311469 +0000 UTC m=+123.468212413" Apr 28 19:18:27.535709 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:27.535679 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f6cf4b8cd-pjsq8" Apr 28 19:18:27.595501 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:27.595449 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-59bf4679d8-rx8t5"] Apr 28 19:18:32.417119 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:32.417081 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-jfmqj"] Apr 28 19:18:32.419950 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:32.419926 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jfmqj" Apr 28 19:18:32.422144 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:32.422123 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 28 19:18:32.434398 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:32.434371 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-jfmqj"] Apr 28 19:18:32.584787 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:32.584745 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7ea34c75-1b0f-4ec6-a805-8cf83f9d32fb-original-pull-secret\") pod \"global-pull-secret-syncer-jfmqj\" (UID: \"7ea34c75-1b0f-4ec6-a805-8cf83f9d32fb\") " pod="kube-system/global-pull-secret-syncer-jfmqj" Apr 28 19:18:32.584787 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:32.584789 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7ea34c75-1b0f-4ec6-a805-8cf83f9d32fb-dbus\") pod \"global-pull-secret-syncer-jfmqj\" (UID: \"7ea34c75-1b0f-4ec6-a805-8cf83f9d32fb\") " pod="kube-system/global-pull-secret-syncer-jfmqj" Apr 28 19:18:32.585003 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:32.584831 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7ea34c75-1b0f-4ec6-a805-8cf83f9d32fb-kubelet-config\") pod \"global-pull-secret-syncer-jfmqj\" (UID: \"7ea34c75-1b0f-4ec6-a805-8cf83f9d32fb\") " pod="kube-system/global-pull-secret-syncer-jfmqj" Apr 28 19:18:32.686042 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:32.685929 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7ea34c75-1b0f-4ec6-a805-8cf83f9d32fb-dbus\") pod \"global-pull-secret-syncer-jfmqj\" (UID: \"7ea34c75-1b0f-4ec6-a805-8cf83f9d32fb\") " pod="kube-system/global-pull-secret-syncer-jfmqj" Apr 28 19:18:32.686188 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:32.686084 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7ea34c75-1b0f-4ec6-a805-8cf83f9d32fb-kubelet-config\") pod \"global-pull-secret-syncer-jfmqj\" (UID: \"7ea34c75-1b0f-4ec6-a805-8cf83f9d32fb\") " pod="kube-system/global-pull-secret-syncer-jfmqj" Apr 28 19:18:32.686188 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:32.686135 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7ea34c75-1b0f-4ec6-a805-8cf83f9d32fb-dbus\") pod \"global-pull-secret-syncer-jfmqj\" (UID: \"7ea34c75-1b0f-4ec6-a805-8cf83f9d32fb\") " pod="kube-system/global-pull-secret-syncer-jfmqj" Apr 28 19:18:32.686268 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:32.686195 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7ea34c75-1b0f-4ec6-a805-8cf83f9d32fb-kubelet-config\") pod \"global-pull-secret-syncer-jfmqj\" (UID: \"7ea34c75-1b0f-4ec6-a805-8cf83f9d32fb\") " pod="kube-system/global-pull-secret-syncer-jfmqj" Apr 28 19:18:32.686268 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:32.686219 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7ea34c75-1b0f-4ec6-a805-8cf83f9d32fb-original-pull-secret\") pod \"global-pull-secret-syncer-jfmqj\" (UID: \"7ea34c75-1b0f-4ec6-a805-8cf83f9d32fb\") " pod="kube-system/global-pull-secret-syncer-jfmqj" Apr 28 19:18:32.688519 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:32.688493 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7ea34c75-1b0f-4ec6-a805-8cf83f9d32fb-original-pull-secret\") pod \"global-pull-secret-syncer-jfmqj\" (UID: \"7ea34c75-1b0f-4ec6-a805-8cf83f9d32fb\") " pod="kube-system/global-pull-secret-syncer-jfmqj" Apr 28 19:18:32.728694 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:32.728665 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jfmqj" Apr 28 19:18:32.848579 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:32.848541 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-jfmqj"] Apr 28 19:18:32.851570 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:18:32.851547 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ea34c75_1b0f_4ec6_a805_8cf83f9d32fb.slice/crio-46abbf6a8e37c20df2f4887e1244596929e0e985efa3e2470d78a865c941fa38 WatchSource:0}: Error finding container 46abbf6a8e37c20df2f4887e1244596929e0e985efa3e2470d78a865c941fa38: Status 404 returned error can't find the container with id 46abbf6a8e37c20df2f4887e1244596929e0e985efa3e2470d78a865c941fa38 Apr 28 19:18:33.549213 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:33.549168 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-jfmqj" event={"ID":"7ea34c75-1b0f-4ec6-a805-8cf83f9d32fb","Type":"ContainerStarted","Data":"46abbf6a8e37c20df2f4887e1244596929e0e985efa3e2470d78a865c941fa38"} Apr 28 19:18:37.561929 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:37.561892 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-jfmqj" event={"ID":"7ea34c75-1b0f-4ec6-a805-8cf83f9d32fb","Type":"ContainerStarted","Data":"d9470467a07112f6110d3ea514106dbd54d057e50371bdb864b79ca537c777db"} Apr 28 19:18:37.578949 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:37.578879 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-jfmqj" podStartSLOduration=1.606141815 podStartE2EDuration="5.578861881s" podCreationTimestamp="2026-04-28 19:18:32 +0000 UTC" firstStartedPulling="2026-04-28 19:18:32.853328072 +0000 UTC m=+129.484228995" lastFinishedPulling="2026-04-28 19:18:36.826048133 +0000 UTC m=+133.456949061" observedRunningTime="2026-04-28 19:18:37.578320838 +0000 UTC m=+134.209221783" watchObservedRunningTime="2026-04-28 19:18:37.578861881 +0000 UTC m=+134.209762818" Apr 28 19:18:52.614445 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:52.614388 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-59bf4679d8-rx8t5" podUID="660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1" containerName="console" containerID="cri-o://c879be5b3e4b8cbfc908fa0c410589b4dfcb487500d5537e3576429b0679b590" gracePeriod=15 Apr 28 19:18:52.683411 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:52.683375 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cw2mq2"] Apr 28 19:18:52.688217 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:52.688199 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cw2mq2" Apr 28 19:18:52.693071 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:52.693044 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 28 19:18:52.693530 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:52.693508 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-95mtd\"" Apr 28 19:18:52.693634 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:52.693616 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 28 19:18:52.696799 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:52.696779 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cw2mq2"] Apr 28 19:18:52.743185 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:52.743156 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsgls\" (UniqueName: \"kubernetes.io/projected/72778ec3-92aa-441f-836c-c404d2f75d8b-kube-api-access-vsgls\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cw2mq2\" (UID: \"72778ec3-92aa-441f-836c-c404d2f75d8b\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cw2mq2" Apr 28 19:18:52.743306 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:52.743207 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/72778ec3-92aa-441f-836c-c404d2f75d8b-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cw2mq2\" (UID: \"72778ec3-92aa-441f-836c-c404d2f75d8b\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cw2mq2" Apr 28 19:18:52.743306 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:52.743227 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/72778ec3-92aa-441f-836c-c404d2f75d8b-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cw2mq2\" (UID: \"72778ec3-92aa-441f-836c-c404d2f75d8b\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cw2mq2" Apr 28 19:18:52.844306 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:52.844272 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/72778ec3-92aa-441f-836c-c404d2f75d8b-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cw2mq2\" (UID: \"72778ec3-92aa-441f-836c-c404d2f75d8b\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cw2mq2" Apr 28 19:18:52.844468 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:52.844314 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/72778ec3-92aa-441f-836c-c404d2f75d8b-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cw2mq2\" (UID: \"72778ec3-92aa-441f-836c-c404d2f75d8b\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cw2mq2" Apr 28 19:18:52.844468 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:52.844378 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vsgls\" (UniqueName: \"kubernetes.io/projected/72778ec3-92aa-441f-836c-c404d2f75d8b-kube-api-access-vsgls\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cw2mq2\" (UID: \"72778ec3-92aa-441f-836c-c404d2f75d8b\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cw2mq2" Apr 28 19:18:52.844749 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:52.844724 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/72778ec3-92aa-441f-836c-c404d2f75d8b-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cw2mq2\" (UID: \"72778ec3-92aa-441f-836c-c404d2f75d8b\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cw2mq2" Apr 28 19:18:52.844819 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:52.844750 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/72778ec3-92aa-441f-836c-c404d2f75d8b-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cw2mq2\" (UID: \"72778ec3-92aa-441f-836c-c404d2f75d8b\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cw2mq2" Apr 28 19:18:52.855943 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:52.855915 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsgls\" (UniqueName: \"kubernetes.io/projected/72778ec3-92aa-441f-836c-c404d2f75d8b-kube-api-access-vsgls\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cw2mq2\" (UID: \"72778ec3-92aa-441f-836c-c404d2f75d8b\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cw2mq2" Apr 28 19:18:52.868709 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:52.868662 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-59bf4679d8-rx8t5_660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1/console/0.log" Apr 28 19:18:52.868796 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:52.868718 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59bf4679d8-rx8t5" Apr 28 19:18:52.945204 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:52.945174 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1-console-serving-cert\") pod \"660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1\" (UID: \"660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1\") " Apr 28 19:18:52.945368 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:52.945219 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzmwt\" (UniqueName: \"kubernetes.io/projected/660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1-kube-api-access-tzmwt\") pod \"660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1\" (UID: \"660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1\") " Apr 28 19:18:52.945368 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:52.945243 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1-service-ca\") pod \"660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1\" (UID: \"660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1\") " Apr 28 19:18:52.945368 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:52.945353 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1-console-oauth-config\") pod \"660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1\" (UID: \"660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1\") " Apr 28 19:18:52.945568 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:52.945396 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1-trusted-ca-bundle\") pod \"660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1\" (UID: \"660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1\") " Apr 28 19:18:52.945568 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:52.945423 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1-oauth-serving-cert\") pod \"660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1\" (UID: \"660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1\") " Apr 28 19:18:52.945568 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:52.945454 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1-console-config\") pod \"660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1\" (UID: \"660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1\") " Apr 28 19:18:52.945716 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:52.945611 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1-service-ca" (OuterVolumeSpecName: "service-ca") pod "660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1" (UID: "660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:18:52.945977 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:52.945779 2571 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1-service-ca\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:18:52.945977 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:52.945792 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1" (UID: "660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:18:52.945977 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:52.945925 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1" (UID: "660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:18:52.945977 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:52.945965 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1-console-config" (OuterVolumeSpecName: "console-config") pod "660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1" (UID: "660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:18:52.947588 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:52.947558 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1" (UID: "660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:18:52.947681 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:52.947628 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1-kube-api-access-tzmwt" (OuterVolumeSpecName: "kube-api-access-tzmwt") pod "660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1" (UID: "660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1"). InnerVolumeSpecName "kube-api-access-tzmwt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:18:52.947681 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:52.947672 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1" (UID: "660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:18:52.997433 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:52.997401 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cw2mq2" Apr 28 19:18:53.046752 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:53.046715 2571 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1-trusted-ca-bundle\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:18:53.046752 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:53.046752 2571 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1-oauth-serving-cert\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:18:53.046943 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:53.046770 2571 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1-console-config\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:18:53.046943 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:53.046785 2571 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1-console-serving-cert\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:18:53.046943 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:53.046802 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tzmwt\" (UniqueName: \"kubernetes.io/projected/660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1-kube-api-access-tzmwt\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:18:53.046943 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:53.046816 2571 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1-console-oauth-config\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:18:53.116355 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:53.116326 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cw2mq2"] Apr 28 19:18:53.119195 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:18:53.119139 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72778ec3_92aa_441f_836c_c404d2f75d8b.slice/crio-79b11579a6ee267041030d4694da1f906fe8cf5b3f6dc111a25a16d6d09ecc76 WatchSource:0}: Error finding container 79b11579a6ee267041030d4694da1f906fe8cf5b3f6dc111a25a16d6d09ecc76: Status 404 returned error can't find the container with id 79b11579a6ee267041030d4694da1f906fe8cf5b3f6dc111a25a16d6d09ecc76 Apr 28 19:18:53.607956 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:53.607918 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cw2mq2" event={"ID":"72778ec3-92aa-441f-836c-c404d2f75d8b","Type":"ContainerStarted","Data":"79b11579a6ee267041030d4694da1f906fe8cf5b3f6dc111a25a16d6d09ecc76"} Apr 28 19:18:53.609110 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:53.609090 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-59bf4679d8-rx8t5_660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1/console/0.log" Apr 28 19:18:53.609206 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:53.609126 2571 generic.go:358] "Generic (PLEG): container finished" podID="660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1" containerID="c879be5b3e4b8cbfc908fa0c410589b4dfcb487500d5537e3576429b0679b590" exitCode=2 Apr 28 19:18:53.609206 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:53.609163 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59bf4679d8-rx8t5" event={"ID":"660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1","Type":"ContainerDied","Data":"c879be5b3e4b8cbfc908fa0c410589b4dfcb487500d5537e3576429b0679b590"} Apr 28 19:18:53.609206 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:53.609194 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59bf4679d8-rx8t5" event={"ID":"660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1","Type":"ContainerDied","Data":"213c33c2d1dd5c42ea94b1a690edc278972eed36dce4b0a22100cc396af9268e"} Apr 28 19:18:53.609319 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:53.609208 2571 scope.go:117] "RemoveContainer" containerID="c879be5b3e4b8cbfc908fa0c410589b4dfcb487500d5537e3576429b0679b590" Apr 28 19:18:53.609319 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:53.609213 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59bf4679d8-rx8t5" Apr 28 19:18:53.618303 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:53.617956 2571 scope.go:117] "RemoveContainer" containerID="c879be5b3e4b8cbfc908fa0c410589b4dfcb487500d5537e3576429b0679b590" Apr 28 19:18:53.618303 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:18:53.618280 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c879be5b3e4b8cbfc908fa0c410589b4dfcb487500d5537e3576429b0679b590\": container with ID starting with c879be5b3e4b8cbfc908fa0c410589b4dfcb487500d5537e3576429b0679b590 not found: ID does not exist" containerID="c879be5b3e4b8cbfc908fa0c410589b4dfcb487500d5537e3576429b0679b590" Apr 28 19:18:53.618600 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:53.618305 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c879be5b3e4b8cbfc908fa0c410589b4dfcb487500d5537e3576429b0679b590"} err="failed to get container status \"c879be5b3e4b8cbfc908fa0c410589b4dfcb487500d5537e3576429b0679b590\": rpc error: code = NotFound desc = could not find container \"c879be5b3e4b8cbfc908fa0c410589b4dfcb487500d5537e3576429b0679b590\": container with ID starting with c879be5b3e4b8cbfc908fa0c410589b4dfcb487500d5537e3576429b0679b590 not found: ID does not exist" Apr 28 19:18:53.640605 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:53.640576 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-59bf4679d8-rx8t5"] Apr 28 19:18:53.650342 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:53.650318 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-59bf4679d8-rx8t5"] Apr 28 19:18:53.946145 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:18:53.946104 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1" path="/var/lib/kubelet/pods/660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1/volumes" Apr 28 19:19:00.632040 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:00.632004 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cw2mq2" event={"ID":"72778ec3-92aa-441f-836c-c404d2f75d8b","Type":"ContainerStarted","Data":"1c4deac42cf703b21a4734d4f6ea27a33518e8b9a9d33e6f9986480066ba31f2"} Apr 28 19:19:01.637098 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:01.637066 2571 generic.go:358] "Generic (PLEG): container finished" podID="72778ec3-92aa-441f-836c-c404d2f75d8b" containerID="1c4deac42cf703b21a4734d4f6ea27a33518e8b9a9d33e6f9986480066ba31f2" exitCode=0 Apr 28 19:19:01.637468 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:01.637154 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cw2mq2" event={"ID":"72778ec3-92aa-441f-836c-c404d2f75d8b","Type":"ContainerDied","Data":"1c4deac42cf703b21a4734d4f6ea27a33518e8b9a9d33e6f9986480066ba31f2"} Apr 28 19:19:02.279370 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:02.279339 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-79b49f59d9-v2fb8"] Apr 28 19:19:02.279677 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:02.279664 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1" containerName="console" Apr 28 19:19:02.279721 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:02.279679 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1" containerName="console" Apr 28 19:19:02.279752 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:02.279738 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="660e2ec9-2c7f-4ef1-9d90-f68454ab1ec1" containerName="console" Apr 28 19:19:02.282510 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:02.282494 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-79b49f59d9-v2fb8" Apr 28 19:19:02.284899 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:02.284868 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 28 19:19:02.285049 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:02.284959 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 28 19:19:02.285683 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:02.285664 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 28 19:19:02.285803 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:02.285716 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 28 19:19:02.285876 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:02.285809 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-cf7z2\"" Apr 28 19:19:02.287118 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:02.287095 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c85945db-6hs54"] Apr 28 19:19:02.290273 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:02.290253 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c85945db-6hs54" Apr 28 19:19:02.292730 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:02.292713 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 28 19:19:02.294881 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:02.294859 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-79b49f59d9-v2fb8"] Apr 28 19:19:02.299658 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:02.299639 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c85945db-6hs54"] Apr 28 19:19:02.321145 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:02.321114 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/06876314-10fe-4aa2-988c-88da33e0cf63-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-79b49f59d9-v2fb8\" (UID: \"06876314-10fe-4aa2-988c-88da33e0cf63\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-79b49f59d9-v2fb8" Apr 28 19:19:02.321256 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:02.321212 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2kfz\" (UniqueName: \"kubernetes.io/projected/06876314-10fe-4aa2-988c-88da33e0cf63-kube-api-access-t2kfz\") pod \"managed-serviceaccount-addon-agent-79b49f59d9-v2fb8\" (UID: \"06876314-10fe-4aa2-988c-88da33e0cf63\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-79b49f59d9-v2fb8" Apr 28 19:19:02.325430 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:02.325409 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f9fd9949-87kvl"] Apr 28 19:19:02.328796 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:02.328782 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f9fd9949-87kvl" Apr 28 19:19:02.332189 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:02.332170 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 28 19:19:02.332296 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:02.332210 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 28 19:19:02.332296 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:02.332214 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 28 19:19:02.332773 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:02.332758 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 28 19:19:02.344440 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:02.344340 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f9fd9949-87kvl"] Apr 28 19:19:02.422114 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:02.422081 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/18c18cbe-5358-4ab6-a3b3-b30e9cdae286-tmp\") pod \"klusterlet-addon-workmgr-5c85945db-6hs54\" (UID: \"18c18cbe-5358-4ab6-a3b3-b30e9cdae286\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c85945db-6hs54" Apr 28 19:19:02.422114 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:02.422119 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-767h6\" (UniqueName: \"kubernetes.io/projected/1c7c4f48-5ec3-4c12-9c9b-8d253678b9c1-kube-api-access-767h6\") pod \"cluster-proxy-proxy-agent-78f9fd9949-87kvl\" (UID: \"1c7c4f48-5ec3-4c12-9c9b-8d253678b9c1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f9fd9949-87kvl" Apr 28 19:19:02.422312 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:02.422151 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/1c7c4f48-5ec3-4c12-9c9b-8d253678b9c1-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-78f9fd9949-87kvl\" (UID: \"1c7c4f48-5ec3-4c12-9c9b-8d253678b9c1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f9fd9949-87kvl" Apr 28 19:19:02.422312 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:02.422175 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/1c7c4f48-5ec3-4c12-9c9b-8d253678b9c1-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-78f9fd9949-87kvl\" (UID: \"1c7c4f48-5ec3-4c12-9c9b-8d253678b9c1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f9fd9949-87kvl" Apr 28 19:19:02.422312 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:02.422197 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/06876314-10fe-4aa2-988c-88da33e0cf63-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-79b49f59d9-v2fb8\" (UID: \"06876314-10fe-4aa2-988c-88da33e0cf63\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-79b49f59d9-v2fb8" Apr 28 19:19:02.422312 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:02.422226 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/1c7c4f48-5ec3-4c12-9c9b-8d253678b9c1-hub\") pod \"cluster-proxy-proxy-agent-78f9fd9949-87kvl\" (UID: \"1c7c4f48-5ec3-4c12-9c9b-8d253678b9c1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f9fd9949-87kvl" Apr 28 19:19:02.422312 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:02.422260 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/1c7c4f48-5ec3-4c12-9c9b-8d253678b9c1-ca\") pod \"cluster-proxy-proxy-agent-78f9fd9949-87kvl\" (UID: \"1c7c4f48-5ec3-4c12-9c9b-8d253678b9c1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f9fd9949-87kvl" Apr 28 19:19:02.422537 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:02.422316 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t2kfz\" (UniqueName: \"kubernetes.io/projected/06876314-10fe-4aa2-988c-88da33e0cf63-kube-api-access-t2kfz\") pod \"managed-serviceaccount-addon-agent-79b49f59d9-v2fb8\" (UID: \"06876314-10fe-4aa2-988c-88da33e0cf63\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-79b49f59d9-v2fb8" Apr 28 19:19:02.422537 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:02.422344 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/18c18cbe-5358-4ab6-a3b3-b30e9cdae286-klusterlet-config\") pod \"klusterlet-addon-workmgr-5c85945db-6hs54\" (UID: \"18c18cbe-5358-4ab6-a3b3-b30e9cdae286\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c85945db-6hs54" Apr 28 19:19:02.422537 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:02.422365 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgjrc\" (UniqueName: \"kubernetes.io/projected/18c18cbe-5358-4ab6-a3b3-b30e9cdae286-kube-api-access-bgjrc\") pod \"klusterlet-addon-workmgr-5c85945db-6hs54\" (UID: \"18c18cbe-5358-4ab6-a3b3-b30e9cdae286\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c85945db-6hs54" Apr 28 19:19:02.422537 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:02.422391 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1c7c4f48-5ec3-4c12-9c9b-8d253678b9c1-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-78f9fd9949-87kvl\" (UID: \"1c7c4f48-5ec3-4c12-9c9b-8d253678b9c1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f9fd9949-87kvl" Apr 28 19:19:02.424762 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:02.424744 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/06876314-10fe-4aa2-988c-88da33e0cf63-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-79b49f59d9-v2fb8\" (UID: \"06876314-10fe-4aa2-988c-88da33e0cf63\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-79b49f59d9-v2fb8" Apr 28 19:19:02.433390 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:02.433366 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2kfz\" (UniqueName: \"kubernetes.io/projected/06876314-10fe-4aa2-988c-88da33e0cf63-kube-api-access-t2kfz\") pod \"managed-serviceaccount-addon-agent-79b49f59d9-v2fb8\" (UID: \"06876314-10fe-4aa2-988c-88da33e0cf63\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-79b49f59d9-v2fb8" Apr 28 19:19:02.523184 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:02.523149 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/1c7c4f48-5ec3-4c12-9c9b-8d253678b9c1-ca\") pod \"cluster-proxy-proxy-agent-78f9fd9949-87kvl\" (UID: \"1c7c4f48-5ec3-4c12-9c9b-8d253678b9c1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f9fd9949-87kvl" Apr 28 19:19:02.523359 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:02.523196 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/18c18cbe-5358-4ab6-a3b3-b30e9cdae286-klusterlet-config\") pod \"klusterlet-addon-workmgr-5c85945db-6hs54\" (UID: \"18c18cbe-5358-4ab6-a3b3-b30e9cdae286\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c85945db-6hs54" Apr 28 19:19:02.523359 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:02.523216 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bgjrc\" (UniqueName: \"kubernetes.io/projected/18c18cbe-5358-4ab6-a3b3-b30e9cdae286-kube-api-access-bgjrc\") pod \"klusterlet-addon-workmgr-5c85945db-6hs54\" (UID: \"18c18cbe-5358-4ab6-a3b3-b30e9cdae286\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c85945db-6hs54" Apr 28 19:19:02.523359 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:02.523238 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1c7c4f48-5ec3-4c12-9c9b-8d253678b9c1-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-78f9fd9949-87kvl\" (UID: \"1c7c4f48-5ec3-4c12-9c9b-8d253678b9c1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f9fd9949-87kvl" Apr 28 19:19:02.523359 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:02.523268 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/18c18cbe-5358-4ab6-a3b3-b30e9cdae286-tmp\") pod \"klusterlet-addon-workmgr-5c85945db-6hs54\" (UID: \"18c18cbe-5358-4ab6-a3b3-b30e9cdae286\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c85945db-6hs54" Apr 28 19:19:02.523359 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:02.523284 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-767h6\" (UniqueName: \"kubernetes.io/projected/1c7c4f48-5ec3-4c12-9c9b-8d253678b9c1-kube-api-access-767h6\") pod \"cluster-proxy-proxy-agent-78f9fd9949-87kvl\" (UID: \"1c7c4f48-5ec3-4c12-9c9b-8d253678b9c1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f9fd9949-87kvl" Apr 28 19:19:02.523359 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:02.523320 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/1c7c4f48-5ec3-4c12-9c9b-8d253678b9c1-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-78f9fd9949-87kvl\" (UID: \"1c7c4f48-5ec3-4c12-9c9b-8d253678b9c1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f9fd9949-87kvl" Apr 28 19:19:02.523359 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:02.523358 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/1c7c4f48-5ec3-4c12-9c9b-8d253678b9c1-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-78f9fd9949-87kvl\" (UID: \"1c7c4f48-5ec3-4c12-9c9b-8d253678b9c1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f9fd9949-87kvl" Apr 28 19:19:02.523725 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:02.523392 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/1c7c4f48-5ec3-4c12-9c9b-8d253678b9c1-hub\") pod \"cluster-proxy-proxy-agent-78f9fd9949-87kvl\" (UID: \"1c7c4f48-5ec3-4c12-9c9b-8d253678b9c1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f9fd9949-87kvl" Apr 28 19:19:02.523786 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:02.523752 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/18c18cbe-5358-4ab6-a3b3-b30e9cdae286-tmp\") pod \"klusterlet-addon-workmgr-5c85945db-6hs54\" (UID: \"18c18cbe-5358-4ab6-a3b3-b30e9cdae286\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c85945db-6hs54" Apr 28 19:19:02.524254 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:02.524226 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/1c7c4f48-5ec3-4c12-9c9b-8d253678b9c1-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-78f9fd9949-87kvl\" (UID: \"1c7c4f48-5ec3-4c12-9c9b-8d253678b9c1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f9fd9949-87kvl" Apr 28 19:19:02.525881 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:02.525849 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/1c7c4f48-5ec3-4c12-9c9b-8d253678b9c1-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-78f9fd9949-87kvl\" (UID: \"1c7c4f48-5ec3-4c12-9c9b-8d253678b9c1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f9fd9949-87kvl" Apr 28 19:19:02.525991 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:02.525946 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/18c18cbe-5358-4ab6-a3b3-b30e9cdae286-klusterlet-config\") pod \"klusterlet-addon-workmgr-5c85945db-6hs54\" (UID: \"18c18cbe-5358-4ab6-a3b3-b30e9cdae286\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c85945db-6hs54" Apr 28 19:19:02.526091 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:02.526066 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1c7c4f48-5ec3-4c12-9c9b-8d253678b9c1-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-78f9fd9949-87kvl\" (UID: \"1c7c4f48-5ec3-4c12-9c9b-8d253678b9c1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f9fd9949-87kvl" Apr 28 19:19:02.526158 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:02.526138 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/1c7c4f48-5ec3-4c12-9c9b-8d253678b9c1-hub\") pod \"cluster-proxy-proxy-agent-78f9fd9949-87kvl\" (UID: \"1c7c4f48-5ec3-4c12-9c9b-8d253678b9c1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f9fd9949-87kvl" Apr 28 19:19:02.526228 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:02.526205 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/1c7c4f48-5ec3-4c12-9c9b-8d253678b9c1-ca\") pod \"cluster-proxy-proxy-agent-78f9fd9949-87kvl\" (UID: \"1c7c4f48-5ec3-4c12-9c9b-8d253678b9c1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f9fd9949-87kvl" Apr 28 19:19:02.533064 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:02.533004 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-767h6\" (UniqueName: \"kubernetes.io/projected/1c7c4f48-5ec3-4c12-9c9b-8d253678b9c1-kube-api-access-767h6\") pod \"cluster-proxy-proxy-agent-78f9fd9949-87kvl\" (UID: \"1c7c4f48-5ec3-4c12-9c9b-8d253678b9c1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f9fd9949-87kvl" Apr 28 19:19:02.533336 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:02.533307 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgjrc\" (UniqueName: \"kubernetes.io/projected/18c18cbe-5358-4ab6-a3b3-b30e9cdae286-kube-api-access-bgjrc\") pod \"klusterlet-addon-workmgr-5c85945db-6hs54\" (UID: \"18c18cbe-5358-4ab6-a3b3-b30e9cdae286\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c85945db-6hs54" Apr 28 19:19:02.604283 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:02.604252 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-79b49f59d9-v2fb8" Apr 28 19:19:02.612103 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:02.612075 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c85945db-6hs54" Apr 28 19:19:02.637166 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:02.637120 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f9fd9949-87kvl" Apr 28 19:19:02.768777 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:02.768746 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-79b49f59d9-v2fb8"] Apr 28 19:19:02.771362 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:19:02.771331 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06876314_10fe_4aa2_988c_88da33e0cf63.slice/crio-a40ef29c3db85507481aae4a94cb63af48b54854711b6593b9c11e5cf313d13e WatchSource:0}: Error finding container a40ef29c3db85507481aae4a94cb63af48b54854711b6593b9c11e5cf313d13e: Status 404 returned error can't find the container with id a40ef29c3db85507481aae4a94cb63af48b54854711b6593b9c11e5cf313d13e Apr 28 19:19:02.802967 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:02.802937 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c85945db-6hs54"] Apr 28 19:19:02.805917 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:19:02.805886 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18c18cbe_5358_4ab6_a3b3_b30e9cdae286.slice/crio-eb2d558bec3a5a1f4998a7d41a6cf73c8826961d2c01f23ac325f1b855897bab WatchSource:0}: Error finding container eb2d558bec3a5a1f4998a7d41a6cf73c8826961d2c01f23ac325f1b855897bab: Status 404 returned error can't find the container with id eb2d558bec3a5a1f4998a7d41a6cf73c8826961d2c01f23ac325f1b855897bab Apr 28 19:19:02.822536 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:02.822510 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f9fd9949-87kvl"] Apr 28 19:19:02.824672 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:19:02.824632 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c7c4f48_5ec3_4c12_9c9b_8d253678b9c1.slice/crio-fb786db1cdd9e13c4ae00d7282c5d71b7b8079c3840aa9b387f74bb3ff8ec388 WatchSource:0}: Error finding container fb786db1cdd9e13c4ae00d7282c5d71b7b8079c3840aa9b387f74bb3ff8ec388: Status 404 returned error can't find the container with id fb786db1cdd9e13c4ae00d7282c5d71b7b8079c3840aa9b387f74bb3ff8ec388 Apr 28 19:19:03.645292 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:03.645254 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f9fd9949-87kvl" event={"ID":"1c7c4f48-5ec3-4c12-9c9b-8d253678b9c1","Type":"ContainerStarted","Data":"fb786db1cdd9e13c4ae00d7282c5d71b7b8079c3840aa9b387f74bb3ff8ec388"} Apr 28 19:19:03.646540 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:03.646510 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-79b49f59d9-v2fb8" event={"ID":"06876314-10fe-4aa2-988c-88da33e0cf63","Type":"ContainerStarted","Data":"a40ef29c3db85507481aae4a94cb63af48b54854711b6593b9c11e5cf313d13e"} Apr 28 19:19:03.647655 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:03.647617 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c85945db-6hs54" event={"ID":"18c18cbe-5358-4ab6-a3b3-b30e9cdae286","Type":"ContainerStarted","Data":"eb2d558bec3a5a1f4998a7d41a6cf73c8826961d2c01f23ac325f1b855897bab"} Apr 28 19:19:04.658445 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:04.658406 2571 generic.go:358] "Generic (PLEG): container finished" podID="72778ec3-92aa-441f-836c-c404d2f75d8b" containerID="e8edd1cf626741e54941e78b9fdf62c9b63393e5fb7c0fc34cd179b92e3fa1b6" exitCode=0 Apr 28 19:19:04.658931 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:04.658459 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cw2mq2" event={"ID":"72778ec3-92aa-441f-836c-c404d2f75d8b","Type":"ContainerDied","Data":"e8edd1cf626741e54941e78b9fdf62c9b63393e5fb7c0fc34cd179b92e3fa1b6"} Apr 28 19:19:09.678312 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:09.678268 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f9fd9949-87kvl" event={"ID":"1c7c4f48-5ec3-4c12-9c9b-8d253678b9c1","Type":"ContainerStarted","Data":"caa006cdb249883526f6412f599619e3954d51d34f2608f2647004fc4d45bc6c"} Apr 28 19:19:09.679991 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:09.679945 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-79b49f59d9-v2fb8" event={"ID":"06876314-10fe-4aa2-988c-88da33e0cf63","Type":"ContainerStarted","Data":"cf4d87cbc4bec542764bed9f6efac3ad62793df1d45dd8d1250e4ffa0116036f"} Apr 28 19:19:09.681516 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:09.681465 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c85945db-6hs54" event={"ID":"18c18cbe-5358-4ab6-a3b3-b30e9cdae286","Type":"ContainerStarted","Data":"682e807f2c99cf46b6343d31b2f14068e1c1347b09303b678b1c67a83a47c637"} Apr 28 19:19:09.681691 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:09.681672 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c85945db-6hs54" Apr 28 19:19:09.683764 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:09.683741 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c85945db-6hs54" Apr 28 19:19:09.699356 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:09.699310 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-79b49f59d9-v2fb8" podStartSLOduration=1.5193081529999999 podStartE2EDuration="7.699296576s" podCreationTimestamp="2026-04-28 19:19:02 +0000 UTC" firstStartedPulling="2026-04-28 19:19:02.774006242 +0000 UTC m=+159.404907166" lastFinishedPulling="2026-04-28 19:19:08.953994651 +0000 UTC m=+165.584895589" observedRunningTime="2026-04-28 19:19:09.697915521 +0000 UTC m=+166.328816469" watchObservedRunningTime="2026-04-28 19:19:09.699296576 +0000 UTC m=+166.330197520" Apr 28 19:19:09.713755 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:09.713707 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c85945db-6hs54" podStartSLOduration=1.551442113 podStartE2EDuration="7.7136899s" podCreationTimestamp="2026-04-28 19:19:02 +0000 UTC" firstStartedPulling="2026-04-28 19:19:02.808247578 +0000 UTC m=+159.439148502" lastFinishedPulling="2026-04-28 19:19:08.970495349 +0000 UTC m=+165.601396289" observedRunningTime="2026-04-28 19:19:09.713290802 +0000 UTC m=+166.344191747" watchObservedRunningTime="2026-04-28 19:19:09.7136899 +0000 UTC m=+166.344590846" Apr 28 19:19:14.704727 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:14.704689 2571 generic.go:358] "Generic (PLEG): container finished" podID="72778ec3-92aa-441f-836c-c404d2f75d8b" containerID="d8c37a861b57fcd082e8e4c82f0123bad4d236675b97874ddb6500b697fa2488" exitCode=0 Apr 28 19:19:14.705195 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:14.704777 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cw2mq2" event={"ID":"72778ec3-92aa-441f-836c-c404d2f75d8b","Type":"ContainerDied","Data":"d8c37a861b57fcd082e8e4c82f0123bad4d236675b97874ddb6500b697fa2488"} Apr 28 19:19:14.706585 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:14.706562 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f9fd9949-87kvl" event={"ID":"1c7c4f48-5ec3-4c12-9c9b-8d253678b9c1","Type":"ContainerStarted","Data":"d0069995b0518b37ce455ad46f4d1ed56c5a312aada4c3a67a061b5ca77bf67e"} Apr 28 19:19:14.706696 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:14.706590 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f9fd9949-87kvl" event={"ID":"1c7c4f48-5ec3-4c12-9c9b-8d253678b9c1","Type":"ContainerStarted","Data":"e50a4a9001bee3f6e0c1478d3af7de6144ca2a8285ffb780d1fc2bb0dc318f38"} Apr 28 19:19:14.745571 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:14.745515 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f9fd9949-87kvl" podStartSLOduration=1.94539425 podStartE2EDuration="12.745500053s" podCreationTimestamp="2026-04-28 19:19:02 +0000 UTC" firstStartedPulling="2026-04-28 19:19:02.826549966 +0000 UTC m=+159.457450906" lastFinishedPulling="2026-04-28 19:19:13.626655785 +0000 UTC m=+170.257556709" observedRunningTime="2026-04-28 19:19:14.743668529 +0000 UTC m=+171.374569473" watchObservedRunningTime="2026-04-28 19:19:14.745500053 +0000 UTC m=+171.376400993" Apr 28 19:19:15.831951 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:15.831927 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cw2mq2" Apr 28 19:19:15.948390 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:15.948361 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/72778ec3-92aa-441f-836c-c404d2f75d8b-util\") pod \"72778ec3-92aa-441f-836c-c404d2f75d8b\" (UID: \"72778ec3-92aa-441f-836c-c404d2f75d8b\") " Apr 28 19:19:15.948546 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:15.948408 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vsgls\" (UniqueName: \"kubernetes.io/projected/72778ec3-92aa-441f-836c-c404d2f75d8b-kube-api-access-vsgls\") pod \"72778ec3-92aa-441f-836c-c404d2f75d8b\" (UID: \"72778ec3-92aa-441f-836c-c404d2f75d8b\") " Apr 28 19:19:15.948546 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:15.948448 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/72778ec3-92aa-441f-836c-c404d2f75d8b-bundle\") pod \"72778ec3-92aa-441f-836c-c404d2f75d8b\" (UID: \"72778ec3-92aa-441f-836c-c404d2f75d8b\") " Apr 28 19:19:15.949151 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:15.949121 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72778ec3-92aa-441f-836c-c404d2f75d8b-bundle" (OuterVolumeSpecName: "bundle") pod "72778ec3-92aa-441f-836c-c404d2f75d8b" (UID: "72778ec3-92aa-441f-836c-c404d2f75d8b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:19:15.950660 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:15.950632 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72778ec3-92aa-441f-836c-c404d2f75d8b-kube-api-access-vsgls" (OuterVolumeSpecName: "kube-api-access-vsgls") pod "72778ec3-92aa-441f-836c-c404d2f75d8b" (UID: "72778ec3-92aa-441f-836c-c404d2f75d8b"). InnerVolumeSpecName "kube-api-access-vsgls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:19:15.953531 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:15.953512 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72778ec3-92aa-441f-836c-c404d2f75d8b-util" (OuterVolumeSpecName: "util") pod "72778ec3-92aa-441f-836c-c404d2f75d8b" (UID: "72778ec3-92aa-441f-836c-c404d2f75d8b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:19:16.049322 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:16.049243 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/72778ec3-92aa-441f-836c-c404d2f75d8b-bundle\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:19:16.049322 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:16.049273 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/72778ec3-92aa-441f-836c-c404d2f75d8b-util\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:19:16.049322 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:16.049283 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vsgls\" (UniqueName: \"kubernetes.io/projected/72778ec3-92aa-441f-836c-c404d2f75d8b-kube-api-access-vsgls\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:19:16.714057 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:16.714028 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cw2mq2" Apr 28 19:19:16.714057 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:16.714036 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cw2mq2" event={"ID":"72778ec3-92aa-441f-836c-c404d2f75d8b","Type":"ContainerDied","Data":"79b11579a6ee267041030d4694da1f906fe8cf5b3f6dc111a25a16d6d09ecc76"} Apr 28 19:19:16.714251 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:19:16.714067 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79b11579a6ee267041030d4694da1f906fe8cf5b3f6dc111a25a16d6d09ecc76" Apr 28 19:21:23.832015 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:21:23.831985 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ppk4t_238e19ca-102f-43b1-8aed-9322ca47bfc9/ovn-acl-logging/0.log" Apr 28 19:21:23.832913 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:21:23.832790 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ppk4t_238e19ca-102f-43b1-8aed-9322ca47bfc9/ovn-acl-logging/0.log" Apr 28 19:21:23.838166 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:21:23.838149 2571 kubelet.go:1628] "Image garbage collection succeeded" Apr 28 19:22:00.158190 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:00.158159 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-v2jtv"] Apr 28 19:22:00.160648 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:00.158660 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="72778ec3-92aa-441f-836c-c404d2f75d8b" containerName="util" Apr 28 19:22:00.160648 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:00.158675 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="72778ec3-92aa-441f-836c-c404d2f75d8b" containerName="util" Apr 28 19:22:00.160648 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:00.158684 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="72778ec3-92aa-441f-836c-c404d2f75d8b" containerName="pull" Apr 28 19:22:00.160648 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:00.158689 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="72778ec3-92aa-441f-836c-c404d2f75d8b" containerName="pull" Apr 28 19:22:00.160648 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:00.158697 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="72778ec3-92aa-441f-836c-c404d2f75d8b" containerName="extract" Apr 28 19:22:00.160648 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:00.158702 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="72778ec3-92aa-441f-836c-c404d2f75d8b" containerName="extract" Apr 28 19:22:00.160648 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:00.158758 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="72778ec3-92aa-441f-836c-c404d2f75d8b" containerName="extract" Apr 28 19:22:00.161427 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:00.161408 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-v2jtv" Apr 28 19:22:00.163666 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:00.163646 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 28 19:22:00.163799 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:00.163648 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 28 19:22:00.164136 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:00.164118 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 28 19:22:00.164199 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:00.164121 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-hdspq\"" Apr 28 19:22:00.168871 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:00.168852 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-v2jtv"] Apr 28 19:22:00.300225 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:00.300189 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzt5d\" (UniqueName: \"kubernetes.io/projected/9739fd02-2f4c-4bcc-ac96-e5b981305b49-kube-api-access-tzt5d\") pod \"s3-init-v2jtv\" (UID: \"9739fd02-2f4c-4bcc-ac96-e5b981305b49\") " pod="kserve/s3-init-v2jtv" Apr 28 19:22:00.401009 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:00.400977 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tzt5d\" (UniqueName: \"kubernetes.io/projected/9739fd02-2f4c-4bcc-ac96-e5b981305b49-kube-api-access-tzt5d\") pod \"s3-init-v2jtv\" (UID: \"9739fd02-2f4c-4bcc-ac96-e5b981305b49\") " pod="kserve/s3-init-v2jtv" Apr 28 19:22:00.409643 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:00.409593 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzt5d\" (UniqueName: \"kubernetes.io/projected/9739fd02-2f4c-4bcc-ac96-e5b981305b49-kube-api-access-tzt5d\") pod \"s3-init-v2jtv\" (UID: \"9739fd02-2f4c-4bcc-ac96-e5b981305b49\") " pod="kserve/s3-init-v2jtv" Apr 28 19:22:00.471179 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:00.471144 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-v2jtv" Apr 28 19:22:00.590403 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:00.590248 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-v2jtv"] Apr 28 19:22:00.593155 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:22:00.593126 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9739fd02_2f4c_4bcc_ac96_e5b981305b49.slice/crio-194c1f6eb79b7b6bd5adf8cd7d4f9427042d8d0dd57656bc88cdd99722412a27 WatchSource:0}: Error finding container 194c1f6eb79b7b6bd5adf8cd7d4f9427042d8d0dd57656bc88cdd99722412a27: Status 404 returned error can't find the container with id 194c1f6eb79b7b6bd5adf8cd7d4f9427042d8d0dd57656bc88cdd99722412a27 Apr 28 19:22:00.594965 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:00.594949 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 19:22:01.190762 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:01.190700 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-v2jtv" event={"ID":"9739fd02-2f4c-4bcc-ac96-e5b981305b49","Type":"ContainerStarted","Data":"194c1f6eb79b7b6bd5adf8cd7d4f9427042d8d0dd57656bc88cdd99722412a27"} Apr 28 19:22:06.205961 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:06.205923 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-v2jtv" event={"ID":"9739fd02-2f4c-4bcc-ac96-e5b981305b49","Type":"ContainerStarted","Data":"171f76f35d9b53cc2417b5d6d6fae27756213e58e6ce2593215284102393e214"} Apr 28 19:22:06.230640 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:06.230589 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-v2jtv" podStartSLOduration=1.635550582 podStartE2EDuration="6.230573566s" podCreationTimestamp="2026-04-28 19:22:00 +0000 UTC" firstStartedPulling="2026-04-28 19:22:00.595074815 +0000 UTC m=+337.225975738" lastFinishedPulling="2026-04-28 19:22:05.190097798 +0000 UTC m=+341.820998722" observedRunningTime="2026-04-28 19:22:06.22895427 +0000 UTC m=+342.859855214" watchObservedRunningTime="2026-04-28 19:22:06.230573566 +0000 UTC m=+342.861474510" Apr 28 19:22:09.218179 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:09.218146 2571 generic.go:358] "Generic (PLEG): container finished" podID="9739fd02-2f4c-4bcc-ac96-e5b981305b49" containerID="171f76f35d9b53cc2417b5d6d6fae27756213e58e6ce2593215284102393e214" exitCode=0 Apr 28 19:22:09.218593 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:09.218218 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-v2jtv" event={"ID":"9739fd02-2f4c-4bcc-ac96-e5b981305b49","Type":"ContainerDied","Data":"171f76f35d9b53cc2417b5d6d6fae27756213e58e6ce2593215284102393e214"} Apr 28 19:22:10.343136 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:10.343115 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-v2jtv" Apr 28 19:22:10.387781 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:10.387753 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzt5d\" (UniqueName: \"kubernetes.io/projected/9739fd02-2f4c-4bcc-ac96-e5b981305b49-kube-api-access-tzt5d\") pod \"9739fd02-2f4c-4bcc-ac96-e5b981305b49\" (UID: \"9739fd02-2f4c-4bcc-ac96-e5b981305b49\") " Apr 28 19:22:10.389823 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:10.389797 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9739fd02-2f4c-4bcc-ac96-e5b981305b49-kube-api-access-tzt5d" (OuterVolumeSpecName: "kube-api-access-tzt5d") pod "9739fd02-2f4c-4bcc-ac96-e5b981305b49" (UID: "9739fd02-2f4c-4bcc-ac96-e5b981305b49"). InnerVolumeSpecName "kube-api-access-tzt5d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:22:10.489088 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:10.489010 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tzt5d\" (UniqueName: \"kubernetes.io/projected/9739fd02-2f4c-4bcc-ac96-e5b981305b49-kube-api-access-tzt5d\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:22:11.225609 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:11.225571 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-v2jtv" event={"ID":"9739fd02-2f4c-4bcc-ac96-e5b981305b49","Type":"ContainerDied","Data":"194c1f6eb79b7b6bd5adf8cd7d4f9427042d8d0dd57656bc88cdd99722412a27"} Apr 28 19:22:11.225609 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:11.225612 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="194c1f6eb79b7b6bd5adf8cd7d4f9427042d8d0dd57656bc88cdd99722412a27" Apr 28 19:22:11.225816 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:11.225583 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-v2jtv" Apr 28 19:22:33.145643 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:33.145610 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f6cf4b8cd-pjsq8"] Apr 28 19:22:47.823141 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:47.823102 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-tls-init-custom-7hrw9"] Apr 28 19:22:47.823535 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:47.823424 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9739fd02-2f4c-4bcc-ac96-e5b981305b49" containerName="s3-init" Apr 28 19:22:47.823535 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:47.823435 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="9739fd02-2f4c-4bcc-ac96-e5b981305b49" containerName="s3-init" Apr 28 19:22:47.823535 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:47.823501 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="9739fd02-2f4c-4bcc-ac96-e5b981305b49" containerName="s3-init" Apr 28 19:22:47.825422 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:47.825404 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-7hrw9" Apr 28 19:22:47.827687 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:47.827666 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 28 19:22:47.828316 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:47.828295 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 28 19:22:47.828375 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:47.828317 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-hdspq\"" Apr 28 19:22:47.828375 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:47.828321 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 28 19:22:47.833545 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:47.833526 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-custom-7hrw9"] Apr 28 19:22:47.889605 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:47.889571 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6296\" (UniqueName: \"kubernetes.io/projected/d3655183-0aab-4af9-8351-e667cb4da8b0-kube-api-access-b6296\") pod \"s3-tls-init-custom-7hrw9\" (UID: \"d3655183-0aab-4af9-8351-e667cb4da8b0\") " pod="kserve/s3-tls-init-custom-7hrw9" Apr 28 19:22:47.990134 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:47.990107 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b6296\" (UniqueName: \"kubernetes.io/projected/d3655183-0aab-4af9-8351-e667cb4da8b0-kube-api-access-b6296\") pod \"s3-tls-init-custom-7hrw9\" (UID: \"d3655183-0aab-4af9-8351-e667cb4da8b0\") " pod="kserve/s3-tls-init-custom-7hrw9" Apr 28 19:22:47.998759 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:47.998724 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6296\" (UniqueName: \"kubernetes.io/projected/d3655183-0aab-4af9-8351-e667cb4da8b0-kube-api-access-b6296\") pod \"s3-tls-init-custom-7hrw9\" (UID: \"d3655183-0aab-4af9-8351-e667cb4da8b0\") " pod="kserve/s3-tls-init-custom-7hrw9" Apr 28 19:22:48.135471 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:48.135384 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-7hrw9" Apr 28 19:22:48.261321 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:48.261290 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-custom-7hrw9"] Apr 28 19:22:48.264256 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:22:48.264229 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3655183_0aab_4af9_8351_e667cb4da8b0.slice/crio-1e94f3a16a77d422dd83860539eb17f0232d5ebfeed94c20924fa3aca4c22364 WatchSource:0}: Error finding container 1e94f3a16a77d422dd83860539eb17f0232d5ebfeed94c20924fa3aca4c22364: Status 404 returned error can't find the container with id 1e94f3a16a77d422dd83860539eb17f0232d5ebfeed94c20924fa3aca4c22364 Apr 28 19:22:48.329578 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:48.329549 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-7hrw9" event={"ID":"d3655183-0aab-4af9-8351-e667cb4da8b0","Type":"ContainerStarted","Data":"f5f0429560e578c9d12662dd1ebfee5ebc2eec1b9021f58d63783fa07b3cf846"} Apr 28 19:22:48.329687 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:48.329587 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-7hrw9" event={"ID":"d3655183-0aab-4af9-8351-e667cb4da8b0","Type":"ContainerStarted","Data":"1e94f3a16a77d422dd83860539eb17f0232d5ebfeed94c20924fa3aca4c22364"} Apr 28 19:22:48.345066 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:48.345020 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-tls-init-custom-7hrw9" podStartSLOduration=1.345004681 podStartE2EDuration="1.345004681s" podCreationTimestamp="2026-04-28 19:22:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:22:48.344523282 +0000 UTC m=+384.975424226" watchObservedRunningTime="2026-04-28 19:22:48.345004681 +0000 UTC m=+384.975905623" Apr 28 19:22:53.347195 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:53.347161 2571 generic.go:358] "Generic (PLEG): container finished" podID="d3655183-0aab-4af9-8351-e667cb4da8b0" containerID="f5f0429560e578c9d12662dd1ebfee5ebc2eec1b9021f58d63783fa07b3cf846" exitCode=0 Apr 28 19:22:53.347679 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:53.347223 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-7hrw9" event={"ID":"d3655183-0aab-4af9-8351-e667cb4da8b0","Type":"ContainerDied","Data":"f5f0429560e578c9d12662dd1ebfee5ebc2eec1b9021f58d63783fa07b3cf846"} Apr 28 19:22:54.470843 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:54.470821 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-7hrw9" Apr 28 19:22:54.547647 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:54.547615 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6296\" (UniqueName: \"kubernetes.io/projected/d3655183-0aab-4af9-8351-e667cb4da8b0-kube-api-access-b6296\") pod \"d3655183-0aab-4af9-8351-e667cb4da8b0\" (UID: \"d3655183-0aab-4af9-8351-e667cb4da8b0\") " Apr 28 19:22:54.549669 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:54.549645 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3655183-0aab-4af9-8351-e667cb4da8b0-kube-api-access-b6296" (OuterVolumeSpecName: "kube-api-access-b6296") pod "d3655183-0aab-4af9-8351-e667cb4da8b0" (UID: "d3655183-0aab-4af9-8351-e667cb4da8b0"). InnerVolumeSpecName "kube-api-access-b6296". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:22:54.649076 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:54.648974 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b6296\" (UniqueName: \"kubernetes.io/projected/d3655183-0aab-4af9-8351-e667cb4da8b0-kube-api-access-b6296\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:22:55.354027 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:55.353992 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-7hrw9" event={"ID":"d3655183-0aab-4af9-8351-e667cb4da8b0","Type":"ContainerDied","Data":"1e94f3a16a77d422dd83860539eb17f0232d5ebfeed94c20924fa3aca4c22364"} Apr 28 19:22:55.354027 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:55.354029 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e94f3a16a77d422dd83860539eb17f0232d5ebfeed94c20924fa3aca4c22364" Apr 28 19:22:55.354216 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:55.354003 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-7hrw9" Apr 28 19:22:58.165955 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:58.165912 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-f6cf4b8cd-pjsq8" podUID="e5c32464-ec1f-40c0-90df-b98120f5f58a" containerName="console" containerID="cri-o://454beecbfd823252768aa9e9409bc2f24a32e17ff3fe7ac0ab786a14c39135ce" gracePeriod=15 Apr 28 19:22:58.364572 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:58.364545 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f6cf4b8cd-pjsq8_e5c32464-ec1f-40c0-90df-b98120f5f58a/console/0.log" Apr 28 19:22:58.364713 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:58.364583 2571 generic.go:358] "Generic (PLEG): container finished" podID="e5c32464-ec1f-40c0-90df-b98120f5f58a" containerID="454beecbfd823252768aa9e9409bc2f24a32e17ff3fe7ac0ab786a14c39135ce" exitCode=2 Apr 28 19:22:58.364713 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:58.364635 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f6cf4b8cd-pjsq8" event={"ID":"e5c32464-ec1f-40c0-90df-b98120f5f58a","Type":"ContainerDied","Data":"454beecbfd823252768aa9e9409bc2f24a32e17ff3fe7ac0ab786a14c39135ce"} Apr 28 19:22:58.401393 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:58.401372 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f6cf4b8cd-pjsq8_e5c32464-ec1f-40c0-90df-b98120f5f58a/console/0.log" Apr 28 19:22:58.401543 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:58.401433 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f6cf4b8cd-pjsq8" Apr 28 19:22:58.578806 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:58.578706 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e5c32464-ec1f-40c0-90df-b98120f5f58a-console-oauth-config\") pod \"e5c32464-ec1f-40c0-90df-b98120f5f58a\" (UID: \"e5c32464-ec1f-40c0-90df-b98120f5f58a\") " Apr 28 19:22:58.578806 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:58.578769 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e5c32464-ec1f-40c0-90df-b98120f5f58a-console-config\") pod \"e5c32464-ec1f-40c0-90df-b98120f5f58a\" (UID: \"e5c32464-ec1f-40c0-90df-b98120f5f58a\") " Apr 28 19:22:58.578806 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:58.578790 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e5c32464-ec1f-40c0-90df-b98120f5f58a-service-ca\") pod \"e5c32464-ec1f-40c0-90df-b98120f5f58a\" (UID: \"e5c32464-ec1f-40c0-90df-b98120f5f58a\") " Apr 28 19:22:58.579065 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:58.578812 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e5c32464-ec1f-40c0-90df-b98120f5f58a-console-serving-cert\") pod \"e5c32464-ec1f-40c0-90df-b98120f5f58a\" (UID: \"e5c32464-ec1f-40c0-90df-b98120f5f58a\") " Apr 28 19:22:58.579065 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:58.578878 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbxp8\" (UniqueName: \"kubernetes.io/projected/e5c32464-ec1f-40c0-90df-b98120f5f58a-kube-api-access-cbxp8\") pod \"e5c32464-ec1f-40c0-90df-b98120f5f58a\" (UID: \"e5c32464-ec1f-40c0-90df-b98120f5f58a\") " Apr 28 19:22:58.579065 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:58.578904 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5c32464-ec1f-40c0-90df-b98120f5f58a-trusted-ca-bundle\") pod \"e5c32464-ec1f-40c0-90df-b98120f5f58a\" (UID: \"e5c32464-ec1f-40c0-90df-b98120f5f58a\") " Apr 28 19:22:58.579065 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:58.578941 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e5c32464-ec1f-40c0-90df-b98120f5f58a-oauth-serving-cert\") pod \"e5c32464-ec1f-40c0-90df-b98120f5f58a\" (UID: \"e5c32464-ec1f-40c0-90df-b98120f5f58a\") " Apr 28 19:22:58.579255 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:58.579213 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5c32464-ec1f-40c0-90df-b98120f5f58a-service-ca" (OuterVolumeSpecName: "service-ca") pod "e5c32464-ec1f-40c0-90df-b98120f5f58a" (UID: "e5c32464-ec1f-40c0-90df-b98120f5f58a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:22:58.579368 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:58.579335 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5c32464-ec1f-40c0-90df-b98120f5f58a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "e5c32464-ec1f-40c0-90df-b98120f5f58a" (UID: "e5c32464-ec1f-40c0-90df-b98120f5f58a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:22:58.579440 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:58.579406 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5c32464-ec1f-40c0-90df-b98120f5f58a-console-config" (OuterVolumeSpecName: "console-config") pod "e5c32464-ec1f-40c0-90df-b98120f5f58a" (UID: "e5c32464-ec1f-40c0-90df-b98120f5f58a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:22:58.579863 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:58.579832 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5c32464-ec1f-40c0-90df-b98120f5f58a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "e5c32464-ec1f-40c0-90df-b98120f5f58a" (UID: "e5c32464-ec1f-40c0-90df-b98120f5f58a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:22:58.581168 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:58.581138 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5c32464-ec1f-40c0-90df-b98120f5f58a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "e5c32464-ec1f-40c0-90df-b98120f5f58a" (UID: "e5c32464-ec1f-40c0-90df-b98120f5f58a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:22:58.581258 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:58.581162 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5c32464-ec1f-40c0-90df-b98120f5f58a-kube-api-access-cbxp8" (OuterVolumeSpecName: "kube-api-access-cbxp8") pod "e5c32464-ec1f-40c0-90df-b98120f5f58a" (UID: "e5c32464-ec1f-40c0-90df-b98120f5f58a"). InnerVolumeSpecName "kube-api-access-cbxp8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:22:58.581258 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:58.581194 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5c32464-ec1f-40c0-90df-b98120f5f58a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "e5c32464-ec1f-40c0-90df-b98120f5f58a" (UID: "e5c32464-ec1f-40c0-90df-b98120f5f58a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:22:58.679633 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:58.679602 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cbxp8\" (UniqueName: \"kubernetes.io/projected/e5c32464-ec1f-40c0-90df-b98120f5f58a-kube-api-access-cbxp8\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:22:58.679633 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:58.679627 2571 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5c32464-ec1f-40c0-90df-b98120f5f58a-trusted-ca-bundle\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:22:58.679633 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:58.679637 2571 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e5c32464-ec1f-40c0-90df-b98120f5f58a-oauth-serving-cert\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:22:58.679835 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:58.679645 2571 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e5c32464-ec1f-40c0-90df-b98120f5f58a-console-oauth-config\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:22:58.679835 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:58.679655 2571 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e5c32464-ec1f-40c0-90df-b98120f5f58a-console-config\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:22:58.679835 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:58.679664 2571 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e5c32464-ec1f-40c0-90df-b98120f5f58a-service-ca\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:22:58.679835 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:58.679672 2571 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e5c32464-ec1f-40c0-90df-b98120f5f58a-console-serving-cert\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:22:59.369087 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:59.369063 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f6cf4b8cd-pjsq8_e5c32464-ec1f-40c0-90df-b98120f5f58a/console/0.log" Apr 28 19:22:59.369504 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:59.369160 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f6cf4b8cd-pjsq8" event={"ID":"e5c32464-ec1f-40c0-90df-b98120f5f58a","Type":"ContainerDied","Data":"674fcf53cfb18f7916530f6be5351f202b73b8a877f0c060765d033c7e1a8f45"} Apr 28 19:22:59.369504 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:59.369173 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f6cf4b8cd-pjsq8" Apr 28 19:22:59.369504 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:59.369194 2571 scope.go:117] "RemoveContainer" containerID="454beecbfd823252768aa9e9409bc2f24a32e17ff3fe7ac0ab786a14c39135ce" Apr 28 19:22:59.391662 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:59.391635 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f6cf4b8cd-pjsq8"] Apr 28 19:22:59.395120 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:59.395097 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f6cf4b8cd-pjsq8"] Apr 28 19:22:59.947184 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:22:59.947147 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5c32464-ec1f-40c0-90df-b98120f5f58a" path="/var/lib/kubelet/pods/e5c32464-ec1f-40c0-90df-b98120f5f58a/volumes" Apr 28 19:23:18.522644 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:23:18.522609 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52"] Apr 28 19:23:18.523094 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:23:18.522961 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d3655183-0aab-4af9-8351-e667cb4da8b0" containerName="s3-tls-init-custom" Apr 28 19:23:18.523094 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:23:18.522973 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3655183-0aab-4af9-8351-e667cb4da8b0" containerName="s3-tls-init-custom" Apr 28 19:23:18.523094 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:23:18.522990 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e5c32464-ec1f-40c0-90df-b98120f5f58a" containerName="console" Apr 28 19:23:18.523094 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:23:18.522995 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5c32464-ec1f-40c0-90df-b98120f5f58a" containerName="console" Apr 28 19:23:18.523094 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:23:18.523051 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="e5c32464-ec1f-40c0-90df-b98120f5f58a" containerName="console" Apr 28 19:23:18.523094 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:23:18.523063 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="d3655183-0aab-4af9-8351-e667cb4da8b0" containerName="s3-tls-init-custom" Apr 28 19:23:18.526259 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:23:18.526241 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" Apr 28 19:23:18.529592 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:23:18.529569 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\"" Apr 28 19:23:18.529883 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:23:18.529862 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 28 19:23:18.529986 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:23:18.529917 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-batcher-predictor-serving-cert\"" Apr 28 19:23:18.529986 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:23:18.529941 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 28 19:23:18.530103 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:23:18.530006 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-44jch\"" Apr 28 19:23:18.545849 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:23:18.545826 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52"] Apr 28 19:23:18.636945 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:23:18.636902 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvftd\" (UniqueName: \"kubernetes.io/projected/38e23382-eb2e-4fec-b84b-7cc873df2741-kube-api-access-tvftd\") pod \"isvc-sklearn-batcher-predictor-57fcff47c9-zrm52\" (UID: \"38e23382-eb2e-4fec-b84b-7cc873df2741\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" Apr 28 19:23:18.637123 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:23:18.637021 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/38e23382-eb2e-4fec-b84b-7cc873df2741-proxy-tls\") pod \"isvc-sklearn-batcher-predictor-57fcff47c9-zrm52\" (UID: \"38e23382-eb2e-4fec-b84b-7cc873df2741\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" Apr 28 19:23:18.637123 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:23:18.637053 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/38e23382-eb2e-4fec-b84b-7cc873df2741-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-57fcff47c9-zrm52\" (UID: \"38e23382-eb2e-4fec-b84b-7cc873df2741\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" Apr 28 19:23:18.637264 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:23:18.637120 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/38e23382-eb2e-4fec-b84b-7cc873df2741-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-predictor-57fcff47c9-zrm52\" (UID: \"38e23382-eb2e-4fec-b84b-7cc873df2741\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" Apr 28 19:23:18.737934 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:23:18.737884 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tvftd\" (UniqueName: \"kubernetes.io/projected/38e23382-eb2e-4fec-b84b-7cc873df2741-kube-api-access-tvftd\") pod \"isvc-sklearn-batcher-predictor-57fcff47c9-zrm52\" (UID: \"38e23382-eb2e-4fec-b84b-7cc873df2741\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" Apr 28 19:23:18.738138 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:23:18.737966 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/38e23382-eb2e-4fec-b84b-7cc873df2741-proxy-tls\") pod \"isvc-sklearn-batcher-predictor-57fcff47c9-zrm52\" (UID: \"38e23382-eb2e-4fec-b84b-7cc873df2741\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" Apr 28 19:23:18.738138 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:23:18.737986 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/38e23382-eb2e-4fec-b84b-7cc873df2741-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-57fcff47c9-zrm52\" (UID: \"38e23382-eb2e-4fec-b84b-7cc873df2741\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" Apr 28 19:23:18.738138 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:23:18.738020 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/38e23382-eb2e-4fec-b84b-7cc873df2741-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-predictor-57fcff47c9-zrm52\" (UID: \"38e23382-eb2e-4fec-b84b-7cc873df2741\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" Apr 28 19:23:18.738401 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:23:18.738379 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/38e23382-eb2e-4fec-b84b-7cc873df2741-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-57fcff47c9-zrm52\" (UID: \"38e23382-eb2e-4fec-b84b-7cc873df2741\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" Apr 28 19:23:18.738651 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:23:18.738635 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/38e23382-eb2e-4fec-b84b-7cc873df2741-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-predictor-57fcff47c9-zrm52\" (UID: \"38e23382-eb2e-4fec-b84b-7cc873df2741\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" Apr 28 19:23:18.740357 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:23:18.740335 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/38e23382-eb2e-4fec-b84b-7cc873df2741-proxy-tls\") pod \"isvc-sklearn-batcher-predictor-57fcff47c9-zrm52\" (UID: \"38e23382-eb2e-4fec-b84b-7cc873df2741\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" Apr 28 19:23:18.746047 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:23:18.746014 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvftd\" (UniqueName: \"kubernetes.io/projected/38e23382-eb2e-4fec-b84b-7cc873df2741-kube-api-access-tvftd\") pod \"isvc-sklearn-batcher-predictor-57fcff47c9-zrm52\" (UID: \"38e23382-eb2e-4fec-b84b-7cc873df2741\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" Apr 28 19:23:18.836552 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:23:18.836441 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" Apr 28 19:23:18.959430 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:23:18.959405 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52"] Apr 28 19:23:18.962038 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:23:18.962009 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38e23382_eb2e_4fec_b84b_7cc873df2741.slice/crio-5f2a3db30af6718a90a4f9da07fa045a73148d85391ec6fa0f6879cc862c0eec WatchSource:0}: Error finding container 5f2a3db30af6718a90a4f9da07fa045a73148d85391ec6fa0f6879cc862c0eec: Status 404 returned error can't find the container with id 5f2a3db30af6718a90a4f9da07fa045a73148d85391ec6fa0f6879cc862c0eec Apr 28 19:23:19.430256 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:23:19.430219 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" event={"ID":"38e23382-eb2e-4fec-b84b-7cc873df2741","Type":"ContainerStarted","Data":"5f2a3db30af6718a90a4f9da07fa045a73148d85391ec6fa0f6879cc862c0eec"} Apr 28 19:23:24.449734 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:23:24.449692 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" event={"ID":"38e23382-eb2e-4fec-b84b-7cc873df2741","Type":"ContainerStarted","Data":"750e04bcfc0e8ebb040b514bc25e1dbe04b2ab13bf8b91421d62cb903175e0b0"} Apr 28 19:23:27.460470 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:23:27.460436 2571 generic.go:358] "Generic (PLEG): container finished" podID="38e23382-eb2e-4fec-b84b-7cc873df2741" containerID="750e04bcfc0e8ebb040b514bc25e1dbe04b2ab13bf8b91421d62cb903175e0b0" exitCode=0 Apr 28 19:23:27.460844 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:23:27.460515 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" event={"ID":"38e23382-eb2e-4fec-b84b-7cc873df2741","Type":"ContainerDied","Data":"750e04bcfc0e8ebb040b514bc25e1dbe04b2ab13bf8b91421d62cb903175e0b0"} Apr 28 19:23:40.515141 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:23:40.515059 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" event={"ID":"38e23382-eb2e-4fec-b84b-7cc873df2741","Type":"ContainerStarted","Data":"b591db3bffb302cfc4675f486491e88f5f49298990078cdcb0e212a88b6fdfbb"} Apr 28 19:23:43.527625 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:23:43.527585 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" event={"ID":"38e23382-eb2e-4fec-b84b-7cc873df2741","Type":"ContainerStarted","Data":"16252b8977d11b869896a63ab3d209f74c1bcba1ff9fe732f85d263df9a224a4"} Apr 28 19:23:46.539259 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:23:46.539195 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" event={"ID":"38e23382-eb2e-4fec-b84b-7cc873df2741","Type":"ContainerStarted","Data":"6e09d901d5fd2d824c47627e8315a2f26bdea21f66946c07bdbed81ecb7aefe0"} Apr 28 19:23:46.539659 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:23:46.539431 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" Apr 28 19:23:46.561708 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:23:46.561656 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" podStartSLOduration=1.7821628980000002 podStartE2EDuration="28.561642598s" podCreationTimestamp="2026-04-28 19:23:18 +0000 UTC" firstStartedPulling="2026-04-28 19:23:18.96385848 +0000 UTC m=+415.594759407" lastFinishedPulling="2026-04-28 19:23:45.743338185 +0000 UTC m=+442.374239107" observedRunningTime="2026-04-28 19:23:46.560157072 +0000 UTC m=+443.191058018" watchObservedRunningTime="2026-04-28 19:23:46.561642598 +0000 UTC m=+443.192543543" Apr 28 19:23:47.546128 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:23:47.546092 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" Apr 28 19:23:47.546557 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:23:47.546145 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" Apr 28 19:23:47.546638 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:23:47.546543 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" podUID="38e23382-eb2e-4fec-b84b-7cc873df2741" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 28 19:23:47.547266 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:23:47.547221 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" podUID="38e23382-eb2e-4fec-b84b-7cc873df2741" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:23:47.550332 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:23:47.550315 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" Apr 28 19:23:48.546281 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:23:48.546238 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" podUID="38e23382-eb2e-4fec-b84b-7cc873df2741" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 28 19:23:48.546713 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:23:48.546507 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" podUID="38e23382-eb2e-4fec-b84b-7cc873df2741" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:23:49.549464 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:23:49.549425 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" podUID="38e23382-eb2e-4fec-b84b-7cc873df2741" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 28 19:23:49.549954 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:23:49.549832 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" podUID="38e23382-eb2e-4fec-b84b-7cc873df2741" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:23:59.549500 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:23:59.549437 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" podUID="38e23382-eb2e-4fec-b84b-7cc873df2741" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 28 19:23:59.549910 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:23:59.549880 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" podUID="38e23382-eb2e-4fec-b84b-7cc873df2741" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:24:09.549599 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:24:09.549551 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" podUID="38e23382-eb2e-4fec-b84b-7cc873df2741" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 28 19:24:09.550096 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:24:09.550020 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" podUID="38e23382-eb2e-4fec-b84b-7cc873df2741" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:24:19.550180 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:24:19.550136 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" podUID="38e23382-eb2e-4fec-b84b-7cc873df2741" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 28 19:24:19.550788 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:24:19.550690 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" podUID="38e23382-eb2e-4fec-b84b-7cc873df2741" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:24:29.550363 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:24:29.550314 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" podUID="38e23382-eb2e-4fec-b84b-7cc873df2741" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 28 19:24:29.550865 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:24:29.550839 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" podUID="38e23382-eb2e-4fec-b84b-7cc873df2741" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:24:39.549598 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:24:39.549555 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" podUID="38e23382-eb2e-4fec-b84b-7cc873df2741" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 28 19:24:39.550008 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:24:39.549989 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" podUID="38e23382-eb2e-4fec-b84b-7cc873df2741" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:24:49.550187 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:24:49.550146 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" Apr 28 19:24:49.550633 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:24:49.550301 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" Apr 28 19:25:04.131234 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:04.131201 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52"] Apr 28 19:25:04.131663 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:04.131578 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" podUID="38e23382-eb2e-4fec-b84b-7cc873df2741" containerName="kserve-container" containerID="cri-o://b591db3bffb302cfc4675f486491e88f5f49298990078cdcb0e212a88b6fdfbb" gracePeriod=30 Apr 28 19:25:04.131663 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:04.131604 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" podUID="38e23382-eb2e-4fec-b84b-7cc873df2741" containerName="agent" containerID="cri-o://6e09d901d5fd2d824c47627e8315a2f26bdea21f66946c07bdbed81ecb7aefe0" gracePeriod=30 Apr 28 19:25:04.131757 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:04.131605 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" podUID="38e23382-eb2e-4fec-b84b-7cc873df2741" containerName="kube-rbac-proxy" containerID="cri-o://16252b8977d11b869896a63ab3d209f74c1bcba1ff9fe732f85d263df9a224a4" gracePeriod=30 Apr 28 19:25:04.437152 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:04.437123 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz"] Apr 28 19:25:04.440657 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:04.440639 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" Apr 28 19:25:04.444169 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:04.444147 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\"" Apr 28 19:25:04.444272 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:04.444189 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-batcher-custom-predictor-serving-cert\"" Apr 28 19:25:04.458831 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:04.458808 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz"] Apr 28 19:25:04.515554 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:04.515510 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5p8fk\" (UniqueName: \"kubernetes.io/projected/4577fbb5-3556-446e-a088-0e3d5239f4ce-kube-api-access-5p8fk\") pod \"isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz\" (UID: \"4577fbb5-3556-446e-a088-0e3d5239f4ce\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" Apr 28 19:25:04.515723 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:04.515591 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4577fbb5-3556-446e-a088-0e3d5239f4ce-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz\" (UID: \"4577fbb5-3556-446e-a088-0e3d5239f4ce\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" Apr 28 19:25:04.515723 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:04.515619 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4577fbb5-3556-446e-a088-0e3d5239f4ce-proxy-tls\") pod \"isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz\" (UID: \"4577fbb5-3556-446e-a088-0e3d5239f4ce\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" Apr 28 19:25:04.515723 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:04.515700 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4577fbb5-3556-446e-a088-0e3d5239f4ce-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz\" (UID: \"4577fbb5-3556-446e-a088-0e3d5239f4ce\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" Apr 28 19:25:04.616562 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:04.616514 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4577fbb5-3556-446e-a088-0e3d5239f4ce-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz\" (UID: \"4577fbb5-3556-446e-a088-0e3d5239f4ce\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" Apr 28 19:25:04.616736 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:04.616582 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5p8fk\" (UniqueName: \"kubernetes.io/projected/4577fbb5-3556-446e-a088-0e3d5239f4ce-kube-api-access-5p8fk\") pod \"isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz\" (UID: \"4577fbb5-3556-446e-a088-0e3d5239f4ce\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" Apr 28 19:25:04.616736 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:04.616624 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4577fbb5-3556-446e-a088-0e3d5239f4ce-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz\" (UID: \"4577fbb5-3556-446e-a088-0e3d5239f4ce\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" Apr 28 19:25:04.616736 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:04.616650 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4577fbb5-3556-446e-a088-0e3d5239f4ce-proxy-tls\") pod \"isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz\" (UID: \"4577fbb5-3556-446e-a088-0e3d5239f4ce\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" Apr 28 19:25:04.616999 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:04.616972 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4577fbb5-3556-446e-a088-0e3d5239f4ce-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz\" (UID: \"4577fbb5-3556-446e-a088-0e3d5239f4ce\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" Apr 28 19:25:04.617199 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:04.617179 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4577fbb5-3556-446e-a088-0e3d5239f4ce-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz\" (UID: \"4577fbb5-3556-446e-a088-0e3d5239f4ce\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" Apr 28 19:25:04.619051 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:04.619032 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4577fbb5-3556-446e-a088-0e3d5239f4ce-proxy-tls\") pod \"isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz\" (UID: \"4577fbb5-3556-446e-a088-0e3d5239f4ce\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" Apr 28 19:25:04.624931 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:04.624907 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5p8fk\" (UniqueName: \"kubernetes.io/projected/4577fbb5-3556-446e-a088-0e3d5239f4ce-kube-api-access-5p8fk\") pod \"isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz\" (UID: \"4577fbb5-3556-446e-a088-0e3d5239f4ce\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" Apr 28 19:25:04.750187 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:04.750105 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" Apr 28 19:25:04.782234 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:04.782198 2571 generic.go:358] "Generic (PLEG): container finished" podID="38e23382-eb2e-4fec-b84b-7cc873df2741" containerID="16252b8977d11b869896a63ab3d209f74c1bcba1ff9fe732f85d263df9a224a4" exitCode=2 Apr 28 19:25:04.782394 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:04.782266 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" event={"ID":"38e23382-eb2e-4fec-b84b-7cc873df2741","Type":"ContainerDied","Data":"16252b8977d11b869896a63ab3d209f74c1bcba1ff9fe732f85d263df9a224a4"} Apr 28 19:25:04.873725 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:04.873610 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz"] Apr 28 19:25:04.876052 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:25:04.876025 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4577fbb5_3556_446e_a088_0e3d5239f4ce.slice/crio-5d305f0d811b4106ddbf361f82182a63e0fe5ac146c1f0633552189f71962508 WatchSource:0}: Error finding container 5d305f0d811b4106ddbf361f82182a63e0fe5ac146c1f0633552189f71962508: Status 404 returned error can't find the container with id 5d305f0d811b4106ddbf361f82182a63e0fe5ac146c1f0633552189f71962508 Apr 28 19:25:05.788803 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:05.788763 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" event={"ID":"4577fbb5-3556-446e-a088-0e3d5239f4ce","Type":"ContainerStarted","Data":"9e11991440c2ee0ac45a958f3161172f872166c5d053b475d138a1cb0641abda"} Apr 28 19:25:05.788803 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:05.788806 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" event={"ID":"4577fbb5-3556-446e-a088-0e3d5239f4ce","Type":"ContainerStarted","Data":"5d305f0d811b4106ddbf361f82182a63e0fe5ac146c1f0633552189f71962508"} Apr 28 19:25:07.545036 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:07.544998 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" podUID="38e23382-eb2e-4fec-b84b-7cc873df2741" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.25:8643/healthz\": dial tcp 10.134.0.25:8643: connect: connection refused" Apr 28 19:25:08.800427 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:08.800396 2571 generic.go:358] "Generic (PLEG): container finished" podID="38e23382-eb2e-4fec-b84b-7cc873df2741" containerID="b591db3bffb302cfc4675f486491e88f5f49298990078cdcb0e212a88b6fdfbb" exitCode=0 Apr 28 19:25:08.800863 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:08.800499 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" event={"ID":"38e23382-eb2e-4fec-b84b-7cc873df2741","Type":"ContainerDied","Data":"b591db3bffb302cfc4675f486491e88f5f49298990078cdcb0e212a88b6fdfbb"} Apr 28 19:25:08.801797 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:08.801777 2571 generic.go:358] "Generic (PLEG): container finished" podID="4577fbb5-3556-446e-a088-0e3d5239f4ce" containerID="9e11991440c2ee0ac45a958f3161172f872166c5d053b475d138a1cb0641abda" exitCode=0 Apr 28 19:25:08.801879 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:08.801825 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" event={"ID":"4577fbb5-3556-446e-a088-0e3d5239f4ce","Type":"ContainerDied","Data":"9e11991440c2ee0ac45a958f3161172f872166c5d053b475d138a1cb0641abda"} Apr 28 19:25:09.549810 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:09.549710 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" podUID="38e23382-eb2e-4fec-b84b-7cc873df2741" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 28 19:25:09.550020 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:09.549990 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" podUID="38e23382-eb2e-4fec-b84b-7cc873df2741" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:25:09.807453 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:09.807367 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" event={"ID":"4577fbb5-3556-446e-a088-0e3d5239f4ce","Type":"ContainerStarted","Data":"5e013f05abeebe0ea19899264b9d132aa79ea47ad36872e9fdbce69cde01d6f0"} Apr 28 19:25:09.807453 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:09.807405 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" event={"ID":"4577fbb5-3556-446e-a088-0e3d5239f4ce","Type":"ContainerStarted","Data":"33eb769a2eb45f55859c066a363f25431062bfba2e65849fc828e5b710ce0669"} Apr 28 19:25:09.807453 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:09.807415 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" event={"ID":"4577fbb5-3556-446e-a088-0e3d5239f4ce","Type":"ContainerStarted","Data":"47acd6252e4391ccfe5a3af66f4c1e6953d7ddade580febf9feb15fb9c01456d"} Apr 28 19:25:09.808040 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:09.807735 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" Apr 28 19:25:09.808040 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:09.807868 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" Apr 28 19:25:09.808040 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:09.807898 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" Apr 28 19:25:09.809418 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:09.809391 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" podUID="4577fbb5-3556-446e-a088-0e3d5239f4ce" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:5000: connect: connection refused" Apr 28 19:25:09.810086 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:09.810063 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" podUID="4577fbb5-3556-446e-a088-0e3d5239f4ce" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:25:09.829386 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:09.829346 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" podStartSLOduration=5.829335068 podStartE2EDuration="5.829335068s" podCreationTimestamp="2026-04-28 19:25:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:25:09.828058051 +0000 UTC m=+526.458958997" watchObservedRunningTime="2026-04-28 19:25:09.829335068 +0000 UTC m=+526.460236012" Apr 28 19:25:10.810745 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:10.810701 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" podUID="4577fbb5-3556-446e-a088-0e3d5239f4ce" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:5000: connect: connection refused" Apr 28 19:25:10.811183 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:10.811157 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" podUID="4577fbb5-3556-446e-a088-0e3d5239f4ce" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:25:12.544423 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:12.544378 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" podUID="38e23382-eb2e-4fec-b84b-7cc873df2741" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.25:8643/healthz\": dial tcp 10.134.0.25:8643: connect: connection refused" Apr 28 19:25:15.815077 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:15.815048 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" Apr 28 19:25:15.815670 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:15.815636 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" podUID="4577fbb5-3556-446e-a088-0e3d5239f4ce" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:5000: connect: connection refused" Apr 28 19:25:15.816023 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:15.816001 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" podUID="4577fbb5-3556-446e-a088-0e3d5239f4ce" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:25:17.544332 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:17.544292 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" podUID="38e23382-eb2e-4fec-b84b-7cc873df2741" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.25:8643/healthz\": dial tcp 10.134.0.25:8643: connect: connection refused" Apr 28 19:25:17.544693 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:17.544433 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" Apr 28 19:25:19.550420 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:19.550362 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" podUID="38e23382-eb2e-4fec-b84b-7cc873df2741" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 28 19:25:19.550865 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:19.550729 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" podUID="38e23382-eb2e-4fec-b84b-7cc873df2741" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:25:22.544296 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:22.544254 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" podUID="38e23382-eb2e-4fec-b84b-7cc873df2741" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.25:8643/healthz\": dial tcp 10.134.0.25:8643: connect: connection refused" Apr 28 19:25:25.816068 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:25.816024 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" podUID="4577fbb5-3556-446e-a088-0e3d5239f4ce" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:5000: connect: connection refused" Apr 28 19:25:25.816567 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:25.816452 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" podUID="4577fbb5-3556-446e-a088-0e3d5239f4ce" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:25:27.544761 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:27.544719 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" podUID="38e23382-eb2e-4fec-b84b-7cc873df2741" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.25:8643/healthz\": dial tcp 10.134.0.25:8643: connect: connection refused" Apr 28 19:25:29.549800 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:29.549757 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" podUID="38e23382-eb2e-4fec-b84b-7cc873df2741" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 28 19:25:29.550266 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:29.549911 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" Apr 28 19:25:29.550266 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:29.550099 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" podUID="38e23382-eb2e-4fec-b84b-7cc873df2741" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:25:29.550266 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:29.550216 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" Apr 28 19:25:32.544490 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:32.544446 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" podUID="38e23382-eb2e-4fec-b84b-7cc873df2741" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.25:8643/healthz\": dial tcp 10.134.0.25:8643: connect: connection refused" Apr 28 19:25:34.278909 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:34.278886 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" Apr 28 19:25:34.368108 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:34.368065 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvftd\" (UniqueName: \"kubernetes.io/projected/38e23382-eb2e-4fec-b84b-7cc873df2741-kube-api-access-tvftd\") pod \"38e23382-eb2e-4fec-b84b-7cc873df2741\" (UID: \"38e23382-eb2e-4fec-b84b-7cc873df2741\") " Apr 28 19:25:34.368298 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:34.368132 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/38e23382-eb2e-4fec-b84b-7cc873df2741-proxy-tls\") pod \"38e23382-eb2e-4fec-b84b-7cc873df2741\" (UID: \"38e23382-eb2e-4fec-b84b-7cc873df2741\") " Apr 28 19:25:34.368298 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:34.368156 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/38e23382-eb2e-4fec-b84b-7cc873df2741-kserve-provision-location\") pod \"38e23382-eb2e-4fec-b84b-7cc873df2741\" (UID: \"38e23382-eb2e-4fec-b84b-7cc873df2741\") " Apr 28 19:25:34.368298 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:34.368238 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/38e23382-eb2e-4fec-b84b-7cc873df2741-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") pod \"38e23382-eb2e-4fec-b84b-7cc873df2741\" (UID: \"38e23382-eb2e-4fec-b84b-7cc873df2741\") " Apr 28 19:25:34.368631 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:34.368600 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38e23382-eb2e-4fec-b84b-7cc873df2741-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "38e23382-eb2e-4fec-b84b-7cc873df2741" (UID: "38e23382-eb2e-4fec-b84b-7cc873df2741"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:25:34.368745 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:34.368635 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38e23382-eb2e-4fec-b84b-7cc873df2741-isvc-sklearn-batcher-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-batcher-kube-rbac-proxy-sar-config") pod "38e23382-eb2e-4fec-b84b-7cc873df2741" (UID: "38e23382-eb2e-4fec-b84b-7cc873df2741"). InnerVolumeSpecName "isvc-sklearn-batcher-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:25:34.370383 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:34.370358 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38e23382-eb2e-4fec-b84b-7cc873df2741-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "38e23382-eb2e-4fec-b84b-7cc873df2741" (UID: "38e23382-eb2e-4fec-b84b-7cc873df2741"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:25:34.370473 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:34.370457 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38e23382-eb2e-4fec-b84b-7cc873df2741-kube-api-access-tvftd" (OuterVolumeSpecName: "kube-api-access-tvftd") pod "38e23382-eb2e-4fec-b84b-7cc873df2741" (UID: "38e23382-eb2e-4fec-b84b-7cc873df2741"). InnerVolumeSpecName "kube-api-access-tvftd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:25:34.468870 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:34.468841 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/38e23382-eb2e-4fec-b84b-7cc873df2741-proxy-tls\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:25:34.468870 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:34.468867 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/38e23382-eb2e-4fec-b84b-7cc873df2741-kserve-provision-location\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:25:34.469127 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:34.468878 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/38e23382-eb2e-4fec-b84b-7cc873df2741-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:25:34.469127 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:34.468889 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tvftd\" (UniqueName: \"kubernetes.io/projected/38e23382-eb2e-4fec-b84b-7cc873df2741-kube-api-access-tvftd\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:25:34.887521 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:34.887417 2571 generic.go:358] "Generic (PLEG): container finished" podID="38e23382-eb2e-4fec-b84b-7cc873df2741" containerID="6e09d901d5fd2d824c47627e8315a2f26bdea21f66946c07bdbed81ecb7aefe0" exitCode=0 Apr 28 19:25:34.887521 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:34.887506 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" event={"ID":"38e23382-eb2e-4fec-b84b-7cc873df2741","Type":"ContainerDied","Data":"6e09d901d5fd2d824c47627e8315a2f26bdea21f66946c07bdbed81ecb7aefe0"} Apr 28 19:25:34.887697 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:34.887533 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" Apr 28 19:25:34.887697 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:34.887554 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52" event={"ID":"38e23382-eb2e-4fec-b84b-7cc873df2741","Type":"ContainerDied","Data":"5f2a3db30af6718a90a4f9da07fa045a73148d85391ec6fa0f6879cc862c0eec"} Apr 28 19:25:34.887697 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:34.887575 2571 scope.go:117] "RemoveContainer" containerID="6e09d901d5fd2d824c47627e8315a2f26bdea21f66946c07bdbed81ecb7aefe0" Apr 28 19:25:34.895982 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:34.895964 2571 scope.go:117] "RemoveContainer" containerID="16252b8977d11b869896a63ab3d209f74c1bcba1ff9fe732f85d263df9a224a4" Apr 28 19:25:34.902997 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:34.902977 2571 scope.go:117] "RemoveContainer" containerID="b591db3bffb302cfc4675f486491e88f5f49298990078cdcb0e212a88b6fdfbb" Apr 28 19:25:34.910325 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:34.910302 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52"] Apr 28 19:25:34.910569 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:34.910551 2571 scope.go:117] "RemoveContainer" containerID="750e04bcfc0e8ebb040b514bc25e1dbe04b2ab13bf8b91421d62cb903175e0b0" Apr 28 19:25:34.914123 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:34.914101 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-57fcff47c9-zrm52"] Apr 28 19:25:34.917931 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:34.917914 2571 scope.go:117] "RemoveContainer" containerID="6e09d901d5fd2d824c47627e8315a2f26bdea21f66946c07bdbed81ecb7aefe0" Apr 28 19:25:34.918184 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:25:34.918162 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e09d901d5fd2d824c47627e8315a2f26bdea21f66946c07bdbed81ecb7aefe0\": container with ID starting with 6e09d901d5fd2d824c47627e8315a2f26bdea21f66946c07bdbed81ecb7aefe0 not found: ID does not exist" containerID="6e09d901d5fd2d824c47627e8315a2f26bdea21f66946c07bdbed81ecb7aefe0" Apr 28 19:25:34.918235 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:34.918194 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e09d901d5fd2d824c47627e8315a2f26bdea21f66946c07bdbed81ecb7aefe0"} err="failed to get container status \"6e09d901d5fd2d824c47627e8315a2f26bdea21f66946c07bdbed81ecb7aefe0\": rpc error: code = NotFound desc = could not find container \"6e09d901d5fd2d824c47627e8315a2f26bdea21f66946c07bdbed81ecb7aefe0\": container with ID starting with 6e09d901d5fd2d824c47627e8315a2f26bdea21f66946c07bdbed81ecb7aefe0 not found: ID does not exist" Apr 28 19:25:34.918235 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:34.918211 2571 scope.go:117] "RemoveContainer" containerID="16252b8977d11b869896a63ab3d209f74c1bcba1ff9fe732f85d263df9a224a4" Apr 28 19:25:34.918458 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:25:34.918441 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16252b8977d11b869896a63ab3d209f74c1bcba1ff9fe732f85d263df9a224a4\": container with ID starting with 16252b8977d11b869896a63ab3d209f74c1bcba1ff9fe732f85d263df9a224a4 not found: ID does not exist" containerID="16252b8977d11b869896a63ab3d209f74c1bcba1ff9fe732f85d263df9a224a4" Apr 28 19:25:34.918549 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:34.918491 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16252b8977d11b869896a63ab3d209f74c1bcba1ff9fe732f85d263df9a224a4"} err="failed to get container status \"16252b8977d11b869896a63ab3d209f74c1bcba1ff9fe732f85d263df9a224a4\": rpc error: code = NotFound desc = could not find container \"16252b8977d11b869896a63ab3d209f74c1bcba1ff9fe732f85d263df9a224a4\": container with ID starting with 16252b8977d11b869896a63ab3d209f74c1bcba1ff9fe732f85d263df9a224a4 not found: ID does not exist" Apr 28 19:25:34.918549 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:34.918515 2571 scope.go:117] "RemoveContainer" containerID="b591db3bffb302cfc4675f486491e88f5f49298990078cdcb0e212a88b6fdfbb" Apr 28 19:25:34.918762 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:25:34.918740 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b591db3bffb302cfc4675f486491e88f5f49298990078cdcb0e212a88b6fdfbb\": container with ID starting with b591db3bffb302cfc4675f486491e88f5f49298990078cdcb0e212a88b6fdfbb not found: ID does not exist" containerID="b591db3bffb302cfc4675f486491e88f5f49298990078cdcb0e212a88b6fdfbb" Apr 28 19:25:34.918854 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:34.918765 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b591db3bffb302cfc4675f486491e88f5f49298990078cdcb0e212a88b6fdfbb"} err="failed to get container status \"b591db3bffb302cfc4675f486491e88f5f49298990078cdcb0e212a88b6fdfbb\": rpc error: code = NotFound desc = could not find container \"b591db3bffb302cfc4675f486491e88f5f49298990078cdcb0e212a88b6fdfbb\": container with ID starting with b591db3bffb302cfc4675f486491e88f5f49298990078cdcb0e212a88b6fdfbb not found: ID does not exist" Apr 28 19:25:34.918854 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:34.918779 2571 scope.go:117] "RemoveContainer" containerID="750e04bcfc0e8ebb040b514bc25e1dbe04b2ab13bf8b91421d62cb903175e0b0" Apr 28 19:25:34.919007 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:25:34.918990 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"750e04bcfc0e8ebb040b514bc25e1dbe04b2ab13bf8b91421d62cb903175e0b0\": container with ID starting with 750e04bcfc0e8ebb040b514bc25e1dbe04b2ab13bf8b91421d62cb903175e0b0 not found: ID does not exist" containerID="750e04bcfc0e8ebb040b514bc25e1dbe04b2ab13bf8b91421d62cb903175e0b0" Apr 28 19:25:34.919047 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:34.919021 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"750e04bcfc0e8ebb040b514bc25e1dbe04b2ab13bf8b91421d62cb903175e0b0"} err="failed to get container status \"750e04bcfc0e8ebb040b514bc25e1dbe04b2ab13bf8b91421d62cb903175e0b0\": rpc error: code = NotFound desc = could not find container \"750e04bcfc0e8ebb040b514bc25e1dbe04b2ab13bf8b91421d62cb903175e0b0\": container with ID starting with 750e04bcfc0e8ebb040b514bc25e1dbe04b2ab13bf8b91421d62cb903175e0b0 not found: ID does not exist" Apr 28 19:25:35.815861 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:35.815813 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" podUID="4577fbb5-3556-446e-a088-0e3d5239f4ce" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:5000: connect: connection refused" Apr 28 19:25:35.816301 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:35.816210 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" podUID="4577fbb5-3556-446e-a088-0e3d5239f4ce" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:25:35.947088 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:35.947056 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38e23382-eb2e-4fec-b84b-7cc873df2741" path="/var/lib/kubelet/pods/38e23382-eb2e-4fec-b84b-7cc873df2741/volumes" Apr 28 19:25:45.815773 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:45.815731 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" podUID="4577fbb5-3556-446e-a088-0e3d5239f4ce" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:5000: connect: connection refused" Apr 28 19:25:45.816221 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:45.816194 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" podUID="4577fbb5-3556-446e-a088-0e3d5239f4ce" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:25:55.816011 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:55.815955 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" podUID="4577fbb5-3556-446e-a088-0e3d5239f4ce" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:5000: connect: connection refused" Apr 28 19:25:55.816503 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:25:55.816452 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" podUID="4577fbb5-3556-446e-a088-0e3d5239f4ce" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:26:05.816209 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:05.816166 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" podUID="4577fbb5-3556-446e-a088-0e3d5239f4ce" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:5000: connect: connection refused" Apr 28 19:26:05.816651 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:05.816618 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" podUID="4577fbb5-3556-446e-a088-0e3d5239f4ce" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:26:15.816111 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:15.816082 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" Apr 28 19:26:15.816577 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:15.816405 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" Apr 28 19:26:23.857451 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:23.857424 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ppk4t_238e19ca-102f-43b1-8aed-9322ca47bfc9/ovn-acl-logging/0.log" Apr 28 19:26:23.859016 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:23.858987 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ppk4t_238e19ca-102f-43b1-8aed-9322ca47bfc9/ovn-acl-logging/0.log" Apr 28 19:26:29.341001 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:29.340962 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-x8c6j"] Apr 28 19:26:29.341387 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:29.341311 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="38e23382-eb2e-4fec-b84b-7cc873df2741" containerName="kube-rbac-proxy" Apr 28 19:26:29.341387 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:29.341322 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="38e23382-eb2e-4fec-b84b-7cc873df2741" containerName="kube-rbac-proxy" Apr 28 19:26:29.341387 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:29.341336 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="38e23382-eb2e-4fec-b84b-7cc873df2741" containerName="agent" Apr 28 19:26:29.341387 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:29.341342 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="38e23382-eb2e-4fec-b84b-7cc873df2741" containerName="agent" Apr 28 19:26:29.341387 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:29.341351 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="38e23382-eb2e-4fec-b84b-7cc873df2741" containerName="storage-initializer" Apr 28 19:26:29.341387 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:29.341357 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="38e23382-eb2e-4fec-b84b-7cc873df2741" containerName="storage-initializer" Apr 28 19:26:29.341387 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:29.341369 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="38e23382-eb2e-4fec-b84b-7cc873df2741" containerName="kserve-container" Apr 28 19:26:29.341387 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:29.341374 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="38e23382-eb2e-4fec-b84b-7cc873df2741" containerName="kserve-container" Apr 28 19:26:29.341738 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:29.341425 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="38e23382-eb2e-4fec-b84b-7cc873df2741" containerName="kube-rbac-proxy" Apr 28 19:26:29.341738 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:29.341433 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="38e23382-eb2e-4fec-b84b-7cc873df2741" containerName="kserve-container" Apr 28 19:26:29.341738 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:29.341442 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="38e23382-eb2e-4fec-b84b-7cc873df2741" containerName="agent" Apr 28 19:26:29.344495 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:29.344465 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-x8c6j" Apr 28 19:26:29.346968 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:29.346945 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"message-dumper-kube-rbac-proxy-sar-config\"" Apr 28 19:26:29.347591 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:29.347567 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"message-dumper-predictor-serving-cert\"" Apr 28 19:26:29.356180 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:29.356158 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-x8c6j"] Apr 28 19:26:29.424663 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:29.424632 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fece92c2-1be2-45ea-840e-11294a319313-proxy-tls\") pod \"message-dumper-predictor-c7d86bcbd-x8c6j\" (UID: \"fece92c2-1be2-45ea-840e-11294a319313\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-x8c6j" Apr 28 19:26:29.424827 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:29.424675 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9qkf\" (UniqueName: \"kubernetes.io/projected/fece92c2-1be2-45ea-840e-11294a319313-kube-api-access-n9qkf\") pod \"message-dumper-predictor-c7d86bcbd-x8c6j\" (UID: \"fece92c2-1be2-45ea-840e-11294a319313\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-x8c6j" Apr 28 19:26:29.424827 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:29.424754 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fece92c2-1be2-45ea-840e-11294a319313-message-dumper-kube-rbac-proxy-sar-config\") pod \"message-dumper-predictor-c7d86bcbd-x8c6j\" (UID: \"fece92c2-1be2-45ea-840e-11294a319313\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-x8c6j" Apr 28 19:26:29.429447 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:29.429424 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz"] Apr 28 19:26:29.429840 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:29.429787 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" podUID="4577fbb5-3556-446e-a088-0e3d5239f4ce" containerName="kserve-container" containerID="cri-o://47acd6252e4391ccfe5a3af66f4c1e6953d7ddade580febf9feb15fb9c01456d" gracePeriod=30 Apr 28 19:26:29.429840 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:29.429817 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" podUID="4577fbb5-3556-446e-a088-0e3d5239f4ce" containerName="kube-rbac-proxy" containerID="cri-o://33eb769a2eb45f55859c066a363f25431062bfba2e65849fc828e5b710ce0669" gracePeriod=30 Apr 28 19:26:29.430028 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:29.429796 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" podUID="4577fbb5-3556-446e-a088-0e3d5239f4ce" containerName="agent" containerID="cri-o://5e013f05abeebe0ea19899264b9d132aa79ea47ad36872e9fdbce69cde01d6f0" gracePeriod=30 Apr 28 19:26:29.525201 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:29.525165 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fece92c2-1be2-45ea-840e-11294a319313-message-dumper-kube-rbac-proxy-sar-config\") pod \"message-dumper-predictor-c7d86bcbd-x8c6j\" (UID: \"fece92c2-1be2-45ea-840e-11294a319313\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-x8c6j" Apr 28 19:26:29.525392 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:29.525256 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fece92c2-1be2-45ea-840e-11294a319313-proxy-tls\") pod \"message-dumper-predictor-c7d86bcbd-x8c6j\" (UID: \"fece92c2-1be2-45ea-840e-11294a319313\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-x8c6j" Apr 28 19:26:29.525392 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:29.525293 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n9qkf\" (UniqueName: \"kubernetes.io/projected/fece92c2-1be2-45ea-840e-11294a319313-kube-api-access-n9qkf\") pod \"message-dumper-predictor-c7d86bcbd-x8c6j\" (UID: \"fece92c2-1be2-45ea-840e-11294a319313\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-x8c6j" Apr 28 19:26:29.525546 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:26:29.525410 2571 secret.go:189] Couldn't get secret kserve-ci-e2e-test/message-dumper-predictor-serving-cert: secret "message-dumper-predictor-serving-cert" not found Apr 28 19:26:29.525546 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:26:29.525511 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fece92c2-1be2-45ea-840e-11294a319313-proxy-tls podName:fece92c2-1be2-45ea-840e-11294a319313 nodeName:}" failed. No retries permitted until 2026-04-28 19:26:30.025469065 +0000 UTC m=+606.656369989 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/fece92c2-1be2-45ea-840e-11294a319313-proxy-tls") pod "message-dumper-predictor-c7d86bcbd-x8c6j" (UID: "fece92c2-1be2-45ea-840e-11294a319313") : secret "message-dumper-predictor-serving-cert" not found Apr 28 19:26:29.525888 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:29.525862 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fece92c2-1be2-45ea-840e-11294a319313-message-dumper-kube-rbac-proxy-sar-config\") pod \"message-dumper-predictor-c7d86bcbd-x8c6j\" (UID: \"fece92c2-1be2-45ea-840e-11294a319313\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-x8c6j" Apr 28 19:26:29.534082 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:29.534058 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9qkf\" (UniqueName: \"kubernetes.io/projected/fece92c2-1be2-45ea-840e-11294a319313-kube-api-access-n9qkf\") pod \"message-dumper-predictor-c7d86bcbd-x8c6j\" (UID: \"fece92c2-1be2-45ea-840e-11294a319313\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-x8c6j" Apr 28 19:26:30.029748 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:30.029701 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fece92c2-1be2-45ea-840e-11294a319313-proxy-tls\") pod \"message-dumper-predictor-c7d86bcbd-x8c6j\" (UID: \"fece92c2-1be2-45ea-840e-11294a319313\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-x8c6j" Apr 28 19:26:30.032135 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:30.032110 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fece92c2-1be2-45ea-840e-11294a319313-proxy-tls\") pod \"message-dumper-predictor-c7d86bcbd-x8c6j\" (UID: \"fece92c2-1be2-45ea-840e-11294a319313\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-x8c6j" Apr 28 19:26:30.059436 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:30.059409 2571 generic.go:358] "Generic (PLEG): container finished" podID="4577fbb5-3556-446e-a088-0e3d5239f4ce" containerID="33eb769a2eb45f55859c066a363f25431062bfba2e65849fc828e5b710ce0669" exitCode=2 Apr 28 19:26:30.059590 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:30.059506 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" event={"ID":"4577fbb5-3556-446e-a088-0e3d5239f4ce","Type":"ContainerDied","Data":"33eb769a2eb45f55859c066a363f25431062bfba2e65849fc828e5b710ce0669"} Apr 28 19:26:30.254884 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:30.254848 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-x8c6j" Apr 28 19:26:30.378316 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:30.378292 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-x8c6j"] Apr 28 19:26:30.380469 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:26:30.380442 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfece92c2_1be2_45ea_840e_11294a319313.slice/crio-de524af153f35f95cc4136339426ab9e3e4b493a07aac1d8300605abc12c0cd4 WatchSource:0}: Error finding container de524af153f35f95cc4136339426ab9e3e4b493a07aac1d8300605abc12c0cd4: Status 404 returned error can't find the container with id de524af153f35f95cc4136339426ab9e3e4b493a07aac1d8300605abc12c0cd4 Apr 28 19:26:30.811527 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:30.811455 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" podUID="4577fbb5-3556-446e-a088-0e3d5239f4ce" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.26:8643/healthz\": dial tcp 10.134.0.26:8643: connect: connection refused" Apr 28 19:26:31.064091 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:31.064014 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-x8c6j" event={"ID":"fece92c2-1be2-45ea-840e-11294a319313","Type":"ContainerStarted","Data":"de524af153f35f95cc4136339426ab9e3e4b493a07aac1d8300605abc12c0cd4"} Apr 28 19:26:32.068134 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:32.068095 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-x8c6j" event={"ID":"fece92c2-1be2-45ea-840e-11294a319313","Type":"ContainerStarted","Data":"3fd8aba4d7379afd31a885d461d94132ca4b23a3f87b88d4e136cb2372422cc2"} Apr 28 19:26:32.068134 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:32.068133 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-x8c6j" event={"ID":"fece92c2-1be2-45ea-840e-11294a319313","Type":"ContainerStarted","Data":"9b4692225e3fdc21e91c102fe981c7895889be3023b9e853965f1b820d7c99ff"} Apr 28 19:26:32.068573 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:32.068228 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-x8c6j" Apr 28 19:26:32.089910 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:32.089851 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-x8c6j" podStartSLOduration=1.8647749999999998 podStartE2EDuration="3.08982986s" podCreationTimestamp="2026-04-28 19:26:29 +0000 UTC" firstStartedPulling="2026-04-28 19:26:30.382086175 +0000 UTC m=+607.012987102" lastFinishedPulling="2026-04-28 19:26:31.607141039 +0000 UTC m=+608.238041962" observedRunningTime="2026-04-28 19:26:32.087669572 +0000 UTC m=+608.718570517" watchObservedRunningTime="2026-04-28 19:26:32.08982986 +0000 UTC m=+608.720730806" Apr 28 19:26:33.071839 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:33.071803 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-x8c6j" Apr 28 19:26:33.073591 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:33.073571 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-x8c6j" Apr 28 19:26:34.076340 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:34.076308 2571 generic.go:358] "Generic (PLEG): container finished" podID="4577fbb5-3556-446e-a088-0e3d5239f4ce" containerID="47acd6252e4391ccfe5a3af66f4c1e6953d7ddade580febf9feb15fb9c01456d" exitCode=0 Apr 28 19:26:34.076826 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:34.076389 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" event={"ID":"4577fbb5-3556-446e-a088-0e3d5239f4ce","Type":"ContainerDied","Data":"47acd6252e4391ccfe5a3af66f4c1e6953d7ddade580febf9feb15fb9c01456d"} Apr 28 19:26:35.811560 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:35.811514 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" podUID="4577fbb5-3556-446e-a088-0e3d5239f4ce" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.26:8643/healthz\": dial tcp 10.134.0.26:8643: connect: connection refused" Apr 28 19:26:35.815837 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:35.815806 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" podUID="4577fbb5-3556-446e-a088-0e3d5239f4ce" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:5000: connect: connection refused" Apr 28 19:26:35.816201 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:35.816180 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" podUID="4577fbb5-3556-446e-a088-0e3d5239f4ce" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:26:40.085796 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:40.085722 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-x8c6j" Apr 28 19:26:40.811092 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:40.811054 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" podUID="4577fbb5-3556-446e-a088-0e3d5239f4ce" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.26:8643/healthz\": dial tcp 10.134.0.26:8643: connect: connection refused" Apr 28 19:26:40.811275 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:40.811193 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" Apr 28 19:26:45.811224 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:45.811182 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" podUID="4577fbb5-3556-446e-a088-0e3d5239f4ce" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.26:8643/healthz\": dial tcp 10.134.0.26:8643: connect: connection refused" Apr 28 19:26:45.815546 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:45.815511 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" podUID="4577fbb5-3556-446e-a088-0e3d5239f4ce" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:5000: connect: connection refused" Apr 28 19:26:45.815831 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:45.815806 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" podUID="4577fbb5-3556-446e-a088-0e3d5239f4ce" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:26:49.886118 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:49.886083 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf"] Apr 28 19:26:49.889684 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:49.889667 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" Apr 28 19:26:49.891851 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:49.891830 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-logger-predictor-serving-cert\"" Apr 28 19:26:49.891953 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:49.891831 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-logger-kube-rbac-proxy-sar-config\"" Apr 28 19:26:49.899600 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:49.899573 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf"] Apr 28 19:26:50.007424 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:50.007388 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5rh9\" (UniqueName: \"kubernetes.io/projected/dde95831-d877-4b72-a1c1-c6affe564d4a-kube-api-access-r5rh9\") pod \"isvc-logger-predictor-7d4db54646-dlmkf\" (UID: \"dde95831-d877-4b72-a1c1-c6affe564d4a\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" Apr 28 19:26:50.007424 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:50.007432 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dde95831-d877-4b72-a1c1-c6affe564d4a-isvc-logger-kube-rbac-proxy-sar-config\") pod \"isvc-logger-predictor-7d4db54646-dlmkf\" (UID: \"dde95831-d877-4b72-a1c1-c6affe564d4a\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" Apr 28 19:26:50.007657 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:50.007530 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dde95831-d877-4b72-a1c1-c6affe564d4a-kserve-provision-location\") pod \"isvc-logger-predictor-7d4db54646-dlmkf\" (UID: \"dde95831-d877-4b72-a1c1-c6affe564d4a\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" Apr 28 19:26:50.007657 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:50.007562 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dde95831-d877-4b72-a1c1-c6affe564d4a-proxy-tls\") pod \"isvc-logger-predictor-7d4db54646-dlmkf\" (UID: \"dde95831-d877-4b72-a1c1-c6affe564d4a\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" Apr 28 19:26:50.108698 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:50.108662 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r5rh9\" (UniqueName: \"kubernetes.io/projected/dde95831-d877-4b72-a1c1-c6affe564d4a-kube-api-access-r5rh9\") pod \"isvc-logger-predictor-7d4db54646-dlmkf\" (UID: \"dde95831-d877-4b72-a1c1-c6affe564d4a\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" Apr 28 19:26:50.108698 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:50.108698 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dde95831-d877-4b72-a1c1-c6affe564d4a-isvc-logger-kube-rbac-proxy-sar-config\") pod \"isvc-logger-predictor-7d4db54646-dlmkf\" (UID: \"dde95831-d877-4b72-a1c1-c6affe564d4a\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" Apr 28 19:26:50.108952 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:50.108737 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dde95831-d877-4b72-a1c1-c6affe564d4a-kserve-provision-location\") pod \"isvc-logger-predictor-7d4db54646-dlmkf\" (UID: \"dde95831-d877-4b72-a1c1-c6affe564d4a\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" Apr 28 19:26:50.108952 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:50.108760 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dde95831-d877-4b72-a1c1-c6affe564d4a-proxy-tls\") pod \"isvc-logger-predictor-7d4db54646-dlmkf\" (UID: \"dde95831-d877-4b72-a1c1-c6affe564d4a\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" Apr 28 19:26:50.109210 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:50.109186 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dde95831-d877-4b72-a1c1-c6affe564d4a-kserve-provision-location\") pod \"isvc-logger-predictor-7d4db54646-dlmkf\" (UID: \"dde95831-d877-4b72-a1c1-c6affe564d4a\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" Apr 28 19:26:50.109425 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:50.109396 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dde95831-d877-4b72-a1c1-c6affe564d4a-isvc-logger-kube-rbac-proxy-sar-config\") pod \"isvc-logger-predictor-7d4db54646-dlmkf\" (UID: \"dde95831-d877-4b72-a1c1-c6affe564d4a\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" Apr 28 19:26:50.111123 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:50.111103 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dde95831-d877-4b72-a1c1-c6affe564d4a-proxy-tls\") pod \"isvc-logger-predictor-7d4db54646-dlmkf\" (UID: \"dde95831-d877-4b72-a1c1-c6affe564d4a\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" Apr 28 19:26:50.115991 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:50.115970 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5rh9\" (UniqueName: \"kubernetes.io/projected/dde95831-d877-4b72-a1c1-c6affe564d4a-kube-api-access-r5rh9\") pod \"isvc-logger-predictor-7d4db54646-dlmkf\" (UID: \"dde95831-d877-4b72-a1c1-c6affe564d4a\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" Apr 28 19:26:50.206966 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:50.206932 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" Apr 28 19:26:50.331284 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:50.331258 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf"] Apr 28 19:26:50.333935 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:26:50.333906 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddde95831_d877_4b72_a1c1_c6affe564d4a.slice/crio-44fe3d9e5bfbe879b6412ba87f75d8804d9b581e72b41a179084fffe28b1e571 WatchSource:0}: Error finding container 44fe3d9e5bfbe879b6412ba87f75d8804d9b581e72b41a179084fffe28b1e571: Status 404 returned error can't find the container with id 44fe3d9e5bfbe879b6412ba87f75d8804d9b581e72b41a179084fffe28b1e571 Apr 28 19:26:50.811378 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:50.811335 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" podUID="4577fbb5-3556-446e-a088-0e3d5239f4ce" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.26:8643/healthz\": dial tcp 10.134.0.26:8643: connect: connection refused" Apr 28 19:26:51.130079 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:51.129997 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" event={"ID":"dde95831-d877-4b72-a1c1-c6affe564d4a","Type":"ContainerStarted","Data":"5f2b8b2b760d0c3f062eca3348d07aeb73c5adf6381b24da0a07618ccddacc8e"} Apr 28 19:26:51.130079 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:51.130031 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" event={"ID":"dde95831-d877-4b72-a1c1-c6affe564d4a","Type":"ContainerStarted","Data":"44fe3d9e5bfbe879b6412ba87f75d8804d9b581e72b41a179084fffe28b1e571"} Apr 28 19:26:54.141175 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:54.141143 2571 generic.go:358] "Generic (PLEG): container finished" podID="dde95831-d877-4b72-a1c1-c6affe564d4a" containerID="5f2b8b2b760d0c3f062eca3348d07aeb73c5adf6381b24da0a07618ccddacc8e" exitCode=0 Apr 28 19:26:54.141543 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:54.141203 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" event={"ID":"dde95831-d877-4b72-a1c1-c6affe564d4a","Type":"ContainerDied","Data":"5f2b8b2b760d0c3f062eca3348d07aeb73c5adf6381b24da0a07618ccddacc8e"} Apr 28 19:26:55.146193 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:55.146152 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" event={"ID":"dde95831-d877-4b72-a1c1-c6affe564d4a","Type":"ContainerStarted","Data":"31fd1ee458f236f9595395b98084c06ebdc3042ec3d476c080959da45e5fc245"} Apr 28 19:26:55.146598 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:55.146202 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" event={"ID":"dde95831-d877-4b72-a1c1-c6affe564d4a","Type":"ContainerStarted","Data":"ca5df778f702e080ab057c20f10beb858ddedc64b40f30b0722082302483427f"} Apr 28 19:26:55.146598 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:55.146215 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" event={"ID":"dde95831-d877-4b72-a1c1-c6affe564d4a","Type":"ContainerStarted","Data":"4b8c8ff81a4a6cfab93481163ef0c304a43e909493fdf968e8025bdda10077ce"} Apr 28 19:26:55.146687 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:55.146609 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" Apr 28 19:26:55.146756 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:55.146736 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" Apr 28 19:26:55.148089 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:55.148063 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" podUID="dde95831-d877-4b72-a1c1-c6affe564d4a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 28 19:26:55.166344 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:55.166299 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" podStartSLOduration=6.166285604 podStartE2EDuration="6.166285604s" podCreationTimestamp="2026-04-28 19:26:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:26:55.1648652 +0000 UTC m=+631.795766145" watchObservedRunningTime="2026-04-28 19:26:55.166285604 +0000 UTC m=+631.797186548" Apr 28 19:26:55.810969 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:55.810929 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" podUID="4577fbb5-3556-446e-a088-0e3d5239f4ce" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.26:8643/healthz\": dial tcp 10.134.0.26:8643: connect: connection refused" Apr 28 19:26:55.816343 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:55.816317 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" podUID="4577fbb5-3556-446e-a088-0e3d5239f4ce" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:5000: connect: connection refused" Apr 28 19:26:55.816464 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:55.816451 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" Apr 28 19:26:55.816641 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:55.816611 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" podUID="4577fbb5-3556-446e-a088-0e3d5239f4ce" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:26:55.816764 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:55.816736 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" Apr 28 19:26:56.149967 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:56.149875 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" Apr 28 19:26:56.150337 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:56.149975 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" podUID="dde95831-d877-4b72-a1c1-c6affe564d4a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 28 19:26:56.150951 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:56.150922 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" podUID="dde95831-d877-4b72-a1c1-c6affe564d4a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:26:57.152583 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:57.152534 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" podUID="dde95831-d877-4b72-a1c1-c6affe564d4a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 28 19:26:57.153014 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:57.152992 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" podUID="dde95831-d877-4b72-a1c1-c6affe564d4a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:26:59.610852 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:59.610830 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" Apr 28 19:26:59.691462 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:59.691379 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4577fbb5-3556-446e-a088-0e3d5239f4ce-kserve-provision-location\") pod \"4577fbb5-3556-446e-a088-0e3d5239f4ce\" (UID: \"4577fbb5-3556-446e-a088-0e3d5239f4ce\") " Apr 28 19:26:59.691624 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:59.691503 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4577fbb5-3556-446e-a088-0e3d5239f4ce-proxy-tls\") pod \"4577fbb5-3556-446e-a088-0e3d5239f4ce\" (UID: \"4577fbb5-3556-446e-a088-0e3d5239f4ce\") " Apr 28 19:26:59.691624 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:59.691551 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4577fbb5-3556-446e-a088-0e3d5239f4ce-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") pod \"4577fbb5-3556-446e-a088-0e3d5239f4ce\" (UID: \"4577fbb5-3556-446e-a088-0e3d5239f4ce\") " Apr 28 19:26:59.691731 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:59.691631 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5p8fk\" (UniqueName: \"kubernetes.io/projected/4577fbb5-3556-446e-a088-0e3d5239f4ce-kube-api-access-5p8fk\") pod \"4577fbb5-3556-446e-a088-0e3d5239f4ce\" (UID: \"4577fbb5-3556-446e-a088-0e3d5239f4ce\") " Apr 28 19:26:59.691801 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:59.691736 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4577fbb5-3556-446e-a088-0e3d5239f4ce-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4577fbb5-3556-446e-a088-0e3d5239f4ce" (UID: "4577fbb5-3556-446e-a088-0e3d5239f4ce"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:26:59.691955 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:59.691936 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4577fbb5-3556-446e-a088-0e3d5239f4ce-kserve-provision-location\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:26:59.692016 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:59.691968 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4577fbb5-3556-446e-a088-0e3d5239f4ce-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config") pod "4577fbb5-3556-446e-a088-0e3d5239f4ce" (UID: "4577fbb5-3556-446e-a088-0e3d5239f4ce"). InnerVolumeSpecName "isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:26:59.693710 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:59.693686 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4577fbb5-3556-446e-a088-0e3d5239f4ce-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "4577fbb5-3556-446e-a088-0e3d5239f4ce" (UID: "4577fbb5-3556-446e-a088-0e3d5239f4ce"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:26:59.693767 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:59.693707 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4577fbb5-3556-446e-a088-0e3d5239f4ce-kube-api-access-5p8fk" (OuterVolumeSpecName: "kube-api-access-5p8fk") pod "4577fbb5-3556-446e-a088-0e3d5239f4ce" (UID: "4577fbb5-3556-446e-a088-0e3d5239f4ce"). InnerVolumeSpecName "kube-api-access-5p8fk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:26:59.792593 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:59.792543 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4577fbb5-3556-446e-a088-0e3d5239f4ce-proxy-tls\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:26:59.792593 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:59.792587 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4577fbb5-3556-446e-a088-0e3d5239f4ce-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:26:59.792593 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:26:59.792598 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5p8fk\" (UniqueName: \"kubernetes.io/projected/4577fbb5-3556-446e-a088-0e3d5239f4ce-kube-api-access-5p8fk\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:27:00.163888 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:27:00.163774 2571 generic.go:358] "Generic (PLEG): container finished" podID="4577fbb5-3556-446e-a088-0e3d5239f4ce" containerID="5e013f05abeebe0ea19899264b9d132aa79ea47ad36872e9fdbce69cde01d6f0" exitCode=0 Apr 28 19:27:00.163888 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:27:00.163825 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" event={"ID":"4577fbb5-3556-446e-a088-0e3d5239f4ce","Type":"ContainerDied","Data":"5e013f05abeebe0ea19899264b9d132aa79ea47ad36872e9fdbce69cde01d6f0"} Apr 28 19:27:00.163888 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:27:00.163872 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" event={"ID":"4577fbb5-3556-446e-a088-0e3d5239f4ce","Type":"ContainerDied","Data":"5d305f0d811b4106ddbf361f82182a63e0fe5ac146c1f0633552189f71962508"} Apr 28 19:27:00.163888 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:27:00.163885 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz" Apr 28 19:27:00.187441 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:27:00.163889 2571 scope.go:117] "RemoveContainer" containerID="5e013f05abeebe0ea19899264b9d132aa79ea47ad36872e9fdbce69cde01d6f0" Apr 28 19:27:00.187441 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:27:00.173846 2571 scope.go:117] "RemoveContainer" containerID="33eb769a2eb45f55859c066a363f25431062bfba2e65849fc828e5b710ce0669" Apr 28 19:27:00.187441 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:27:00.180152 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz"] Apr 28 19:27:00.187441 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:27:00.181952 2571 scope.go:117] "RemoveContainer" containerID="47acd6252e4391ccfe5a3af66f4c1e6953d7ddade580febf9feb15fb9c01456d" Apr 28 19:27:00.187441 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:27:00.183954 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7c59ff5d-hd5lz"] Apr 28 19:27:00.191054 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:27:00.191039 2571 scope.go:117] "RemoveContainer" containerID="9e11991440c2ee0ac45a958f3161172f872166c5d053b475d138a1cb0641abda" Apr 28 19:27:00.198342 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:27:00.198328 2571 scope.go:117] "RemoveContainer" containerID="5e013f05abeebe0ea19899264b9d132aa79ea47ad36872e9fdbce69cde01d6f0" Apr 28 19:27:00.198701 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:27:00.198676 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e013f05abeebe0ea19899264b9d132aa79ea47ad36872e9fdbce69cde01d6f0\": container with ID starting with 5e013f05abeebe0ea19899264b9d132aa79ea47ad36872e9fdbce69cde01d6f0 not found: ID does not exist" containerID="5e013f05abeebe0ea19899264b9d132aa79ea47ad36872e9fdbce69cde01d6f0" Apr 28 19:27:00.198774 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:27:00.198710 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e013f05abeebe0ea19899264b9d132aa79ea47ad36872e9fdbce69cde01d6f0"} err="failed to get container status \"5e013f05abeebe0ea19899264b9d132aa79ea47ad36872e9fdbce69cde01d6f0\": rpc error: code = NotFound desc = could not find container \"5e013f05abeebe0ea19899264b9d132aa79ea47ad36872e9fdbce69cde01d6f0\": container with ID starting with 5e013f05abeebe0ea19899264b9d132aa79ea47ad36872e9fdbce69cde01d6f0 not found: ID does not exist" Apr 28 19:27:00.198774 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:27:00.198729 2571 scope.go:117] "RemoveContainer" containerID="33eb769a2eb45f55859c066a363f25431062bfba2e65849fc828e5b710ce0669" Apr 28 19:27:00.198956 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:27:00.198940 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33eb769a2eb45f55859c066a363f25431062bfba2e65849fc828e5b710ce0669\": container with ID starting with 33eb769a2eb45f55859c066a363f25431062bfba2e65849fc828e5b710ce0669 not found: ID does not exist" containerID="33eb769a2eb45f55859c066a363f25431062bfba2e65849fc828e5b710ce0669" Apr 28 19:27:00.199006 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:27:00.198960 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33eb769a2eb45f55859c066a363f25431062bfba2e65849fc828e5b710ce0669"} err="failed to get container status \"33eb769a2eb45f55859c066a363f25431062bfba2e65849fc828e5b710ce0669\": rpc error: code = NotFound desc = could not find container \"33eb769a2eb45f55859c066a363f25431062bfba2e65849fc828e5b710ce0669\": container with ID starting with 33eb769a2eb45f55859c066a363f25431062bfba2e65849fc828e5b710ce0669 not found: ID does not exist" Apr 28 19:27:00.199006 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:27:00.198973 2571 scope.go:117] "RemoveContainer" containerID="47acd6252e4391ccfe5a3af66f4c1e6953d7ddade580febf9feb15fb9c01456d" Apr 28 19:27:00.199214 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:27:00.199195 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47acd6252e4391ccfe5a3af66f4c1e6953d7ddade580febf9feb15fb9c01456d\": container with ID starting with 47acd6252e4391ccfe5a3af66f4c1e6953d7ddade580febf9feb15fb9c01456d not found: ID does not exist" containerID="47acd6252e4391ccfe5a3af66f4c1e6953d7ddade580febf9feb15fb9c01456d" Apr 28 19:27:00.199260 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:27:00.199219 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47acd6252e4391ccfe5a3af66f4c1e6953d7ddade580febf9feb15fb9c01456d"} err="failed to get container status \"47acd6252e4391ccfe5a3af66f4c1e6953d7ddade580febf9feb15fb9c01456d\": rpc error: code = NotFound desc = could not find container \"47acd6252e4391ccfe5a3af66f4c1e6953d7ddade580febf9feb15fb9c01456d\": container with ID starting with 47acd6252e4391ccfe5a3af66f4c1e6953d7ddade580febf9feb15fb9c01456d not found: ID does not exist" Apr 28 19:27:00.199260 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:27:00.199234 2571 scope.go:117] "RemoveContainer" containerID="9e11991440c2ee0ac45a958f3161172f872166c5d053b475d138a1cb0641abda" Apr 28 19:27:00.199466 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:27:00.199448 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e11991440c2ee0ac45a958f3161172f872166c5d053b475d138a1cb0641abda\": container with ID starting with 9e11991440c2ee0ac45a958f3161172f872166c5d053b475d138a1cb0641abda not found: ID does not exist" containerID="9e11991440c2ee0ac45a958f3161172f872166c5d053b475d138a1cb0641abda" Apr 28 19:27:00.199530 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:27:00.199472 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e11991440c2ee0ac45a958f3161172f872166c5d053b475d138a1cb0641abda"} err="failed to get container status \"9e11991440c2ee0ac45a958f3161172f872166c5d053b475d138a1cb0641abda\": rpc error: code = NotFound desc = could not find container \"9e11991440c2ee0ac45a958f3161172f872166c5d053b475d138a1cb0641abda\": container with ID starting with 9e11991440c2ee0ac45a958f3161172f872166c5d053b475d138a1cb0641abda not found: ID does not exist" Apr 28 19:27:01.946545 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:27:01.946513 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4577fbb5-3556-446e-a088-0e3d5239f4ce" path="/var/lib/kubelet/pods/4577fbb5-3556-446e-a088-0e3d5239f4ce/volumes" Apr 28 19:27:02.156555 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:27:02.156521 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" Apr 28 19:27:02.157085 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:27:02.157050 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" podUID="dde95831-d877-4b72-a1c1-c6affe564d4a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 28 19:27:02.157536 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:27:02.157508 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" podUID="dde95831-d877-4b72-a1c1-c6affe564d4a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:27:12.157020 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:27:12.156975 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" podUID="dde95831-d877-4b72-a1c1-c6affe564d4a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 28 19:27:12.157518 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:27:12.157401 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" podUID="dde95831-d877-4b72-a1c1-c6affe564d4a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:27:22.157781 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:27:22.157736 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" podUID="dde95831-d877-4b72-a1c1-c6affe564d4a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 28 19:27:22.158296 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:27:22.158269 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" podUID="dde95831-d877-4b72-a1c1-c6affe564d4a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:27:32.157002 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:27:32.156951 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" podUID="dde95831-d877-4b72-a1c1-c6affe564d4a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 28 19:27:32.157429 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:27:32.157339 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" podUID="dde95831-d877-4b72-a1c1-c6affe564d4a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:27:42.157208 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:27:42.157161 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" podUID="dde95831-d877-4b72-a1c1-c6affe564d4a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 28 19:27:42.157722 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:27:42.157692 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" podUID="dde95831-d877-4b72-a1c1-c6affe564d4a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:27:52.156970 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:27:52.156930 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" podUID="dde95831-d877-4b72-a1c1-c6affe564d4a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 28 19:27:52.157521 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:27:52.157298 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" podUID="dde95831-d877-4b72-a1c1-c6affe564d4a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:28:02.158246 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:02.158206 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" Apr 28 19:28:02.158678 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:02.158410 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" Apr 28 19:28:13.924530 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:13.924498 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-predictor-c7d86bcbd-x8c6j_fece92c2-1be2-45ea-840e-11294a319313/kserve-container/0.log" Apr 28 19:28:14.116302 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:14.116262 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-x8c6j"] Apr 28 19:28:14.116650 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:14.116617 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-x8c6j" podUID="fece92c2-1be2-45ea-840e-11294a319313" containerName="kserve-container" containerID="cri-o://9b4692225e3fdc21e91c102fe981c7895889be3023b9e853965f1b820d7c99ff" gracePeriod=30 Apr 28 19:28:14.116793 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:14.116675 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-x8c6j" podUID="fece92c2-1be2-45ea-840e-11294a319313" containerName="kube-rbac-proxy" containerID="cri-o://3fd8aba4d7379afd31a885d461d94132ca4b23a3f87b88d4e136cb2372422cc2" gracePeriod=30 Apr 28 19:28:14.356982 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:14.356956 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-x8c6j" Apr 28 19:28:14.390204 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:14.390170 2571 generic.go:358] "Generic (PLEG): container finished" podID="fece92c2-1be2-45ea-840e-11294a319313" containerID="3fd8aba4d7379afd31a885d461d94132ca4b23a3f87b88d4e136cb2372422cc2" exitCode=2 Apr 28 19:28:14.390204 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:14.390200 2571 generic.go:358] "Generic (PLEG): container finished" podID="fece92c2-1be2-45ea-840e-11294a319313" containerID="9b4692225e3fdc21e91c102fe981c7895889be3023b9e853965f1b820d7c99ff" exitCode=2 Apr 28 19:28:14.390433 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:14.390242 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-x8c6j" Apr 28 19:28:14.390433 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:14.390246 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-x8c6j" event={"ID":"fece92c2-1be2-45ea-840e-11294a319313","Type":"ContainerDied","Data":"3fd8aba4d7379afd31a885d461d94132ca4b23a3f87b88d4e136cb2372422cc2"} Apr 28 19:28:14.390433 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:14.390288 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-x8c6j" event={"ID":"fece92c2-1be2-45ea-840e-11294a319313","Type":"ContainerDied","Data":"9b4692225e3fdc21e91c102fe981c7895889be3023b9e853965f1b820d7c99ff"} Apr 28 19:28:14.390433 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:14.390303 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-x8c6j" event={"ID":"fece92c2-1be2-45ea-840e-11294a319313","Type":"ContainerDied","Data":"de524af153f35f95cc4136339426ab9e3e4b493a07aac1d8300605abc12c0cd4"} Apr 28 19:28:14.390433 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:14.390321 2571 scope.go:117] "RemoveContainer" containerID="3fd8aba4d7379afd31a885d461d94132ca4b23a3f87b88d4e136cb2372422cc2" Apr 28 19:28:14.398278 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:14.398255 2571 scope.go:117] "RemoveContainer" containerID="9b4692225e3fdc21e91c102fe981c7895889be3023b9e853965f1b820d7c99ff" Apr 28 19:28:14.405710 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:14.405692 2571 scope.go:117] "RemoveContainer" containerID="3fd8aba4d7379afd31a885d461d94132ca4b23a3f87b88d4e136cb2372422cc2" Apr 28 19:28:14.406019 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:28:14.405990 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fd8aba4d7379afd31a885d461d94132ca4b23a3f87b88d4e136cb2372422cc2\": container with ID starting with 3fd8aba4d7379afd31a885d461d94132ca4b23a3f87b88d4e136cb2372422cc2 not found: ID does not exist" containerID="3fd8aba4d7379afd31a885d461d94132ca4b23a3f87b88d4e136cb2372422cc2" Apr 28 19:28:14.406095 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:14.406020 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fd8aba4d7379afd31a885d461d94132ca4b23a3f87b88d4e136cb2372422cc2"} err="failed to get container status \"3fd8aba4d7379afd31a885d461d94132ca4b23a3f87b88d4e136cb2372422cc2\": rpc error: code = NotFound desc = could not find container \"3fd8aba4d7379afd31a885d461d94132ca4b23a3f87b88d4e136cb2372422cc2\": container with ID starting with 3fd8aba4d7379afd31a885d461d94132ca4b23a3f87b88d4e136cb2372422cc2 not found: ID does not exist" Apr 28 19:28:14.406095 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:14.406040 2571 scope.go:117] "RemoveContainer" containerID="9b4692225e3fdc21e91c102fe981c7895889be3023b9e853965f1b820d7c99ff" Apr 28 19:28:14.406278 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:28:14.406264 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b4692225e3fdc21e91c102fe981c7895889be3023b9e853965f1b820d7c99ff\": container with ID starting with 9b4692225e3fdc21e91c102fe981c7895889be3023b9e853965f1b820d7c99ff not found: ID does not exist" containerID="9b4692225e3fdc21e91c102fe981c7895889be3023b9e853965f1b820d7c99ff" Apr 28 19:28:14.406322 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:14.406281 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b4692225e3fdc21e91c102fe981c7895889be3023b9e853965f1b820d7c99ff"} err="failed to get container status \"9b4692225e3fdc21e91c102fe981c7895889be3023b9e853965f1b820d7c99ff\": rpc error: code = NotFound desc = could not find container \"9b4692225e3fdc21e91c102fe981c7895889be3023b9e853965f1b820d7c99ff\": container with ID starting with 9b4692225e3fdc21e91c102fe981c7895889be3023b9e853965f1b820d7c99ff not found: ID does not exist" Apr 28 19:28:14.406322 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:14.406293 2571 scope.go:117] "RemoveContainer" containerID="3fd8aba4d7379afd31a885d461d94132ca4b23a3f87b88d4e136cb2372422cc2" Apr 28 19:28:14.406627 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:14.406594 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fd8aba4d7379afd31a885d461d94132ca4b23a3f87b88d4e136cb2372422cc2"} err="failed to get container status \"3fd8aba4d7379afd31a885d461d94132ca4b23a3f87b88d4e136cb2372422cc2\": rpc error: code = NotFound desc = could not find container \"3fd8aba4d7379afd31a885d461d94132ca4b23a3f87b88d4e136cb2372422cc2\": container with ID starting with 3fd8aba4d7379afd31a885d461d94132ca4b23a3f87b88d4e136cb2372422cc2 not found: ID does not exist" Apr 28 19:28:14.406710 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:14.406628 2571 scope.go:117] "RemoveContainer" containerID="9b4692225e3fdc21e91c102fe981c7895889be3023b9e853965f1b820d7c99ff" Apr 28 19:28:14.406876 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:14.406857 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b4692225e3fdc21e91c102fe981c7895889be3023b9e853965f1b820d7c99ff"} err="failed to get container status \"9b4692225e3fdc21e91c102fe981c7895889be3023b9e853965f1b820d7c99ff\": rpc error: code = NotFound desc = could not find container \"9b4692225e3fdc21e91c102fe981c7895889be3023b9e853965f1b820d7c99ff\": container with ID starting with 9b4692225e3fdc21e91c102fe981c7895889be3023b9e853965f1b820d7c99ff not found: ID does not exist" Apr 28 19:28:14.431689 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:14.431660 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9qkf\" (UniqueName: \"kubernetes.io/projected/fece92c2-1be2-45ea-840e-11294a319313-kube-api-access-n9qkf\") pod \"fece92c2-1be2-45ea-840e-11294a319313\" (UID: \"fece92c2-1be2-45ea-840e-11294a319313\") " Apr 28 19:28:14.431826 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:14.431753 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fece92c2-1be2-45ea-840e-11294a319313-message-dumper-kube-rbac-proxy-sar-config\") pod \"fece92c2-1be2-45ea-840e-11294a319313\" (UID: \"fece92c2-1be2-45ea-840e-11294a319313\") " Apr 28 19:28:14.431826 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:14.431792 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fece92c2-1be2-45ea-840e-11294a319313-proxy-tls\") pod \"fece92c2-1be2-45ea-840e-11294a319313\" (UID: \"fece92c2-1be2-45ea-840e-11294a319313\") " Apr 28 19:28:14.432216 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:14.432170 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fece92c2-1be2-45ea-840e-11294a319313-message-dumper-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "message-dumper-kube-rbac-proxy-sar-config") pod "fece92c2-1be2-45ea-840e-11294a319313" (UID: "fece92c2-1be2-45ea-840e-11294a319313"). InnerVolumeSpecName "message-dumper-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:28:14.433861 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:14.433836 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fece92c2-1be2-45ea-840e-11294a319313-kube-api-access-n9qkf" (OuterVolumeSpecName: "kube-api-access-n9qkf") pod "fece92c2-1be2-45ea-840e-11294a319313" (UID: "fece92c2-1be2-45ea-840e-11294a319313"). InnerVolumeSpecName "kube-api-access-n9qkf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:28:14.434042 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:14.434016 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fece92c2-1be2-45ea-840e-11294a319313-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fece92c2-1be2-45ea-840e-11294a319313" (UID: "fece92c2-1be2-45ea-840e-11294a319313"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:28:14.533105 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:14.533025 2571 reconciler_common.go:299] "Volume detached for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fece92c2-1be2-45ea-840e-11294a319313-message-dumper-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:28:14.533105 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:14.533054 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fece92c2-1be2-45ea-840e-11294a319313-proxy-tls\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:28:14.533105 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:14.533065 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n9qkf\" (UniqueName: \"kubernetes.io/projected/fece92c2-1be2-45ea-840e-11294a319313-kube-api-access-n9qkf\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:28:14.613910 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:14.613873 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf"] Apr 28 19:28:14.614318 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:14.614265 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" podUID="dde95831-d877-4b72-a1c1-c6affe564d4a" containerName="kserve-container" containerID="cri-o://4b8c8ff81a4a6cfab93481163ef0c304a43e909493fdf968e8025bdda10077ce" gracePeriod=30 Apr 28 19:28:14.614318 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:14.614293 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" podUID="dde95831-d877-4b72-a1c1-c6affe564d4a" containerName="kube-rbac-proxy" containerID="cri-o://ca5df778f702e080ab057c20f10beb858ddedc64b40f30b0722082302483427f" gracePeriod=30 Apr 28 19:28:14.614779 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:14.614306 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" podUID="dde95831-d877-4b72-a1c1-c6affe564d4a" containerName="agent" containerID="cri-o://31fd1ee458f236f9595395b98084c06ebdc3042ec3d476c080959da45e5fc245" gracePeriod=30 Apr 28 19:28:14.711397 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:14.711365 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-x8c6j"] Apr 28 19:28:14.717047 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:14.717018 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-x8c6j"] Apr 28 19:28:14.834866 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:14.834784 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-ltxg9"] Apr 28 19:28:14.835181 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:14.835163 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fece92c2-1be2-45ea-840e-11294a319313" containerName="kserve-container" Apr 28 19:28:14.835181 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:14.835180 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="fece92c2-1be2-45ea-840e-11294a319313" containerName="kserve-container" Apr 28 19:28:14.835278 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:14.835189 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4577fbb5-3556-446e-a088-0e3d5239f4ce" containerName="kserve-container" Apr 28 19:28:14.835278 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:14.835195 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="4577fbb5-3556-446e-a088-0e3d5239f4ce" containerName="kserve-container" Apr 28 19:28:14.835278 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:14.835204 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4577fbb5-3556-446e-a088-0e3d5239f4ce" containerName="agent" Apr 28 19:28:14.835278 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:14.835210 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="4577fbb5-3556-446e-a088-0e3d5239f4ce" containerName="agent" Apr 28 19:28:14.835278 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:14.835221 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4577fbb5-3556-446e-a088-0e3d5239f4ce" containerName="storage-initializer" Apr 28 19:28:14.835278 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:14.835226 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="4577fbb5-3556-446e-a088-0e3d5239f4ce" containerName="storage-initializer" Apr 28 19:28:14.835278 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:14.835234 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fece92c2-1be2-45ea-840e-11294a319313" containerName="kube-rbac-proxy" Apr 28 19:28:14.835278 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:14.835239 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="fece92c2-1be2-45ea-840e-11294a319313" containerName="kube-rbac-proxy" Apr 28 19:28:14.835278 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:14.835253 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4577fbb5-3556-446e-a088-0e3d5239f4ce" containerName="kube-rbac-proxy" Apr 28 19:28:14.835278 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:14.835258 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="4577fbb5-3556-446e-a088-0e3d5239f4ce" containerName="kube-rbac-proxy" Apr 28 19:28:14.835601 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:14.835306 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="fece92c2-1be2-45ea-840e-11294a319313" containerName="kube-rbac-proxy" Apr 28 19:28:14.835601 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:14.835315 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="4577fbb5-3556-446e-a088-0e3d5239f4ce" containerName="kserve-container" Apr 28 19:28:14.835601 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:14.835322 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="4577fbb5-3556-446e-a088-0e3d5239f4ce" containerName="kube-rbac-proxy" Apr 28 19:28:14.835601 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:14.835328 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="4577fbb5-3556-446e-a088-0e3d5239f4ce" containerName="agent" Apr 28 19:28:14.835601 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:14.835334 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="fece92c2-1be2-45ea-840e-11294a319313" containerName="kserve-container" Apr 28 19:28:14.838562 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:14.838545 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-ltxg9" Apr 28 19:28:14.840789 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:14.840766 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-kube-rbac-proxy-sar-config\"" Apr 28 19:28:14.840789 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:14.840781 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-predictor-serving-cert\"" Apr 28 19:28:14.848904 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:14.848882 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-ltxg9"] Apr 28 19:28:14.935695 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:14.935654 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5hrp\" (UniqueName: \"kubernetes.io/projected/29fd42c4-8bbe-4bea-b08f-b4e9dfffd607-kube-api-access-m5hrp\") pod \"isvc-lightgbm-predictor-bdf964bd-ltxg9\" (UID: \"29fd42c4-8bbe-4bea-b08f-b4e9dfffd607\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-ltxg9" Apr 28 19:28:14.935695 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:14.935696 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/29fd42c4-8bbe-4bea-b08f-b4e9dfffd607-kserve-provision-location\") pod \"isvc-lightgbm-predictor-bdf964bd-ltxg9\" (UID: \"29fd42c4-8bbe-4bea-b08f-b4e9dfffd607\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-ltxg9" Apr 28 19:28:14.936100 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:14.935795 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/29fd42c4-8bbe-4bea-b08f-b4e9dfffd607-isvc-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-predictor-bdf964bd-ltxg9\" (UID: \"29fd42c4-8bbe-4bea-b08f-b4e9dfffd607\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-ltxg9" Apr 28 19:28:14.936100 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:14.935816 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/29fd42c4-8bbe-4bea-b08f-b4e9dfffd607-proxy-tls\") pod \"isvc-lightgbm-predictor-bdf964bd-ltxg9\" (UID: \"29fd42c4-8bbe-4bea-b08f-b4e9dfffd607\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-ltxg9" Apr 28 19:28:15.037076 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:15.037035 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/29fd42c4-8bbe-4bea-b08f-b4e9dfffd607-isvc-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-predictor-bdf964bd-ltxg9\" (UID: \"29fd42c4-8bbe-4bea-b08f-b4e9dfffd607\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-ltxg9" Apr 28 19:28:15.037076 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:15.037073 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/29fd42c4-8bbe-4bea-b08f-b4e9dfffd607-proxy-tls\") pod \"isvc-lightgbm-predictor-bdf964bd-ltxg9\" (UID: \"29fd42c4-8bbe-4bea-b08f-b4e9dfffd607\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-ltxg9" Apr 28 19:28:15.037307 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:15.037098 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m5hrp\" (UniqueName: \"kubernetes.io/projected/29fd42c4-8bbe-4bea-b08f-b4e9dfffd607-kube-api-access-m5hrp\") pod \"isvc-lightgbm-predictor-bdf964bd-ltxg9\" (UID: \"29fd42c4-8bbe-4bea-b08f-b4e9dfffd607\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-ltxg9" Apr 28 19:28:15.037307 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:15.037128 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/29fd42c4-8bbe-4bea-b08f-b4e9dfffd607-kserve-provision-location\") pod \"isvc-lightgbm-predictor-bdf964bd-ltxg9\" (UID: \"29fd42c4-8bbe-4bea-b08f-b4e9dfffd607\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-ltxg9" Apr 28 19:28:15.037546 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:15.037527 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/29fd42c4-8bbe-4bea-b08f-b4e9dfffd607-kserve-provision-location\") pod \"isvc-lightgbm-predictor-bdf964bd-ltxg9\" (UID: \"29fd42c4-8bbe-4bea-b08f-b4e9dfffd607\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-ltxg9" Apr 28 19:28:15.037877 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:15.037859 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/29fd42c4-8bbe-4bea-b08f-b4e9dfffd607-isvc-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-predictor-bdf964bd-ltxg9\" (UID: \"29fd42c4-8bbe-4bea-b08f-b4e9dfffd607\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-ltxg9" Apr 28 19:28:15.039537 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:15.039518 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/29fd42c4-8bbe-4bea-b08f-b4e9dfffd607-proxy-tls\") pod \"isvc-lightgbm-predictor-bdf964bd-ltxg9\" (UID: \"29fd42c4-8bbe-4bea-b08f-b4e9dfffd607\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-ltxg9" Apr 28 19:28:15.045723 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:15.045705 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5hrp\" (UniqueName: \"kubernetes.io/projected/29fd42c4-8bbe-4bea-b08f-b4e9dfffd607-kube-api-access-m5hrp\") pod \"isvc-lightgbm-predictor-bdf964bd-ltxg9\" (UID: \"29fd42c4-8bbe-4bea-b08f-b4e9dfffd607\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-ltxg9" Apr 28 19:28:15.149654 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:15.149624 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-ltxg9" Apr 28 19:28:15.268752 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:15.268713 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-ltxg9"] Apr 28 19:28:15.271687 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:28:15.271660 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29fd42c4_8bbe_4bea_b08f_b4e9dfffd607.slice/crio-0361a7a2ebf7629838ff076eb6296ec3b718d3228ff23c37d5051c79885c6b73 WatchSource:0}: Error finding container 0361a7a2ebf7629838ff076eb6296ec3b718d3228ff23c37d5051c79885c6b73: Status 404 returned error can't find the container with id 0361a7a2ebf7629838ff076eb6296ec3b718d3228ff23c37d5051c79885c6b73 Apr 28 19:28:15.273549 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:15.273527 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 19:28:15.396820 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:15.396732 2571 generic.go:358] "Generic (PLEG): container finished" podID="dde95831-d877-4b72-a1c1-c6affe564d4a" containerID="ca5df778f702e080ab057c20f10beb858ddedc64b40f30b0722082302483427f" exitCode=2 Apr 28 19:28:15.396820 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:15.396800 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" event={"ID":"dde95831-d877-4b72-a1c1-c6affe564d4a","Type":"ContainerDied","Data":"ca5df778f702e080ab057c20f10beb858ddedc64b40f30b0722082302483427f"} Apr 28 19:28:15.398026 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:15.398001 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-ltxg9" event={"ID":"29fd42c4-8bbe-4bea-b08f-b4e9dfffd607","Type":"ContainerStarted","Data":"5f53f9a25cc99659e4524b94d594ed4b3c124efc697d4779cc113bcaa2e72c2d"} Apr 28 19:28:15.398123 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:15.398031 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-ltxg9" event={"ID":"29fd42c4-8bbe-4bea-b08f-b4e9dfffd607","Type":"ContainerStarted","Data":"0361a7a2ebf7629838ff076eb6296ec3b718d3228ff23c37d5051c79885c6b73"} Apr 28 19:28:15.947200 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:15.947165 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fece92c2-1be2-45ea-840e-11294a319313" path="/var/lib/kubelet/pods/fece92c2-1be2-45ea-840e-11294a319313/volumes" Apr 28 19:28:17.153213 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:17.153171 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" podUID="dde95831-d877-4b72-a1c1-c6affe564d4a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.28:8643/healthz\": dial tcp 10.134.0.28:8643: connect: connection refused" Apr 28 19:28:19.412111 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:19.412075 2571 generic.go:358] "Generic (PLEG): container finished" podID="dde95831-d877-4b72-a1c1-c6affe564d4a" containerID="4b8c8ff81a4a6cfab93481163ef0c304a43e909493fdf968e8025bdda10077ce" exitCode=0 Apr 28 19:28:19.412546 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:19.412142 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" event={"ID":"dde95831-d877-4b72-a1c1-c6affe564d4a","Type":"ContainerDied","Data":"4b8c8ff81a4a6cfab93481163ef0c304a43e909493fdf968e8025bdda10077ce"} Apr 28 19:28:19.413398 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:19.413376 2571 generic.go:358] "Generic (PLEG): container finished" podID="29fd42c4-8bbe-4bea-b08f-b4e9dfffd607" containerID="5f53f9a25cc99659e4524b94d594ed4b3c124efc697d4779cc113bcaa2e72c2d" exitCode=0 Apr 28 19:28:19.413520 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:19.413431 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-ltxg9" event={"ID":"29fd42c4-8bbe-4bea-b08f-b4e9dfffd607","Type":"ContainerDied","Data":"5f53f9a25cc99659e4524b94d594ed4b3c124efc697d4779cc113bcaa2e72c2d"} Apr 28 19:28:22.153396 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:22.153350 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" podUID="dde95831-d877-4b72-a1c1-c6affe564d4a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.28:8643/healthz\": dial tcp 10.134.0.28:8643: connect: connection refused" Apr 28 19:28:22.157696 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:22.157661 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" podUID="dde95831-d877-4b72-a1c1-c6affe564d4a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 28 19:28:22.158031 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:22.158000 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" podUID="dde95831-d877-4b72-a1c1-c6affe564d4a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:28:27.153589 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:27.153541 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" podUID="dde95831-d877-4b72-a1c1-c6affe564d4a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.28:8643/healthz\": dial tcp 10.134.0.28:8643: connect: connection refused" Apr 28 19:28:27.153993 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:27.153694 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" Apr 28 19:28:27.452822 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:27.452738 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-ltxg9" event={"ID":"29fd42c4-8bbe-4bea-b08f-b4e9dfffd607","Type":"ContainerStarted","Data":"962e30865e7b2e9fceac4077bf385bb785407086654b9e3bd5de9c2f29c4935c"} Apr 28 19:28:27.452822 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:27.452773 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-ltxg9" event={"ID":"29fd42c4-8bbe-4bea-b08f-b4e9dfffd607","Type":"ContainerStarted","Data":"ae68e05c9adae1a7373d799f5813e23e030c0d2cb673266a10f4dfcba5036fdd"} Apr 28 19:28:27.453074 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:27.453057 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-ltxg9" Apr 28 19:28:27.453228 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:27.453188 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-ltxg9" Apr 28 19:28:27.454350 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:27.454329 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-ltxg9" podUID="29fd42c4-8bbe-4bea-b08f-b4e9dfffd607" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 28 19:28:27.471158 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:27.471115 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-ltxg9" podStartSLOduration=6.437287247 podStartE2EDuration="13.471103649s" podCreationTimestamp="2026-04-28 19:28:14 +0000 UTC" firstStartedPulling="2026-04-28 19:28:19.4147121 +0000 UTC m=+716.045613024" lastFinishedPulling="2026-04-28 19:28:26.448528503 +0000 UTC m=+723.079429426" observedRunningTime="2026-04-28 19:28:27.469914804 +0000 UTC m=+724.100815749" watchObservedRunningTime="2026-04-28 19:28:27.471103649 +0000 UTC m=+724.102004593" Apr 28 19:28:28.456380 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:28.456341 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-ltxg9" podUID="29fd42c4-8bbe-4bea-b08f-b4e9dfffd607" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 28 19:28:32.153660 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:32.153618 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" podUID="dde95831-d877-4b72-a1c1-c6affe564d4a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.28:8643/healthz\": dial tcp 10.134.0.28:8643: connect: connection refused" Apr 28 19:28:32.156944 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:32.156916 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" podUID="dde95831-d877-4b72-a1c1-c6affe564d4a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 28 19:28:32.157239 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:32.157211 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" podUID="dde95831-d877-4b72-a1c1-c6affe564d4a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:28:33.460429 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:33.460401 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-ltxg9" Apr 28 19:28:33.460953 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:33.460922 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-ltxg9" podUID="29fd42c4-8bbe-4bea-b08f-b4e9dfffd607" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 28 19:28:37.152848 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:37.152803 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" podUID="dde95831-d877-4b72-a1c1-c6affe564d4a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.28:8643/healthz\": dial tcp 10.134.0.28:8643: connect: connection refused" Apr 28 19:28:42.153081 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:42.153039 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" podUID="dde95831-d877-4b72-a1c1-c6affe564d4a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.28:8643/healthz\": dial tcp 10.134.0.28:8643: connect: connection refused" Apr 28 19:28:42.157376 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:42.157349 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" podUID="dde95831-d877-4b72-a1c1-c6affe564d4a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 28 19:28:42.157473 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:42.157459 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" Apr 28 19:28:42.157726 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:42.157703 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" podUID="dde95831-d877-4b72-a1c1-c6affe564d4a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:28:42.157829 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:42.157818 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" Apr 28 19:28:43.461578 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:43.461541 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-ltxg9" podUID="29fd42c4-8bbe-4bea-b08f-b4e9dfffd607" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 28 19:28:45.261791 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:45.261769 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" Apr 28 19:28:45.399243 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:45.399148 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dde95831-d877-4b72-a1c1-c6affe564d4a-isvc-logger-kube-rbac-proxy-sar-config\") pod \"dde95831-d877-4b72-a1c1-c6affe564d4a\" (UID: \"dde95831-d877-4b72-a1c1-c6affe564d4a\") " Apr 28 19:28:45.399243 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:45.399208 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5rh9\" (UniqueName: \"kubernetes.io/projected/dde95831-d877-4b72-a1c1-c6affe564d4a-kube-api-access-r5rh9\") pod \"dde95831-d877-4b72-a1c1-c6affe564d4a\" (UID: \"dde95831-d877-4b72-a1c1-c6affe564d4a\") " Apr 28 19:28:45.399469 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:45.399265 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dde95831-d877-4b72-a1c1-c6affe564d4a-proxy-tls\") pod \"dde95831-d877-4b72-a1c1-c6affe564d4a\" (UID: \"dde95831-d877-4b72-a1c1-c6affe564d4a\") " Apr 28 19:28:45.399469 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:45.399340 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dde95831-d877-4b72-a1c1-c6affe564d4a-kserve-provision-location\") pod \"dde95831-d877-4b72-a1c1-c6affe564d4a\" (UID: \"dde95831-d877-4b72-a1c1-c6affe564d4a\") " Apr 28 19:28:45.399607 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:45.399542 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dde95831-d877-4b72-a1c1-c6affe564d4a-isvc-logger-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-logger-kube-rbac-proxy-sar-config") pod "dde95831-d877-4b72-a1c1-c6affe564d4a" (UID: "dde95831-d877-4b72-a1c1-c6affe564d4a"). InnerVolumeSpecName "isvc-logger-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:28:45.399659 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:45.399619 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dde95831-d877-4b72-a1c1-c6affe564d4a-isvc-logger-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:28:45.399770 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:45.399737 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dde95831-d877-4b72-a1c1-c6affe564d4a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "dde95831-d877-4b72-a1c1-c6affe564d4a" (UID: "dde95831-d877-4b72-a1c1-c6affe564d4a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:28:45.401315 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:45.401294 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dde95831-d877-4b72-a1c1-c6affe564d4a-kube-api-access-r5rh9" (OuterVolumeSpecName: "kube-api-access-r5rh9") pod "dde95831-d877-4b72-a1c1-c6affe564d4a" (UID: "dde95831-d877-4b72-a1c1-c6affe564d4a"). InnerVolumeSpecName "kube-api-access-r5rh9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:28:45.401392 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:45.401324 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dde95831-d877-4b72-a1c1-c6affe564d4a-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "dde95831-d877-4b72-a1c1-c6affe564d4a" (UID: "dde95831-d877-4b72-a1c1-c6affe564d4a"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:28:45.500979 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:45.500940 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dde95831-d877-4b72-a1c1-c6affe564d4a-kserve-provision-location\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:28:45.500979 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:45.500973 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r5rh9\" (UniqueName: \"kubernetes.io/projected/dde95831-d877-4b72-a1c1-c6affe564d4a-kube-api-access-r5rh9\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:28:45.500979 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:45.500985 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dde95831-d877-4b72-a1c1-c6affe564d4a-proxy-tls\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:28:45.509113 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:45.509082 2571 generic.go:358] "Generic (PLEG): container finished" podID="dde95831-d877-4b72-a1c1-c6affe564d4a" containerID="31fd1ee458f236f9595395b98084c06ebdc3042ec3d476c080959da45e5fc245" exitCode=0 Apr 28 19:28:45.509255 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:45.509142 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" event={"ID":"dde95831-d877-4b72-a1c1-c6affe564d4a","Type":"ContainerDied","Data":"31fd1ee458f236f9595395b98084c06ebdc3042ec3d476c080959da45e5fc245"} Apr 28 19:28:45.509255 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:45.509167 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" Apr 28 19:28:45.509255 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:45.509179 2571 scope.go:117] "RemoveContainer" containerID="31fd1ee458f236f9595395b98084c06ebdc3042ec3d476c080959da45e5fc245" Apr 28 19:28:45.509393 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:45.509169 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf" event={"ID":"dde95831-d877-4b72-a1c1-c6affe564d4a","Type":"ContainerDied","Data":"44fe3d9e5bfbe879b6412ba87f75d8804d9b581e72b41a179084fffe28b1e571"} Apr 28 19:28:45.517503 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:45.517470 2571 scope.go:117] "RemoveContainer" containerID="ca5df778f702e080ab057c20f10beb858ddedc64b40f30b0722082302483427f" Apr 28 19:28:45.524650 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:45.524627 2571 scope.go:117] "RemoveContainer" containerID="4b8c8ff81a4a6cfab93481163ef0c304a43e909493fdf968e8025bdda10077ce" Apr 28 19:28:45.530244 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:45.530217 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf"] Apr 28 19:28:45.532622 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:45.532597 2571 scope.go:117] "RemoveContainer" containerID="5f2b8b2b760d0c3f062eca3348d07aeb73c5adf6381b24da0a07618ccddacc8e" Apr 28 19:28:45.534707 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:45.534687 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-7d4db54646-dlmkf"] Apr 28 19:28:45.539840 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:45.539812 2571 scope.go:117] "RemoveContainer" containerID="31fd1ee458f236f9595395b98084c06ebdc3042ec3d476c080959da45e5fc245" Apr 28 19:28:45.540094 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:28:45.540077 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31fd1ee458f236f9595395b98084c06ebdc3042ec3d476c080959da45e5fc245\": container with ID starting with 31fd1ee458f236f9595395b98084c06ebdc3042ec3d476c080959da45e5fc245 not found: ID does not exist" containerID="31fd1ee458f236f9595395b98084c06ebdc3042ec3d476c080959da45e5fc245" Apr 28 19:28:45.540143 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:45.540104 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31fd1ee458f236f9595395b98084c06ebdc3042ec3d476c080959da45e5fc245"} err="failed to get container status \"31fd1ee458f236f9595395b98084c06ebdc3042ec3d476c080959da45e5fc245\": rpc error: code = NotFound desc = could not find container \"31fd1ee458f236f9595395b98084c06ebdc3042ec3d476c080959da45e5fc245\": container with ID starting with 31fd1ee458f236f9595395b98084c06ebdc3042ec3d476c080959da45e5fc245 not found: ID does not exist" Apr 28 19:28:45.540143 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:45.540120 2571 scope.go:117] "RemoveContainer" containerID="ca5df778f702e080ab057c20f10beb858ddedc64b40f30b0722082302483427f" Apr 28 19:28:45.540340 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:28:45.540325 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca5df778f702e080ab057c20f10beb858ddedc64b40f30b0722082302483427f\": container with ID starting with ca5df778f702e080ab057c20f10beb858ddedc64b40f30b0722082302483427f not found: ID does not exist" containerID="ca5df778f702e080ab057c20f10beb858ddedc64b40f30b0722082302483427f" Apr 28 19:28:45.540380 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:45.540343 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca5df778f702e080ab057c20f10beb858ddedc64b40f30b0722082302483427f"} err="failed to get container status \"ca5df778f702e080ab057c20f10beb858ddedc64b40f30b0722082302483427f\": rpc error: code = NotFound desc = could not find container \"ca5df778f702e080ab057c20f10beb858ddedc64b40f30b0722082302483427f\": container with ID starting with ca5df778f702e080ab057c20f10beb858ddedc64b40f30b0722082302483427f not found: ID does not exist" Apr 28 19:28:45.540380 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:45.540354 2571 scope.go:117] "RemoveContainer" containerID="4b8c8ff81a4a6cfab93481163ef0c304a43e909493fdf968e8025bdda10077ce" Apr 28 19:28:45.540546 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:28:45.540532 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b8c8ff81a4a6cfab93481163ef0c304a43e909493fdf968e8025bdda10077ce\": container with ID starting with 4b8c8ff81a4a6cfab93481163ef0c304a43e909493fdf968e8025bdda10077ce not found: ID does not exist" containerID="4b8c8ff81a4a6cfab93481163ef0c304a43e909493fdf968e8025bdda10077ce" Apr 28 19:28:45.540583 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:45.540549 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b8c8ff81a4a6cfab93481163ef0c304a43e909493fdf968e8025bdda10077ce"} err="failed to get container status \"4b8c8ff81a4a6cfab93481163ef0c304a43e909493fdf968e8025bdda10077ce\": rpc error: code = NotFound desc = could not find container \"4b8c8ff81a4a6cfab93481163ef0c304a43e909493fdf968e8025bdda10077ce\": container with ID starting with 4b8c8ff81a4a6cfab93481163ef0c304a43e909493fdf968e8025bdda10077ce not found: ID does not exist" Apr 28 19:28:45.540583 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:45.540561 2571 scope.go:117] "RemoveContainer" containerID="5f2b8b2b760d0c3f062eca3348d07aeb73c5adf6381b24da0a07618ccddacc8e" Apr 28 19:28:45.540791 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:28:45.540775 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f2b8b2b760d0c3f062eca3348d07aeb73c5adf6381b24da0a07618ccddacc8e\": container with ID starting with 5f2b8b2b760d0c3f062eca3348d07aeb73c5adf6381b24da0a07618ccddacc8e not found: ID does not exist" containerID="5f2b8b2b760d0c3f062eca3348d07aeb73c5adf6381b24da0a07618ccddacc8e" Apr 28 19:28:45.540839 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:45.540793 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f2b8b2b760d0c3f062eca3348d07aeb73c5adf6381b24da0a07618ccddacc8e"} err="failed to get container status \"5f2b8b2b760d0c3f062eca3348d07aeb73c5adf6381b24da0a07618ccddacc8e\": rpc error: code = NotFound desc = could not find container \"5f2b8b2b760d0c3f062eca3348d07aeb73c5adf6381b24da0a07618ccddacc8e\": container with ID starting with 5f2b8b2b760d0c3f062eca3348d07aeb73c5adf6381b24da0a07618ccddacc8e not found: ID does not exist" Apr 28 19:28:45.947092 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:45.947057 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dde95831-d877-4b72-a1c1-c6affe564d4a" path="/var/lib/kubelet/pods/dde95831-d877-4b72-a1c1-c6affe564d4a/volumes" Apr 28 19:28:53.461758 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:28:53.461723 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-ltxg9" podUID="29fd42c4-8bbe-4bea-b08f-b4e9dfffd607" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 28 19:29:03.461381 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:29:03.461342 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-ltxg9" podUID="29fd42c4-8bbe-4bea-b08f-b4e9dfffd607" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 28 19:29:13.461860 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:29:13.461812 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-ltxg9" podUID="29fd42c4-8bbe-4bea-b08f-b4e9dfffd607" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 28 19:29:23.461244 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:29:23.461209 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-ltxg9" podUID="29fd42c4-8bbe-4bea-b08f-b4e9dfffd607" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 28 19:29:33.461021 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:29:33.460977 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-ltxg9" podUID="29fd42c4-8bbe-4bea-b08f-b4e9dfffd607" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 28 19:29:40.942969 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:29:40.942866 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-ltxg9" podUID="29fd42c4-8bbe-4bea-b08f-b4e9dfffd607" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 28 19:29:50.943667 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:29:50.943633 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-ltxg9" Apr 28 19:29:54.924510 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:29:54.924460 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-ltxg9"] Apr 28 19:29:54.924882 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:29:54.924847 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-ltxg9" podUID="29fd42c4-8bbe-4bea-b08f-b4e9dfffd607" containerName="kserve-container" containerID="cri-o://ae68e05c9adae1a7373d799f5813e23e030c0d2cb673266a10f4dfcba5036fdd" gracePeriod=30 Apr 28 19:29:54.924944 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:29:54.924879 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-ltxg9" podUID="29fd42c4-8bbe-4bea-b08f-b4e9dfffd607" containerName="kube-rbac-proxy" containerID="cri-o://962e30865e7b2e9fceac4077bf385bb785407086654b9e3bd5de9c2f29c4935c" gracePeriod=30 Apr 28 19:29:55.134635 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:29:55.134597 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-67zf9"] Apr 28 19:29:55.135130 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:29:55.135111 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dde95831-d877-4b72-a1c1-c6affe564d4a" containerName="kube-rbac-proxy" Apr 28 19:29:55.135172 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:29:55.135136 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="dde95831-d877-4b72-a1c1-c6affe564d4a" containerName="kube-rbac-proxy" Apr 28 19:29:55.135172 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:29:55.135155 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dde95831-d877-4b72-a1c1-c6affe564d4a" containerName="kserve-container" Apr 28 19:29:55.135172 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:29:55.135164 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="dde95831-d877-4b72-a1c1-c6affe564d4a" containerName="kserve-container" Apr 28 19:29:55.135263 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:29:55.135178 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dde95831-d877-4b72-a1c1-c6affe564d4a" containerName="storage-initializer" Apr 28 19:29:55.135263 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:29:55.135190 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="dde95831-d877-4b72-a1c1-c6affe564d4a" containerName="storage-initializer" Apr 28 19:29:55.135263 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:29:55.135215 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dde95831-d877-4b72-a1c1-c6affe564d4a" containerName="agent" Apr 28 19:29:55.135263 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:29:55.135223 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="dde95831-d877-4b72-a1c1-c6affe564d4a" containerName="agent" Apr 28 19:29:55.135376 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:29:55.135300 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="dde95831-d877-4b72-a1c1-c6affe564d4a" containerName="kube-rbac-proxy" Apr 28 19:29:55.135376 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:29:55.135314 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="dde95831-d877-4b72-a1c1-c6affe564d4a" containerName="agent" Apr 28 19:29:55.135376 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:29:55.135329 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="dde95831-d877-4b72-a1c1-c6affe564d4a" containerName="kserve-container" Apr 28 19:29:55.138744 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:29:55.138727 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-67zf9" Apr 28 19:29:55.140862 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:29:55.140836 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-runtime-predictor-serving-cert\"" Apr 28 19:29:55.140981 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:29:55.140862 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\"" Apr 28 19:29:55.146625 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:29:55.146603 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-67zf9"] Apr 28 19:29:55.176177 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:29:55.176111 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzhnm\" (UniqueName: \"kubernetes.io/projected/7320bf13-d054-4139-ad23-33215e059411-kube-api-access-kzhnm\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-67zf9\" (UID: \"7320bf13-d054-4139-ad23-33215e059411\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-67zf9" Apr 28 19:29:55.176177 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:29:55.176144 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7320bf13-d054-4139-ad23-33215e059411-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-67zf9\" (UID: \"7320bf13-d054-4139-ad23-33215e059411\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-67zf9" Apr 28 19:29:55.176177 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:29:55.176174 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7320bf13-d054-4139-ad23-33215e059411-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-67zf9\" (UID: \"7320bf13-d054-4139-ad23-33215e059411\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-67zf9" Apr 28 19:29:55.176368 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:29:55.176240 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7320bf13-d054-4139-ad23-33215e059411-proxy-tls\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-67zf9\" (UID: \"7320bf13-d054-4139-ad23-33215e059411\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-67zf9" Apr 28 19:29:55.276809 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:29:55.276776 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kzhnm\" (UniqueName: \"kubernetes.io/projected/7320bf13-d054-4139-ad23-33215e059411-kube-api-access-kzhnm\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-67zf9\" (UID: \"7320bf13-d054-4139-ad23-33215e059411\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-67zf9" Apr 28 19:29:55.276939 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:29:55.276821 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7320bf13-d054-4139-ad23-33215e059411-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-67zf9\" (UID: \"7320bf13-d054-4139-ad23-33215e059411\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-67zf9" Apr 28 19:29:55.276939 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:29:55.276857 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7320bf13-d054-4139-ad23-33215e059411-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-67zf9\" (UID: \"7320bf13-d054-4139-ad23-33215e059411\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-67zf9" Apr 28 19:29:55.276939 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:29:55.276888 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7320bf13-d054-4139-ad23-33215e059411-proxy-tls\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-67zf9\" (UID: \"7320bf13-d054-4139-ad23-33215e059411\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-67zf9" Apr 28 19:29:55.277347 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:29:55.277320 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7320bf13-d054-4139-ad23-33215e059411-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-67zf9\" (UID: \"7320bf13-d054-4139-ad23-33215e059411\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-67zf9" Apr 28 19:29:55.277581 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:29:55.277563 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7320bf13-d054-4139-ad23-33215e059411-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-67zf9\" (UID: \"7320bf13-d054-4139-ad23-33215e059411\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-67zf9" Apr 28 19:29:55.279301 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:29:55.279279 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7320bf13-d054-4139-ad23-33215e059411-proxy-tls\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-67zf9\" (UID: \"7320bf13-d054-4139-ad23-33215e059411\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-67zf9" Apr 28 19:29:55.284442 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:29:55.284416 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzhnm\" (UniqueName: \"kubernetes.io/projected/7320bf13-d054-4139-ad23-33215e059411-kube-api-access-kzhnm\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-67zf9\" (UID: \"7320bf13-d054-4139-ad23-33215e059411\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-67zf9" Apr 28 19:29:55.449894 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:29:55.449804 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-67zf9" Apr 28 19:29:55.573244 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:29:55.573220 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-67zf9"] Apr 28 19:29:55.575243 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:29:55.575217 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7320bf13_d054_4139_ad23_33215e059411.slice/crio-6f0e39f51963469a0d729ed11fcc65b182b66c4cdde68138ad9cebe3f9fdc98e WatchSource:0}: Error finding container 6f0e39f51963469a0d729ed11fcc65b182b66c4cdde68138ad9cebe3f9fdc98e: Status 404 returned error can't find the container with id 6f0e39f51963469a0d729ed11fcc65b182b66c4cdde68138ad9cebe3f9fdc98e Apr 28 19:29:55.724447 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:29:55.724354 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-67zf9" event={"ID":"7320bf13-d054-4139-ad23-33215e059411","Type":"ContainerStarted","Data":"6e72366576ed5be55326a31640ce8ab24a22f72f7fd748d68664b418028f0d3d"} Apr 28 19:29:55.724447 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:29:55.724402 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-67zf9" event={"ID":"7320bf13-d054-4139-ad23-33215e059411","Type":"ContainerStarted","Data":"6f0e39f51963469a0d729ed11fcc65b182b66c4cdde68138ad9cebe3f9fdc98e"} Apr 28 19:29:55.726348 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:29:55.726322 2571 generic.go:358] "Generic (PLEG): container finished" podID="29fd42c4-8bbe-4bea-b08f-b4e9dfffd607" containerID="962e30865e7b2e9fceac4077bf385bb785407086654b9e3bd5de9c2f29c4935c" exitCode=2 Apr 28 19:29:55.726461 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:29:55.726384 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-ltxg9" event={"ID":"29fd42c4-8bbe-4bea-b08f-b4e9dfffd607","Type":"ContainerDied","Data":"962e30865e7b2e9fceac4077bf385bb785407086654b9e3bd5de9c2f29c4935c"} Apr 28 19:29:58.457531 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:29:58.457456 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-ltxg9" podUID="29fd42c4-8bbe-4bea-b08f-b4e9dfffd607" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.29:8643/healthz\": dial tcp 10.134.0.29:8643: connect: connection refused" Apr 28 19:29:59.741406 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:29:59.741377 2571 generic.go:358] "Generic (PLEG): container finished" podID="7320bf13-d054-4139-ad23-33215e059411" containerID="6e72366576ed5be55326a31640ce8ab24a22f72f7fd748d68664b418028f0d3d" exitCode=0 Apr 28 19:29:59.741750 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:29:59.741446 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-67zf9" event={"ID":"7320bf13-d054-4139-ad23-33215e059411","Type":"ContainerDied","Data":"6e72366576ed5be55326a31640ce8ab24a22f72f7fd748d68664b418028f0d3d"} Apr 28 19:29:59.743525 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:29:59.743502 2571 generic.go:358] "Generic (PLEG): container finished" podID="29fd42c4-8bbe-4bea-b08f-b4e9dfffd607" containerID="ae68e05c9adae1a7373d799f5813e23e030c0d2cb673266a10f4dfcba5036fdd" exitCode=0 Apr 28 19:29:59.743625 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:29:59.743579 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-ltxg9" event={"ID":"29fd42c4-8bbe-4bea-b08f-b4e9dfffd607","Type":"ContainerDied","Data":"ae68e05c9adae1a7373d799f5813e23e030c0d2cb673266a10f4dfcba5036fdd"} Apr 28 19:29:59.878747 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:29:59.878723 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-ltxg9" Apr 28 19:29:59.910531 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:29:59.910473 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/29fd42c4-8bbe-4bea-b08f-b4e9dfffd607-kserve-provision-location\") pod \"29fd42c4-8bbe-4bea-b08f-b4e9dfffd607\" (UID: \"29fd42c4-8bbe-4bea-b08f-b4e9dfffd607\") " Apr 28 19:29:59.910675 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:29:59.910584 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/29fd42c4-8bbe-4bea-b08f-b4e9dfffd607-proxy-tls\") pod \"29fd42c4-8bbe-4bea-b08f-b4e9dfffd607\" (UID: \"29fd42c4-8bbe-4bea-b08f-b4e9dfffd607\") " Apr 28 19:29:59.910675 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:29:59.910621 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/29fd42c4-8bbe-4bea-b08f-b4e9dfffd607-isvc-lightgbm-kube-rbac-proxy-sar-config\") pod \"29fd42c4-8bbe-4bea-b08f-b4e9dfffd607\" (UID: \"29fd42c4-8bbe-4bea-b08f-b4e9dfffd607\") " Apr 28 19:29:59.910675 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:29:59.910670 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5hrp\" (UniqueName: \"kubernetes.io/projected/29fd42c4-8bbe-4bea-b08f-b4e9dfffd607-kube-api-access-m5hrp\") pod \"29fd42c4-8bbe-4bea-b08f-b4e9dfffd607\" (UID: \"29fd42c4-8bbe-4bea-b08f-b4e9dfffd607\") " Apr 28 19:29:59.910826 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:29:59.910783 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29fd42c4-8bbe-4bea-b08f-b4e9dfffd607-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "29fd42c4-8bbe-4bea-b08f-b4e9dfffd607" (UID: "29fd42c4-8bbe-4bea-b08f-b4e9dfffd607"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:29:59.911018 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:29:59.910995 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/29fd42c4-8bbe-4bea-b08f-b4e9dfffd607-kserve-provision-location\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:29:59.911124 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:29:59.911015 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29fd42c4-8bbe-4bea-b08f-b4e9dfffd607-isvc-lightgbm-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-lightgbm-kube-rbac-proxy-sar-config") pod "29fd42c4-8bbe-4bea-b08f-b4e9dfffd607" (UID: "29fd42c4-8bbe-4bea-b08f-b4e9dfffd607"). InnerVolumeSpecName "isvc-lightgbm-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:29:59.912737 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:29:59.912704 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29fd42c4-8bbe-4bea-b08f-b4e9dfffd607-kube-api-access-m5hrp" (OuterVolumeSpecName: "kube-api-access-m5hrp") pod "29fd42c4-8bbe-4bea-b08f-b4e9dfffd607" (UID: "29fd42c4-8bbe-4bea-b08f-b4e9dfffd607"). InnerVolumeSpecName "kube-api-access-m5hrp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:29:59.913061 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:29:59.913044 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29fd42c4-8bbe-4bea-b08f-b4e9dfffd607-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "29fd42c4-8bbe-4bea-b08f-b4e9dfffd607" (UID: "29fd42c4-8bbe-4bea-b08f-b4e9dfffd607"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:30:00.011462 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:30:00.011373 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/29fd42c4-8bbe-4bea-b08f-b4e9dfffd607-proxy-tls\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:30:00.011462 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:30:00.011403 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/29fd42c4-8bbe-4bea-b08f-b4e9dfffd607-isvc-lightgbm-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:30:00.011462 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:30:00.011414 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m5hrp\" (UniqueName: \"kubernetes.io/projected/29fd42c4-8bbe-4bea-b08f-b4e9dfffd607-kube-api-access-m5hrp\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:30:00.748625 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:30:00.748574 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-67zf9" event={"ID":"7320bf13-d054-4139-ad23-33215e059411","Type":"ContainerStarted","Data":"0d89200060f370e9a24b89239625af6141d021160867f0a23665192d2b5b6317"} Apr 28 19:30:00.748625 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:30:00.748624 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-67zf9" event={"ID":"7320bf13-d054-4139-ad23-33215e059411","Type":"ContainerStarted","Data":"e00b8eaa2a5e965dff4f5978305185eb48895c85773a4b6ecd4a64eb86caf51a"} Apr 28 19:30:00.749140 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:30:00.748892 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-67zf9" Apr 28 19:30:00.750298 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:30:00.750273 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-ltxg9" event={"ID":"29fd42c4-8bbe-4bea-b08f-b4e9dfffd607","Type":"ContainerDied","Data":"0361a7a2ebf7629838ff076eb6296ec3b718d3228ff23c37d5051c79885c6b73"} Apr 28 19:30:00.750400 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:30:00.750311 2571 scope.go:117] "RemoveContainer" containerID="962e30865e7b2e9fceac4077bf385bb785407086654b9e3bd5de9c2f29c4935c" Apr 28 19:30:00.750400 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:30:00.750286 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-ltxg9" Apr 28 19:30:00.760156 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:30:00.758971 2571 scope.go:117] "RemoveContainer" containerID="ae68e05c9adae1a7373d799f5813e23e030c0d2cb673266a10f4dfcba5036fdd" Apr 28 19:30:00.767642 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:30:00.767529 2571 scope.go:117] "RemoveContainer" containerID="5f53f9a25cc99659e4524b94d594ed4b3c124efc697d4779cc113bcaa2e72c2d" Apr 28 19:30:00.769123 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:30:00.769086 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-67zf9" podStartSLOduration=5.7690732 podStartE2EDuration="5.7690732s" podCreationTimestamp="2026-04-28 19:29:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:30:00.76704423 +0000 UTC m=+817.397945176" watchObservedRunningTime="2026-04-28 19:30:00.7690732 +0000 UTC m=+817.399974145" Apr 28 19:30:00.780862 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:30:00.780839 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-ltxg9"] Apr 28 19:30:00.786340 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:30:00.786316 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-ltxg9"] Apr 28 19:30:01.753985 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:30:01.753955 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-67zf9" Apr 28 19:30:01.755245 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:30:01.755221 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-67zf9" podUID="7320bf13-d054-4139-ad23-33215e059411" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 28 19:30:01.946494 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:30:01.946443 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29fd42c4-8bbe-4bea-b08f-b4e9dfffd607" path="/var/lib/kubelet/pods/29fd42c4-8bbe-4bea-b08f-b4e9dfffd607/volumes" Apr 28 19:30:02.757270 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:30:02.757221 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-67zf9" podUID="7320bf13-d054-4139-ad23-33215e059411" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 28 19:30:07.761953 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:30:07.761912 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-67zf9" Apr 28 19:30:07.762532 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:30:07.762470 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-67zf9" podUID="7320bf13-d054-4139-ad23-33215e059411" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 28 19:30:17.763243 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:30:17.763199 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-67zf9" podUID="7320bf13-d054-4139-ad23-33215e059411" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 28 19:30:27.763036 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:30:27.762992 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-67zf9" podUID="7320bf13-d054-4139-ad23-33215e059411" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 28 19:30:37.762981 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:30:37.762932 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-67zf9" podUID="7320bf13-d054-4139-ad23-33215e059411" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 28 19:30:47.762433 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:30:47.762390 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-67zf9" podUID="7320bf13-d054-4139-ad23-33215e059411" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 28 19:30:57.762983 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:30:57.762938 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-67zf9" podUID="7320bf13-d054-4139-ad23-33215e059411" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 28 19:31:07.762798 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:07.762752 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-67zf9" podUID="7320bf13-d054-4139-ad23-33215e059411" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 28 19:31:17.763697 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:17.763600 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-67zf9" Apr 28 19:31:23.880570 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:23.880540 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ppk4t_238e19ca-102f-43b1-8aed-9322ca47bfc9/ovn-acl-logging/0.log" Apr 28 19:31:23.883041 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:23.883016 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ppk4t_238e19ca-102f-43b1-8aed-9322ca47bfc9/ovn-acl-logging/0.log" Apr 28 19:31:25.315573 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:25.315540 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-67zf9"] Apr 28 19:31:25.316033 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:25.315985 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-67zf9" podUID="7320bf13-d054-4139-ad23-33215e059411" containerName="kserve-container" containerID="cri-o://e00b8eaa2a5e965dff4f5978305185eb48895c85773a4b6ecd4a64eb86caf51a" gracePeriod=30 Apr 28 19:31:25.316099 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:25.316033 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-67zf9" podUID="7320bf13-d054-4139-ad23-33215e059411" containerName="kube-rbac-proxy" containerID="cri-o://0d89200060f370e9a24b89239625af6141d021160867f0a23665192d2b5b6317" gracePeriod=30 Apr 28 19:31:25.533672 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:25.533639 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-z2x9p"] Apr 28 19:31:25.533991 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:25.533975 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="29fd42c4-8bbe-4bea-b08f-b4e9dfffd607" containerName="storage-initializer" Apr 28 19:31:25.534071 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:25.533993 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="29fd42c4-8bbe-4bea-b08f-b4e9dfffd607" containerName="storage-initializer" Apr 28 19:31:25.534071 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:25.534007 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="29fd42c4-8bbe-4bea-b08f-b4e9dfffd607" containerName="kube-rbac-proxy" Apr 28 19:31:25.534071 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:25.534016 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="29fd42c4-8bbe-4bea-b08f-b4e9dfffd607" containerName="kube-rbac-proxy" Apr 28 19:31:25.534071 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:25.534023 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="29fd42c4-8bbe-4bea-b08f-b4e9dfffd607" containerName="kserve-container" Apr 28 19:31:25.534071 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:25.534028 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="29fd42c4-8bbe-4bea-b08f-b4e9dfffd607" containerName="kserve-container" Apr 28 19:31:25.534223 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:25.534091 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="29fd42c4-8bbe-4bea-b08f-b4e9dfffd607" containerName="kserve-container" Apr 28 19:31:25.534223 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:25.534104 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="29fd42c4-8bbe-4bea-b08f-b4e9dfffd607" containerName="kube-rbac-proxy" Apr 28 19:31:25.537134 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:25.537114 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-z2x9p" Apr 28 19:31:25.539057 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:25.539023 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-v2-runtime-predictor-serving-cert\"" Apr 28 19:31:25.539165 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:25.539148 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\"" Apr 28 19:31:25.547665 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:25.547637 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-z2x9p"] Apr 28 19:31:25.626847 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:25.626765 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/df8b6436-2aca-4af9-ad96-190b70d1f5a0-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-z2x9p\" (UID: \"df8b6436-2aca-4af9-ad96-190b70d1f5a0\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-z2x9p" Apr 28 19:31:25.626847 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:25.626821 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/df8b6436-2aca-4af9-ad96-190b70d1f5a0-proxy-tls\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-z2x9p\" (UID: \"df8b6436-2aca-4af9-ad96-190b70d1f5a0\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-z2x9p" Apr 28 19:31:25.627028 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:25.626884 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9nsv\" (UniqueName: \"kubernetes.io/projected/df8b6436-2aca-4af9-ad96-190b70d1f5a0-kube-api-access-f9nsv\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-z2x9p\" (UID: \"df8b6436-2aca-4af9-ad96-190b70d1f5a0\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-z2x9p" Apr 28 19:31:25.627028 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:25.626914 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/df8b6436-2aca-4af9-ad96-190b70d1f5a0-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-z2x9p\" (UID: \"df8b6436-2aca-4af9-ad96-190b70d1f5a0\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-z2x9p" Apr 28 19:31:25.727499 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:25.727440 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/df8b6436-2aca-4af9-ad96-190b70d1f5a0-proxy-tls\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-z2x9p\" (UID: \"df8b6436-2aca-4af9-ad96-190b70d1f5a0\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-z2x9p" Apr 28 19:31:25.727653 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:31:25.727611 2571 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-serving-cert: secret "isvc-lightgbm-v2-runtime-predictor-serving-cert" not found Apr 28 19:31:25.727653 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:25.727619 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f9nsv\" (UniqueName: \"kubernetes.io/projected/df8b6436-2aca-4af9-ad96-190b70d1f5a0-kube-api-access-f9nsv\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-z2x9p\" (UID: \"df8b6436-2aca-4af9-ad96-190b70d1f5a0\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-z2x9p" Apr 28 19:31:25.727731 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:25.727655 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/df8b6436-2aca-4af9-ad96-190b70d1f5a0-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-z2x9p\" (UID: \"df8b6436-2aca-4af9-ad96-190b70d1f5a0\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-z2x9p" Apr 28 19:31:25.727731 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:31:25.727684 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df8b6436-2aca-4af9-ad96-190b70d1f5a0-proxy-tls podName:df8b6436-2aca-4af9-ad96-190b70d1f5a0 nodeName:}" failed. No retries permitted until 2026-04-28 19:31:26.227664443 +0000 UTC m=+902.858565367 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/df8b6436-2aca-4af9-ad96-190b70d1f5a0-proxy-tls") pod "isvc-lightgbm-v2-runtime-predictor-8765c9667-z2x9p" (UID: "df8b6436-2aca-4af9-ad96-190b70d1f5a0") : secret "isvc-lightgbm-v2-runtime-predictor-serving-cert" not found Apr 28 19:31:25.727731 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:25.727726 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/df8b6436-2aca-4af9-ad96-190b70d1f5a0-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-z2x9p\" (UID: \"df8b6436-2aca-4af9-ad96-190b70d1f5a0\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-z2x9p" Apr 28 19:31:25.728060 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:25.728039 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/df8b6436-2aca-4af9-ad96-190b70d1f5a0-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-z2x9p\" (UID: \"df8b6436-2aca-4af9-ad96-190b70d1f5a0\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-z2x9p" Apr 28 19:31:25.728316 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:25.728299 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/df8b6436-2aca-4af9-ad96-190b70d1f5a0-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-z2x9p\" (UID: \"df8b6436-2aca-4af9-ad96-190b70d1f5a0\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-z2x9p" Apr 28 19:31:25.735867 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:25.735838 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9nsv\" (UniqueName: \"kubernetes.io/projected/df8b6436-2aca-4af9-ad96-190b70d1f5a0-kube-api-access-f9nsv\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-z2x9p\" (UID: \"df8b6436-2aca-4af9-ad96-190b70d1f5a0\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-z2x9p" Apr 28 19:31:26.007581 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:26.007544 2571 generic.go:358] "Generic (PLEG): container finished" podID="7320bf13-d054-4139-ad23-33215e059411" containerID="0d89200060f370e9a24b89239625af6141d021160867f0a23665192d2b5b6317" exitCode=2 Apr 28 19:31:26.007747 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:26.007617 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-67zf9" event={"ID":"7320bf13-d054-4139-ad23-33215e059411","Type":"ContainerDied","Data":"0d89200060f370e9a24b89239625af6141d021160867f0a23665192d2b5b6317"} Apr 28 19:31:26.231706 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:26.231671 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/df8b6436-2aca-4af9-ad96-190b70d1f5a0-proxy-tls\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-z2x9p\" (UID: \"df8b6436-2aca-4af9-ad96-190b70d1f5a0\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-z2x9p" Apr 28 19:31:26.234042 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:26.234022 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/df8b6436-2aca-4af9-ad96-190b70d1f5a0-proxy-tls\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-z2x9p\" (UID: \"df8b6436-2aca-4af9-ad96-190b70d1f5a0\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-z2x9p" Apr 28 19:31:26.447554 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:26.447519 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-z2x9p" Apr 28 19:31:26.568970 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:26.568946 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-z2x9p"] Apr 28 19:31:26.571064 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:31:26.571037 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf8b6436_2aca_4af9_ad96_190b70d1f5a0.slice/crio-9b2b99fbed111f45e258b529348763163634de0f8898101e12eef2a44d5a799f WatchSource:0}: Error finding container 9b2b99fbed111f45e258b529348763163634de0f8898101e12eef2a44d5a799f: Status 404 returned error can't find the container with id 9b2b99fbed111f45e258b529348763163634de0f8898101e12eef2a44d5a799f Apr 28 19:31:27.012209 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:27.012170 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-z2x9p" event={"ID":"df8b6436-2aca-4af9-ad96-190b70d1f5a0","Type":"ContainerStarted","Data":"3e2a026244559719d3de65a88f7f7373e66873a32ba823b62d8cdf6da73d9d59"} Apr 28 19:31:27.012392 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:27.012215 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-z2x9p" event={"ID":"df8b6436-2aca-4af9-ad96-190b70d1f5a0","Type":"ContainerStarted","Data":"9b2b99fbed111f45e258b529348763163634de0f8898101e12eef2a44d5a799f"} Apr 28 19:31:27.758107 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:27.758064 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-67zf9" podUID="7320bf13-d054-4139-ad23-33215e059411" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.30:8643/healthz\": dial tcp 10.134.0.30:8643: connect: connection refused" Apr 28 19:31:27.762449 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:27.762416 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-67zf9" podUID="7320bf13-d054-4139-ad23-33215e059411" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 28 19:31:29.963189 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:29.963166 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-67zf9" Apr 28 19:31:30.024383 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:30.024351 2571 generic.go:358] "Generic (PLEG): container finished" podID="7320bf13-d054-4139-ad23-33215e059411" containerID="e00b8eaa2a5e965dff4f5978305185eb48895c85773a4b6ecd4a64eb86caf51a" exitCode=0 Apr 28 19:31:30.024551 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:30.024416 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-67zf9" event={"ID":"7320bf13-d054-4139-ad23-33215e059411","Type":"ContainerDied","Data":"e00b8eaa2a5e965dff4f5978305185eb48895c85773a4b6ecd4a64eb86caf51a"} Apr 28 19:31:30.024551 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:30.024431 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-67zf9" Apr 28 19:31:30.024551 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:30.024449 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-67zf9" event={"ID":"7320bf13-d054-4139-ad23-33215e059411","Type":"ContainerDied","Data":"6f0e39f51963469a0d729ed11fcc65b182b66c4cdde68138ad9cebe3f9fdc98e"} Apr 28 19:31:30.024551 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:30.024468 2571 scope.go:117] "RemoveContainer" containerID="0d89200060f370e9a24b89239625af6141d021160867f0a23665192d2b5b6317" Apr 28 19:31:30.032056 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:30.032040 2571 scope.go:117] "RemoveContainer" containerID="e00b8eaa2a5e965dff4f5978305185eb48895c85773a4b6ecd4a64eb86caf51a" Apr 28 19:31:30.039012 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:30.038992 2571 scope.go:117] "RemoveContainer" containerID="6e72366576ed5be55326a31640ce8ab24a22f72f7fd748d68664b418028f0d3d" Apr 28 19:31:30.045907 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:30.045892 2571 scope.go:117] "RemoveContainer" containerID="0d89200060f370e9a24b89239625af6141d021160867f0a23665192d2b5b6317" Apr 28 19:31:30.046144 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:31:30.046125 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d89200060f370e9a24b89239625af6141d021160867f0a23665192d2b5b6317\": container with ID starting with 0d89200060f370e9a24b89239625af6141d021160867f0a23665192d2b5b6317 not found: ID does not exist" containerID="0d89200060f370e9a24b89239625af6141d021160867f0a23665192d2b5b6317" Apr 28 19:31:30.046190 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:30.046154 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d89200060f370e9a24b89239625af6141d021160867f0a23665192d2b5b6317"} err="failed to get container status \"0d89200060f370e9a24b89239625af6141d021160867f0a23665192d2b5b6317\": rpc error: code = NotFound desc = could not find container \"0d89200060f370e9a24b89239625af6141d021160867f0a23665192d2b5b6317\": container with ID starting with 0d89200060f370e9a24b89239625af6141d021160867f0a23665192d2b5b6317 not found: ID does not exist" Apr 28 19:31:30.046190 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:30.046176 2571 scope.go:117] "RemoveContainer" containerID="e00b8eaa2a5e965dff4f5978305185eb48895c85773a4b6ecd4a64eb86caf51a" Apr 28 19:31:30.046373 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:31:30.046356 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e00b8eaa2a5e965dff4f5978305185eb48895c85773a4b6ecd4a64eb86caf51a\": container with ID starting with e00b8eaa2a5e965dff4f5978305185eb48895c85773a4b6ecd4a64eb86caf51a not found: ID does not exist" containerID="e00b8eaa2a5e965dff4f5978305185eb48895c85773a4b6ecd4a64eb86caf51a" Apr 28 19:31:30.046420 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:30.046378 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e00b8eaa2a5e965dff4f5978305185eb48895c85773a4b6ecd4a64eb86caf51a"} err="failed to get container status \"e00b8eaa2a5e965dff4f5978305185eb48895c85773a4b6ecd4a64eb86caf51a\": rpc error: code = NotFound desc = could not find container \"e00b8eaa2a5e965dff4f5978305185eb48895c85773a4b6ecd4a64eb86caf51a\": container with ID starting with e00b8eaa2a5e965dff4f5978305185eb48895c85773a4b6ecd4a64eb86caf51a not found: ID does not exist" Apr 28 19:31:30.046420 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:30.046393 2571 scope.go:117] "RemoveContainer" containerID="6e72366576ed5be55326a31640ce8ab24a22f72f7fd748d68664b418028f0d3d" Apr 28 19:31:30.046631 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:31:30.046614 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e72366576ed5be55326a31640ce8ab24a22f72f7fd748d68664b418028f0d3d\": container with ID starting with 6e72366576ed5be55326a31640ce8ab24a22f72f7fd748d68664b418028f0d3d not found: ID does not exist" containerID="6e72366576ed5be55326a31640ce8ab24a22f72f7fd748d68664b418028f0d3d" Apr 28 19:31:30.046677 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:30.046635 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e72366576ed5be55326a31640ce8ab24a22f72f7fd748d68664b418028f0d3d"} err="failed to get container status \"6e72366576ed5be55326a31640ce8ab24a22f72f7fd748d68664b418028f0d3d\": rpc error: code = NotFound desc = could not find container \"6e72366576ed5be55326a31640ce8ab24a22f72f7fd748d68664b418028f0d3d\": container with ID starting with 6e72366576ed5be55326a31640ce8ab24a22f72f7fd748d68664b418028f0d3d not found: ID does not exist" Apr 28 19:31:30.062462 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:30.062443 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzhnm\" (UniqueName: \"kubernetes.io/projected/7320bf13-d054-4139-ad23-33215e059411-kube-api-access-kzhnm\") pod \"7320bf13-d054-4139-ad23-33215e059411\" (UID: \"7320bf13-d054-4139-ad23-33215e059411\") " Apr 28 19:31:30.062561 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:30.062472 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7320bf13-d054-4139-ad23-33215e059411-proxy-tls\") pod \"7320bf13-d054-4139-ad23-33215e059411\" (UID: \"7320bf13-d054-4139-ad23-33215e059411\") " Apr 28 19:31:30.062561 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:30.062541 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7320bf13-d054-4139-ad23-33215e059411-kserve-provision-location\") pod \"7320bf13-d054-4139-ad23-33215e059411\" (UID: \"7320bf13-d054-4139-ad23-33215e059411\") " Apr 28 19:31:30.062629 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:30.062598 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7320bf13-d054-4139-ad23-33215e059411-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") pod \"7320bf13-d054-4139-ad23-33215e059411\" (UID: \"7320bf13-d054-4139-ad23-33215e059411\") " Apr 28 19:31:30.062999 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:30.062973 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7320bf13-d054-4139-ad23-33215e059411-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-lightgbm-runtime-kube-rbac-proxy-sar-config") pod "7320bf13-d054-4139-ad23-33215e059411" (UID: "7320bf13-d054-4139-ad23-33215e059411"). InnerVolumeSpecName "isvc-lightgbm-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:31:30.063088 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:30.062973 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7320bf13-d054-4139-ad23-33215e059411-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7320bf13-d054-4139-ad23-33215e059411" (UID: "7320bf13-d054-4139-ad23-33215e059411"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:31:30.064411 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:30.064376 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7320bf13-d054-4139-ad23-33215e059411-kube-api-access-kzhnm" (OuterVolumeSpecName: "kube-api-access-kzhnm") pod "7320bf13-d054-4139-ad23-33215e059411" (UID: "7320bf13-d054-4139-ad23-33215e059411"). InnerVolumeSpecName "kube-api-access-kzhnm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:31:30.064411 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:30.064400 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7320bf13-d054-4139-ad23-33215e059411-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "7320bf13-d054-4139-ad23-33215e059411" (UID: "7320bf13-d054-4139-ad23-33215e059411"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:31:30.163659 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:30.163630 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7320bf13-d054-4139-ad23-33215e059411-kserve-provision-location\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:31:30.163659 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:30.163659 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7320bf13-d054-4139-ad23-33215e059411-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:31:30.163839 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:30.163673 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kzhnm\" (UniqueName: \"kubernetes.io/projected/7320bf13-d054-4139-ad23-33215e059411-kube-api-access-kzhnm\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:31:30.163839 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:30.163685 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7320bf13-d054-4139-ad23-33215e059411-proxy-tls\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:31:30.346942 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:30.346915 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-67zf9"] Apr 28 19:31:30.350318 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:30.350293 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-67zf9"] Apr 28 19:31:31.029284 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:31.029254 2571 generic.go:358] "Generic (PLEG): container finished" podID="df8b6436-2aca-4af9-ad96-190b70d1f5a0" containerID="3e2a026244559719d3de65a88f7f7373e66873a32ba823b62d8cdf6da73d9d59" exitCode=0 Apr 28 19:31:31.029679 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:31.029336 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-z2x9p" event={"ID":"df8b6436-2aca-4af9-ad96-190b70d1f5a0","Type":"ContainerDied","Data":"3e2a026244559719d3de65a88f7f7373e66873a32ba823b62d8cdf6da73d9d59"} Apr 28 19:31:31.950496 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:31:31.950020 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7320bf13-d054-4139-ad23-33215e059411" path="/var/lib/kubelet/pods/7320bf13-d054-4139-ad23-33215e059411/volumes" Apr 28 19:33:55.820137 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:33:55.820116 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 19:33:56.513240 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:33:56.513198 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-z2x9p" event={"ID":"df8b6436-2aca-4af9-ad96-190b70d1f5a0","Type":"ContainerStarted","Data":"2f01516453e4ecdab977ab8bd59178d6c6e941c2e80a4c924dfa4093bda6c636"} Apr 28 19:33:56.513433 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:33:56.513250 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-z2x9p" event={"ID":"df8b6436-2aca-4af9-ad96-190b70d1f5a0","Type":"ContainerStarted","Data":"7dcc4b9348a899fe894f327ebe80e1d955fc3ea4206604fc157904bc507e1cbd"} Apr 28 19:33:56.513433 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:33:56.513313 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-z2x9p" Apr 28 19:33:56.540534 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:33:56.540457 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-z2x9p" podStartSLOduration=6.88296417 podStartE2EDuration="2m31.540441652s" podCreationTimestamp="2026-04-28 19:31:25 +0000 UTC" firstStartedPulling="2026-04-28 19:31:31.030501713 +0000 UTC m=+907.661402636" lastFinishedPulling="2026-04-28 19:33:55.687979184 +0000 UTC m=+1052.318880118" observedRunningTime="2026-04-28 19:33:56.53947151 +0000 UTC m=+1053.170372467" watchObservedRunningTime="2026-04-28 19:33:56.540441652 +0000 UTC m=+1053.171342596" Apr 28 19:33:57.516646 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:33:57.516614 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-z2x9p" Apr 28 19:34:03.525279 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:03.525250 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-z2x9p" Apr 28 19:34:33.529326 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:33.529246 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-z2x9p" Apr 28 19:34:39.519100 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:39.519067 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-z2x9p"] Apr 28 19:34:39.519584 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:39.519398 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-z2x9p" podUID="df8b6436-2aca-4af9-ad96-190b70d1f5a0" containerName="kserve-container" containerID="cri-o://7dcc4b9348a899fe894f327ebe80e1d955fc3ea4206604fc157904bc507e1cbd" gracePeriod=30 Apr 28 19:34:39.519584 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:39.519449 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-z2x9p" podUID="df8b6436-2aca-4af9-ad96-190b70d1f5a0" containerName="kube-rbac-proxy" containerID="cri-o://2f01516453e4ecdab977ab8bd59178d6c6e941c2e80a4c924dfa4093bda6c636" gracePeriod=30 Apr 28 19:34:39.648096 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:39.648063 2571 generic.go:358] "Generic (PLEG): container finished" podID="df8b6436-2aca-4af9-ad96-190b70d1f5a0" containerID="2f01516453e4ecdab977ab8bd59178d6c6e941c2e80a4c924dfa4093bda6c636" exitCode=2 Apr 28 19:34:39.648251 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:39.648111 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-z2x9p" event={"ID":"df8b6436-2aca-4af9-ad96-190b70d1f5a0","Type":"ContainerDied","Data":"2f01516453e4ecdab977ab8bd59178d6c6e941c2e80a4c924dfa4093bda6c636"} Apr 28 19:34:39.735335 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:39.735300 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-t4pcr"] Apr 28 19:34:39.735718 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:39.735701 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7320bf13-d054-4139-ad23-33215e059411" containerName="kserve-container" Apr 28 19:34:39.735766 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:39.735720 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="7320bf13-d054-4139-ad23-33215e059411" containerName="kserve-container" Apr 28 19:34:39.735766 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:39.735733 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7320bf13-d054-4139-ad23-33215e059411" containerName="storage-initializer" Apr 28 19:34:39.735766 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:39.735739 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="7320bf13-d054-4139-ad23-33215e059411" containerName="storage-initializer" Apr 28 19:34:39.735766 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:39.735748 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7320bf13-d054-4139-ad23-33215e059411" containerName="kube-rbac-proxy" Apr 28 19:34:39.735766 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:39.735754 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="7320bf13-d054-4139-ad23-33215e059411" containerName="kube-rbac-proxy" Apr 28 19:34:39.735913 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:39.735800 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="7320bf13-d054-4139-ad23-33215e059411" containerName="kube-rbac-proxy" Apr 28 19:34:39.735913 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:39.735809 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="7320bf13-d054-4139-ad23-33215e059411" containerName="kserve-container" Apr 28 19:34:39.738046 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:39.738027 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-t4pcr" Apr 28 19:34:39.740175 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:39.740150 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\"" Apr 28 19:34:39.740352 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:39.740334 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-v2-kserve-predictor-serving-cert\"" Apr 28 19:34:39.750214 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:39.750191 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-t4pcr"] Apr 28 19:34:39.824534 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:39.824426 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/867d0351-7545-4dda-8967-0da277dbe738-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-t4pcr\" (UID: \"867d0351-7545-4dda-8967-0da277dbe738\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-t4pcr" Apr 28 19:34:39.824534 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:39.824475 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/867d0351-7545-4dda-8967-0da277dbe738-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-t4pcr\" (UID: \"867d0351-7545-4dda-8967-0da277dbe738\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-t4pcr" Apr 28 19:34:39.824733 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:39.824572 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/867d0351-7545-4dda-8967-0da277dbe738-proxy-tls\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-t4pcr\" (UID: \"867d0351-7545-4dda-8967-0da277dbe738\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-t4pcr" Apr 28 19:34:39.824733 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:39.824621 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxmvj\" (UniqueName: \"kubernetes.io/projected/867d0351-7545-4dda-8967-0da277dbe738-kube-api-access-mxmvj\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-t4pcr\" (UID: \"867d0351-7545-4dda-8967-0da277dbe738\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-t4pcr" Apr 28 19:34:39.925546 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:39.925510 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/867d0351-7545-4dda-8967-0da277dbe738-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-t4pcr\" (UID: \"867d0351-7545-4dda-8967-0da277dbe738\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-t4pcr" Apr 28 19:34:39.925744 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:39.925562 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/867d0351-7545-4dda-8967-0da277dbe738-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-t4pcr\" (UID: \"867d0351-7545-4dda-8967-0da277dbe738\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-t4pcr" Apr 28 19:34:39.925744 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:39.925624 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/867d0351-7545-4dda-8967-0da277dbe738-proxy-tls\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-t4pcr\" (UID: \"867d0351-7545-4dda-8967-0da277dbe738\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-t4pcr" Apr 28 19:34:39.925744 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:39.925656 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mxmvj\" (UniqueName: \"kubernetes.io/projected/867d0351-7545-4dda-8967-0da277dbe738-kube-api-access-mxmvj\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-t4pcr\" (UID: \"867d0351-7545-4dda-8967-0da277dbe738\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-t4pcr" Apr 28 19:34:39.926051 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:39.926026 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/867d0351-7545-4dda-8967-0da277dbe738-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-t4pcr\" (UID: \"867d0351-7545-4dda-8967-0da277dbe738\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-t4pcr" Apr 28 19:34:39.926262 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:39.926239 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/867d0351-7545-4dda-8967-0da277dbe738-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-t4pcr\" (UID: \"867d0351-7545-4dda-8967-0da277dbe738\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-t4pcr" Apr 28 19:34:39.928015 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:39.927997 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/867d0351-7545-4dda-8967-0da277dbe738-proxy-tls\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-t4pcr\" (UID: \"867d0351-7545-4dda-8967-0da277dbe738\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-t4pcr" Apr 28 19:34:39.932805 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:39.932784 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxmvj\" (UniqueName: \"kubernetes.io/projected/867d0351-7545-4dda-8967-0da277dbe738-kube-api-access-mxmvj\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-t4pcr\" (UID: \"867d0351-7545-4dda-8967-0da277dbe738\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-t4pcr" Apr 28 19:34:40.048960 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:40.048926 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-t4pcr" Apr 28 19:34:40.176443 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:40.176413 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-t4pcr"] Apr 28 19:34:40.179234 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:34:40.179200 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod867d0351_7545_4dda_8967_0da277dbe738.slice/crio-a85c4572e67ee2be1054d93acd01a14dc2ce813f9ad4dcc0430b077d09770ebb WatchSource:0}: Error finding container a85c4572e67ee2be1054d93acd01a14dc2ce813f9ad4dcc0430b077d09770ebb: Status 404 returned error can't find the container with id a85c4572e67ee2be1054d93acd01a14dc2ce813f9ad4dcc0430b077d09770ebb Apr 28 19:34:40.652982 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:40.652945 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-t4pcr" event={"ID":"867d0351-7545-4dda-8967-0da277dbe738","Type":"ContainerStarted","Data":"205138a149217a7c91786ecadbe8bc1ee860e0e9094aa5ad8755b30050ab3e14"} Apr 28 19:34:40.653415 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:40.652991 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-t4pcr" event={"ID":"867d0351-7545-4dda-8967-0da277dbe738","Type":"ContainerStarted","Data":"a85c4572e67ee2be1054d93acd01a14dc2ce813f9ad4dcc0430b077d09770ebb"} Apr 28 19:34:40.654878 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:40.654841 2571 generic.go:358] "Generic (PLEG): container finished" podID="df8b6436-2aca-4af9-ad96-190b70d1f5a0" containerID="7dcc4b9348a899fe894f327ebe80e1d955fc3ea4206604fc157904bc507e1cbd" exitCode=0 Apr 28 19:34:40.654987 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:40.654899 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-z2x9p" event={"ID":"df8b6436-2aca-4af9-ad96-190b70d1f5a0","Type":"ContainerDied","Data":"7dcc4b9348a899fe894f327ebe80e1d955fc3ea4206604fc157904bc507e1cbd"} Apr 28 19:34:40.654987 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:40.654930 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-z2x9p" event={"ID":"df8b6436-2aca-4af9-ad96-190b70d1f5a0","Type":"ContainerDied","Data":"9b2b99fbed111f45e258b529348763163634de0f8898101e12eef2a44d5a799f"} Apr 28 19:34:40.654987 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:40.654940 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b2b99fbed111f45e258b529348763163634de0f8898101e12eef2a44d5a799f" Apr 28 19:34:40.665391 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:40.665370 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-z2x9p" Apr 28 19:34:40.733798 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:40.733710 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9nsv\" (UniqueName: \"kubernetes.io/projected/df8b6436-2aca-4af9-ad96-190b70d1f5a0-kube-api-access-f9nsv\") pod \"df8b6436-2aca-4af9-ad96-190b70d1f5a0\" (UID: \"df8b6436-2aca-4af9-ad96-190b70d1f5a0\") " Apr 28 19:34:40.733798 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:40.733765 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/df8b6436-2aca-4af9-ad96-190b70d1f5a0-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") pod \"df8b6436-2aca-4af9-ad96-190b70d1f5a0\" (UID: \"df8b6436-2aca-4af9-ad96-190b70d1f5a0\") " Apr 28 19:34:40.734040 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:40.733813 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/df8b6436-2aca-4af9-ad96-190b70d1f5a0-kserve-provision-location\") pod \"df8b6436-2aca-4af9-ad96-190b70d1f5a0\" (UID: \"df8b6436-2aca-4af9-ad96-190b70d1f5a0\") " Apr 28 19:34:40.734040 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:40.733841 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/df8b6436-2aca-4af9-ad96-190b70d1f5a0-proxy-tls\") pod \"df8b6436-2aca-4af9-ad96-190b70d1f5a0\" (UID: \"df8b6436-2aca-4af9-ad96-190b70d1f5a0\") " Apr 28 19:34:40.734221 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:40.734170 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df8b6436-2aca-4af9-ad96-190b70d1f5a0-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "df8b6436-2aca-4af9-ad96-190b70d1f5a0" (UID: "df8b6436-2aca-4af9-ad96-190b70d1f5a0"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:34:40.734356 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:40.734215 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df8b6436-2aca-4af9-ad96-190b70d1f5a0-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config") pod "df8b6436-2aca-4af9-ad96-190b70d1f5a0" (UID: "df8b6436-2aca-4af9-ad96-190b70d1f5a0"). InnerVolumeSpecName "isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:34:40.736022 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:40.735999 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df8b6436-2aca-4af9-ad96-190b70d1f5a0-kube-api-access-f9nsv" (OuterVolumeSpecName: "kube-api-access-f9nsv") pod "df8b6436-2aca-4af9-ad96-190b70d1f5a0" (UID: "df8b6436-2aca-4af9-ad96-190b70d1f5a0"). InnerVolumeSpecName "kube-api-access-f9nsv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:34:40.736322 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:40.736303 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df8b6436-2aca-4af9-ad96-190b70d1f5a0-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "df8b6436-2aca-4af9-ad96-190b70d1f5a0" (UID: "df8b6436-2aca-4af9-ad96-190b70d1f5a0"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:34:40.834822 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:40.834786 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f9nsv\" (UniqueName: \"kubernetes.io/projected/df8b6436-2aca-4af9-ad96-190b70d1f5a0-kube-api-access-f9nsv\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:34:40.834822 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:40.834817 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/df8b6436-2aca-4af9-ad96-190b70d1f5a0-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:34:40.834822 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:40.834830 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/df8b6436-2aca-4af9-ad96-190b70d1f5a0-kserve-provision-location\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:34:40.835037 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:40.834840 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/df8b6436-2aca-4af9-ad96-190b70d1f5a0-proxy-tls\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:34:41.657742 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:41.657714 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-z2x9p" Apr 28 19:34:41.677112 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:41.677083 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-z2x9p"] Apr 28 19:34:41.679609 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:41.679587 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-z2x9p"] Apr 28 19:34:41.946823 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:41.946743 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df8b6436-2aca-4af9-ad96-190b70d1f5a0" path="/var/lib/kubelet/pods/df8b6436-2aca-4af9-ad96-190b70d1f5a0/volumes" Apr 28 19:34:44.667896 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:44.667863 2571 generic.go:358] "Generic (PLEG): container finished" podID="867d0351-7545-4dda-8967-0da277dbe738" containerID="205138a149217a7c91786ecadbe8bc1ee860e0e9094aa5ad8755b30050ab3e14" exitCode=0 Apr 28 19:34:44.668276 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:44.667907 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-t4pcr" event={"ID":"867d0351-7545-4dda-8967-0da277dbe738","Type":"ContainerDied","Data":"205138a149217a7c91786ecadbe8bc1ee860e0e9094aa5ad8755b30050ab3e14"} Apr 28 19:34:45.673189 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:45.673148 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-t4pcr" event={"ID":"867d0351-7545-4dda-8967-0da277dbe738","Type":"ContainerStarted","Data":"6b49319e6fb67c4f5c92a8c37de6c793121930470a4c8e2da5d808e80fbc309b"} Apr 28 19:34:45.673189 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:45.673184 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-t4pcr" event={"ID":"867d0351-7545-4dda-8967-0da277dbe738","Type":"ContainerStarted","Data":"cfda52aea6b663ebd9ff79a8eb7e319d54a60bdbc90531f868742b8f10bab1a5"} Apr 28 19:34:45.673662 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:45.673465 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-t4pcr" Apr 28 19:34:45.673662 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:45.673534 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-t4pcr" Apr 28 19:34:45.675130 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:45.675104 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-t4pcr" podUID="867d0351-7545-4dda-8967-0da277dbe738" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 28 19:34:45.700743 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:45.700691 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-t4pcr" podStartSLOduration=6.700675227 podStartE2EDuration="6.700675227s" podCreationTimestamp="2026-04-28 19:34:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:34:45.697968535 +0000 UTC m=+1102.328869480" watchObservedRunningTime="2026-04-28 19:34:45.700675227 +0000 UTC m=+1102.331576173" Apr 28 19:34:46.676496 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:46.676438 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-t4pcr" podUID="867d0351-7545-4dda-8967-0da277dbe738" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 28 19:34:51.681319 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:51.681289 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-t4pcr" Apr 28 19:34:51.681806 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:34:51.681773 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-t4pcr" podUID="867d0351-7545-4dda-8967-0da277dbe738" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 28 19:35:01.682459 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:01.682427 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-t4pcr" Apr 28 19:35:09.726170 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:09.726135 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-t4pcr"] Apr 28 19:35:09.726673 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:09.726566 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-t4pcr" podUID="867d0351-7545-4dda-8967-0da277dbe738" containerName="kserve-container" containerID="cri-o://cfda52aea6b663ebd9ff79a8eb7e319d54a60bdbc90531f868742b8f10bab1a5" gracePeriod=30 Apr 28 19:35:09.726673 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:09.726603 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-t4pcr" podUID="867d0351-7545-4dda-8967-0da277dbe738" containerName="kube-rbac-proxy" containerID="cri-o://6b49319e6fb67c4f5c92a8c37de6c793121930470a4c8e2da5d808e80fbc309b" gracePeriod=30 Apr 28 19:35:09.932500 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:09.932446 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-g52mx"] Apr 28 19:35:09.932808 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:09.932796 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="df8b6436-2aca-4af9-ad96-190b70d1f5a0" containerName="kserve-container" Apr 28 19:35:09.932851 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:09.932810 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="df8b6436-2aca-4af9-ad96-190b70d1f5a0" containerName="kserve-container" Apr 28 19:35:09.932851 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:09.932821 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="df8b6436-2aca-4af9-ad96-190b70d1f5a0" containerName="kube-rbac-proxy" Apr 28 19:35:09.932851 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:09.932827 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="df8b6436-2aca-4af9-ad96-190b70d1f5a0" containerName="kube-rbac-proxy" Apr 28 19:35:09.932851 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:09.932837 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="df8b6436-2aca-4af9-ad96-190b70d1f5a0" containerName="storage-initializer" Apr 28 19:35:09.932851 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:09.932843 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="df8b6436-2aca-4af9-ad96-190b70d1f5a0" containerName="storage-initializer" Apr 28 19:35:09.933002 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:09.932891 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="df8b6436-2aca-4af9-ad96-190b70d1f5a0" containerName="kube-rbac-proxy" Apr 28 19:35:09.933002 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:09.932901 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="df8b6436-2aca-4af9-ad96-190b70d1f5a0" containerName="kserve-container" Apr 28 19:35:09.935293 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:09.935274 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-g52mx" Apr 28 19:35:09.937170 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:09.937140 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\"" Apr 28 19:35:09.937298 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:09.937247 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-mlflow-v2-runtime-predictor-serving-cert\"" Apr 28 19:35:09.947641 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:09.947616 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-g52mx"] Apr 28 19:35:09.974179 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:09.974143 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kc46\" (UniqueName: \"kubernetes.io/projected/95b9d646-5e0e-46f0-a06a-5a5b98c7bf98-kube-api-access-9kc46\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-g52mx\" (UID: \"95b9d646-5e0e-46f0-a06a-5a5b98c7bf98\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-g52mx" Apr 28 19:35:09.974338 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:09.974193 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/95b9d646-5e0e-46f0-a06a-5a5b98c7bf98-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-g52mx\" (UID: \"95b9d646-5e0e-46f0-a06a-5a5b98c7bf98\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-g52mx" Apr 28 19:35:09.974338 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:09.974252 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/95b9d646-5e0e-46f0-a06a-5a5b98c7bf98-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-g52mx\" (UID: \"95b9d646-5e0e-46f0-a06a-5a5b98c7bf98\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-g52mx" Apr 28 19:35:09.974338 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:09.974318 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/95b9d646-5e0e-46f0-a06a-5a5b98c7bf98-proxy-tls\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-g52mx\" (UID: \"95b9d646-5e0e-46f0-a06a-5a5b98c7bf98\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-g52mx" Apr 28 19:35:10.075014 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:10.074913 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9kc46\" (UniqueName: \"kubernetes.io/projected/95b9d646-5e0e-46f0-a06a-5a5b98c7bf98-kube-api-access-9kc46\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-g52mx\" (UID: \"95b9d646-5e0e-46f0-a06a-5a5b98c7bf98\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-g52mx" Apr 28 19:35:10.075014 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:10.074966 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/95b9d646-5e0e-46f0-a06a-5a5b98c7bf98-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-g52mx\" (UID: \"95b9d646-5e0e-46f0-a06a-5a5b98c7bf98\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-g52mx" Apr 28 19:35:10.075014 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:10.074998 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/95b9d646-5e0e-46f0-a06a-5a5b98c7bf98-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-g52mx\" (UID: \"95b9d646-5e0e-46f0-a06a-5a5b98c7bf98\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-g52mx" Apr 28 19:35:10.075263 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:10.075045 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/95b9d646-5e0e-46f0-a06a-5a5b98c7bf98-proxy-tls\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-g52mx\" (UID: \"95b9d646-5e0e-46f0-a06a-5a5b98c7bf98\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-g52mx" Apr 28 19:35:10.075536 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:10.075510 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/95b9d646-5e0e-46f0-a06a-5a5b98c7bf98-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-g52mx\" (UID: \"95b9d646-5e0e-46f0-a06a-5a5b98c7bf98\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-g52mx" Apr 28 19:35:10.075816 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:10.075797 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/95b9d646-5e0e-46f0-a06a-5a5b98c7bf98-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-g52mx\" (UID: \"95b9d646-5e0e-46f0-a06a-5a5b98c7bf98\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-g52mx" Apr 28 19:35:10.077563 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:10.077544 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/95b9d646-5e0e-46f0-a06a-5a5b98c7bf98-proxy-tls\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-g52mx\" (UID: \"95b9d646-5e0e-46f0-a06a-5a5b98c7bf98\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-g52mx" Apr 28 19:35:10.082468 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:10.082443 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kc46\" (UniqueName: \"kubernetes.io/projected/95b9d646-5e0e-46f0-a06a-5a5b98c7bf98-kube-api-access-9kc46\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-g52mx\" (UID: \"95b9d646-5e0e-46f0-a06a-5a5b98c7bf98\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-g52mx" Apr 28 19:35:10.246894 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:10.246856 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-g52mx" Apr 28 19:35:10.372955 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:10.372925 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-g52mx"] Apr 28 19:35:10.374901 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:35:10.374875 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95b9d646_5e0e_46f0_a06a_5a5b98c7bf98.slice/crio-9eb796dfdff2dc2a3bdeea57d238008051686e46af54b1e973f6ce6de61ca9ca WatchSource:0}: Error finding container 9eb796dfdff2dc2a3bdeea57d238008051686e46af54b1e973f6ce6de61ca9ca: Status 404 returned error can't find the container with id 9eb796dfdff2dc2a3bdeea57d238008051686e46af54b1e973f6ce6de61ca9ca Apr 28 19:35:10.462013 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:10.461987 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-t4pcr" Apr 28 19:35:10.479069 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:10.479047 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/867d0351-7545-4dda-8967-0da277dbe738-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") pod \"867d0351-7545-4dda-8967-0da277dbe738\" (UID: \"867d0351-7545-4dda-8967-0da277dbe738\") " Apr 28 19:35:10.479273 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:10.479104 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/867d0351-7545-4dda-8967-0da277dbe738-kserve-provision-location\") pod \"867d0351-7545-4dda-8967-0da277dbe738\" (UID: \"867d0351-7545-4dda-8967-0da277dbe738\") " Apr 28 19:35:10.479273 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:10.479126 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/867d0351-7545-4dda-8967-0da277dbe738-proxy-tls\") pod \"867d0351-7545-4dda-8967-0da277dbe738\" (UID: \"867d0351-7545-4dda-8967-0da277dbe738\") " Apr 28 19:35:10.479273 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:10.479165 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxmvj\" (UniqueName: \"kubernetes.io/projected/867d0351-7545-4dda-8967-0da277dbe738-kube-api-access-mxmvj\") pod \"867d0351-7545-4dda-8967-0da277dbe738\" (UID: \"867d0351-7545-4dda-8967-0da277dbe738\") " Apr 28 19:35:10.479505 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:10.479455 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/867d0351-7545-4dda-8967-0da277dbe738-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "867d0351-7545-4dda-8967-0da277dbe738" (UID: "867d0351-7545-4dda-8967-0da277dbe738"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:35:10.480766 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:10.480734 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/867d0351-7545-4dda-8967-0da277dbe738-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config") pod "867d0351-7545-4dda-8967-0da277dbe738" (UID: "867d0351-7545-4dda-8967-0da277dbe738"). InnerVolumeSpecName "isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:35:10.481754 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:10.481731 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/867d0351-7545-4dda-8967-0da277dbe738-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "867d0351-7545-4dda-8967-0da277dbe738" (UID: "867d0351-7545-4dda-8967-0da277dbe738"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:35:10.482344 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:10.482320 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/867d0351-7545-4dda-8967-0da277dbe738-kube-api-access-mxmvj" (OuterVolumeSpecName: "kube-api-access-mxmvj") pod "867d0351-7545-4dda-8967-0da277dbe738" (UID: "867d0351-7545-4dda-8967-0da277dbe738"). InnerVolumeSpecName "kube-api-access-mxmvj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:35:10.579771 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:10.579736 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/867d0351-7545-4dda-8967-0da277dbe738-kserve-provision-location\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:35:10.579771 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:10.579765 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/867d0351-7545-4dda-8967-0da277dbe738-proxy-tls\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:35:10.579771 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:10.579774 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mxmvj\" (UniqueName: \"kubernetes.io/projected/867d0351-7545-4dda-8967-0da277dbe738-kube-api-access-mxmvj\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:35:10.579997 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:10.579785 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/867d0351-7545-4dda-8967-0da277dbe738-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:35:10.748400 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:10.748367 2571 generic.go:358] "Generic (PLEG): container finished" podID="867d0351-7545-4dda-8967-0da277dbe738" containerID="6b49319e6fb67c4f5c92a8c37de6c793121930470a4c8e2da5d808e80fbc309b" exitCode=2 Apr 28 19:35:10.748400 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:10.748393 2571 generic.go:358] "Generic (PLEG): container finished" podID="867d0351-7545-4dda-8967-0da277dbe738" containerID="cfda52aea6b663ebd9ff79a8eb7e319d54a60bdbc90531f868742b8f10bab1a5" exitCode=0 Apr 28 19:35:10.748889 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:10.748459 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-t4pcr" Apr 28 19:35:10.748889 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:10.748456 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-t4pcr" event={"ID":"867d0351-7545-4dda-8967-0da277dbe738","Type":"ContainerDied","Data":"6b49319e6fb67c4f5c92a8c37de6c793121930470a4c8e2da5d808e80fbc309b"} Apr 28 19:35:10.748889 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:10.748527 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-t4pcr" event={"ID":"867d0351-7545-4dda-8967-0da277dbe738","Type":"ContainerDied","Data":"cfda52aea6b663ebd9ff79a8eb7e319d54a60bdbc90531f868742b8f10bab1a5"} Apr 28 19:35:10.748889 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:10.748543 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-t4pcr" event={"ID":"867d0351-7545-4dda-8967-0da277dbe738","Type":"ContainerDied","Data":"a85c4572e67ee2be1054d93acd01a14dc2ce813f9ad4dcc0430b077d09770ebb"} Apr 28 19:35:10.748889 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:10.748563 2571 scope.go:117] "RemoveContainer" containerID="6b49319e6fb67c4f5c92a8c37de6c793121930470a4c8e2da5d808e80fbc309b" Apr 28 19:35:10.750456 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:10.750433 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-g52mx" event={"ID":"95b9d646-5e0e-46f0-a06a-5a5b98c7bf98","Type":"ContainerStarted","Data":"efff80ed6d4bd93d346b46eed7341ee5a44d210f581f0d30287868cf95409b86"} Apr 28 19:35:10.750559 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:10.750467 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-g52mx" event={"ID":"95b9d646-5e0e-46f0-a06a-5a5b98c7bf98","Type":"ContainerStarted","Data":"9eb796dfdff2dc2a3bdeea57d238008051686e46af54b1e973f6ce6de61ca9ca"} Apr 28 19:35:10.758034 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:10.758004 2571 scope.go:117] "RemoveContainer" containerID="cfda52aea6b663ebd9ff79a8eb7e319d54a60bdbc90531f868742b8f10bab1a5" Apr 28 19:35:10.765703 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:10.765682 2571 scope.go:117] "RemoveContainer" containerID="205138a149217a7c91786ecadbe8bc1ee860e0e9094aa5ad8755b30050ab3e14" Apr 28 19:35:10.773422 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:10.773399 2571 scope.go:117] "RemoveContainer" containerID="6b49319e6fb67c4f5c92a8c37de6c793121930470a4c8e2da5d808e80fbc309b" Apr 28 19:35:10.773728 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:35:10.773709 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b49319e6fb67c4f5c92a8c37de6c793121930470a4c8e2da5d808e80fbc309b\": container with ID starting with 6b49319e6fb67c4f5c92a8c37de6c793121930470a4c8e2da5d808e80fbc309b not found: ID does not exist" containerID="6b49319e6fb67c4f5c92a8c37de6c793121930470a4c8e2da5d808e80fbc309b" Apr 28 19:35:10.773787 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:10.773740 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b49319e6fb67c4f5c92a8c37de6c793121930470a4c8e2da5d808e80fbc309b"} err="failed to get container status \"6b49319e6fb67c4f5c92a8c37de6c793121930470a4c8e2da5d808e80fbc309b\": rpc error: code = NotFound desc = could not find container \"6b49319e6fb67c4f5c92a8c37de6c793121930470a4c8e2da5d808e80fbc309b\": container with ID starting with 6b49319e6fb67c4f5c92a8c37de6c793121930470a4c8e2da5d808e80fbc309b not found: ID does not exist" Apr 28 19:35:10.773787 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:10.773759 2571 scope.go:117] "RemoveContainer" containerID="cfda52aea6b663ebd9ff79a8eb7e319d54a60bdbc90531f868742b8f10bab1a5" Apr 28 19:35:10.774024 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:35:10.774006 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfda52aea6b663ebd9ff79a8eb7e319d54a60bdbc90531f868742b8f10bab1a5\": container with ID starting with cfda52aea6b663ebd9ff79a8eb7e319d54a60bdbc90531f868742b8f10bab1a5 not found: ID does not exist" containerID="cfda52aea6b663ebd9ff79a8eb7e319d54a60bdbc90531f868742b8f10bab1a5" Apr 28 19:35:10.774079 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:10.774030 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfda52aea6b663ebd9ff79a8eb7e319d54a60bdbc90531f868742b8f10bab1a5"} err="failed to get container status \"cfda52aea6b663ebd9ff79a8eb7e319d54a60bdbc90531f868742b8f10bab1a5\": rpc error: code = NotFound desc = could not find container \"cfda52aea6b663ebd9ff79a8eb7e319d54a60bdbc90531f868742b8f10bab1a5\": container with ID starting with cfda52aea6b663ebd9ff79a8eb7e319d54a60bdbc90531f868742b8f10bab1a5 not found: ID does not exist" Apr 28 19:35:10.774079 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:10.774045 2571 scope.go:117] "RemoveContainer" containerID="205138a149217a7c91786ecadbe8bc1ee860e0e9094aa5ad8755b30050ab3e14" Apr 28 19:35:10.774261 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:35:10.774246 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"205138a149217a7c91786ecadbe8bc1ee860e0e9094aa5ad8755b30050ab3e14\": container with ID starting with 205138a149217a7c91786ecadbe8bc1ee860e0e9094aa5ad8755b30050ab3e14 not found: ID does not exist" containerID="205138a149217a7c91786ecadbe8bc1ee860e0e9094aa5ad8755b30050ab3e14" Apr 28 19:35:10.774305 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:10.774265 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"205138a149217a7c91786ecadbe8bc1ee860e0e9094aa5ad8755b30050ab3e14"} err="failed to get container status \"205138a149217a7c91786ecadbe8bc1ee860e0e9094aa5ad8755b30050ab3e14\": rpc error: code = NotFound desc = could not find container \"205138a149217a7c91786ecadbe8bc1ee860e0e9094aa5ad8755b30050ab3e14\": container with ID starting with 205138a149217a7c91786ecadbe8bc1ee860e0e9094aa5ad8755b30050ab3e14 not found: ID does not exist" Apr 28 19:35:10.774305 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:10.774286 2571 scope.go:117] "RemoveContainer" containerID="6b49319e6fb67c4f5c92a8c37de6c793121930470a4c8e2da5d808e80fbc309b" Apr 28 19:35:10.774589 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:10.774531 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b49319e6fb67c4f5c92a8c37de6c793121930470a4c8e2da5d808e80fbc309b"} err="failed to get container status \"6b49319e6fb67c4f5c92a8c37de6c793121930470a4c8e2da5d808e80fbc309b\": rpc error: code = NotFound desc = could not find container \"6b49319e6fb67c4f5c92a8c37de6c793121930470a4c8e2da5d808e80fbc309b\": container with ID starting with 6b49319e6fb67c4f5c92a8c37de6c793121930470a4c8e2da5d808e80fbc309b not found: ID does not exist" Apr 28 19:35:10.774589 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:10.774557 2571 scope.go:117] "RemoveContainer" containerID="cfda52aea6b663ebd9ff79a8eb7e319d54a60bdbc90531f868742b8f10bab1a5" Apr 28 19:35:10.774814 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:10.774795 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfda52aea6b663ebd9ff79a8eb7e319d54a60bdbc90531f868742b8f10bab1a5"} err="failed to get container status \"cfda52aea6b663ebd9ff79a8eb7e319d54a60bdbc90531f868742b8f10bab1a5\": rpc error: code = NotFound desc = could not find container \"cfda52aea6b663ebd9ff79a8eb7e319d54a60bdbc90531f868742b8f10bab1a5\": container with ID starting with cfda52aea6b663ebd9ff79a8eb7e319d54a60bdbc90531f868742b8f10bab1a5 not found: ID does not exist" Apr 28 19:35:10.774869 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:10.774814 2571 scope.go:117] "RemoveContainer" containerID="205138a149217a7c91786ecadbe8bc1ee860e0e9094aa5ad8755b30050ab3e14" Apr 28 19:35:10.775058 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:10.775040 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"205138a149217a7c91786ecadbe8bc1ee860e0e9094aa5ad8755b30050ab3e14"} err="failed to get container status \"205138a149217a7c91786ecadbe8bc1ee860e0e9094aa5ad8755b30050ab3e14\": rpc error: code = NotFound desc = could not find container \"205138a149217a7c91786ecadbe8bc1ee860e0e9094aa5ad8755b30050ab3e14\": container with ID starting with 205138a149217a7c91786ecadbe8bc1ee860e0e9094aa5ad8755b30050ab3e14 not found: ID does not exist" Apr 28 19:35:10.783898 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:10.783873 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-t4pcr"] Apr 28 19:35:10.786241 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:10.786218 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-t4pcr"] Apr 28 19:35:11.947577 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:11.947538 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="867d0351-7545-4dda-8967-0da277dbe738" path="/var/lib/kubelet/pods/867d0351-7545-4dda-8967-0da277dbe738/volumes" Apr 28 19:35:14.765057 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:14.765024 2571 generic.go:358] "Generic (PLEG): container finished" podID="95b9d646-5e0e-46f0-a06a-5a5b98c7bf98" containerID="efff80ed6d4bd93d346b46eed7341ee5a44d210f581f0d30287868cf95409b86" exitCode=0 Apr 28 19:35:14.765433 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:14.765108 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-g52mx" event={"ID":"95b9d646-5e0e-46f0-a06a-5a5b98c7bf98","Type":"ContainerDied","Data":"efff80ed6d4bd93d346b46eed7341ee5a44d210f581f0d30287868cf95409b86"} Apr 28 19:35:15.770539 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:15.770502 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-g52mx" event={"ID":"95b9d646-5e0e-46f0-a06a-5a5b98c7bf98","Type":"ContainerStarted","Data":"fbe88e4b8cd46f6b7498075d28a9fa77562ad50aef19b0d1b4f1c54cbd46f3ed"} Apr 28 19:35:15.770539 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:15.770541 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-g52mx" event={"ID":"95b9d646-5e0e-46f0-a06a-5a5b98c7bf98","Type":"ContainerStarted","Data":"9455cc77e61e748db57d84d9448712914f52e543c7ef11a1d0b7698e08c1890d"} Apr 28 19:35:15.770967 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:15.770870 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-g52mx" Apr 28 19:35:15.770967 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:15.770899 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-g52mx" Apr 28 19:35:15.788152 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:15.788106 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-g52mx" podStartSLOduration=6.788091904 podStartE2EDuration="6.788091904s" podCreationTimestamp="2026-04-28 19:35:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:35:15.786840483 +0000 UTC m=+1132.417741431" watchObservedRunningTime="2026-04-28 19:35:15.788091904 +0000 UTC m=+1132.418992849" Apr 28 19:35:21.779073 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:21.779040 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-g52mx" Apr 28 19:35:51.783013 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:51.782979 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-g52mx" Apr 28 19:35:59.872753 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:59.872719 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-g52mx"] Apr 28 19:35:59.873155 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:59.873050 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-g52mx" podUID="95b9d646-5e0e-46f0-a06a-5a5b98c7bf98" containerName="kserve-container" containerID="cri-o://9455cc77e61e748db57d84d9448712914f52e543c7ef11a1d0b7698e08c1890d" gracePeriod=30 Apr 28 19:35:59.873155 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:35:59.873096 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-g52mx" podUID="95b9d646-5e0e-46f0-a06a-5a5b98c7bf98" containerName="kube-rbac-proxy" containerID="cri-o://fbe88e4b8cd46f6b7498075d28a9fa77562ad50aef19b0d1b4f1c54cbd46f3ed" gracePeriod=30 Apr 28 19:36:00.087912 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:00.087877 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq"] Apr 28 19:36:00.088232 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:00.088218 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="867d0351-7545-4dda-8967-0da277dbe738" containerName="kube-rbac-proxy" Apr 28 19:36:00.088280 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:00.088237 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="867d0351-7545-4dda-8967-0da277dbe738" containerName="kube-rbac-proxy" Apr 28 19:36:00.088280 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:00.088253 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="867d0351-7545-4dda-8967-0da277dbe738" containerName="storage-initializer" Apr 28 19:36:00.088280 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:00.088259 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="867d0351-7545-4dda-8967-0da277dbe738" containerName="storage-initializer" Apr 28 19:36:00.088280 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:00.088270 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="867d0351-7545-4dda-8967-0da277dbe738" containerName="kserve-container" Apr 28 19:36:00.088280 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:00.088276 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="867d0351-7545-4dda-8967-0da277dbe738" containerName="kserve-container" Apr 28 19:36:00.088441 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:00.088327 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="867d0351-7545-4dda-8967-0da277dbe738" containerName="kube-rbac-proxy" Apr 28 19:36:00.088441 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:00.088338 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="867d0351-7545-4dda-8967-0da277dbe738" containerName="kserve-container" Apr 28 19:36:00.090644 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:00.090626 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq" Apr 28 19:36:00.092853 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:00.092824 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\"" Apr 28 19:36:00.093011 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:00.092854 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-mcp-predictor-serving-cert\"" Apr 28 19:36:00.103067 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:00.103041 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq"] Apr 28 19:36:00.175061 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:00.175013 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/696284ec-82a0-46fb-9ee6-93e4dbbb9fae-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq\" (UID: \"696284ec-82a0-46fb-9ee6-93e4dbbb9fae\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq" Apr 28 19:36:00.175264 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:00.175080 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4rtg\" (UniqueName: \"kubernetes.io/projected/696284ec-82a0-46fb-9ee6-93e4dbbb9fae-kube-api-access-g4rtg\") pod \"isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq\" (UID: \"696284ec-82a0-46fb-9ee6-93e4dbbb9fae\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq" Apr 28 19:36:00.175264 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:00.175139 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/696284ec-82a0-46fb-9ee6-93e4dbbb9fae-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq\" (UID: \"696284ec-82a0-46fb-9ee6-93e4dbbb9fae\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq" Apr 28 19:36:00.175264 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:00.175179 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/696284ec-82a0-46fb-9ee6-93e4dbbb9fae-proxy-tls\") pod \"isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq\" (UID: \"696284ec-82a0-46fb-9ee6-93e4dbbb9fae\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq" Apr 28 19:36:00.276592 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:00.276555 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g4rtg\" (UniqueName: \"kubernetes.io/projected/696284ec-82a0-46fb-9ee6-93e4dbbb9fae-kube-api-access-g4rtg\") pod \"isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq\" (UID: \"696284ec-82a0-46fb-9ee6-93e4dbbb9fae\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq" Apr 28 19:36:00.276592 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:00.276597 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/696284ec-82a0-46fb-9ee6-93e4dbbb9fae-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq\" (UID: \"696284ec-82a0-46fb-9ee6-93e4dbbb9fae\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq" Apr 28 19:36:00.276864 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:00.276628 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/696284ec-82a0-46fb-9ee6-93e4dbbb9fae-proxy-tls\") pod \"isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq\" (UID: \"696284ec-82a0-46fb-9ee6-93e4dbbb9fae\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq" Apr 28 19:36:00.276864 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:00.276665 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/696284ec-82a0-46fb-9ee6-93e4dbbb9fae-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq\" (UID: \"696284ec-82a0-46fb-9ee6-93e4dbbb9fae\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq" Apr 28 19:36:00.276864 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:36:00.276787 2571 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-serving-cert: secret "isvc-sklearn-mcp-predictor-serving-cert" not found Apr 28 19:36:00.276864 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:36:00.276854 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/696284ec-82a0-46fb-9ee6-93e4dbbb9fae-proxy-tls podName:696284ec-82a0-46fb-9ee6-93e4dbbb9fae nodeName:}" failed. No retries permitted until 2026-04-28 19:36:00.776832872 +0000 UTC m=+1177.407733815 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/696284ec-82a0-46fb-9ee6-93e4dbbb9fae-proxy-tls") pod "isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq" (UID: "696284ec-82a0-46fb-9ee6-93e4dbbb9fae") : secret "isvc-sklearn-mcp-predictor-serving-cert" not found Apr 28 19:36:00.277102 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:00.277078 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/696284ec-82a0-46fb-9ee6-93e4dbbb9fae-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq\" (UID: \"696284ec-82a0-46fb-9ee6-93e4dbbb9fae\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq" Apr 28 19:36:00.277375 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:00.277354 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/696284ec-82a0-46fb-9ee6-93e4dbbb9fae-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq\" (UID: \"696284ec-82a0-46fb-9ee6-93e4dbbb9fae\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq" Apr 28 19:36:00.286689 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:00.286658 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4rtg\" (UniqueName: \"kubernetes.io/projected/696284ec-82a0-46fb-9ee6-93e4dbbb9fae-kube-api-access-g4rtg\") pod \"isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq\" (UID: \"696284ec-82a0-46fb-9ee6-93e4dbbb9fae\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq" Apr 28 19:36:00.780642 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:00.780603 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/696284ec-82a0-46fb-9ee6-93e4dbbb9fae-proxy-tls\") pod \"isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq\" (UID: \"696284ec-82a0-46fb-9ee6-93e4dbbb9fae\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq" Apr 28 19:36:00.783204 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:00.783171 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/696284ec-82a0-46fb-9ee6-93e4dbbb9fae-proxy-tls\") pod \"isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq\" (UID: \"696284ec-82a0-46fb-9ee6-93e4dbbb9fae\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq" Apr 28 19:36:00.922240 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:00.922197 2571 generic.go:358] "Generic (PLEG): container finished" podID="95b9d646-5e0e-46f0-a06a-5a5b98c7bf98" containerID="fbe88e4b8cd46f6b7498075d28a9fa77562ad50aef19b0d1b4f1c54cbd46f3ed" exitCode=2 Apr 28 19:36:00.922647 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:00.922328 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-g52mx" event={"ID":"95b9d646-5e0e-46f0-a06a-5a5b98c7bf98","Type":"ContainerDied","Data":"fbe88e4b8cd46f6b7498075d28a9fa77562ad50aef19b0d1b4f1c54cbd46f3ed"} Apr 28 19:36:01.001562 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:01.001529 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq" Apr 28 19:36:01.132696 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:01.132663 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq"] Apr 28 19:36:01.135044 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:36:01.135019 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod696284ec_82a0_46fb_9ee6_93e4dbbb9fae.slice/crio-556079ee7e0b9284515a155d61b320c3c0b209fef06420fb3bf5add87dca85b1 WatchSource:0}: Error finding container 556079ee7e0b9284515a155d61b320c3c0b209fef06420fb3bf5add87dca85b1: Status 404 returned error can't find the container with id 556079ee7e0b9284515a155d61b320c3c0b209fef06420fb3bf5add87dca85b1 Apr 28 19:36:01.135137 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:01.135111 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-g52mx" Apr 28 19:36:01.285538 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:01.285504 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/95b9d646-5e0e-46f0-a06a-5a5b98c7bf98-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") pod \"95b9d646-5e0e-46f0-a06a-5a5b98c7bf98\" (UID: \"95b9d646-5e0e-46f0-a06a-5a5b98c7bf98\") " Apr 28 19:36:01.285730 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:01.285548 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/95b9d646-5e0e-46f0-a06a-5a5b98c7bf98-proxy-tls\") pod \"95b9d646-5e0e-46f0-a06a-5a5b98c7bf98\" (UID: \"95b9d646-5e0e-46f0-a06a-5a5b98c7bf98\") " Apr 28 19:36:01.285730 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:01.285655 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9kc46\" (UniqueName: \"kubernetes.io/projected/95b9d646-5e0e-46f0-a06a-5a5b98c7bf98-kube-api-access-9kc46\") pod \"95b9d646-5e0e-46f0-a06a-5a5b98c7bf98\" (UID: \"95b9d646-5e0e-46f0-a06a-5a5b98c7bf98\") " Apr 28 19:36:01.285730 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:01.285715 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/95b9d646-5e0e-46f0-a06a-5a5b98c7bf98-kserve-provision-location\") pod \"95b9d646-5e0e-46f0-a06a-5a5b98c7bf98\" (UID: \"95b9d646-5e0e-46f0-a06a-5a5b98c7bf98\") " Apr 28 19:36:01.286019 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:01.285886 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95b9d646-5e0e-46f0-a06a-5a5b98c7bf98-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config") pod "95b9d646-5e0e-46f0-a06a-5a5b98c7bf98" (UID: "95b9d646-5e0e-46f0-a06a-5a5b98c7bf98"). InnerVolumeSpecName "isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:36:01.286126 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:01.286051 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95b9d646-5e0e-46f0-a06a-5a5b98c7bf98-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "95b9d646-5e0e-46f0-a06a-5a5b98c7bf98" (UID: "95b9d646-5e0e-46f0-a06a-5a5b98c7bf98"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:36:01.287694 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:01.287670 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95b9d646-5e0e-46f0-a06a-5a5b98c7bf98-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "95b9d646-5e0e-46f0-a06a-5a5b98c7bf98" (UID: "95b9d646-5e0e-46f0-a06a-5a5b98c7bf98"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:36:01.287790 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:01.287705 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95b9d646-5e0e-46f0-a06a-5a5b98c7bf98-kube-api-access-9kc46" (OuterVolumeSpecName: "kube-api-access-9kc46") pod "95b9d646-5e0e-46f0-a06a-5a5b98c7bf98" (UID: "95b9d646-5e0e-46f0-a06a-5a5b98c7bf98"). InnerVolumeSpecName "kube-api-access-9kc46". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:36:01.387225 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:01.387147 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/95b9d646-5e0e-46f0-a06a-5a5b98c7bf98-kserve-provision-location\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:36:01.387225 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:01.387174 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/95b9d646-5e0e-46f0-a06a-5a5b98c7bf98-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:36:01.387225 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:01.387186 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/95b9d646-5e0e-46f0-a06a-5a5b98c7bf98-proxy-tls\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:36:01.387225 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:01.387197 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9kc46\" (UniqueName: \"kubernetes.io/projected/95b9d646-5e0e-46f0-a06a-5a5b98c7bf98-kube-api-access-9kc46\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:36:01.927049 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:01.927006 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq" event={"ID":"696284ec-82a0-46fb-9ee6-93e4dbbb9fae","Type":"ContainerStarted","Data":"b9418bf555b2c1c11ae314ee60c20fc5be4ff6dc0232363329080795b82e63cd"} Apr 28 19:36:01.927049 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:01.927054 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq" event={"ID":"696284ec-82a0-46fb-9ee6-93e4dbbb9fae","Type":"ContainerStarted","Data":"556079ee7e0b9284515a155d61b320c3c0b209fef06420fb3bf5add87dca85b1"} Apr 28 19:36:01.928691 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:01.928665 2571 generic.go:358] "Generic (PLEG): container finished" podID="95b9d646-5e0e-46f0-a06a-5a5b98c7bf98" containerID="9455cc77e61e748db57d84d9448712914f52e543c7ef11a1d0b7698e08c1890d" exitCode=0 Apr 28 19:36:01.928800 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:01.928721 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-g52mx" event={"ID":"95b9d646-5e0e-46f0-a06a-5a5b98c7bf98","Type":"ContainerDied","Data":"9455cc77e61e748db57d84d9448712914f52e543c7ef11a1d0b7698e08c1890d"} Apr 28 19:36:01.928800 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:01.928751 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-g52mx" event={"ID":"95b9d646-5e0e-46f0-a06a-5a5b98c7bf98","Type":"ContainerDied","Data":"9eb796dfdff2dc2a3bdeea57d238008051686e46af54b1e973f6ce6de61ca9ca"} Apr 28 19:36:01.928800 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:01.928772 2571 scope.go:117] "RemoveContainer" containerID="fbe88e4b8cd46f6b7498075d28a9fa77562ad50aef19b0d1b4f1c54cbd46f3ed" Apr 28 19:36:01.928800 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:01.928790 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-g52mx" Apr 28 19:36:01.937169 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:01.937152 2571 scope.go:117] "RemoveContainer" containerID="9455cc77e61e748db57d84d9448712914f52e543c7ef11a1d0b7698e08c1890d" Apr 28 19:36:01.944569 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:01.944550 2571 scope.go:117] "RemoveContainer" containerID="efff80ed6d4bd93d346b46eed7341ee5a44d210f581f0d30287868cf95409b86" Apr 28 19:36:01.952614 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:01.952598 2571 scope.go:117] "RemoveContainer" containerID="fbe88e4b8cd46f6b7498075d28a9fa77562ad50aef19b0d1b4f1c54cbd46f3ed" Apr 28 19:36:01.952892 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:36:01.952873 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbe88e4b8cd46f6b7498075d28a9fa77562ad50aef19b0d1b4f1c54cbd46f3ed\": container with ID starting with fbe88e4b8cd46f6b7498075d28a9fa77562ad50aef19b0d1b4f1c54cbd46f3ed not found: ID does not exist" containerID="fbe88e4b8cd46f6b7498075d28a9fa77562ad50aef19b0d1b4f1c54cbd46f3ed" Apr 28 19:36:01.952932 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:01.952901 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbe88e4b8cd46f6b7498075d28a9fa77562ad50aef19b0d1b4f1c54cbd46f3ed"} err="failed to get container status \"fbe88e4b8cd46f6b7498075d28a9fa77562ad50aef19b0d1b4f1c54cbd46f3ed\": rpc error: code = NotFound desc = could not find container \"fbe88e4b8cd46f6b7498075d28a9fa77562ad50aef19b0d1b4f1c54cbd46f3ed\": container with ID starting with fbe88e4b8cd46f6b7498075d28a9fa77562ad50aef19b0d1b4f1c54cbd46f3ed not found: ID does not exist" Apr 28 19:36:01.952932 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:01.952921 2571 scope.go:117] "RemoveContainer" containerID="9455cc77e61e748db57d84d9448712914f52e543c7ef11a1d0b7698e08c1890d" Apr 28 19:36:01.953173 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:36:01.953156 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9455cc77e61e748db57d84d9448712914f52e543c7ef11a1d0b7698e08c1890d\": container with ID starting with 9455cc77e61e748db57d84d9448712914f52e543c7ef11a1d0b7698e08c1890d not found: ID does not exist" containerID="9455cc77e61e748db57d84d9448712914f52e543c7ef11a1d0b7698e08c1890d" Apr 28 19:36:01.953217 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:01.953183 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9455cc77e61e748db57d84d9448712914f52e543c7ef11a1d0b7698e08c1890d"} err="failed to get container status \"9455cc77e61e748db57d84d9448712914f52e543c7ef11a1d0b7698e08c1890d\": rpc error: code = NotFound desc = could not find container \"9455cc77e61e748db57d84d9448712914f52e543c7ef11a1d0b7698e08c1890d\": container with ID starting with 9455cc77e61e748db57d84d9448712914f52e543c7ef11a1d0b7698e08c1890d not found: ID does not exist" Apr 28 19:36:01.953217 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:01.953200 2571 scope.go:117] "RemoveContainer" containerID="efff80ed6d4bd93d346b46eed7341ee5a44d210f581f0d30287868cf95409b86" Apr 28 19:36:01.953393 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:36:01.953372 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efff80ed6d4bd93d346b46eed7341ee5a44d210f581f0d30287868cf95409b86\": container with ID starting with efff80ed6d4bd93d346b46eed7341ee5a44d210f581f0d30287868cf95409b86 not found: ID does not exist" containerID="efff80ed6d4bd93d346b46eed7341ee5a44d210f581f0d30287868cf95409b86" Apr 28 19:36:01.953450 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:01.953402 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efff80ed6d4bd93d346b46eed7341ee5a44d210f581f0d30287868cf95409b86"} err="failed to get container status \"efff80ed6d4bd93d346b46eed7341ee5a44d210f581f0d30287868cf95409b86\": rpc error: code = NotFound desc = could not find container \"efff80ed6d4bd93d346b46eed7341ee5a44d210f581f0d30287868cf95409b86\": container with ID starting with efff80ed6d4bd93d346b46eed7341ee5a44d210f581f0d30287868cf95409b86 not found: ID does not exist" Apr 28 19:36:01.958760 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:01.958738 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-g52mx"] Apr 28 19:36:01.963677 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:01.963657 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-g52mx"] Apr 28 19:36:03.946456 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:03.946424 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95b9d646-5e0e-46f0-a06a-5a5b98c7bf98" path="/var/lib/kubelet/pods/95b9d646-5e0e-46f0-a06a-5a5b98c7bf98/volumes" Apr 28 19:36:04.938939 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:04.938901 2571 generic.go:358] "Generic (PLEG): container finished" podID="696284ec-82a0-46fb-9ee6-93e4dbbb9fae" containerID="b9418bf555b2c1c11ae314ee60c20fc5be4ff6dc0232363329080795b82e63cd" exitCode=0 Apr 28 19:36:04.939146 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:04.938963 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq" event={"ID":"696284ec-82a0-46fb-9ee6-93e4dbbb9fae","Type":"ContainerDied","Data":"b9418bf555b2c1c11ae314ee60c20fc5be4ff6dc0232363329080795b82e63cd"} Apr 28 19:36:05.948044 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:05.947998 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq" event={"ID":"696284ec-82a0-46fb-9ee6-93e4dbbb9fae","Type":"ContainerStarted","Data":"3a1487ff847434e94fd077b45eb7b66fbffeb943abe3a72f661828cef0a0e3aa"} Apr 28 19:36:07.953084 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:07.953053 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq" event={"ID":"696284ec-82a0-46fb-9ee6-93e4dbbb9fae","Type":"ContainerStarted","Data":"b8c8283a36814baa353a7a7bf86633f4fbc992c398d6c4d51e66f82d4ce81945"} Apr 28 19:36:08.958004 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:08.957968 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq" event={"ID":"696284ec-82a0-46fb-9ee6-93e4dbbb9fae","Type":"ContainerStarted","Data":"38d367ccec56d62a4025195d48b5702377cceadf105947198527f37ec26bda63"} Apr 28 19:36:08.958413 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:08.958145 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq" Apr 28 19:36:08.958413 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:08.958266 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq" Apr 28 19:36:08.980528 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:08.980461 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq" podStartSLOduration=6.087128035 podStartE2EDuration="8.980443886s" podCreationTimestamp="2026-04-28 19:36:00 +0000 UTC" firstStartedPulling="2026-04-28 19:36:04.992922627 +0000 UTC m=+1181.623823550" lastFinishedPulling="2026-04-28 19:36:07.886238463 +0000 UTC m=+1184.517139401" observedRunningTime="2026-04-28 19:36:08.978709699 +0000 UTC m=+1185.609610643" watchObservedRunningTime="2026-04-28 19:36:08.980443886 +0000 UTC m=+1185.611344834" Apr 28 19:36:09.961043 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:09.961005 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq" Apr 28 19:36:15.970026 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:15.969991 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq" Apr 28 19:36:23.907406 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:23.907369 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ppk4t_238e19ca-102f-43b1-8aed-9322ca47bfc9/ovn-acl-logging/0.log" Apr 28 19:36:23.909313 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:23.909289 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ppk4t_238e19ca-102f-43b1-8aed-9322ca47bfc9/ovn-acl-logging/0.log" Apr 28 19:36:35.971599 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:35.971558 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq" podUID="696284ec-82a0-46fb-9ee6-93e4dbbb9fae" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.34:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.134.0.34:8080: connect: connection refused" Apr 28 19:36:45.972175 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:36:45.972144 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq" Apr 28 19:37:15.973256 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:15.973180 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq" Apr 28 19:37:20.070292 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:20.070251 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq"] Apr 28 19:37:20.072766 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:20.070735 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq" podUID="696284ec-82a0-46fb-9ee6-93e4dbbb9fae" containerName="kserve-container" containerID="cri-o://3a1487ff847434e94fd077b45eb7b66fbffeb943abe3a72f661828cef0a0e3aa" gracePeriod=30 Apr 28 19:37:20.072766 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:20.070773 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq" podUID="696284ec-82a0-46fb-9ee6-93e4dbbb9fae" containerName="kserve-agent" containerID="cri-o://b8c8283a36814baa353a7a7bf86633f4fbc992c398d6c4d51e66f82d4ce81945" gracePeriod=30 Apr 28 19:37:20.072766 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:20.070737 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq" podUID="696284ec-82a0-46fb-9ee6-93e4dbbb9fae" containerName="kube-rbac-proxy" containerID="cri-o://38d367ccec56d62a4025195d48b5702377cceadf105947198527f37ec26bda63" gracePeriod=30 Apr 28 19:37:20.284736 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:20.284703 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ts72q"] Apr 28 19:37:20.285046 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:20.285034 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="95b9d646-5e0e-46f0-a06a-5a5b98c7bf98" containerName="storage-initializer" Apr 28 19:37:20.285089 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:20.285049 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="95b9d646-5e0e-46f0-a06a-5a5b98c7bf98" containerName="storage-initializer" Apr 28 19:37:20.285089 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:20.285059 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="95b9d646-5e0e-46f0-a06a-5a5b98c7bf98" containerName="kube-rbac-proxy" Apr 28 19:37:20.285089 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:20.285068 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="95b9d646-5e0e-46f0-a06a-5a5b98c7bf98" containerName="kube-rbac-proxy" Apr 28 19:37:20.285089 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:20.285088 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="95b9d646-5e0e-46f0-a06a-5a5b98c7bf98" containerName="kserve-container" Apr 28 19:37:20.285212 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:20.285094 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="95b9d646-5e0e-46f0-a06a-5a5b98c7bf98" containerName="kserve-container" Apr 28 19:37:20.285212 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:20.285152 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="95b9d646-5e0e-46f0-a06a-5a5b98c7bf98" containerName="kube-rbac-proxy" Apr 28 19:37:20.285212 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:20.285162 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="95b9d646-5e0e-46f0-a06a-5a5b98c7bf98" containerName="kserve-container" Apr 28 19:37:20.288342 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:20.288324 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ts72q" Apr 28 19:37:20.290638 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:20.290613 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-kube-rbac-proxy-sar-config\"" Apr 28 19:37:20.290872 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:20.290849 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-predictor-serving-cert\"" Apr 28 19:37:20.298347 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:20.298320 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ts72q"] Apr 28 19:37:20.326873 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:20.326814 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fb4159da-753d-44d9-9882-f66db722c96b-kserve-provision-location\") pod \"isvc-paddle-predictor-6b8b7cfb4b-ts72q\" (UID: \"fb4159da-753d-44d9-9882-f66db722c96b\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ts72q" Apr 28 19:37:20.326873 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:20.326852 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fb4159da-753d-44d9-9882-f66db722c96b-isvc-paddle-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-predictor-6b8b7cfb4b-ts72q\" (UID: \"fb4159da-753d-44d9-9882-f66db722c96b\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ts72q" Apr 28 19:37:20.327012 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:20.326913 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npqvr\" (UniqueName: \"kubernetes.io/projected/fb4159da-753d-44d9-9882-f66db722c96b-kube-api-access-npqvr\") pod \"isvc-paddle-predictor-6b8b7cfb4b-ts72q\" (UID: \"fb4159da-753d-44d9-9882-f66db722c96b\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ts72q" Apr 28 19:37:20.327012 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:20.326945 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fb4159da-753d-44d9-9882-f66db722c96b-proxy-tls\") pod \"isvc-paddle-predictor-6b8b7cfb4b-ts72q\" (UID: \"fb4159da-753d-44d9-9882-f66db722c96b\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ts72q" Apr 28 19:37:20.427265 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:20.427227 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fb4159da-753d-44d9-9882-f66db722c96b-kserve-provision-location\") pod \"isvc-paddle-predictor-6b8b7cfb4b-ts72q\" (UID: \"fb4159da-753d-44d9-9882-f66db722c96b\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ts72q" Apr 28 19:37:20.427265 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:20.427271 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fb4159da-753d-44d9-9882-f66db722c96b-isvc-paddle-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-predictor-6b8b7cfb4b-ts72q\" (UID: \"fb4159da-753d-44d9-9882-f66db722c96b\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ts72q" Apr 28 19:37:20.427536 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:20.427313 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-npqvr\" (UniqueName: \"kubernetes.io/projected/fb4159da-753d-44d9-9882-f66db722c96b-kube-api-access-npqvr\") pod \"isvc-paddle-predictor-6b8b7cfb4b-ts72q\" (UID: \"fb4159da-753d-44d9-9882-f66db722c96b\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ts72q" Apr 28 19:37:20.427536 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:20.427367 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fb4159da-753d-44d9-9882-f66db722c96b-proxy-tls\") pod \"isvc-paddle-predictor-6b8b7cfb4b-ts72q\" (UID: \"fb4159da-753d-44d9-9882-f66db722c96b\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ts72q" Apr 28 19:37:20.427667 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:20.427649 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fb4159da-753d-44d9-9882-f66db722c96b-kserve-provision-location\") pod \"isvc-paddle-predictor-6b8b7cfb4b-ts72q\" (UID: \"fb4159da-753d-44d9-9882-f66db722c96b\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ts72q" Apr 28 19:37:20.428079 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:20.428060 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fb4159da-753d-44d9-9882-f66db722c96b-isvc-paddle-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-predictor-6b8b7cfb4b-ts72q\" (UID: \"fb4159da-753d-44d9-9882-f66db722c96b\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ts72q" Apr 28 19:37:20.429764 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:20.429747 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fb4159da-753d-44d9-9882-f66db722c96b-proxy-tls\") pod \"isvc-paddle-predictor-6b8b7cfb4b-ts72q\" (UID: \"fb4159da-753d-44d9-9882-f66db722c96b\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ts72q" Apr 28 19:37:20.435453 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:20.435427 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-npqvr\" (UniqueName: \"kubernetes.io/projected/fb4159da-753d-44d9-9882-f66db722c96b-kube-api-access-npqvr\") pod \"isvc-paddle-predictor-6b8b7cfb4b-ts72q\" (UID: \"fb4159da-753d-44d9-9882-f66db722c96b\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ts72q" Apr 28 19:37:20.600295 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:20.600201 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ts72q" Apr 28 19:37:20.725116 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:20.725089 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ts72q"] Apr 28 19:37:20.729619 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:37:20.729585 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb4159da_753d_44d9_9882_f66db722c96b.slice/crio-9b527ed3c8ce827629b042503f23d2887d1771ea9620d0b64d799bf02a385441 WatchSource:0}: Error finding container 9b527ed3c8ce827629b042503f23d2887d1771ea9620d0b64d799bf02a385441: Status 404 returned error can't find the container with id 9b527ed3c8ce827629b042503f23d2887d1771ea9620d0b64d799bf02a385441 Apr 28 19:37:20.965325 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:20.965291 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq" podUID="696284ec-82a0-46fb-9ee6-93e4dbbb9fae" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.34:8643/healthz\": dial tcp 10.134.0.34:8643: connect: connection refused" Apr 28 19:37:21.182330 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:21.182295 2571 generic.go:358] "Generic (PLEG): container finished" podID="696284ec-82a0-46fb-9ee6-93e4dbbb9fae" containerID="38d367ccec56d62a4025195d48b5702377cceadf105947198527f37ec26bda63" exitCode=2 Apr 28 19:37:21.182754 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:21.182364 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq" event={"ID":"696284ec-82a0-46fb-9ee6-93e4dbbb9fae","Type":"ContainerDied","Data":"38d367ccec56d62a4025195d48b5702377cceadf105947198527f37ec26bda63"} Apr 28 19:37:21.183666 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:21.183641 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ts72q" event={"ID":"fb4159da-753d-44d9-9882-f66db722c96b","Type":"ContainerStarted","Data":"2b6f3c90117f8ea00ec383dd59a78db028daf7fd6f7f74961b6f03d92d7c0f4b"} Apr 28 19:37:21.183790 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:21.183674 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ts72q" event={"ID":"fb4159da-753d-44d9-9882-f66db722c96b","Type":"ContainerStarted","Data":"9b527ed3c8ce827629b042503f23d2887d1771ea9620d0b64d799bf02a385441"} Apr 28 19:37:22.188992 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:22.188965 2571 generic.go:358] "Generic (PLEG): container finished" podID="696284ec-82a0-46fb-9ee6-93e4dbbb9fae" containerID="3a1487ff847434e94fd077b45eb7b66fbffeb943abe3a72f661828cef0a0e3aa" exitCode=0 Apr 28 19:37:22.189284 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:22.189030 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq" event={"ID":"696284ec-82a0-46fb-9ee6-93e4dbbb9fae","Type":"ContainerDied","Data":"3a1487ff847434e94fd077b45eb7b66fbffeb943abe3a72f661828cef0a0e3aa"} Apr 28 19:37:25.199992 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:25.199958 2571 generic.go:358] "Generic (PLEG): container finished" podID="fb4159da-753d-44d9-9882-f66db722c96b" containerID="2b6f3c90117f8ea00ec383dd59a78db028daf7fd6f7f74961b6f03d92d7c0f4b" exitCode=0 Apr 28 19:37:25.200452 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:25.199998 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ts72q" event={"ID":"fb4159da-753d-44d9-9882-f66db722c96b","Type":"ContainerDied","Data":"2b6f3c90117f8ea00ec383dd59a78db028daf7fd6f7f74961b6f03d92d7c0f4b"} Apr 28 19:37:25.964832 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:25.964762 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq" podUID="696284ec-82a0-46fb-9ee6-93e4dbbb9fae" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.34:8643/healthz\": dial tcp 10.134.0.34:8643: connect: connection refused" Apr 28 19:37:25.970668 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:25.970631 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq" podUID="696284ec-82a0-46fb-9ee6-93e4dbbb9fae" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.34:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.134.0.34:8080: connect: connection refused" Apr 28 19:37:30.965256 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:30.965193 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq" podUID="696284ec-82a0-46fb-9ee6-93e4dbbb9fae" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.34:8643/healthz\": dial tcp 10.134.0.34:8643: connect: connection refused" Apr 28 19:37:30.965661 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:30.965401 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq" Apr 28 19:37:35.965541 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:35.965501 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq" podUID="696284ec-82a0-46fb-9ee6-93e4dbbb9fae" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.34:8643/healthz\": dial tcp 10.134.0.34:8643: connect: connection refused" Apr 28 19:37:35.971003 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:35.970969 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq" podUID="696284ec-82a0-46fb-9ee6-93e4dbbb9fae" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.34:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.134.0.34:8080: connect: connection refused" Apr 28 19:37:36.243179 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:36.243095 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ts72q" event={"ID":"fb4159da-753d-44d9-9882-f66db722c96b","Type":"ContainerStarted","Data":"85af96a558a7107601627df8c06ee00fdb6c36bc651bcb82faea00a0826a1971"} Apr 28 19:37:36.243179 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:36.243136 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ts72q" event={"ID":"fb4159da-753d-44d9-9882-f66db722c96b","Type":"ContainerStarted","Data":"916cd425567eb1ab59ddfdc8469adde4004b0cbc70248e47a56261e9fc61a45a"} Apr 28 19:37:36.243416 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:36.243399 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ts72q" Apr 28 19:37:36.261944 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:36.261886 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ts72q" podStartSLOduration=5.419282867 podStartE2EDuration="16.261868617s" podCreationTimestamp="2026-04-28 19:37:20 +0000 UTC" firstStartedPulling="2026-04-28 19:37:25.201263786 +0000 UTC m=+1261.832164709" lastFinishedPulling="2026-04-28 19:37:36.043849535 +0000 UTC m=+1272.674750459" observedRunningTime="2026-04-28 19:37:36.260227611 +0000 UTC m=+1272.891128568" watchObservedRunningTime="2026-04-28 19:37:36.261868617 +0000 UTC m=+1272.892769564" Apr 28 19:37:37.246542 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:37.246505 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ts72q" Apr 28 19:37:37.247747 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:37.247718 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ts72q" podUID="fb4159da-753d-44d9-9882-f66db722c96b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 28 19:37:38.249683 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:38.249648 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ts72q" podUID="fb4159da-753d-44d9-9882-f66db722c96b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 28 19:37:40.965032 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:40.964984 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq" podUID="696284ec-82a0-46fb-9ee6-93e4dbbb9fae" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.34:8643/healthz\": dial tcp 10.134.0.34:8643: connect: connection refused" Apr 28 19:37:43.254518 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:43.254472 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ts72q" Apr 28 19:37:43.255011 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:43.254986 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ts72q" podUID="fb4159da-753d-44d9-9882-f66db722c96b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 28 19:37:45.965054 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:45.965015 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq" podUID="696284ec-82a0-46fb-9ee6-93e4dbbb9fae" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.34:8643/healthz\": dial tcp 10.134.0.34:8643: connect: connection refused" Apr 28 19:37:45.970578 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:45.970547 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq" podUID="696284ec-82a0-46fb-9ee6-93e4dbbb9fae" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.34:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.134.0.34:8080: connect: connection refused" Apr 28 19:37:45.970662 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:45.970651 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq" Apr 28 19:37:50.210168 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:50.210143 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq" Apr 28 19:37:50.289694 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:50.289653 2571 generic.go:358] "Generic (PLEG): container finished" podID="696284ec-82a0-46fb-9ee6-93e4dbbb9fae" containerID="b8c8283a36814baa353a7a7bf86633f4fbc992c398d6c4d51e66f82d4ce81945" exitCode=0 Apr 28 19:37:50.289857 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:50.289728 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq" event={"ID":"696284ec-82a0-46fb-9ee6-93e4dbbb9fae","Type":"ContainerDied","Data":"b8c8283a36814baa353a7a7bf86633f4fbc992c398d6c4d51e66f82d4ce81945"} Apr 28 19:37:50.289857 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:50.289749 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq" Apr 28 19:37:50.289857 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:50.289776 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq" event={"ID":"696284ec-82a0-46fb-9ee6-93e4dbbb9fae","Type":"ContainerDied","Data":"556079ee7e0b9284515a155d61b320c3c0b209fef06420fb3bf5add87dca85b1"} Apr 28 19:37:50.289857 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:50.289798 2571 scope.go:117] "RemoveContainer" containerID="38d367ccec56d62a4025195d48b5702377cceadf105947198527f37ec26bda63" Apr 28 19:37:50.299725 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:50.299696 2571 scope.go:117] "RemoveContainer" containerID="b8c8283a36814baa353a7a7bf86633f4fbc992c398d6c4d51e66f82d4ce81945" Apr 28 19:37:50.307090 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:50.307071 2571 scope.go:117] "RemoveContainer" containerID="3a1487ff847434e94fd077b45eb7b66fbffeb943abe3a72f661828cef0a0e3aa" Apr 28 19:37:50.313906 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:50.313886 2571 scope.go:117] "RemoveContainer" containerID="b9418bf555b2c1c11ae314ee60c20fc5be4ff6dc0232363329080795b82e63cd" Apr 28 19:37:50.320569 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:50.320552 2571 scope.go:117] "RemoveContainer" containerID="38d367ccec56d62a4025195d48b5702377cceadf105947198527f37ec26bda63" Apr 28 19:37:50.320818 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:37:50.320799 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38d367ccec56d62a4025195d48b5702377cceadf105947198527f37ec26bda63\": container with ID starting with 38d367ccec56d62a4025195d48b5702377cceadf105947198527f37ec26bda63 not found: ID does not exist" containerID="38d367ccec56d62a4025195d48b5702377cceadf105947198527f37ec26bda63" Apr 28 19:37:50.320877 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:50.320833 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38d367ccec56d62a4025195d48b5702377cceadf105947198527f37ec26bda63"} err="failed to get container status \"38d367ccec56d62a4025195d48b5702377cceadf105947198527f37ec26bda63\": rpc error: code = NotFound desc = could not find container \"38d367ccec56d62a4025195d48b5702377cceadf105947198527f37ec26bda63\": container with ID starting with 38d367ccec56d62a4025195d48b5702377cceadf105947198527f37ec26bda63 not found: ID does not exist" Apr 28 19:37:50.320877 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:50.320853 2571 scope.go:117] "RemoveContainer" containerID="b8c8283a36814baa353a7a7bf86633f4fbc992c398d6c4d51e66f82d4ce81945" Apr 28 19:37:50.321051 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:37:50.321032 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8c8283a36814baa353a7a7bf86633f4fbc992c398d6c4d51e66f82d4ce81945\": container with ID starting with b8c8283a36814baa353a7a7bf86633f4fbc992c398d6c4d51e66f82d4ce81945 not found: ID does not exist" containerID="b8c8283a36814baa353a7a7bf86633f4fbc992c398d6c4d51e66f82d4ce81945" Apr 28 19:37:50.321095 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:50.321059 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8c8283a36814baa353a7a7bf86633f4fbc992c398d6c4d51e66f82d4ce81945"} err="failed to get container status \"b8c8283a36814baa353a7a7bf86633f4fbc992c398d6c4d51e66f82d4ce81945\": rpc error: code = NotFound desc = could not find container \"b8c8283a36814baa353a7a7bf86633f4fbc992c398d6c4d51e66f82d4ce81945\": container with ID starting with b8c8283a36814baa353a7a7bf86633f4fbc992c398d6c4d51e66f82d4ce81945 not found: ID does not exist" Apr 28 19:37:50.321095 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:50.321075 2571 scope.go:117] "RemoveContainer" containerID="3a1487ff847434e94fd077b45eb7b66fbffeb943abe3a72f661828cef0a0e3aa" Apr 28 19:37:50.321293 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:37:50.321276 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a1487ff847434e94fd077b45eb7b66fbffeb943abe3a72f661828cef0a0e3aa\": container with ID starting with 3a1487ff847434e94fd077b45eb7b66fbffeb943abe3a72f661828cef0a0e3aa not found: ID does not exist" containerID="3a1487ff847434e94fd077b45eb7b66fbffeb943abe3a72f661828cef0a0e3aa" Apr 28 19:37:50.321363 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:50.321303 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a1487ff847434e94fd077b45eb7b66fbffeb943abe3a72f661828cef0a0e3aa"} err="failed to get container status \"3a1487ff847434e94fd077b45eb7b66fbffeb943abe3a72f661828cef0a0e3aa\": rpc error: code = NotFound desc = could not find container \"3a1487ff847434e94fd077b45eb7b66fbffeb943abe3a72f661828cef0a0e3aa\": container with ID starting with 3a1487ff847434e94fd077b45eb7b66fbffeb943abe3a72f661828cef0a0e3aa not found: ID does not exist" Apr 28 19:37:50.321363 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:50.321324 2571 scope.go:117] "RemoveContainer" containerID="b9418bf555b2c1c11ae314ee60c20fc5be4ff6dc0232363329080795b82e63cd" Apr 28 19:37:50.321532 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:37:50.321514 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9418bf555b2c1c11ae314ee60c20fc5be4ff6dc0232363329080795b82e63cd\": container with ID starting with b9418bf555b2c1c11ae314ee60c20fc5be4ff6dc0232363329080795b82e63cd not found: ID does not exist" containerID="b9418bf555b2c1c11ae314ee60c20fc5be4ff6dc0232363329080795b82e63cd" Apr 28 19:37:50.321591 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:50.321542 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9418bf555b2c1c11ae314ee60c20fc5be4ff6dc0232363329080795b82e63cd"} err="failed to get container status \"b9418bf555b2c1c11ae314ee60c20fc5be4ff6dc0232363329080795b82e63cd\": rpc error: code = NotFound desc = could not find container \"b9418bf555b2c1c11ae314ee60c20fc5be4ff6dc0232363329080795b82e63cd\": container with ID starting with b9418bf555b2c1c11ae314ee60c20fc5be4ff6dc0232363329080795b82e63cd not found: ID does not exist" Apr 28 19:37:50.379958 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:50.379889 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/696284ec-82a0-46fb-9ee6-93e4dbbb9fae-proxy-tls\") pod \"696284ec-82a0-46fb-9ee6-93e4dbbb9fae\" (UID: \"696284ec-82a0-46fb-9ee6-93e4dbbb9fae\") " Apr 28 19:37:50.379958 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:50.379926 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/696284ec-82a0-46fb-9ee6-93e4dbbb9fae-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") pod \"696284ec-82a0-46fb-9ee6-93e4dbbb9fae\" (UID: \"696284ec-82a0-46fb-9ee6-93e4dbbb9fae\") " Apr 28 19:37:50.380116 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:50.379979 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/696284ec-82a0-46fb-9ee6-93e4dbbb9fae-kserve-provision-location\") pod \"696284ec-82a0-46fb-9ee6-93e4dbbb9fae\" (UID: \"696284ec-82a0-46fb-9ee6-93e4dbbb9fae\") " Apr 28 19:37:50.380116 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:50.380029 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4rtg\" (UniqueName: \"kubernetes.io/projected/696284ec-82a0-46fb-9ee6-93e4dbbb9fae-kube-api-access-g4rtg\") pod \"696284ec-82a0-46fb-9ee6-93e4dbbb9fae\" (UID: \"696284ec-82a0-46fb-9ee6-93e4dbbb9fae\") " Apr 28 19:37:50.380330 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:50.380309 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/696284ec-82a0-46fb-9ee6-93e4dbbb9fae-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "696284ec-82a0-46fb-9ee6-93e4dbbb9fae" (UID: "696284ec-82a0-46fb-9ee6-93e4dbbb9fae"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:37:50.380388 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:50.380317 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/696284ec-82a0-46fb-9ee6-93e4dbbb9fae-isvc-sklearn-mcp-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-mcp-kube-rbac-proxy-sar-config") pod "696284ec-82a0-46fb-9ee6-93e4dbbb9fae" (UID: "696284ec-82a0-46fb-9ee6-93e4dbbb9fae"). InnerVolumeSpecName "isvc-sklearn-mcp-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:37:50.381981 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:50.381948 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/696284ec-82a0-46fb-9ee6-93e4dbbb9fae-kube-api-access-g4rtg" (OuterVolumeSpecName: "kube-api-access-g4rtg") pod "696284ec-82a0-46fb-9ee6-93e4dbbb9fae" (UID: "696284ec-82a0-46fb-9ee6-93e4dbbb9fae"). InnerVolumeSpecName "kube-api-access-g4rtg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:37:50.382084 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:50.382012 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/696284ec-82a0-46fb-9ee6-93e4dbbb9fae-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "696284ec-82a0-46fb-9ee6-93e4dbbb9fae" (UID: "696284ec-82a0-46fb-9ee6-93e4dbbb9fae"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:37:50.481058 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:50.481011 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g4rtg\" (UniqueName: \"kubernetes.io/projected/696284ec-82a0-46fb-9ee6-93e4dbbb9fae-kube-api-access-g4rtg\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:37:50.481058 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:50.481053 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/696284ec-82a0-46fb-9ee6-93e4dbbb9fae-proxy-tls\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:37:50.481058 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:50.481065 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/696284ec-82a0-46fb-9ee6-93e4dbbb9fae-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:37:50.481280 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:50.481075 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/696284ec-82a0-46fb-9ee6-93e4dbbb9fae-kserve-provision-location\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:37:50.613387 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:50.613356 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq"] Apr 28 19:37:50.617876 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:50.617844 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5f8b5bfcd6-nw5gq"] Apr 28 19:37:51.947256 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:51.947219 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="696284ec-82a0-46fb-9ee6-93e4dbbb9fae" path="/var/lib/kubelet/pods/696284ec-82a0-46fb-9ee6-93e4dbbb9fae/volumes" Apr 28 19:37:53.255514 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:37:53.255461 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ts72q" podUID="fb4159da-753d-44d9-9882-f66db722c96b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 28 19:38:03.255071 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:03.255027 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ts72q" podUID="fb4159da-753d-44d9-9882-f66db722c96b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 28 19:38:13.254996 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:13.254949 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ts72q" podUID="fb4159da-753d-44d9-9882-f66db722c96b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 28 19:38:23.255639 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:23.255609 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ts72q" Apr 28 19:38:26.360827 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:26.360795 2571 scope.go:117] "RemoveContainer" containerID="3e2a026244559719d3de65a88f7f7373e66873a32ba823b62d8cdf6da73d9d59" Apr 28 19:38:31.573578 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:31.573538 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ts72q"] Apr 28 19:38:31.573961 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:31.573869 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ts72q" podUID="fb4159da-753d-44d9-9882-f66db722c96b" containerName="kserve-container" containerID="cri-o://916cd425567eb1ab59ddfdc8469adde4004b0cbc70248e47a56261e9fc61a45a" gracePeriod=30 Apr 28 19:38:31.573961 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:31.573911 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ts72q" podUID="fb4159da-753d-44d9-9882-f66db722c96b" containerName="kube-rbac-proxy" containerID="cri-o://85af96a558a7107601627df8c06ee00fdb6c36bc651bcb82faea00a0826a1971" gracePeriod=30 Apr 28 19:38:31.782685 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:31.782653 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-9thsb"] Apr 28 19:38:31.783006 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:31.782993 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="696284ec-82a0-46fb-9ee6-93e4dbbb9fae" containerName="kube-rbac-proxy" Apr 28 19:38:31.783051 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:31.783008 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="696284ec-82a0-46fb-9ee6-93e4dbbb9fae" containerName="kube-rbac-proxy" Apr 28 19:38:31.783051 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:31.783022 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="696284ec-82a0-46fb-9ee6-93e4dbbb9fae" containerName="kserve-container" Apr 28 19:38:31.783051 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:31.783030 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="696284ec-82a0-46fb-9ee6-93e4dbbb9fae" containerName="kserve-container" Apr 28 19:38:31.783051 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:31.783043 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="696284ec-82a0-46fb-9ee6-93e4dbbb9fae" containerName="kserve-agent" Apr 28 19:38:31.783051 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:31.783049 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="696284ec-82a0-46fb-9ee6-93e4dbbb9fae" containerName="kserve-agent" Apr 28 19:38:31.783230 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:31.783058 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="696284ec-82a0-46fb-9ee6-93e4dbbb9fae" containerName="storage-initializer" Apr 28 19:38:31.783230 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:31.783064 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="696284ec-82a0-46fb-9ee6-93e4dbbb9fae" containerName="storage-initializer" Apr 28 19:38:31.783230 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:31.783122 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="696284ec-82a0-46fb-9ee6-93e4dbbb9fae" containerName="kserve-agent" Apr 28 19:38:31.783230 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:31.783130 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="696284ec-82a0-46fb-9ee6-93e4dbbb9fae" containerName="kube-rbac-proxy" Apr 28 19:38:31.783230 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:31.783138 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="696284ec-82a0-46fb-9ee6-93e4dbbb9fae" containerName="kserve-container" Apr 28 19:38:31.785351 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:31.785334 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-9thsb" Apr 28 19:38:31.787301 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:31.787272 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-runtime-predictor-serving-cert\"" Apr 28 19:38:31.787582 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:31.787558 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-runtime-kube-rbac-proxy-sar-config\"" Apr 28 19:38:31.795474 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:31.795451 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-9thsb"] Apr 28 19:38:31.928505 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:31.928445 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwgq9\" (UniqueName: \"kubernetes.io/projected/b92eeb7d-7365-453c-ab3c-89526b93cbc8-kube-api-access-xwgq9\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-9thsb\" (UID: \"b92eeb7d-7365-453c-ab3c-89526b93cbc8\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-9thsb" Apr 28 19:38:31.928708 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:31.928539 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b92eeb7d-7365-453c-ab3c-89526b93cbc8-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-9thsb\" (UID: \"b92eeb7d-7365-453c-ab3c-89526b93cbc8\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-9thsb" Apr 28 19:38:31.928708 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:31.928615 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b92eeb7d-7365-453c-ab3c-89526b93cbc8-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-9thsb\" (UID: \"b92eeb7d-7365-453c-ab3c-89526b93cbc8\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-9thsb" Apr 28 19:38:31.928708 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:31.928658 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b92eeb7d-7365-453c-ab3c-89526b93cbc8-proxy-tls\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-9thsb\" (UID: \"b92eeb7d-7365-453c-ab3c-89526b93cbc8\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-9thsb" Apr 28 19:38:32.029860 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:32.029819 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b92eeb7d-7365-453c-ab3c-89526b93cbc8-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-9thsb\" (UID: \"b92eeb7d-7365-453c-ab3c-89526b93cbc8\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-9thsb" Apr 28 19:38:32.030130 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:32.029869 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b92eeb7d-7365-453c-ab3c-89526b93cbc8-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-9thsb\" (UID: \"b92eeb7d-7365-453c-ab3c-89526b93cbc8\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-9thsb" Apr 28 19:38:32.030130 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:32.029898 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b92eeb7d-7365-453c-ab3c-89526b93cbc8-proxy-tls\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-9thsb\" (UID: \"b92eeb7d-7365-453c-ab3c-89526b93cbc8\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-9thsb" Apr 28 19:38:32.030130 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:32.029946 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xwgq9\" (UniqueName: \"kubernetes.io/projected/b92eeb7d-7365-453c-ab3c-89526b93cbc8-kube-api-access-xwgq9\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-9thsb\" (UID: \"b92eeb7d-7365-453c-ab3c-89526b93cbc8\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-9thsb" Apr 28 19:38:32.030286 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:32.030242 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b92eeb7d-7365-453c-ab3c-89526b93cbc8-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-9thsb\" (UID: \"b92eeb7d-7365-453c-ab3c-89526b93cbc8\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-9thsb" Apr 28 19:38:32.030693 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:32.030671 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b92eeb7d-7365-453c-ab3c-89526b93cbc8-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-9thsb\" (UID: \"b92eeb7d-7365-453c-ab3c-89526b93cbc8\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-9thsb" Apr 28 19:38:32.032515 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:32.032465 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b92eeb7d-7365-453c-ab3c-89526b93cbc8-proxy-tls\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-9thsb\" (UID: \"b92eeb7d-7365-453c-ab3c-89526b93cbc8\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-9thsb" Apr 28 19:38:32.037066 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:32.037043 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwgq9\" (UniqueName: \"kubernetes.io/projected/b92eeb7d-7365-453c-ab3c-89526b93cbc8-kube-api-access-xwgq9\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-9thsb\" (UID: \"b92eeb7d-7365-453c-ab3c-89526b93cbc8\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-9thsb" Apr 28 19:38:32.096327 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:32.096292 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-9thsb" Apr 28 19:38:32.220440 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:32.220415 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-9thsb"] Apr 28 19:38:32.223098 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:38:32.223069 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb92eeb7d_7365_453c_ab3c_89526b93cbc8.slice/crio-bd296e9be386e9bf0bc778cbecab4bb988069c1e39f919657bd0a420d350f2a0 WatchSource:0}: Error finding container bd296e9be386e9bf0bc778cbecab4bb988069c1e39f919657bd0a420d350f2a0: Status 404 returned error can't find the container with id bd296e9be386e9bf0bc778cbecab4bb988069c1e39f919657bd0a420d350f2a0 Apr 28 19:38:32.428372 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:32.428333 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-9thsb" event={"ID":"b92eeb7d-7365-453c-ab3c-89526b93cbc8","Type":"ContainerStarted","Data":"253b527ec5734e10ca1ce073258c0d60c8457bbcf72f23f4f3af67032df5da92"} Apr 28 19:38:32.428372 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:32.428378 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-9thsb" event={"ID":"b92eeb7d-7365-453c-ab3c-89526b93cbc8","Type":"ContainerStarted","Data":"bd296e9be386e9bf0bc778cbecab4bb988069c1e39f919657bd0a420d350f2a0"} Apr 28 19:38:32.430315 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:32.430280 2571 generic.go:358] "Generic (PLEG): container finished" podID="fb4159da-753d-44d9-9882-f66db722c96b" containerID="85af96a558a7107601627df8c06ee00fdb6c36bc651bcb82faea00a0826a1971" exitCode=2 Apr 28 19:38:32.430504 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:32.430351 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ts72q" event={"ID":"fb4159da-753d-44d9-9882-f66db722c96b","Type":"ContainerDied","Data":"85af96a558a7107601627df8c06ee00fdb6c36bc651bcb82faea00a0826a1971"} Apr 28 19:38:33.250440 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:33.250397 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ts72q" podUID="fb4159da-753d-44d9-9882-f66db722c96b" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.35:8643/healthz\": dial tcp 10.134.0.35:8643: connect: connection refused" Apr 28 19:38:33.255620 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:33.255587 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ts72q" podUID="fb4159da-753d-44d9-9882-f66db722c96b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 28 19:38:34.315559 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:34.315533 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ts72q" Apr 28 19:38:34.445110 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:34.445078 2571 generic.go:358] "Generic (PLEG): container finished" podID="fb4159da-753d-44d9-9882-f66db722c96b" containerID="916cd425567eb1ab59ddfdc8469adde4004b0cbc70248e47a56261e9fc61a45a" exitCode=0 Apr 28 19:38:34.445282 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:34.445159 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ts72q" Apr 28 19:38:34.445282 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:34.445170 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ts72q" event={"ID":"fb4159da-753d-44d9-9882-f66db722c96b","Type":"ContainerDied","Data":"916cd425567eb1ab59ddfdc8469adde4004b0cbc70248e47a56261e9fc61a45a"} Apr 28 19:38:34.445282 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:34.445209 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ts72q" event={"ID":"fb4159da-753d-44d9-9882-f66db722c96b","Type":"ContainerDied","Data":"9b527ed3c8ce827629b042503f23d2887d1771ea9620d0b64d799bf02a385441"} Apr 28 19:38:34.445282 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:34.445226 2571 scope.go:117] "RemoveContainer" containerID="85af96a558a7107601627df8c06ee00fdb6c36bc651bcb82faea00a0826a1971" Apr 28 19:38:34.447885 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:34.447869 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fb4159da-753d-44d9-9882-f66db722c96b-proxy-tls\") pod \"fb4159da-753d-44d9-9882-f66db722c96b\" (UID: \"fb4159da-753d-44d9-9882-f66db722c96b\") " Apr 28 19:38:34.447991 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:34.447934 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fb4159da-753d-44d9-9882-f66db722c96b-isvc-paddle-kube-rbac-proxy-sar-config\") pod \"fb4159da-753d-44d9-9882-f66db722c96b\" (UID: \"fb4159da-753d-44d9-9882-f66db722c96b\") " Apr 28 19:38:34.448052 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:34.448023 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fb4159da-753d-44d9-9882-f66db722c96b-kserve-provision-location\") pod \"fb4159da-753d-44d9-9882-f66db722c96b\" (UID: \"fb4159da-753d-44d9-9882-f66db722c96b\") " Apr 28 19:38:34.448101 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:34.448057 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npqvr\" (UniqueName: \"kubernetes.io/projected/fb4159da-753d-44d9-9882-f66db722c96b-kube-api-access-npqvr\") pod \"fb4159da-753d-44d9-9882-f66db722c96b\" (UID: \"fb4159da-753d-44d9-9882-f66db722c96b\") " Apr 28 19:38:34.448385 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:34.448356 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb4159da-753d-44d9-9882-f66db722c96b-isvc-paddle-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-paddle-kube-rbac-proxy-sar-config") pod "fb4159da-753d-44d9-9882-f66db722c96b" (UID: "fb4159da-753d-44d9-9882-f66db722c96b"). InnerVolumeSpecName "isvc-paddle-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:38:34.450191 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:34.450159 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb4159da-753d-44d9-9882-f66db722c96b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fb4159da-753d-44d9-9882-f66db722c96b" (UID: "fb4159da-753d-44d9-9882-f66db722c96b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:38:34.450525 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:34.450503 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb4159da-753d-44d9-9882-f66db722c96b-kube-api-access-npqvr" (OuterVolumeSpecName: "kube-api-access-npqvr") pod "fb4159da-753d-44d9-9882-f66db722c96b" (UID: "fb4159da-753d-44d9-9882-f66db722c96b"). InnerVolumeSpecName "kube-api-access-npqvr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:38:34.454147 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:34.454109 2571 scope.go:117] "RemoveContainer" containerID="916cd425567eb1ab59ddfdc8469adde4004b0cbc70248e47a56261e9fc61a45a" Apr 28 19:38:34.459189 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:34.459161 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb4159da-753d-44d9-9882-f66db722c96b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "fb4159da-753d-44d9-9882-f66db722c96b" (UID: "fb4159da-753d-44d9-9882-f66db722c96b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:38:34.464209 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:34.464187 2571 scope.go:117] "RemoveContainer" containerID="2b6f3c90117f8ea00ec383dd59a78db028daf7fd6f7f74961b6f03d92d7c0f4b" Apr 28 19:38:34.470990 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:34.470974 2571 scope.go:117] "RemoveContainer" containerID="85af96a558a7107601627df8c06ee00fdb6c36bc651bcb82faea00a0826a1971" Apr 28 19:38:34.471232 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:38:34.471213 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85af96a558a7107601627df8c06ee00fdb6c36bc651bcb82faea00a0826a1971\": container with ID starting with 85af96a558a7107601627df8c06ee00fdb6c36bc651bcb82faea00a0826a1971 not found: ID does not exist" containerID="85af96a558a7107601627df8c06ee00fdb6c36bc651bcb82faea00a0826a1971" Apr 28 19:38:34.471303 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:34.471247 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85af96a558a7107601627df8c06ee00fdb6c36bc651bcb82faea00a0826a1971"} err="failed to get container status \"85af96a558a7107601627df8c06ee00fdb6c36bc651bcb82faea00a0826a1971\": rpc error: code = NotFound desc = could not find container \"85af96a558a7107601627df8c06ee00fdb6c36bc651bcb82faea00a0826a1971\": container with ID starting with 85af96a558a7107601627df8c06ee00fdb6c36bc651bcb82faea00a0826a1971 not found: ID does not exist" Apr 28 19:38:34.471303 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:34.471274 2571 scope.go:117] "RemoveContainer" containerID="916cd425567eb1ab59ddfdc8469adde4004b0cbc70248e47a56261e9fc61a45a" Apr 28 19:38:34.471541 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:38:34.471519 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"916cd425567eb1ab59ddfdc8469adde4004b0cbc70248e47a56261e9fc61a45a\": container with ID starting with 916cd425567eb1ab59ddfdc8469adde4004b0cbc70248e47a56261e9fc61a45a not found: ID does not exist" containerID="916cd425567eb1ab59ddfdc8469adde4004b0cbc70248e47a56261e9fc61a45a" Apr 28 19:38:34.471625 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:34.471543 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"916cd425567eb1ab59ddfdc8469adde4004b0cbc70248e47a56261e9fc61a45a"} err="failed to get container status \"916cd425567eb1ab59ddfdc8469adde4004b0cbc70248e47a56261e9fc61a45a\": rpc error: code = NotFound desc = could not find container \"916cd425567eb1ab59ddfdc8469adde4004b0cbc70248e47a56261e9fc61a45a\": container with ID starting with 916cd425567eb1ab59ddfdc8469adde4004b0cbc70248e47a56261e9fc61a45a not found: ID does not exist" Apr 28 19:38:34.471625 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:34.471558 2571 scope.go:117] "RemoveContainer" containerID="2b6f3c90117f8ea00ec383dd59a78db028daf7fd6f7f74961b6f03d92d7c0f4b" Apr 28 19:38:34.471764 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:38:34.471747 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b6f3c90117f8ea00ec383dd59a78db028daf7fd6f7f74961b6f03d92d7c0f4b\": container with ID starting with 2b6f3c90117f8ea00ec383dd59a78db028daf7fd6f7f74961b6f03d92d7c0f4b not found: ID does not exist" containerID="2b6f3c90117f8ea00ec383dd59a78db028daf7fd6f7f74961b6f03d92d7c0f4b" Apr 28 19:38:34.471803 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:34.471770 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b6f3c90117f8ea00ec383dd59a78db028daf7fd6f7f74961b6f03d92d7c0f4b"} err="failed to get container status \"2b6f3c90117f8ea00ec383dd59a78db028daf7fd6f7f74961b6f03d92d7c0f4b\": rpc error: code = NotFound desc = could not find container \"2b6f3c90117f8ea00ec383dd59a78db028daf7fd6f7f74961b6f03d92d7c0f4b\": container with ID starting with 2b6f3c90117f8ea00ec383dd59a78db028daf7fd6f7f74961b6f03d92d7c0f4b not found: ID does not exist" Apr 28 19:38:34.549422 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:34.549394 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fb4159da-753d-44d9-9882-f66db722c96b-isvc-paddle-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:38:34.549422 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:34.549420 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fb4159da-753d-44d9-9882-f66db722c96b-kserve-provision-location\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:38:34.549422 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:34.549430 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-npqvr\" (UniqueName: \"kubernetes.io/projected/fb4159da-753d-44d9-9882-f66db722c96b-kube-api-access-npqvr\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:38:34.549648 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:34.549439 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fb4159da-753d-44d9-9882-f66db722c96b-proxy-tls\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:38:34.765842 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:34.765810 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ts72q"] Apr 28 19:38:34.767969 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:34.767946 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-ts72q"] Apr 28 19:38:35.947836 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:35.947802 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb4159da-753d-44d9-9882-f66db722c96b" path="/var/lib/kubelet/pods/fb4159da-753d-44d9-9882-f66db722c96b/volumes" Apr 28 19:38:37.456157 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:37.456126 2571 generic.go:358] "Generic (PLEG): container finished" podID="b92eeb7d-7365-453c-ab3c-89526b93cbc8" containerID="253b527ec5734e10ca1ce073258c0d60c8457bbcf72f23f4f3af67032df5da92" exitCode=0 Apr 28 19:38:37.456552 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:37.456205 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-9thsb" event={"ID":"b92eeb7d-7365-453c-ab3c-89526b93cbc8","Type":"ContainerDied","Data":"253b527ec5734e10ca1ce073258c0d60c8457bbcf72f23f4f3af67032df5da92"} Apr 28 19:38:38.461560 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:38.461520 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-9thsb" event={"ID":"b92eeb7d-7365-453c-ab3c-89526b93cbc8","Type":"ContainerStarted","Data":"a2257e06376825094ea571babc80c4a5840a4fd01980e8818ed0dca6a03bcf29"} Apr 28 19:38:38.461560 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:38.461563 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-9thsb" event={"ID":"b92eeb7d-7365-453c-ab3c-89526b93cbc8","Type":"ContainerStarted","Data":"137e5bac17d77395775de70a2d298a009579db7cbc4622b89b475d7f5c7944e2"} Apr 28 19:38:38.461987 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:38.461780 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-9thsb" Apr 28 19:38:38.481158 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:38.481110 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-9thsb" podStartSLOduration=7.481096923 podStartE2EDuration="7.481096923s" podCreationTimestamp="2026-04-28 19:38:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:38:38.479582987 +0000 UTC m=+1335.110483932" watchObservedRunningTime="2026-04-28 19:38:38.481096923 +0000 UTC m=+1335.111997868" Apr 28 19:38:39.465555 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:39.465469 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-9thsb" Apr 28 19:38:39.466939 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:39.466908 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-9thsb" podUID="b92eeb7d-7365-453c-ab3c-89526b93cbc8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 28 19:38:40.468492 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:40.468439 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-9thsb" podUID="b92eeb7d-7365-453c-ab3c-89526b93cbc8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 28 19:38:45.473501 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:45.473444 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-9thsb" Apr 28 19:38:45.478716 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:45.478674 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-9thsb" podUID="b92eeb7d-7365-453c-ab3c-89526b93cbc8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 28 19:38:55.475336 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:38:55.475297 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-9thsb" podUID="b92eeb7d-7365-453c-ab3c-89526b93cbc8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 28 19:39:05.475272 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:05.475226 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-9thsb" podUID="b92eeb7d-7365-453c-ab3c-89526b93cbc8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 28 19:39:15.474445 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:15.474403 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-9thsb" podUID="b92eeb7d-7365-453c-ab3c-89526b93cbc8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 28 19:39:25.475397 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:25.475361 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-9thsb" Apr 28 19:39:33.161706 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:33.161671 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-9thsb"] Apr 28 19:39:33.162198 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:33.161996 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-9thsb" podUID="b92eeb7d-7365-453c-ab3c-89526b93cbc8" containerName="kserve-container" containerID="cri-o://137e5bac17d77395775de70a2d298a009579db7cbc4622b89b475d7f5c7944e2" gracePeriod=30 Apr 28 19:39:33.162198 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:33.162011 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-9thsb" podUID="b92eeb7d-7365-453c-ab3c-89526b93cbc8" containerName="kube-rbac-proxy" containerID="cri-o://a2257e06376825094ea571babc80c4a5840a4fd01980e8818ed0dca6a03bcf29" gracePeriod=30 Apr 28 19:39:33.383424 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:33.383392 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jphz7"] Apr 28 19:39:33.383770 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:33.383758 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb4159da-753d-44d9-9882-f66db722c96b" containerName="kserve-container" Apr 28 19:39:33.383815 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:33.383772 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb4159da-753d-44d9-9882-f66db722c96b" containerName="kserve-container" Apr 28 19:39:33.383815 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:33.383781 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb4159da-753d-44d9-9882-f66db722c96b" containerName="kube-rbac-proxy" Apr 28 19:39:33.383815 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:33.383786 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb4159da-753d-44d9-9882-f66db722c96b" containerName="kube-rbac-proxy" Apr 28 19:39:33.383815 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:33.383805 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb4159da-753d-44d9-9882-f66db722c96b" containerName="storage-initializer" Apr 28 19:39:33.383815 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:33.383811 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb4159da-753d-44d9-9882-f66db722c96b" containerName="storage-initializer" Apr 28 19:39:33.383970 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:33.383870 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="fb4159da-753d-44d9-9882-f66db722c96b" containerName="kserve-container" Apr 28 19:39:33.383970 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:33.383884 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="fb4159da-753d-44d9-9882-f66db722c96b" containerName="kube-rbac-proxy" Apr 28 19:39:33.386992 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:33.386975 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jphz7" Apr 28 19:39:33.389148 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:33.389129 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\"" Apr 28 19:39:33.389249 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:33.389128 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-v2-kserve-predictor-serving-cert\"" Apr 28 19:39:33.397501 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:33.397452 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jphz7"] Apr 28 19:39:33.526339 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:33.526245 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d5248e85-8cda-465b-8ce5-df0b2d192126-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-jphz7\" (UID: \"d5248e85-8cda-465b-8ce5-df0b2d192126\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jphz7" Apr 28 19:39:33.526339 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:33.526302 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d5248e85-8cda-465b-8ce5-df0b2d192126-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-jphz7\" (UID: \"d5248e85-8cda-465b-8ce5-df0b2d192126\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jphz7" Apr 28 19:39:33.526567 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:33.526347 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d5248e85-8cda-465b-8ce5-df0b2d192126-proxy-tls\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-jphz7\" (UID: \"d5248e85-8cda-465b-8ce5-df0b2d192126\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jphz7" Apr 28 19:39:33.526567 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:33.526366 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9kw6\" (UniqueName: \"kubernetes.io/projected/d5248e85-8cda-465b-8ce5-df0b2d192126-kube-api-access-x9kw6\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-jphz7\" (UID: \"d5248e85-8cda-465b-8ce5-df0b2d192126\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jphz7" Apr 28 19:39:33.627685 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:33.627644 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d5248e85-8cda-465b-8ce5-df0b2d192126-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-jphz7\" (UID: \"d5248e85-8cda-465b-8ce5-df0b2d192126\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jphz7" Apr 28 19:39:33.627881 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:33.627725 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d5248e85-8cda-465b-8ce5-df0b2d192126-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-jphz7\" (UID: \"d5248e85-8cda-465b-8ce5-df0b2d192126\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jphz7" Apr 28 19:39:33.627881 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:33.627766 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d5248e85-8cda-465b-8ce5-df0b2d192126-proxy-tls\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-jphz7\" (UID: \"d5248e85-8cda-465b-8ce5-df0b2d192126\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jphz7" Apr 28 19:39:33.627881 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:33.627794 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x9kw6\" (UniqueName: \"kubernetes.io/projected/d5248e85-8cda-465b-8ce5-df0b2d192126-kube-api-access-x9kw6\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-jphz7\" (UID: \"d5248e85-8cda-465b-8ce5-df0b2d192126\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jphz7" Apr 28 19:39:33.628196 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:33.628168 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d5248e85-8cda-465b-8ce5-df0b2d192126-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-jphz7\" (UID: \"d5248e85-8cda-465b-8ce5-df0b2d192126\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jphz7" Apr 28 19:39:33.628399 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:33.628379 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d5248e85-8cda-465b-8ce5-df0b2d192126-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-jphz7\" (UID: \"d5248e85-8cda-465b-8ce5-df0b2d192126\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jphz7" Apr 28 19:39:33.630238 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:33.630215 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d5248e85-8cda-465b-8ce5-df0b2d192126-proxy-tls\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-jphz7\" (UID: \"d5248e85-8cda-465b-8ce5-df0b2d192126\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jphz7" Apr 28 19:39:33.634951 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:33.634930 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9kw6\" (UniqueName: \"kubernetes.io/projected/d5248e85-8cda-465b-8ce5-df0b2d192126-kube-api-access-x9kw6\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-jphz7\" (UID: \"d5248e85-8cda-465b-8ce5-df0b2d192126\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jphz7" Apr 28 19:39:33.635057 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:33.634949 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-9thsb" event={"ID":"b92eeb7d-7365-453c-ab3c-89526b93cbc8","Type":"ContainerDied","Data":"a2257e06376825094ea571babc80c4a5840a4fd01980e8818ed0dca6a03bcf29"} Apr 28 19:39:33.635057 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:33.634929 2571 generic.go:358] "Generic (PLEG): container finished" podID="b92eeb7d-7365-453c-ab3c-89526b93cbc8" containerID="a2257e06376825094ea571babc80c4a5840a4fd01980e8818ed0dca6a03bcf29" exitCode=2 Apr 28 19:39:33.698139 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:33.698095 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jphz7" Apr 28 19:39:33.817339 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:33.817313 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jphz7"] Apr 28 19:39:33.819763 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:39:33.819737 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5248e85_8cda_465b_8ce5_df0b2d192126.slice/crio-a8006f1772405072b48377a8eefc61d08c171cd48feed5a2ac82edca4dba547f WatchSource:0}: Error finding container a8006f1772405072b48377a8eefc61d08c171cd48feed5a2ac82edca4dba547f: Status 404 returned error can't find the container with id a8006f1772405072b48377a8eefc61d08c171cd48feed5a2ac82edca4dba547f Apr 28 19:39:33.821559 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:33.821541 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 19:39:34.638980 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:34.638946 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jphz7" event={"ID":"d5248e85-8cda-465b-8ce5-df0b2d192126","Type":"ContainerStarted","Data":"bc62deeb23c81e284f5b61d3ac7aae2aabdf7038279e2282a8c3478556f368c5"} Apr 28 19:39:34.638980 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:34.638981 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jphz7" event={"ID":"d5248e85-8cda-465b-8ce5-df0b2d192126","Type":"ContainerStarted","Data":"a8006f1772405072b48377a8eefc61d08c171cd48feed5a2ac82edca4dba547f"} Apr 28 19:39:35.468895 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:35.468852 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-9thsb" podUID="b92eeb7d-7365-453c-ab3c-89526b93cbc8" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.36:8643/healthz\": dial tcp 10.134.0.36:8643: connect: connection refused" Apr 28 19:39:35.474314 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:35.474283 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-9thsb" podUID="b92eeb7d-7365-453c-ab3c-89526b93cbc8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 28 19:39:36.005015 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:36.004989 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-9thsb" Apr 28 19:39:36.147252 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:36.147149 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b92eeb7d-7365-453c-ab3c-89526b93cbc8-kserve-provision-location\") pod \"b92eeb7d-7365-453c-ab3c-89526b93cbc8\" (UID: \"b92eeb7d-7365-453c-ab3c-89526b93cbc8\") " Apr 28 19:39:36.147252 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:36.147208 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwgq9\" (UniqueName: \"kubernetes.io/projected/b92eeb7d-7365-453c-ab3c-89526b93cbc8-kube-api-access-xwgq9\") pod \"b92eeb7d-7365-453c-ab3c-89526b93cbc8\" (UID: \"b92eeb7d-7365-453c-ab3c-89526b93cbc8\") " Apr 28 19:39:36.147443 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:36.147287 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b92eeb7d-7365-453c-ab3c-89526b93cbc8-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") pod \"b92eeb7d-7365-453c-ab3c-89526b93cbc8\" (UID: \"b92eeb7d-7365-453c-ab3c-89526b93cbc8\") " Apr 28 19:39:36.147443 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:36.147309 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b92eeb7d-7365-453c-ab3c-89526b93cbc8-proxy-tls\") pod \"b92eeb7d-7365-453c-ab3c-89526b93cbc8\" (UID: \"b92eeb7d-7365-453c-ab3c-89526b93cbc8\") " Apr 28 19:39:36.147691 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:36.147649 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b92eeb7d-7365-453c-ab3c-89526b93cbc8-isvc-paddle-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-paddle-runtime-kube-rbac-proxy-sar-config") pod "b92eeb7d-7365-453c-ab3c-89526b93cbc8" (UID: "b92eeb7d-7365-453c-ab3c-89526b93cbc8"). InnerVolumeSpecName "isvc-paddle-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:39:36.149395 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:36.149370 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b92eeb7d-7365-453c-ab3c-89526b93cbc8-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b92eeb7d-7365-453c-ab3c-89526b93cbc8" (UID: "b92eeb7d-7365-453c-ab3c-89526b93cbc8"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:39:36.149509 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:36.149395 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b92eeb7d-7365-453c-ab3c-89526b93cbc8-kube-api-access-xwgq9" (OuterVolumeSpecName: "kube-api-access-xwgq9") pod "b92eeb7d-7365-453c-ab3c-89526b93cbc8" (UID: "b92eeb7d-7365-453c-ab3c-89526b93cbc8"). InnerVolumeSpecName "kube-api-access-xwgq9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:39:36.157101 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:36.157073 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b92eeb7d-7365-453c-ab3c-89526b93cbc8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b92eeb7d-7365-453c-ab3c-89526b93cbc8" (UID: "b92eeb7d-7365-453c-ab3c-89526b93cbc8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:39:36.248019 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:36.247971 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b92eeb7d-7365-453c-ab3c-89526b93cbc8-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:39:36.248019 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:36.248012 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b92eeb7d-7365-453c-ab3c-89526b93cbc8-proxy-tls\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:39:36.248019 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:36.248022 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b92eeb7d-7365-453c-ab3c-89526b93cbc8-kserve-provision-location\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:39:36.248019 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:36.248032 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xwgq9\" (UniqueName: \"kubernetes.io/projected/b92eeb7d-7365-453c-ab3c-89526b93cbc8-kube-api-access-xwgq9\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:39:36.646889 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:36.646857 2571 generic.go:358] "Generic (PLEG): container finished" podID="b92eeb7d-7365-453c-ab3c-89526b93cbc8" containerID="137e5bac17d77395775de70a2d298a009579db7cbc4622b89b475d7f5c7944e2" exitCode=0 Apr 28 19:39:36.647078 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:36.646934 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-9thsb" Apr 28 19:39:36.647078 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:36.646935 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-9thsb" event={"ID":"b92eeb7d-7365-453c-ab3c-89526b93cbc8","Type":"ContainerDied","Data":"137e5bac17d77395775de70a2d298a009579db7cbc4622b89b475d7f5c7944e2"} Apr 28 19:39:36.647078 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:36.647035 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-9thsb" event={"ID":"b92eeb7d-7365-453c-ab3c-89526b93cbc8","Type":"ContainerDied","Data":"bd296e9be386e9bf0bc778cbecab4bb988069c1e39f919657bd0a420d350f2a0"} Apr 28 19:39:36.647078 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:36.647050 2571 scope.go:117] "RemoveContainer" containerID="a2257e06376825094ea571babc80c4a5840a4fd01980e8818ed0dca6a03bcf29" Apr 28 19:39:36.655276 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:36.655259 2571 scope.go:117] "RemoveContainer" containerID="137e5bac17d77395775de70a2d298a009579db7cbc4622b89b475d7f5c7944e2" Apr 28 19:39:36.662327 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:36.662312 2571 scope.go:117] "RemoveContainer" containerID="253b527ec5734e10ca1ce073258c0d60c8457bbcf72f23f4f3af67032df5da92" Apr 28 19:39:36.667412 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:36.667389 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-9thsb"] Apr 28 19:39:36.669162 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:36.669145 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-9thsb"] Apr 28 19:39:36.672174 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:36.672157 2571 scope.go:117] "RemoveContainer" containerID="a2257e06376825094ea571babc80c4a5840a4fd01980e8818ed0dca6a03bcf29" Apr 28 19:39:36.672594 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:39:36.672570 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2257e06376825094ea571babc80c4a5840a4fd01980e8818ed0dca6a03bcf29\": container with ID starting with a2257e06376825094ea571babc80c4a5840a4fd01980e8818ed0dca6a03bcf29 not found: ID does not exist" containerID="a2257e06376825094ea571babc80c4a5840a4fd01980e8818ed0dca6a03bcf29" Apr 28 19:39:36.672648 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:36.672607 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2257e06376825094ea571babc80c4a5840a4fd01980e8818ed0dca6a03bcf29"} err="failed to get container status \"a2257e06376825094ea571babc80c4a5840a4fd01980e8818ed0dca6a03bcf29\": rpc error: code = NotFound desc = could not find container \"a2257e06376825094ea571babc80c4a5840a4fd01980e8818ed0dca6a03bcf29\": container with ID starting with a2257e06376825094ea571babc80c4a5840a4fd01980e8818ed0dca6a03bcf29 not found: ID does not exist" Apr 28 19:39:36.672648 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:36.672632 2571 scope.go:117] "RemoveContainer" containerID="137e5bac17d77395775de70a2d298a009579db7cbc4622b89b475d7f5c7944e2" Apr 28 19:39:36.672880 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:39:36.672860 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"137e5bac17d77395775de70a2d298a009579db7cbc4622b89b475d7f5c7944e2\": container with ID starting with 137e5bac17d77395775de70a2d298a009579db7cbc4622b89b475d7f5c7944e2 not found: ID does not exist" containerID="137e5bac17d77395775de70a2d298a009579db7cbc4622b89b475d7f5c7944e2" Apr 28 19:39:36.672940 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:36.672891 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"137e5bac17d77395775de70a2d298a009579db7cbc4622b89b475d7f5c7944e2"} err="failed to get container status \"137e5bac17d77395775de70a2d298a009579db7cbc4622b89b475d7f5c7944e2\": rpc error: code = NotFound desc = could not find container \"137e5bac17d77395775de70a2d298a009579db7cbc4622b89b475d7f5c7944e2\": container with ID starting with 137e5bac17d77395775de70a2d298a009579db7cbc4622b89b475d7f5c7944e2 not found: ID does not exist" Apr 28 19:39:36.672940 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:36.672916 2571 scope.go:117] "RemoveContainer" containerID="253b527ec5734e10ca1ce073258c0d60c8457bbcf72f23f4f3af67032df5da92" Apr 28 19:39:36.673182 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:39:36.673165 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"253b527ec5734e10ca1ce073258c0d60c8457bbcf72f23f4f3af67032df5da92\": container with ID starting with 253b527ec5734e10ca1ce073258c0d60c8457bbcf72f23f4f3af67032df5da92 not found: ID does not exist" containerID="253b527ec5734e10ca1ce073258c0d60c8457bbcf72f23f4f3af67032df5da92" Apr 28 19:39:36.673228 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:36.673188 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"253b527ec5734e10ca1ce073258c0d60c8457bbcf72f23f4f3af67032df5da92"} err="failed to get container status \"253b527ec5734e10ca1ce073258c0d60c8457bbcf72f23f4f3af67032df5da92\": rpc error: code = NotFound desc = could not find container \"253b527ec5734e10ca1ce073258c0d60c8457bbcf72f23f4f3af67032df5da92\": container with ID starting with 253b527ec5734e10ca1ce073258c0d60c8457bbcf72f23f4f3af67032df5da92 not found: ID does not exist" Apr 28 19:39:37.947066 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:37.947033 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b92eeb7d-7365-453c-ab3c-89526b93cbc8" path="/var/lib/kubelet/pods/b92eeb7d-7365-453c-ab3c-89526b93cbc8/volumes" Apr 28 19:39:38.655413 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:38.655379 2571 generic.go:358] "Generic (PLEG): container finished" podID="d5248e85-8cda-465b-8ce5-df0b2d192126" containerID="bc62deeb23c81e284f5b61d3ac7aae2aabdf7038279e2282a8c3478556f368c5" exitCode=0 Apr 28 19:39:38.655636 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:38.655459 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jphz7" event={"ID":"d5248e85-8cda-465b-8ce5-df0b2d192126","Type":"ContainerDied","Data":"bc62deeb23c81e284f5b61d3ac7aae2aabdf7038279e2282a8c3478556f368c5"} Apr 28 19:39:39.660173 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:39.660141 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jphz7" event={"ID":"d5248e85-8cda-465b-8ce5-df0b2d192126","Type":"ContainerStarted","Data":"4524f7d975e4b01138a9004d06e2b7c891245c5aad19d02bcb4ba4bd1a81e2e2"} Apr 28 19:39:39.660173 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:39.660178 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jphz7" event={"ID":"d5248e85-8cda-465b-8ce5-df0b2d192126","Type":"ContainerStarted","Data":"26d5ecfd8cfc1358b604890594062b62ad4282dff346a56eda4aceb6bc45beff"} Apr 28 19:39:39.660602 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:39.660383 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jphz7" Apr 28 19:39:39.680774 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:39.680725 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jphz7" podStartSLOduration=6.680710482 podStartE2EDuration="6.680710482s" podCreationTimestamp="2026-04-28 19:39:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:39:39.679586781 +0000 UTC m=+1396.310487727" watchObservedRunningTime="2026-04-28 19:39:39.680710482 +0000 UTC m=+1396.311611428" Apr 28 19:39:40.664168 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:40.664126 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jphz7" Apr 28 19:39:40.665394 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:40.665366 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jphz7" podUID="d5248e85-8cda-465b-8ce5-df0b2d192126" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 28 19:39:41.667854 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:41.667806 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jphz7" podUID="d5248e85-8cda-465b-8ce5-df0b2d192126" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 28 19:39:46.673315 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:46.673285 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jphz7" Apr 28 19:39:46.673887 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:46.673860 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jphz7" podUID="d5248e85-8cda-465b-8ce5-df0b2d192126" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 28 19:39:56.674137 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:39:56.674099 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jphz7" podUID="d5248e85-8cda-465b-8ce5-df0b2d192126" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 28 19:40:06.674810 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:06.674769 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jphz7" podUID="d5248e85-8cda-465b-8ce5-df0b2d192126" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 28 19:40:16.673857 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:16.673771 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jphz7" podUID="d5248e85-8cda-465b-8ce5-df0b2d192126" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 28 19:40:26.385553 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:26.385523 2571 scope.go:117] "RemoveContainer" containerID="7dcc4b9348a899fe894f327ebe80e1d955fc3ea4206604fc157904bc507e1cbd" Apr 28 19:40:26.393647 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:26.393627 2571 scope.go:117] "RemoveContainer" containerID="2f01516453e4ecdab977ab8bd59178d6c6e941c2e80a4c924dfa4093bda6c636" Apr 28 19:40:26.674446 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:26.674418 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jphz7" Apr 28 19:40:34.760083 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:34.760038 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jphz7"] Apr 28 19:40:34.760574 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:34.760388 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jphz7" podUID="d5248e85-8cda-465b-8ce5-df0b2d192126" containerName="kserve-container" containerID="cri-o://26d5ecfd8cfc1358b604890594062b62ad4282dff346a56eda4aceb6bc45beff" gracePeriod=30 Apr 28 19:40:34.760574 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:34.760444 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jphz7" podUID="d5248e85-8cda-465b-8ce5-df0b2d192126" containerName="kube-rbac-proxy" containerID="cri-o://4524f7d975e4b01138a9004d06e2b7c891245c5aad19d02bcb4ba4bd1a81e2e2" gracePeriod=30 Apr 28 19:40:34.983159 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:34.983124 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zzb69"] Apr 28 19:40:34.983443 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:34.983432 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b92eeb7d-7365-453c-ab3c-89526b93cbc8" containerName="storage-initializer" Apr 28 19:40:34.983506 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:34.983445 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="b92eeb7d-7365-453c-ab3c-89526b93cbc8" containerName="storage-initializer" Apr 28 19:40:34.983506 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:34.983459 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b92eeb7d-7365-453c-ab3c-89526b93cbc8" containerName="kube-rbac-proxy" Apr 28 19:40:34.983506 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:34.983465 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="b92eeb7d-7365-453c-ab3c-89526b93cbc8" containerName="kube-rbac-proxy" Apr 28 19:40:34.983506 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:34.983501 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b92eeb7d-7365-453c-ab3c-89526b93cbc8" containerName="kserve-container" Apr 28 19:40:34.983506 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:34.983507 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="b92eeb7d-7365-453c-ab3c-89526b93cbc8" containerName="kserve-container" Apr 28 19:40:34.983666 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:34.983564 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="b92eeb7d-7365-453c-ab3c-89526b93cbc8" containerName="kserve-container" Apr 28 19:40:34.983666 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:34.983573 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="b92eeb7d-7365-453c-ab3c-89526b93cbc8" containerName="kube-rbac-proxy" Apr 28 19:40:34.986452 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:34.986427 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zzb69" Apr 28 19:40:34.988409 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:34.988382 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-kube-rbac-proxy-sar-config\"" Apr 28 19:40:34.988556 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:34.988462 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-predictor-serving-cert\"" Apr 28 19:40:34.994171 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:34.994149 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zzb69"] Apr 28 19:40:35.140666 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:35.140566 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6-kserve-provision-location\") pod \"isvc-pmml-predictor-8bb578669-zzb69\" (UID: \"4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zzb69" Apr 28 19:40:35.140666 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:35.140639 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6-proxy-tls\") pod \"isvc-pmml-predictor-8bb578669-zzb69\" (UID: \"4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zzb69" Apr 28 19:40:35.140666 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:35.140663 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rflsc\" (UniqueName: \"kubernetes.io/projected/4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6-kube-api-access-rflsc\") pod \"isvc-pmml-predictor-8bb578669-zzb69\" (UID: \"4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zzb69" Apr 28 19:40:35.140888 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:35.140718 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6-isvc-pmml-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-predictor-8bb578669-zzb69\" (UID: \"4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zzb69" Apr 28 19:40:35.241979 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:35.241943 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6-isvc-pmml-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-predictor-8bb578669-zzb69\" (UID: \"4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zzb69" Apr 28 19:40:35.242160 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:35.241994 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6-kserve-provision-location\") pod \"isvc-pmml-predictor-8bb578669-zzb69\" (UID: \"4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zzb69" Apr 28 19:40:35.242160 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:35.242041 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6-proxy-tls\") pod \"isvc-pmml-predictor-8bb578669-zzb69\" (UID: \"4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zzb69" Apr 28 19:40:35.242160 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:35.242058 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rflsc\" (UniqueName: \"kubernetes.io/projected/4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6-kube-api-access-rflsc\") pod \"isvc-pmml-predictor-8bb578669-zzb69\" (UID: \"4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zzb69" Apr 28 19:40:35.242514 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:35.242466 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6-kserve-provision-location\") pod \"isvc-pmml-predictor-8bb578669-zzb69\" (UID: \"4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zzb69" Apr 28 19:40:35.242758 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:35.242739 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6-isvc-pmml-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-predictor-8bb578669-zzb69\" (UID: \"4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zzb69" Apr 28 19:40:35.244501 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:35.244465 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6-proxy-tls\") pod \"isvc-pmml-predictor-8bb578669-zzb69\" (UID: \"4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zzb69" Apr 28 19:40:35.249041 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:35.249019 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rflsc\" (UniqueName: \"kubernetes.io/projected/4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6-kube-api-access-rflsc\") pod \"isvc-pmml-predictor-8bb578669-zzb69\" (UID: \"4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zzb69" Apr 28 19:40:35.297907 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:35.297689 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zzb69" Apr 28 19:40:35.421266 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:35.421085 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zzb69"] Apr 28 19:40:35.423928 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:40:35.423900 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f752d0a_b57f_483d_bfc8_fa9ebcd5d6d6.slice/crio-486582ef337281b6e41b8979c3a9fbebd890dd0332c09840cdae5035e5e8e871 WatchSource:0}: Error finding container 486582ef337281b6e41b8979c3a9fbebd890dd0332c09840cdae5035e5e8e871: Status 404 returned error can't find the container with id 486582ef337281b6e41b8979c3a9fbebd890dd0332c09840cdae5035e5e8e871 Apr 28 19:40:35.833989 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:35.833901 2571 generic.go:358] "Generic (PLEG): container finished" podID="d5248e85-8cda-465b-8ce5-df0b2d192126" containerID="4524f7d975e4b01138a9004d06e2b7c891245c5aad19d02bcb4ba4bd1a81e2e2" exitCode=2 Apr 28 19:40:35.834355 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:35.833985 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jphz7" event={"ID":"d5248e85-8cda-465b-8ce5-df0b2d192126","Type":"ContainerDied","Data":"4524f7d975e4b01138a9004d06e2b7c891245c5aad19d02bcb4ba4bd1a81e2e2"} Apr 28 19:40:35.835252 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:35.835224 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zzb69" event={"ID":"4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6","Type":"ContainerStarted","Data":"67f88642972ba14275f5d22dee3432cb748c4c97f531933388462bdb3aadf7b9"} Apr 28 19:40:35.835252 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:35.835255 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zzb69" event={"ID":"4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6","Type":"ContainerStarted","Data":"486582ef337281b6e41b8979c3a9fbebd890dd0332c09840cdae5035e5e8e871"} Apr 28 19:40:36.668098 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:36.668057 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jphz7" podUID="d5248e85-8cda-465b-8ce5-df0b2d192126" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.37:8643/healthz\": dial tcp 10.134.0.37:8643: connect: connection refused" Apr 28 19:40:36.674593 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:36.674561 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jphz7" podUID="d5248e85-8cda-465b-8ce5-df0b2d192126" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 28 19:40:37.597595 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:37.597572 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jphz7" Apr 28 19:40:37.763189 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:37.763107 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d5248e85-8cda-465b-8ce5-df0b2d192126-proxy-tls\") pod \"d5248e85-8cda-465b-8ce5-df0b2d192126\" (UID: \"d5248e85-8cda-465b-8ce5-df0b2d192126\") " Apr 28 19:40:37.763189 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:37.763152 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d5248e85-8cda-465b-8ce5-df0b2d192126-kserve-provision-location\") pod \"d5248e85-8cda-465b-8ce5-df0b2d192126\" (UID: \"d5248e85-8cda-465b-8ce5-df0b2d192126\") " Apr 28 19:40:37.763566 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:37.763194 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9kw6\" (UniqueName: \"kubernetes.io/projected/d5248e85-8cda-465b-8ce5-df0b2d192126-kube-api-access-x9kw6\") pod \"d5248e85-8cda-465b-8ce5-df0b2d192126\" (UID: \"d5248e85-8cda-465b-8ce5-df0b2d192126\") " Apr 28 19:40:37.763566 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:37.763260 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d5248e85-8cda-465b-8ce5-df0b2d192126-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") pod \"d5248e85-8cda-465b-8ce5-df0b2d192126\" (UID: \"d5248e85-8cda-465b-8ce5-df0b2d192126\") " Apr 28 19:40:37.763685 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:37.763660 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5248e85-8cda-465b-8ce5-df0b2d192126-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config") pod "d5248e85-8cda-465b-8ce5-df0b2d192126" (UID: "d5248e85-8cda-465b-8ce5-df0b2d192126"). InnerVolumeSpecName "isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:40:37.765199 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:37.765163 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5248e85-8cda-465b-8ce5-df0b2d192126-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d5248e85-8cda-465b-8ce5-df0b2d192126" (UID: "d5248e85-8cda-465b-8ce5-df0b2d192126"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:40:37.765298 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:37.765231 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5248e85-8cda-465b-8ce5-df0b2d192126-kube-api-access-x9kw6" (OuterVolumeSpecName: "kube-api-access-x9kw6") pod "d5248e85-8cda-465b-8ce5-df0b2d192126" (UID: "d5248e85-8cda-465b-8ce5-df0b2d192126"). InnerVolumeSpecName "kube-api-access-x9kw6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:40:37.773287 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:37.773259 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5248e85-8cda-465b-8ce5-df0b2d192126-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d5248e85-8cda-465b-8ce5-df0b2d192126" (UID: "d5248e85-8cda-465b-8ce5-df0b2d192126"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:40:37.842680 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:37.842645 2571 generic.go:358] "Generic (PLEG): container finished" podID="d5248e85-8cda-465b-8ce5-df0b2d192126" containerID="26d5ecfd8cfc1358b604890594062b62ad4282dff346a56eda4aceb6bc45beff" exitCode=0 Apr 28 19:40:37.842861 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:37.842695 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jphz7" event={"ID":"d5248e85-8cda-465b-8ce5-df0b2d192126","Type":"ContainerDied","Data":"26d5ecfd8cfc1358b604890594062b62ad4282dff346a56eda4aceb6bc45beff"} Apr 28 19:40:37.842861 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:37.842718 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jphz7" event={"ID":"d5248e85-8cda-465b-8ce5-df0b2d192126","Type":"ContainerDied","Data":"a8006f1772405072b48377a8eefc61d08c171cd48feed5a2ac82edca4dba547f"} Apr 28 19:40:37.842861 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:37.842734 2571 scope.go:117] "RemoveContainer" containerID="4524f7d975e4b01138a9004d06e2b7c891245c5aad19d02bcb4ba4bd1a81e2e2" Apr 28 19:40:37.842861 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:37.842742 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jphz7" Apr 28 19:40:37.850775 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:37.850752 2571 scope.go:117] "RemoveContainer" containerID="26d5ecfd8cfc1358b604890594062b62ad4282dff346a56eda4aceb6bc45beff" Apr 28 19:40:37.858119 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:37.858099 2571 scope.go:117] "RemoveContainer" containerID="bc62deeb23c81e284f5b61d3ac7aae2aabdf7038279e2282a8c3478556f368c5" Apr 28 19:40:37.863272 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:37.863249 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jphz7"] Apr 28 19:40:37.864612 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:37.864589 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d5248e85-8cda-465b-8ce5-df0b2d192126-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:40:37.864692 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:37.864616 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d5248e85-8cda-465b-8ce5-df0b2d192126-proxy-tls\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:40:37.864692 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:37.864634 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d5248e85-8cda-465b-8ce5-df0b2d192126-kserve-provision-location\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:40:37.864692 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:37.864650 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x9kw6\" (UniqueName: \"kubernetes.io/projected/d5248e85-8cda-465b-8ce5-df0b2d192126-kube-api-access-x9kw6\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:40:37.865881 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:37.865862 2571 scope.go:117] "RemoveContainer" containerID="4524f7d975e4b01138a9004d06e2b7c891245c5aad19d02bcb4ba4bd1a81e2e2" Apr 28 19:40:37.866152 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:40:37.866135 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4524f7d975e4b01138a9004d06e2b7c891245c5aad19d02bcb4ba4bd1a81e2e2\": container with ID starting with 4524f7d975e4b01138a9004d06e2b7c891245c5aad19d02bcb4ba4bd1a81e2e2 not found: ID does not exist" containerID="4524f7d975e4b01138a9004d06e2b7c891245c5aad19d02bcb4ba4bd1a81e2e2" Apr 28 19:40:37.866201 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:37.866162 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4524f7d975e4b01138a9004d06e2b7c891245c5aad19d02bcb4ba4bd1a81e2e2"} err="failed to get container status \"4524f7d975e4b01138a9004d06e2b7c891245c5aad19d02bcb4ba4bd1a81e2e2\": rpc error: code = NotFound desc = could not find container \"4524f7d975e4b01138a9004d06e2b7c891245c5aad19d02bcb4ba4bd1a81e2e2\": container with ID starting with 4524f7d975e4b01138a9004d06e2b7c891245c5aad19d02bcb4ba4bd1a81e2e2 not found: ID does not exist" Apr 28 19:40:37.866201 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:37.866181 2571 scope.go:117] "RemoveContainer" containerID="26d5ecfd8cfc1358b604890594062b62ad4282dff346a56eda4aceb6bc45beff" Apr 28 19:40:37.866412 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:40:37.866395 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26d5ecfd8cfc1358b604890594062b62ad4282dff346a56eda4aceb6bc45beff\": container with ID starting with 26d5ecfd8cfc1358b604890594062b62ad4282dff346a56eda4aceb6bc45beff not found: ID does not exist" containerID="26d5ecfd8cfc1358b604890594062b62ad4282dff346a56eda4aceb6bc45beff" Apr 28 19:40:37.866458 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:37.866424 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26d5ecfd8cfc1358b604890594062b62ad4282dff346a56eda4aceb6bc45beff"} err="failed to get container status \"26d5ecfd8cfc1358b604890594062b62ad4282dff346a56eda4aceb6bc45beff\": rpc error: code = NotFound desc = could not find container \"26d5ecfd8cfc1358b604890594062b62ad4282dff346a56eda4aceb6bc45beff\": container with ID starting with 26d5ecfd8cfc1358b604890594062b62ad4282dff346a56eda4aceb6bc45beff not found: ID does not exist" Apr 28 19:40:37.866458 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:37.866440 2571 scope.go:117] "RemoveContainer" containerID="bc62deeb23c81e284f5b61d3ac7aae2aabdf7038279e2282a8c3478556f368c5" Apr 28 19:40:37.866661 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:40:37.866646 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc62deeb23c81e284f5b61d3ac7aae2aabdf7038279e2282a8c3478556f368c5\": container with ID starting with bc62deeb23c81e284f5b61d3ac7aae2aabdf7038279e2282a8c3478556f368c5 not found: ID does not exist" containerID="bc62deeb23c81e284f5b61d3ac7aae2aabdf7038279e2282a8c3478556f368c5" Apr 28 19:40:37.866708 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:37.866666 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc62deeb23c81e284f5b61d3ac7aae2aabdf7038279e2282a8c3478556f368c5"} err="failed to get container status \"bc62deeb23c81e284f5b61d3ac7aae2aabdf7038279e2282a8c3478556f368c5\": rpc error: code = NotFound desc = could not find container \"bc62deeb23c81e284f5b61d3ac7aae2aabdf7038279e2282a8c3478556f368c5\": container with ID starting with bc62deeb23c81e284f5b61d3ac7aae2aabdf7038279e2282a8c3478556f368c5 not found: ID does not exist" Apr 28 19:40:37.867236 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:37.867218 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jphz7"] Apr 28 19:40:37.946332 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:37.946300 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5248e85-8cda-465b-8ce5-df0b2d192126" path="/var/lib/kubelet/pods/d5248e85-8cda-465b-8ce5-df0b2d192126/volumes" Apr 28 19:40:39.850781 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:39.850749 2571 generic.go:358] "Generic (PLEG): container finished" podID="4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6" containerID="67f88642972ba14275f5d22dee3432cb748c4c97f531933388462bdb3aadf7b9" exitCode=0 Apr 28 19:40:39.851170 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:39.850824 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zzb69" event={"ID":"4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6","Type":"ContainerDied","Data":"67f88642972ba14275f5d22dee3432cb748c4c97f531933388462bdb3aadf7b9"} Apr 28 19:40:46.879175 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:46.879139 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zzb69" event={"ID":"4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6","Type":"ContainerStarted","Data":"5cfde5c5ac03fe187840b9145ab9c75c2d1e739dfb211d63a6d296209fb276d3"} Apr 28 19:40:46.879175 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:46.879178 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zzb69" event={"ID":"4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6","Type":"ContainerStarted","Data":"25e86a815ef5c251e6d81dd9f1d3c499ae366c5380996b1bb25aa83807c353d1"} Apr 28 19:40:46.879657 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:46.879460 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zzb69" Apr 28 19:40:46.879657 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:46.879502 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zzb69" Apr 28 19:40:46.880870 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:46.880845 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zzb69" podUID="4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 28 19:40:46.899546 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:46.899474 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zzb69" podStartSLOduration=6.019062276 podStartE2EDuration="12.899456238s" podCreationTimestamp="2026-04-28 19:40:34 +0000 UTC" firstStartedPulling="2026-04-28 19:40:39.851995515 +0000 UTC m=+1456.482896439" lastFinishedPulling="2026-04-28 19:40:46.732389479 +0000 UTC m=+1463.363290401" observedRunningTime="2026-04-28 19:40:46.897044455 +0000 UTC m=+1463.527945436" watchObservedRunningTime="2026-04-28 19:40:46.899456238 +0000 UTC m=+1463.530357184" Apr 28 19:40:47.881916 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:47.881873 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zzb69" podUID="4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 28 19:40:52.886499 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:52.886451 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zzb69" Apr 28 19:40:52.886988 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:40:52.886959 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zzb69" podUID="4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 28 19:41:02.887015 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:41:02.886977 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zzb69" podUID="4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 28 19:41:12.887255 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:41:12.887214 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zzb69" podUID="4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 28 19:41:22.887922 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:41:22.887878 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zzb69" podUID="4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 28 19:41:23.933968 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:41:23.933942 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ppk4t_238e19ca-102f-43b1-8aed-9322ca47bfc9/ovn-acl-logging/0.log" Apr 28 19:41:23.936726 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:41:23.936704 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ppk4t_238e19ca-102f-43b1-8aed-9322ca47bfc9/ovn-acl-logging/0.log" Apr 28 19:41:32.887167 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:41:32.887124 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zzb69" podUID="4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 28 19:41:42.887729 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:41:42.887643 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zzb69" podUID="4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 28 19:41:52.887543 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:41:52.887500 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zzb69" podUID="4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 28 19:42:02.887915 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:42:02.887876 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zzb69" podUID="4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 28 19:42:12.887540 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:42:12.887510 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zzb69" Apr 28 19:42:16.261661 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:42:16.261626 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zzb69"] Apr 28 19:42:16.262120 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:42:16.261944 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zzb69" podUID="4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6" containerName="kserve-container" containerID="cri-o://25e86a815ef5c251e6d81dd9f1d3c499ae366c5380996b1bb25aa83807c353d1" gracePeriod=30 Apr 28 19:42:16.262120 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:42:16.261984 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zzb69" podUID="4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6" containerName="kube-rbac-proxy" containerID="cri-o://5cfde5c5ac03fe187840b9145ab9c75c2d1e739dfb211d63a6d296209fb276d3" gracePeriod=30 Apr 28 19:42:17.141354 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:42:17.141317 2571 generic.go:358] "Generic (PLEG): container finished" podID="4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6" containerID="5cfde5c5ac03fe187840b9145ab9c75c2d1e739dfb211d63a6d296209fb276d3" exitCode=2 Apr 28 19:42:17.141550 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:42:17.141358 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zzb69" event={"ID":"4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6","Type":"ContainerDied","Data":"5cfde5c5ac03fe187840b9145ab9c75c2d1e739dfb211d63a6d296209fb276d3"} Apr 28 19:42:17.882977 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:42:17.882935 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zzb69" podUID="4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.38:8643/healthz\": dial tcp 10.134.0.38:8643: connect: connection refused" Apr 28 19:42:19.908631 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:42:19.908607 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zzb69" Apr 28 19:42:20.023094 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:42:20.023011 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6-proxy-tls\") pod \"4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6\" (UID: \"4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6\") " Apr 28 19:42:20.023094 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:42:20.023074 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6-kserve-provision-location\") pod \"4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6\" (UID: \"4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6\") " Apr 28 19:42:20.023295 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:42:20.023124 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rflsc\" (UniqueName: \"kubernetes.io/projected/4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6-kube-api-access-rflsc\") pod \"4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6\" (UID: \"4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6\") " Apr 28 19:42:20.023295 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:42:20.023153 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6-isvc-pmml-kube-rbac-proxy-sar-config\") pod \"4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6\" (UID: \"4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6\") " Apr 28 19:42:20.023683 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:42:20.023430 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6" (UID: "4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:42:20.023683 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:42:20.023631 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6-isvc-pmml-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-pmml-kube-rbac-proxy-sar-config") pod "4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6" (UID: "4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6"). InnerVolumeSpecName "isvc-pmml-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:42:20.025269 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:42:20.025244 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6" (UID: "4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:42:20.025269 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:42:20.025253 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6-kube-api-access-rflsc" (OuterVolumeSpecName: "kube-api-access-rflsc") pod "4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6" (UID: "4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6"). InnerVolumeSpecName "kube-api-access-rflsc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:42:20.124795 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:42:20.124698 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6-proxy-tls\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:42:20.124795 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:42:20.124725 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6-kserve-provision-location\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:42:20.124795 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:42:20.124737 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rflsc\" (UniqueName: \"kubernetes.io/projected/4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6-kube-api-access-rflsc\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:42:20.124795 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:42:20.124747 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6-isvc-pmml-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:42:20.152304 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:42:20.152272 2571 generic.go:358] "Generic (PLEG): container finished" podID="4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6" containerID="25e86a815ef5c251e6d81dd9f1d3c499ae366c5380996b1bb25aa83807c353d1" exitCode=0 Apr 28 19:42:20.152428 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:42:20.152324 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zzb69" event={"ID":"4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6","Type":"ContainerDied","Data":"25e86a815ef5c251e6d81dd9f1d3c499ae366c5380996b1bb25aa83807c353d1"} Apr 28 19:42:20.152428 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:42:20.152358 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zzb69" event={"ID":"4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6","Type":"ContainerDied","Data":"486582ef337281b6e41b8979c3a9fbebd890dd0332c09840cdae5035e5e8e871"} Apr 28 19:42:20.152428 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:42:20.152357 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zzb69" Apr 28 19:42:20.152428 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:42:20.152370 2571 scope.go:117] "RemoveContainer" containerID="5cfde5c5ac03fe187840b9145ab9c75c2d1e739dfb211d63a6d296209fb276d3" Apr 28 19:42:20.161918 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:42:20.161749 2571 scope.go:117] "RemoveContainer" containerID="25e86a815ef5c251e6d81dd9f1d3c499ae366c5380996b1bb25aa83807c353d1" Apr 28 19:42:20.168998 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:42:20.168983 2571 scope.go:117] "RemoveContainer" containerID="67f88642972ba14275f5d22dee3432cb748c4c97f531933388462bdb3aadf7b9" Apr 28 19:42:20.175331 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:42:20.175306 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zzb69"] Apr 28 19:42:20.176072 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:42:20.176056 2571 scope.go:117] "RemoveContainer" containerID="5cfde5c5ac03fe187840b9145ab9c75c2d1e739dfb211d63a6d296209fb276d3" Apr 28 19:42:20.176312 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:42:20.176287 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cfde5c5ac03fe187840b9145ab9c75c2d1e739dfb211d63a6d296209fb276d3\": container with ID starting with 5cfde5c5ac03fe187840b9145ab9c75c2d1e739dfb211d63a6d296209fb276d3 not found: ID does not exist" containerID="5cfde5c5ac03fe187840b9145ab9c75c2d1e739dfb211d63a6d296209fb276d3" Apr 28 19:42:20.176374 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:42:20.176319 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cfde5c5ac03fe187840b9145ab9c75c2d1e739dfb211d63a6d296209fb276d3"} err="failed to get container status \"5cfde5c5ac03fe187840b9145ab9c75c2d1e739dfb211d63a6d296209fb276d3\": rpc error: code = NotFound desc = could not find container \"5cfde5c5ac03fe187840b9145ab9c75c2d1e739dfb211d63a6d296209fb276d3\": container with ID starting with 5cfde5c5ac03fe187840b9145ab9c75c2d1e739dfb211d63a6d296209fb276d3 not found: ID does not exist" Apr 28 19:42:20.176374 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:42:20.176341 2571 scope.go:117] "RemoveContainer" containerID="25e86a815ef5c251e6d81dd9f1d3c499ae366c5380996b1bb25aa83807c353d1" Apr 28 19:42:20.176612 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:42:20.176588 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25e86a815ef5c251e6d81dd9f1d3c499ae366c5380996b1bb25aa83807c353d1\": container with ID starting with 25e86a815ef5c251e6d81dd9f1d3c499ae366c5380996b1bb25aa83807c353d1 not found: ID does not exist" containerID="25e86a815ef5c251e6d81dd9f1d3c499ae366c5380996b1bb25aa83807c353d1" Apr 28 19:42:20.176710 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:42:20.176617 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25e86a815ef5c251e6d81dd9f1d3c499ae366c5380996b1bb25aa83807c353d1"} err="failed to get container status \"25e86a815ef5c251e6d81dd9f1d3c499ae366c5380996b1bb25aa83807c353d1\": rpc error: code = NotFound desc = could not find container \"25e86a815ef5c251e6d81dd9f1d3c499ae366c5380996b1bb25aa83807c353d1\": container with ID starting with 25e86a815ef5c251e6d81dd9f1d3c499ae366c5380996b1bb25aa83807c353d1 not found: ID does not exist" Apr 28 19:42:20.176710 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:42:20.176634 2571 scope.go:117] "RemoveContainer" containerID="67f88642972ba14275f5d22dee3432cb748c4c97f531933388462bdb3aadf7b9" Apr 28 19:42:20.176992 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:42:20.176959 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67f88642972ba14275f5d22dee3432cb748c4c97f531933388462bdb3aadf7b9\": container with ID starting with 67f88642972ba14275f5d22dee3432cb748c4c97f531933388462bdb3aadf7b9 not found: ID does not exist" containerID="67f88642972ba14275f5d22dee3432cb748c4c97f531933388462bdb3aadf7b9" Apr 28 19:42:20.177098 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:42:20.177006 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67f88642972ba14275f5d22dee3432cb748c4c97f531933388462bdb3aadf7b9"} err="failed to get container status \"67f88642972ba14275f5d22dee3432cb748c4c97f531933388462bdb3aadf7b9\": rpc error: code = NotFound desc = could not find container \"67f88642972ba14275f5d22dee3432cb748c4c97f531933388462bdb3aadf7b9\": container with ID starting with 67f88642972ba14275f5d22dee3432cb748c4c97f531933388462bdb3aadf7b9 not found: ID does not exist" Apr 28 19:42:20.179067 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:42:20.179048 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-zzb69"] Apr 28 19:42:21.947325 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:42:21.947290 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6" path="/var/lib/kubelet/pods/4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6/volumes" Apr 28 19:45:33.586205 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:45:33.586165 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-primary-77d500-predictor-8d9ffc784-l4s82"] Apr 28 19:45:33.586934 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:45:33.586908 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d5248e85-8cda-465b-8ce5-df0b2d192126" containerName="kserve-container" Apr 28 19:45:33.587043 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:45:33.586943 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5248e85-8cda-465b-8ce5-df0b2d192126" containerName="kserve-container" Apr 28 19:45:33.587043 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:45:33.586963 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d5248e85-8cda-465b-8ce5-df0b2d192126" containerName="kube-rbac-proxy" Apr 28 19:45:33.587043 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:45:33.586972 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5248e85-8cda-465b-8ce5-df0b2d192126" containerName="kube-rbac-proxy" Apr 28 19:45:33.587043 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:45:33.586999 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d5248e85-8cda-465b-8ce5-df0b2d192126" containerName="storage-initializer" Apr 28 19:45:33.587043 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:45:33.587008 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5248e85-8cda-465b-8ce5-df0b2d192126" containerName="storage-initializer" Apr 28 19:45:33.587043 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:45:33.587025 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6" containerName="kserve-container" Apr 28 19:45:33.587043 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:45:33.587033 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6" containerName="kserve-container" Apr 28 19:45:33.587390 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:45:33.587063 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6" containerName="storage-initializer" Apr 28 19:45:33.587390 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:45:33.587073 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6" containerName="storage-initializer" Apr 28 19:45:33.587390 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:45:33.587091 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6" containerName="kube-rbac-proxy" Apr 28 19:45:33.587390 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:45:33.587105 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6" containerName="kube-rbac-proxy" Apr 28 19:45:33.587390 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:45:33.587242 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="d5248e85-8cda-465b-8ce5-df0b2d192126" containerName="kserve-container" Apr 28 19:45:33.587390 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:45:33.587267 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6" containerName="kserve-container" Apr 28 19:45:33.587390 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:45:33.587281 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="4f752d0a-b57f-483d-bfc8-fa9ebcd5d6d6" containerName="kube-rbac-proxy" Apr 28 19:45:33.587390 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:45:33.587301 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="d5248e85-8cda-465b-8ce5-df0b2d192126" containerName="kube-rbac-proxy" Apr 28 19:45:33.591716 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:45:33.591694 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-77d500-predictor-8d9ffc784-l4s82" Apr 28 19:45:33.594701 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:45:33.594677 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-primary-77d500-kube-rbac-proxy-sar-config\"" Apr 28 19:45:33.594901 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:45:33.594696 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-44jch\"" Apr 28 19:45:33.594995 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:45:33.594709 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 28 19:45:33.594995 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:45:33.594733 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-primary-77d500-predictor-serving-cert\"" Apr 28 19:45:33.594995 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:45:33.594745 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 28 19:45:33.597440 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:45:33.597420 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-77d500-predictor-8d9ffc784-l4s82"] Apr 28 19:45:33.692911 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:45:33.692874 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chf2c\" (UniqueName: \"kubernetes.io/projected/24d79a92-4318-433f-88da-e160ee5a0e71-kube-api-access-chf2c\") pod \"isvc-primary-77d500-predictor-8d9ffc784-l4s82\" (UID: \"24d79a92-4318-433f-88da-e160ee5a0e71\") " pod="kserve-ci-e2e-test/isvc-primary-77d500-predictor-8d9ffc784-l4s82" Apr 28 19:45:33.692911 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:45:33.692915 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-primary-77d500-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/24d79a92-4318-433f-88da-e160ee5a0e71-isvc-primary-77d500-kube-rbac-proxy-sar-config\") pod \"isvc-primary-77d500-predictor-8d9ffc784-l4s82\" (UID: \"24d79a92-4318-433f-88da-e160ee5a0e71\") " pod="kserve-ci-e2e-test/isvc-primary-77d500-predictor-8d9ffc784-l4s82" Apr 28 19:45:33.693155 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:45:33.692934 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/24d79a92-4318-433f-88da-e160ee5a0e71-proxy-tls\") pod \"isvc-primary-77d500-predictor-8d9ffc784-l4s82\" (UID: \"24d79a92-4318-433f-88da-e160ee5a0e71\") " pod="kserve-ci-e2e-test/isvc-primary-77d500-predictor-8d9ffc784-l4s82" Apr 28 19:45:33.693155 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:45:33.693015 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/24d79a92-4318-433f-88da-e160ee5a0e71-kserve-provision-location\") pod \"isvc-primary-77d500-predictor-8d9ffc784-l4s82\" (UID: \"24d79a92-4318-433f-88da-e160ee5a0e71\") " pod="kserve-ci-e2e-test/isvc-primary-77d500-predictor-8d9ffc784-l4s82" Apr 28 19:45:33.794086 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:45:33.794034 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/24d79a92-4318-433f-88da-e160ee5a0e71-kserve-provision-location\") pod \"isvc-primary-77d500-predictor-8d9ffc784-l4s82\" (UID: \"24d79a92-4318-433f-88da-e160ee5a0e71\") " pod="kserve-ci-e2e-test/isvc-primary-77d500-predictor-8d9ffc784-l4s82" Apr 28 19:45:33.794296 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:45:33.794116 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-chf2c\" (UniqueName: \"kubernetes.io/projected/24d79a92-4318-433f-88da-e160ee5a0e71-kube-api-access-chf2c\") pod \"isvc-primary-77d500-predictor-8d9ffc784-l4s82\" (UID: \"24d79a92-4318-433f-88da-e160ee5a0e71\") " pod="kserve-ci-e2e-test/isvc-primary-77d500-predictor-8d9ffc784-l4s82" Apr 28 19:45:33.794296 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:45:33.794139 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-primary-77d500-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/24d79a92-4318-433f-88da-e160ee5a0e71-isvc-primary-77d500-kube-rbac-proxy-sar-config\") pod \"isvc-primary-77d500-predictor-8d9ffc784-l4s82\" (UID: \"24d79a92-4318-433f-88da-e160ee5a0e71\") " pod="kserve-ci-e2e-test/isvc-primary-77d500-predictor-8d9ffc784-l4s82" Apr 28 19:45:33.794296 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:45:33.794160 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/24d79a92-4318-433f-88da-e160ee5a0e71-proxy-tls\") pod \"isvc-primary-77d500-predictor-8d9ffc784-l4s82\" (UID: \"24d79a92-4318-433f-88da-e160ee5a0e71\") " pod="kserve-ci-e2e-test/isvc-primary-77d500-predictor-8d9ffc784-l4s82" Apr 28 19:45:33.794544 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:45:33.794522 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/24d79a92-4318-433f-88da-e160ee5a0e71-kserve-provision-location\") pod \"isvc-primary-77d500-predictor-8d9ffc784-l4s82\" (UID: \"24d79a92-4318-433f-88da-e160ee5a0e71\") " pod="kserve-ci-e2e-test/isvc-primary-77d500-predictor-8d9ffc784-l4s82" Apr 28 19:45:33.794872 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:45:33.794845 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-primary-77d500-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/24d79a92-4318-433f-88da-e160ee5a0e71-isvc-primary-77d500-kube-rbac-proxy-sar-config\") pod \"isvc-primary-77d500-predictor-8d9ffc784-l4s82\" (UID: \"24d79a92-4318-433f-88da-e160ee5a0e71\") " pod="kserve-ci-e2e-test/isvc-primary-77d500-predictor-8d9ffc784-l4s82" Apr 28 19:45:33.796647 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:45:33.796630 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/24d79a92-4318-433f-88da-e160ee5a0e71-proxy-tls\") pod \"isvc-primary-77d500-predictor-8d9ffc784-l4s82\" (UID: \"24d79a92-4318-433f-88da-e160ee5a0e71\") " pod="kserve-ci-e2e-test/isvc-primary-77d500-predictor-8d9ffc784-l4s82" Apr 28 19:45:33.802361 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:45:33.802341 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-chf2c\" (UniqueName: \"kubernetes.io/projected/24d79a92-4318-433f-88da-e160ee5a0e71-kube-api-access-chf2c\") pod \"isvc-primary-77d500-predictor-8d9ffc784-l4s82\" (UID: \"24d79a92-4318-433f-88da-e160ee5a0e71\") " pod="kserve-ci-e2e-test/isvc-primary-77d500-predictor-8d9ffc784-l4s82" Apr 28 19:45:33.903542 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:45:33.903507 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-77d500-predictor-8d9ffc784-l4s82" Apr 28 19:45:34.029284 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:45:34.029255 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-77d500-predictor-8d9ffc784-l4s82"] Apr 28 19:45:34.031006 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:45:34.030975 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24d79a92_4318_433f_88da_e160ee5a0e71.slice/crio-c43f83d98d2233f552da2c4837233dcd9af62fdc30cb56eb77c3c3500b5377dd WatchSource:0}: Error finding container c43f83d98d2233f552da2c4837233dcd9af62fdc30cb56eb77c3c3500b5377dd: Status 404 returned error can't find the container with id c43f83d98d2233f552da2c4837233dcd9af62fdc30cb56eb77c3c3500b5377dd Apr 28 19:45:34.032890 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:45:34.032872 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 19:45:34.728124 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:45:34.728091 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-77d500-predictor-8d9ffc784-l4s82" event={"ID":"24d79a92-4318-433f-88da-e160ee5a0e71","Type":"ContainerStarted","Data":"316b6164c498daa2fa8a5e1cf4fc3961b9d58a78007efda75e0b6de9efdfa6f7"} Apr 28 19:45:34.728124 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:45:34.728126 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-77d500-predictor-8d9ffc784-l4s82" event={"ID":"24d79a92-4318-433f-88da-e160ee5a0e71","Type":"ContainerStarted","Data":"c43f83d98d2233f552da2c4837233dcd9af62fdc30cb56eb77c3c3500b5377dd"} Apr 28 19:45:38.745458 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:45:38.745423 2571 generic.go:358] "Generic (PLEG): container finished" podID="24d79a92-4318-433f-88da-e160ee5a0e71" containerID="316b6164c498daa2fa8a5e1cf4fc3961b9d58a78007efda75e0b6de9efdfa6f7" exitCode=0 Apr 28 19:45:38.745893 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:45:38.745510 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-77d500-predictor-8d9ffc784-l4s82" event={"ID":"24d79a92-4318-433f-88da-e160ee5a0e71","Type":"ContainerDied","Data":"316b6164c498daa2fa8a5e1cf4fc3961b9d58a78007efda75e0b6de9efdfa6f7"} Apr 28 19:45:39.750476 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:45:39.750442 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-77d500-predictor-8d9ffc784-l4s82" event={"ID":"24d79a92-4318-433f-88da-e160ee5a0e71","Type":"ContainerStarted","Data":"aca05f80cf7a91854bdc46ce52421e7e4add0b4c9148344c6d0d8641f5714f84"} Apr 28 19:45:39.750476 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:45:39.750507 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-77d500-predictor-8d9ffc784-l4s82" event={"ID":"24d79a92-4318-433f-88da-e160ee5a0e71","Type":"ContainerStarted","Data":"cf5b5368991ca1f9e836c47a465ed5507b5320467ae9327e686bbeed513cb86c"} Apr 28 19:45:39.750943 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:45:39.750826 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-77d500-predictor-8d9ffc784-l4s82" Apr 28 19:45:39.750997 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:45:39.750970 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-77d500-predictor-8d9ffc784-l4s82" Apr 28 19:45:39.752275 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:45:39.752250 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-77d500-predictor-8d9ffc784-l4s82" podUID="24d79a92-4318-433f-88da-e160ee5a0e71" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 28 19:45:39.782920 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:45:39.782872 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-primary-77d500-predictor-8d9ffc784-l4s82" podStartSLOduration=6.782859343 podStartE2EDuration="6.782859343s" podCreationTimestamp="2026-04-28 19:45:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:45:39.781760185 +0000 UTC m=+1756.412661131" watchObservedRunningTime="2026-04-28 19:45:39.782859343 +0000 UTC m=+1756.413760290" Apr 28 19:45:40.753152 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:45:40.753108 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-77d500-predictor-8d9ffc784-l4s82" podUID="24d79a92-4318-433f-88da-e160ee5a0e71" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 28 19:45:41.756674 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:45:41.756636 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-77d500-predictor-8d9ffc784-l4s82" podUID="24d79a92-4318-433f-88da-e160ee5a0e71" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 28 19:45:46.761532 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:45:46.761474 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-77d500-predictor-8d9ffc784-l4s82" Apr 28 19:45:46.762016 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:45:46.761989 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-77d500-predictor-8d9ffc784-l4s82" podUID="24d79a92-4318-433f-88da-e160ee5a0e71" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 28 19:45:56.762107 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:45:56.762061 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-77d500-predictor-8d9ffc784-l4s82" podUID="24d79a92-4318-433f-88da-e160ee5a0e71" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 28 19:46:06.761993 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:46:06.761952 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-77d500-predictor-8d9ffc784-l4s82" podUID="24d79a92-4318-433f-88da-e160ee5a0e71" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 28 19:46:16.762257 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:46:16.762203 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-77d500-predictor-8d9ffc784-l4s82" podUID="24d79a92-4318-433f-88da-e160ee5a0e71" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 28 19:46:23.958338 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:46:23.958311 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ppk4t_238e19ca-102f-43b1-8aed-9322ca47bfc9/ovn-acl-logging/0.log" Apr 28 19:46:23.963572 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:46:23.963549 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ppk4t_238e19ca-102f-43b1-8aed-9322ca47bfc9/ovn-acl-logging/0.log" Apr 28 19:46:26.762675 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:46:26.762631 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-77d500-predictor-8d9ffc784-l4s82" podUID="24d79a92-4318-433f-88da-e160ee5a0e71" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 28 19:46:36.762741 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:46:36.762696 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-77d500-predictor-8d9ffc784-l4s82" podUID="24d79a92-4318-433f-88da-e160ee5a0e71" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 28 19:46:46.762973 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:46:46.762943 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-77d500-predictor-8d9ffc784-l4s82" Apr 28 19:46:52.882907 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:46:52.882872 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-77d500-predictor-7cbd677c59-wpc97"] Apr 28 19:46:52.886400 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:46:52.886379 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-77d500-predictor-7cbd677c59-wpc97" Apr 28 19:46:52.888188 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:46:52.888164 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 28 19:46:52.888305 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:46:52.888164 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-secret-77d500\"" Apr 28 19:46:52.888430 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:46:52.888417 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-secondary-77d500-predictor-serving-cert\"" Apr 28 19:46:52.888617 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:46:52.888594 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-secondary-77d500-kube-rbac-proxy-sar-config\"" Apr 28 19:46:52.888713 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:46:52.888695 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-sa-77d500-dockercfg-9sb2q\"" Apr 28 19:46:52.897523 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:46:52.897491 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-77d500-predictor-7cbd677c59-wpc97"] Apr 28 19:46:52.939590 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:46:52.939558 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/17d692a4-ac88-467f-97fe-4ec44b36dbb0-proxy-tls\") pod \"isvc-secondary-77d500-predictor-7cbd677c59-wpc97\" (UID: \"17d692a4-ac88-467f-97fe-4ec44b36dbb0\") " pod="kserve-ci-e2e-test/isvc-secondary-77d500-predictor-7cbd677c59-wpc97" Apr 28 19:46:52.939746 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:46:52.939600 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/17d692a4-ac88-467f-97fe-4ec44b36dbb0-kserve-provision-location\") pod \"isvc-secondary-77d500-predictor-7cbd677c59-wpc97\" (UID: \"17d692a4-ac88-467f-97fe-4ec44b36dbb0\") " pod="kserve-ci-e2e-test/isvc-secondary-77d500-predictor-7cbd677c59-wpc97" Apr 28 19:46:52.939746 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:46:52.939689 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwbfg\" (UniqueName: \"kubernetes.io/projected/17d692a4-ac88-467f-97fe-4ec44b36dbb0-kube-api-access-fwbfg\") pod \"isvc-secondary-77d500-predictor-7cbd677c59-wpc97\" (UID: \"17d692a4-ac88-467f-97fe-4ec44b36dbb0\") " pod="kserve-ci-e2e-test/isvc-secondary-77d500-predictor-7cbd677c59-wpc97" Apr 28 19:46:52.939746 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:46:52.939735 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/17d692a4-ac88-467f-97fe-4ec44b36dbb0-cabundle-cert\") pod \"isvc-secondary-77d500-predictor-7cbd677c59-wpc97\" (UID: \"17d692a4-ac88-467f-97fe-4ec44b36dbb0\") " pod="kserve-ci-e2e-test/isvc-secondary-77d500-predictor-7cbd677c59-wpc97" Apr 28 19:46:52.939879 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:46:52.939756 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-secondary-77d500-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/17d692a4-ac88-467f-97fe-4ec44b36dbb0-isvc-secondary-77d500-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-77d500-predictor-7cbd677c59-wpc97\" (UID: \"17d692a4-ac88-467f-97fe-4ec44b36dbb0\") " pod="kserve-ci-e2e-test/isvc-secondary-77d500-predictor-7cbd677c59-wpc97" Apr 28 19:46:53.040402 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:46:53.040370 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/17d692a4-ac88-467f-97fe-4ec44b36dbb0-proxy-tls\") pod \"isvc-secondary-77d500-predictor-7cbd677c59-wpc97\" (UID: \"17d692a4-ac88-467f-97fe-4ec44b36dbb0\") " pod="kserve-ci-e2e-test/isvc-secondary-77d500-predictor-7cbd677c59-wpc97" Apr 28 19:46:53.040603 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:46:53.040416 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/17d692a4-ac88-467f-97fe-4ec44b36dbb0-kserve-provision-location\") pod \"isvc-secondary-77d500-predictor-7cbd677c59-wpc97\" (UID: \"17d692a4-ac88-467f-97fe-4ec44b36dbb0\") " pod="kserve-ci-e2e-test/isvc-secondary-77d500-predictor-7cbd677c59-wpc97" Apr 28 19:46:53.040603 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:46:53.040502 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fwbfg\" (UniqueName: \"kubernetes.io/projected/17d692a4-ac88-467f-97fe-4ec44b36dbb0-kube-api-access-fwbfg\") pod \"isvc-secondary-77d500-predictor-7cbd677c59-wpc97\" (UID: \"17d692a4-ac88-467f-97fe-4ec44b36dbb0\") " pod="kserve-ci-e2e-test/isvc-secondary-77d500-predictor-7cbd677c59-wpc97" Apr 28 19:46:53.040603 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:46:53.040537 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/17d692a4-ac88-467f-97fe-4ec44b36dbb0-cabundle-cert\") pod \"isvc-secondary-77d500-predictor-7cbd677c59-wpc97\" (UID: \"17d692a4-ac88-467f-97fe-4ec44b36dbb0\") " pod="kserve-ci-e2e-test/isvc-secondary-77d500-predictor-7cbd677c59-wpc97" Apr 28 19:46:53.040603 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:46:53.040572 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-secondary-77d500-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/17d692a4-ac88-467f-97fe-4ec44b36dbb0-isvc-secondary-77d500-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-77d500-predictor-7cbd677c59-wpc97\" (UID: \"17d692a4-ac88-467f-97fe-4ec44b36dbb0\") " pod="kserve-ci-e2e-test/isvc-secondary-77d500-predictor-7cbd677c59-wpc97" Apr 28 19:46:53.040934 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:46:53.040907 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/17d692a4-ac88-467f-97fe-4ec44b36dbb0-kserve-provision-location\") pod \"isvc-secondary-77d500-predictor-7cbd677c59-wpc97\" (UID: \"17d692a4-ac88-467f-97fe-4ec44b36dbb0\") " pod="kserve-ci-e2e-test/isvc-secondary-77d500-predictor-7cbd677c59-wpc97" Apr 28 19:46:53.041276 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:46:53.041255 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-secondary-77d500-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/17d692a4-ac88-467f-97fe-4ec44b36dbb0-isvc-secondary-77d500-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-77d500-predictor-7cbd677c59-wpc97\" (UID: \"17d692a4-ac88-467f-97fe-4ec44b36dbb0\") " pod="kserve-ci-e2e-test/isvc-secondary-77d500-predictor-7cbd677c59-wpc97" Apr 28 19:46:53.041359 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:46:53.041300 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/17d692a4-ac88-467f-97fe-4ec44b36dbb0-cabundle-cert\") pod \"isvc-secondary-77d500-predictor-7cbd677c59-wpc97\" (UID: \"17d692a4-ac88-467f-97fe-4ec44b36dbb0\") " pod="kserve-ci-e2e-test/isvc-secondary-77d500-predictor-7cbd677c59-wpc97" Apr 28 19:46:53.043044 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:46:53.043022 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/17d692a4-ac88-467f-97fe-4ec44b36dbb0-proxy-tls\") pod \"isvc-secondary-77d500-predictor-7cbd677c59-wpc97\" (UID: \"17d692a4-ac88-467f-97fe-4ec44b36dbb0\") " pod="kserve-ci-e2e-test/isvc-secondary-77d500-predictor-7cbd677c59-wpc97" Apr 28 19:46:53.050270 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:46:53.050242 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwbfg\" (UniqueName: \"kubernetes.io/projected/17d692a4-ac88-467f-97fe-4ec44b36dbb0-kube-api-access-fwbfg\") pod \"isvc-secondary-77d500-predictor-7cbd677c59-wpc97\" (UID: \"17d692a4-ac88-467f-97fe-4ec44b36dbb0\") " pod="kserve-ci-e2e-test/isvc-secondary-77d500-predictor-7cbd677c59-wpc97" Apr 28 19:46:53.202025 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:46:53.201985 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-77d500-predictor-7cbd677c59-wpc97" Apr 28 19:46:53.335222 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:46:53.335187 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-77d500-predictor-7cbd677c59-wpc97"] Apr 28 19:46:53.338230 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:46:53.338190 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17d692a4_ac88_467f_97fe_4ec44b36dbb0.slice/crio-5c2bbeb3537fe8729c51d8e786e34906f6d34ab470d5bc53bc83c934650862df WatchSource:0}: Error finding container 5c2bbeb3537fe8729c51d8e786e34906f6d34ab470d5bc53bc83c934650862df: Status 404 returned error can't find the container with id 5c2bbeb3537fe8729c51d8e786e34906f6d34ab470d5bc53bc83c934650862df Apr 28 19:46:53.972503 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:46:53.972448 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-77d500-predictor-7cbd677c59-wpc97" event={"ID":"17d692a4-ac88-467f-97fe-4ec44b36dbb0","Type":"ContainerStarted","Data":"6721d0610d4ed4d7fa4c7b1163e93eee823b1bea49be1af60c47c700b6ff7a09"} Apr 28 19:46:53.972503 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:46:53.972505 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-77d500-predictor-7cbd677c59-wpc97" event={"ID":"17d692a4-ac88-467f-97fe-4ec44b36dbb0","Type":"ContainerStarted","Data":"5c2bbeb3537fe8729c51d8e786e34906f6d34ab470d5bc53bc83c934650862df"} Apr 28 19:46:57.985093 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:46:57.985067 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-77d500-predictor-7cbd677c59-wpc97_17d692a4-ac88-467f-97fe-4ec44b36dbb0/storage-initializer/0.log" Apr 28 19:46:57.985459 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:46:57.985100 2571 generic.go:358] "Generic (PLEG): container finished" podID="17d692a4-ac88-467f-97fe-4ec44b36dbb0" containerID="6721d0610d4ed4d7fa4c7b1163e93eee823b1bea49be1af60c47c700b6ff7a09" exitCode=1 Apr 28 19:46:57.985459 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:46:57.985181 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-77d500-predictor-7cbd677c59-wpc97" event={"ID":"17d692a4-ac88-467f-97fe-4ec44b36dbb0","Type":"ContainerDied","Data":"6721d0610d4ed4d7fa4c7b1163e93eee823b1bea49be1af60c47c700b6ff7a09"} Apr 28 19:46:58.989865 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:46:58.989836 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-77d500-predictor-7cbd677c59-wpc97_17d692a4-ac88-467f-97fe-4ec44b36dbb0/storage-initializer/0.log" Apr 28 19:46:58.990247 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:46:58.989935 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-77d500-predictor-7cbd677c59-wpc97" event={"ID":"17d692a4-ac88-467f-97fe-4ec44b36dbb0","Type":"ContainerStarted","Data":"f04648ee1ccf724709ecd8e8f1f144ff60a7d6ed1e51913ee69ae7ca7299222f"} Apr 28 19:47:01.999092 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:01.999061 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-77d500-predictor-7cbd677c59-wpc97_17d692a4-ac88-467f-97fe-4ec44b36dbb0/storage-initializer/1.log" Apr 28 19:47:01.999505 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:01.999420 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-77d500-predictor-7cbd677c59-wpc97_17d692a4-ac88-467f-97fe-4ec44b36dbb0/storage-initializer/0.log" Apr 28 19:47:01.999505 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:01.999451 2571 generic.go:358] "Generic (PLEG): container finished" podID="17d692a4-ac88-467f-97fe-4ec44b36dbb0" containerID="f04648ee1ccf724709ecd8e8f1f144ff60a7d6ed1e51913ee69ae7ca7299222f" exitCode=1 Apr 28 19:47:01.999591 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:01.999528 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-77d500-predictor-7cbd677c59-wpc97" event={"ID":"17d692a4-ac88-467f-97fe-4ec44b36dbb0","Type":"ContainerDied","Data":"f04648ee1ccf724709ecd8e8f1f144ff60a7d6ed1e51913ee69ae7ca7299222f"} Apr 28 19:47:01.999591 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:01.999579 2571 scope.go:117] "RemoveContainer" containerID="6721d0610d4ed4d7fa4c7b1163e93eee823b1bea49be1af60c47c700b6ff7a09" Apr 28 19:47:01.999972 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:01.999938 2571 scope.go:117] "RemoveContainer" containerID="6721d0610d4ed4d7fa4c7b1163e93eee823b1bea49be1af60c47c700b6ff7a09" Apr 28 19:47:02.010147 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:47:02.010119 2571 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-77d500-predictor-7cbd677c59-wpc97_kserve-ci-e2e-test_17d692a4-ac88-467f-97fe-4ec44b36dbb0_0 in pod sandbox 5c2bbeb3537fe8729c51d8e786e34906f6d34ab470d5bc53bc83c934650862df from index: no such id: '6721d0610d4ed4d7fa4c7b1163e93eee823b1bea49be1af60c47c700b6ff7a09'" containerID="6721d0610d4ed4d7fa4c7b1163e93eee823b1bea49be1af60c47c700b6ff7a09" Apr 28 19:47:02.010232 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:02.010158 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6721d0610d4ed4d7fa4c7b1163e93eee823b1bea49be1af60c47c700b6ff7a09"} err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-77d500-predictor-7cbd677c59-wpc97_kserve-ci-e2e-test_17d692a4-ac88-467f-97fe-4ec44b36dbb0_0 in pod sandbox 5c2bbeb3537fe8729c51d8e786e34906f6d34ab470d5bc53bc83c934650862df from index: no such id: '6721d0610d4ed4d7fa4c7b1163e93eee823b1bea49be1af60c47c700b6ff7a09'" Apr 28 19:47:02.010407 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:47:02.010384 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-secondary-77d500-predictor-7cbd677c59-wpc97_kserve-ci-e2e-test(17d692a4-ac88-467f-97fe-4ec44b36dbb0)\"" pod="kserve-ci-e2e-test/isvc-secondary-77d500-predictor-7cbd677c59-wpc97" podUID="17d692a4-ac88-467f-97fe-4ec44b36dbb0" Apr 28 19:47:03.003875 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:03.003849 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-77d500-predictor-7cbd677c59-wpc97_17d692a4-ac88-467f-97fe-4ec44b36dbb0/storage-initializer/1.log" Apr 28 19:47:08.944378 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:08.944333 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-77d500-predictor-7cbd677c59-wpc97"] Apr 28 19:47:08.987934 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:08.987885 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-77d500-predictor-8d9ffc784-l4s82"] Apr 28 19:47:08.988355 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:08.988225 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-77d500-predictor-8d9ffc784-l4s82" podUID="24d79a92-4318-433f-88da-e160ee5a0e71" containerName="kserve-container" containerID="cri-o://cf5b5368991ca1f9e836c47a465ed5507b5320467ae9327e686bbeed513cb86c" gracePeriod=30 Apr 28 19:47:08.988355 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:08.988275 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-77d500-predictor-8d9ffc784-l4s82" podUID="24d79a92-4318-433f-88da-e160ee5a0e71" containerName="kube-rbac-proxy" containerID="cri-o://aca05f80cf7a91854bdc46ce52421e7e4add0b4c9148344c6d0d8641f5714f84" gracePeriod=30 Apr 28 19:47:09.045172 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:09.045145 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-19fe47-predictor-c5688fb5c-bh8pm"] Apr 28 19:47:09.049943 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:09.049921 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-19fe47-predictor-c5688fb5c-bh8pm" Apr 28 19:47:09.051954 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:09.051927 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-sa-19fe47-dockercfg-55hfj\"" Apr 28 19:47:09.052084 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:09.051931 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-init-fail-19fe47-kube-rbac-proxy-sar-config\"" Apr 28 19:47:09.052084 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:09.051995 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-init-fail-19fe47-predictor-serving-cert\"" Apr 28 19:47:09.052084 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:09.052003 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-secret-19fe47\"" Apr 28 19:47:09.059999 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:09.059973 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-19fe47-predictor-c5688fb5c-bh8pm"] Apr 28 19:47:09.107689 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:09.107670 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-77d500-predictor-7cbd677c59-wpc97_17d692a4-ac88-467f-97fe-4ec44b36dbb0/storage-initializer/1.log" Apr 28 19:47:09.107796 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:09.107732 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-77d500-predictor-7cbd677c59-wpc97" Apr 28 19:47:09.175192 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:09.175163 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/17d692a4-ac88-467f-97fe-4ec44b36dbb0-cabundle-cert\") pod \"17d692a4-ac88-467f-97fe-4ec44b36dbb0\" (UID: \"17d692a4-ac88-467f-97fe-4ec44b36dbb0\") " Apr 28 19:47:09.175381 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:09.175214 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwbfg\" (UniqueName: \"kubernetes.io/projected/17d692a4-ac88-467f-97fe-4ec44b36dbb0-kube-api-access-fwbfg\") pod \"17d692a4-ac88-467f-97fe-4ec44b36dbb0\" (UID: \"17d692a4-ac88-467f-97fe-4ec44b36dbb0\") " Apr 28 19:47:09.175381 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:09.175244 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/17d692a4-ac88-467f-97fe-4ec44b36dbb0-kserve-provision-location\") pod \"17d692a4-ac88-467f-97fe-4ec44b36dbb0\" (UID: \"17d692a4-ac88-467f-97fe-4ec44b36dbb0\") " Apr 28 19:47:09.175381 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:09.175275 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/17d692a4-ac88-467f-97fe-4ec44b36dbb0-proxy-tls\") pod \"17d692a4-ac88-467f-97fe-4ec44b36dbb0\" (UID: \"17d692a4-ac88-467f-97fe-4ec44b36dbb0\") " Apr 28 19:47:09.175381 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:09.175292 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-secondary-77d500-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/17d692a4-ac88-467f-97fe-4ec44b36dbb0-isvc-secondary-77d500-kube-rbac-proxy-sar-config\") pod \"17d692a4-ac88-467f-97fe-4ec44b36dbb0\" (UID: \"17d692a4-ac88-467f-97fe-4ec44b36dbb0\") " Apr 28 19:47:09.175649 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:09.175437 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/42088543-26c0-4162-a75a-dfc5448d4af4-proxy-tls\") pod \"isvc-init-fail-19fe47-predictor-c5688fb5c-bh8pm\" (UID: \"42088543-26c0-4162-a75a-dfc5448d4af4\") " pod="kserve-ci-e2e-test/isvc-init-fail-19fe47-predictor-c5688fb5c-bh8pm" Apr 28 19:47:09.175649 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:09.175475 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6ksb\" (UniqueName: \"kubernetes.io/projected/42088543-26c0-4162-a75a-dfc5448d4af4-kube-api-access-x6ksb\") pod \"isvc-init-fail-19fe47-predictor-c5688fb5c-bh8pm\" (UID: \"42088543-26c0-4162-a75a-dfc5448d4af4\") " pod="kserve-ci-e2e-test/isvc-init-fail-19fe47-predictor-c5688fb5c-bh8pm" Apr 28 19:47:09.175649 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:09.175576 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17d692a4-ac88-467f-97fe-4ec44b36dbb0-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "17d692a4-ac88-467f-97fe-4ec44b36dbb0" (UID: "17d692a4-ac88-467f-97fe-4ec44b36dbb0"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:47:09.175649 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:09.175613 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/42088543-26c0-4162-a75a-dfc5448d4af4-kserve-provision-location\") pod \"isvc-init-fail-19fe47-predictor-c5688fb5c-bh8pm\" (UID: \"42088543-26c0-4162-a75a-dfc5448d4af4\") " pod="kserve-ci-e2e-test/isvc-init-fail-19fe47-predictor-c5688fb5c-bh8pm" Apr 28 19:47:09.175858 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:09.175658 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/42088543-26c0-4162-a75a-dfc5448d4af4-cabundle-cert\") pod \"isvc-init-fail-19fe47-predictor-c5688fb5c-bh8pm\" (UID: \"42088543-26c0-4162-a75a-dfc5448d4af4\") " pod="kserve-ci-e2e-test/isvc-init-fail-19fe47-predictor-c5688fb5c-bh8pm" Apr 28 19:47:09.175858 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:09.175696 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17d692a4-ac88-467f-97fe-4ec44b36dbb0-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "17d692a4-ac88-467f-97fe-4ec44b36dbb0" (UID: "17d692a4-ac88-467f-97fe-4ec44b36dbb0"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:47:09.175858 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:09.175720 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-init-fail-19fe47-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/42088543-26c0-4162-a75a-dfc5448d4af4-isvc-init-fail-19fe47-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-19fe47-predictor-c5688fb5c-bh8pm\" (UID: \"42088543-26c0-4162-a75a-dfc5448d4af4\") " pod="kserve-ci-e2e-test/isvc-init-fail-19fe47-predictor-c5688fb5c-bh8pm" Apr 28 19:47:09.175858 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:09.175783 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/17d692a4-ac88-467f-97fe-4ec44b36dbb0-kserve-provision-location\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:47:09.175858 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:09.175784 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17d692a4-ac88-467f-97fe-4ec44b36dbb0-isvc-secondary-77d500-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-secondary-77d500-kube-rbac-proxy-sar-config") pod "17d692a4-ac88-467f-97fe-4ec44b36dbb0" (UID: "17d692a4-ac88-467f-97fe-4ec44b36dbb0"). InnerVolumeSpecName "isvc-secondary-77d500-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:47:09.175858 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:09.175800 2571 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/17d692a4-ac88-467f-97fe-4ec44b36dbb0-cabundle-cert\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:47:09.177444 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:09.177424 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17d692a4-ac88-467f-97fe-4ec44b36dbb0-kube-api-access-fwbfg" (OuterVolumeSpecName: "kube-api-access-fwbfg") pod "17d692a4-ac88-467f-97fe-4ec44b36dbb0" (UID: "17d692a4-ac88-467f-97fe-4ec44b36dbb0"). InnerVolumeSpecName "kube-api-access-fwbfg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:47:09.177611 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:09.177595 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17d692a4-ac88-467f-97fe-4ec44b36dbb0-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "17d692a4-ac88-467f-97fe-4ec44b36dbb0" (UID: "17d692a4-ac88-467f-97fe-4ec44b36dbb0"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:47:09.277020 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:09.276930 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/42088543-26c0-4162-a75a-dfc5448d4af4-cabundle-cert\") pod \"isvc-init-fail-19fe47-predictor-c5688fb5c-bh8pm\" (UID: \"42088543-26c0-4162-a75a-dfc5448d4af4\") " pod="kserve-ci-e2e-test/isvc-init-fail-19fe47-predictor-c5688fb5c-bh8pm" Apr 28 19:47:09.277020 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:09.276992 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-init-fail-19fe47-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/42088543-26c0-4162-a75a-dfc5448d4af4-isvc-init-fail-19fe47-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-19fe47-predictor-c5688fb5c-bh8pm\" (UID: \"42088543-26c0-4162-a75a-dfc5448d4af4\") " pod="kserve-ci-e2e-test/isvc-init-fail-19fe47-predictor-c5688fb5c-bh8pm" Apr 28 19:47:09.277231 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:09.277050 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/42088543-26c0-4162-a75a-dfc5448d4af4-proxy-tls\") pod \"isvc-init-fail-19fe47-predictor-c5688fb5c-bh8pm\" (UID: \"42088543-26c0-4162-a75a-dfc5448d4af4\") " pod="kserve-ci-e2e-test/isvc-init-fail-19fe47-predictor-c5688fb5c-bh8pm" Apr 28 19:47:09.277231 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:09.277085 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x6ksb\" (UniqueName: \"kubernetes.io/projected/42088543-26c0-4162-a75a-dfc5448d4af4-kube-api-access-x6ksb\") pod \"isvc-init-fail-19fe47-predictor-c5688fb5c-bh8pm\" (UID: \"42088543-26c0-4162-a75a-dfc5448d4af4\") " pod="kserve-ci-e2e-test/isvc-init-fail-19fe47-predictor-c5688fb5c-bh8pm" Apr 28 19:47:09.277231 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:09.277141 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/42088543-26c0-4162-a75a-dfc5448d4af4-kserve-provision-location\") pod \"isvc-init-fail-19fe47-predictor-c5688fb5c-bh8pm\" (UID: \"42088543-26c0-4162-a75a-dfc5448d4af4\") " pod="kserve-ci-e2e-test/isvc-init-fail-19fe47-predictor-c5688fb5c-bh8pm" Apr 28 19:47:09.277231 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:09.277184 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/17d692a4-ac88-467f-97fe-4ec44b36dbb0-proxy-tls\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:47:09.277231 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:09.277200 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-secondary-77d500-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/17d692a4-ac88-467f-97fe-4ec44b36dbb0-isvc-secondary-77d500-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:47:09.277231 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:09.277219 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fwbfg\" (UniqueName: \"kubernetes.io/projected/17d692a4-ac88-467f-97fe-4ec44b36dbb0-kube-api-access-fwbfg\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:47:09.277557 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:09.277535 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/42088543-26c0-4162-a75a-dfc5448d4af4-kserve-provision-location\") pod \"isvc-init-fail-19fe47-predictor-c5688fb5c-bh8pm\" (UID: \"42088543-26c0-4162-a75a-dfc5448d4af4\") " pod="kserve-ci-e2e-test/isvc-init-fail-19fe47-predictor-c5688fb5c-bh8pm" Apr 28 19:47:09.277756 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:09.277736 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-init-fail-19fe47-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/42088543-26c0-4162-a75a-dfc5448d4af4-isvc-init-fail-19fe47-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-19fe47-predictor-c5688fb5c-bh8pm\" (UID: \"42088543-26c0-4162-a75a-dfc5448d4af4\") " pod="kserve-ci-e2e-test/isvc-init-fail-19fe47-predictor-c5688fb5c-bh8pm" Apr 28 19:47:09.277794 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:09.277745 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/42088543-26c0-4162-a75a-dfc5448d4af4-cabundle-cert\") pod \"isvc-init-fail-19fe47-predictor-c5688fb5c-bh8pm\" (UID: \"42088543-26c0-4162-a75a-dfc5448d4af4\") " pod="kserve-ci-e2e-test/isvc-init-fail-19fe47-predictor-c5688fb5c-bh8pm" Apr 28 19:47:09.279512 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:09.279475 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/42088543-26c0-4162-a75a-dfc5448d4af4-proxy-tls\") pod \"isvc-init-fail-19fe47-predictor-c5688fb5c-bh8pm\" (UID: \"42088543-26c0-4162-a75a-dfc5448d4af4\") " pod="kserve-ci-e2e-test/isvc-init-fail-19fe47-predictor-c5688fb5c-bh8pm" Apr 28 19:47:09.285885 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:09.285856 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6ksb\" (UniqueName: \"kubernetes.io/projected/42088543-26c0-4162-a75a-dfc5448d4af4-kube-api-access-x6ksb\") pod \"isvc-init-fail-19fe47-predictor-c5688fb5c-bh8pm\" (UID: \"42088543-26c0-4162-a75a-dfc5448d4af4\") " pod="kserve-ci-e2e-test/isvc-init-fail-19fe47-predictor-c5688fb5c-bh8pm" Apr 28 19:47:09.361994 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:09.361948 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-19fe47-predictor-c5688fb5c-bh8pm" Apr 28 19:47:09.494193 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:09.494151 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-19fe47-predictor-c5688fb5c-bh8pm"] Apr 28 19:47:09.497205 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:47:09.497176 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42088543_26c0_4162_a75a_dfc5448d4af4.slice/crio-6c598c0296d38a92db465338217e77a7b4aefd375ed46f130275aeadd0802334 WatchSource:0}: Error finding container 6c598c0296d38a92db465338217e77a7b4aefd375ed46f130275aeadd0802334: Status 404 returned error can't find the container with id 6c598c0296d38a92db465338217e77a7b4aefd375ed46f130275aeadd0802334 Apr 28 19:47:10.029840 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:10.029806 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-77d500-predictor-7cbd677c59-wpc97_17d692a4-ac88-467f-97fe-4ec44b36dbb0/storage-initializer/1.log" Apr 28 19:47:10.030278 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:10.029924 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-77d500-predictor-7cbd677c59-wpc97" event={"ID":"17d692a4-ac88-467f-97fe-4ec44b36dbb0","Type":"ContainerDied","Data":"5c2bbeb3537fe8729c51d8e786e34906f6d34ab470d5bc53bc83c934650862df"} Apr 28 19:47:10.030278 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:10.029971 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-77d500-predictor-7cbd677c59-wpc97" Apr 28 19:47:10.030278 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:10.029976 2571 scope.go:117] "RemoveContainer" containerID="f04648ee1ccf724709ecd8e8f1f144ff60a7d6ed1e51913ee69ae7ca7299222f" Apr 28 19:47:10.031619 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:10.031588 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-19fe47-predictor-c5688fb5c-bh8pm" event={"ID":"42088543-26c0-4162-a75a-dfc5448d4af4","Type":"ContainerStarted","Data":"8fde979eccb580836214f0dcdc9ea6d5c2d6f7ebec50c33b35fba0208c0406f0"} Apr 28 19:47:10.031748 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:10.031627 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-19fe47-predictor-c5688fb5c-bh8pm" event={"ID":"42088543-26c0-4162-a75a-dfc5448d4af4","Type":"ContainerStarted","Data":"6c598c0296d38a92db465338217e77a7b4aefd375ed46f130275aeadd0802334"} Apr 28 19:47:10.034037 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:10.034014 2571 generic.go:358] "Generic (PLEG): container finished" podID="24d79a92-4318-433f-88da-e160ee5a0e71" containerID="aca05f80cf7a91854bdc46ce52421e7e4add0b4c9148344c6d0d8641f5714f84" exitCode=2 Apr 28 19:47:10.034162 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:10.034095 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-77d500-predictor-8d9ffc784-l4s82" event={"ID":"24d79a92-4318-433f-88da-e160ee5a0e71","Type":"ContainerDied","Data":"aca05f80cf7a91854bdc46ce52421e7e4add0b4c9148344c6d0d8641f5714f84"} Apr 28 19:47:10.084665 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:10.084623 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-77d500-predictor-7cbd677c59-wpc97"] Apr 28 19:47:10.091114 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:10.091080 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-77d500-predictor-7cbd677c59-wpc97"] Apr 28 19:47:11.757147 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:11.757100 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-77d500-predictor-8d9ffc784-l4s82" podUID="24d79a92-4318-433f-88da-e160ee5a0e71" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.39:8643/healthz\": dial tcp 10.134.0.39:8643: connect: connection refused" Apr 28 19:47:11.946621 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:11.946586 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17d692a4-ac88-467f-97fe-4ec44b36dbb0" path="/var/lib/kubelet/pods/17d692a4-ac88-467f-97fe-4ec44b36dbb0/volumes" Apr 28 19:47:13.439861 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:13.439838 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-77d500-predictor-8d9ffc784-l4s82" Apr 28 19:47:13.511300 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:13.511273 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/24d79a92-4318-433f-88da-e160ee5a0e71-kserve-provision-location\") pod \"24d79a92-4318-433f-88da-e160ee5a0e71\" (UID: \"24d79a92-4318-433f-88da-e160ee5a0e71\") " Apr 28 19:47:13.511404 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:13.511321 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chf2c\" (UniqueName: \"kubernetes.io/projected/24d79a92-4318-433f-88da-e160ee5a0e71-kube-api-access-chf2c\") pod \"24d79a92-4318-433f-88da-e160ee5a0e71\" (UID: \"24d79a92-4318-433f-88da-e160ee5a0e71\") " Apr 28 19:47:13.511404 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:13.511371 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/24d79a92-4318-433f-88da-e160ee5a0e71-proxy-tls\") pod \"24d79a92-4318-433f-88da-e160ee5a0e71\" (UID: \"24d79a92-4318-433f-88da-e160ee5a0e71\") " Apr 28 19:47:13.511601 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:13.511403 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-primary-77d500-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/24d79a92-4318-433f-88da-e160ee5a0e71-isvc-primary-77d500-kube-rbac-proxy-sar-config\") pod \"24d79a92-4318-433f-88da-e160ee5a0e71\" (UID: \"24d79a92-4318-433f-88da-e160ee5a0e71\") " Apr 28 19:47:13.511601 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:13.511579 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24d79a92-4318-433f-88da-e160ee5a0e71-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "24d79a92-4318-433f-88da-e160ee5a0e71" (UID: "24d79a92-4318-433f-88da-e160ee5a0e71"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:47:13.511775 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:13.511753 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24d79a92-4318-433f-88da-e160ee5a0e71-isvc-primary-77d500-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-primary-77d500-kube-rbac-proxy-sar-config") pod "24d79a92-4318-433f-88da-e160ee5a0e71" (UID: "24d79a92-4318-433f-88da-e160ee5a0e71"). InnerVolumeSpecName "isvc-primary-77d500-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:47:13.513426 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:13.513406 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24d79a92-4318-433f-88da-e160ee5a0e71-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "24d79a92-4318-433f-88da-e160ee5a0e71" (UID: "24d79a92-4318-433f-88da-e160ee5a0e71"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:47:13.514404 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:13.514384 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24d79a92-4318-433f-88da-e160ee5a0e71-kube-api-access-chf2c" (OuterVolumeSpecName: "kube-api-access-chf2c") pod "24d79a92-4318-433f-88da-e160ee5a0e71" (UID: "24d79a92-4318-433f-88da-e160ee5a0e71"). InnerVolumeSpecName "kube-api-access-chf2c". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:47:13.612010 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:13.611982 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/24d79a92-4318-433f-88da-e160ee5a0e71-proxy-tls\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:47:13.612010 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:13.612009 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-primary-77d500-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/24d79a92-4318-433f-88da-e160ee5a0e71-isvc-primary-77d500-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:47:13.612146 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:13.612021 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/24d79a92-4318-433f-88da-e160ee5a0e71-kserve-provision-location\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:47:13.612146 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:13.612032 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-chf2c\" (UniqueName: \"kubernetes.io/projected/24d79a92-4318-433f-88da-e160ee5a0e71-kube-api-access-chf2c\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:47:14.048004 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:14.047978 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-19fe47-predictor-c5688fb5c-bh8pm_42088543-26c0-4162-a75a-dfc5448d4af4/storage-initializer/0.log" Apr 28 19:47:14.048166 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:14.048014 2571 generic.go:358] "Generic (PLEG): container finished" podID="42088543-26c0-4162-a75a-dfc5448d4af4" containerID="8fde979eccb580836214f0dcdc9ea6d5c2d6f7ebec50c33b35fba0208c0406f0" exitCode=1 Apr 28 19:47:14.048166 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:14.048091 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-19fe47-predictor-c5688fb5c-bh8pm" event={"ID":"42088543-26c0-4162-a75a-dfc5448d4af4","Type":"ContainerDied","Data":"8fde979eccb580836214f0dcdc9ea6d5c2d6f7ebec50c33b35fba0208c0406f0"} Apr 28 19:47:14.049812 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:14.049791 2571 generic.go:358] "Generic (PLEG): container finished" podID="24d79a92-4318-433f-88da-e160ee5a0e71" containerID="cf5b5368991ca1f9e836c47a465ed5507b5320467ae9327e686bbeed513cb86c" exitCode=0 Apr 28 19:47:14.049919 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:14.049863 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-77d500-predictor-8d9ffc784-l4s82" Apr 28 19:47:14.049919 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:14.049869 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-77d500-predictor-8d9ffc784-l4s82" event={"ID":"24d79a92-4318-433f-88da-e160ee5a0e71","Type":"ContainerDied","Data":"cf5b5368991ca1f9e836c47a465ed5507b5320467ae9327e686bbeed513cb86c"} Apr 28 19:47:14.049919 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:14.049913 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-77d500-predictor-8d9ffc784-l4s82" event={"ID":"24d79a92-4318-433f-88da-e160ee5a0e71","Type":"ContainerDied","Data":"c43f83d98d2233f552da2c4837233dcd9af62fdc30cb56eb77c3c3500b5377dd"} Apr 28 19:47:14.050043 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:14.049931 2571 scope.go:117] "RemoveContainer" containerID="aca05f80cf7a91854bdc46ce52421e7e4add0b4c9148344c6d0d8641f5714f84" Apr 28 19:47:14.060215 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:14.060195 2571 scope.go:117] "RemoveContainer" containerID="cf5b5368991ca1f9e836c47a465ed5507b5320467ae9327e686bbeed513cb86c" Apr 28 19:47:14.070870 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:14.070852 2571 scope.go:117] "RemoveContainer" containerID="316b6164c498daa2fa8a5e1cf4fc3961b9d58a78007efda75e0b6de9efdfa6f7" Apr 28 19:47:14.079663 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:14.079642 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-77d500-predictor-8d9ffc784-l4s82"] Apr 28 19:47:14.085620 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:14.085591 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-77d500-predictor-8d9ffc784-l4s82"] Apr 28 19:47:14.088348 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:14.088331 2571 scope.go:117] "RemoveContainer" containerID="aca05f80cf7a91854bdc46ce52421e7e4add0b4c9148344c6d0d8641f5714f84" Apr 28 19:47:14.088664 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:47:14.088644 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aca05f80cf7a91854bdc46ce52421e7e4add0b4c9148344c6d0d8641f5714f84\": container with ID starting with aca05f80cf7a91854bdc46ce52421e7e4add0b4c9148344c6d0d8641f5714f84 not found: ID does not exist" containerID="aca05f80cf7a91854bdc46ce52421e7e4add0b4c9148344c6d0d8641f5714f84" Apr 28 19:47:14.088732 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:14.088671 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aca05f80cf7a91854bdc46ce52421e7e4add0b4c9148344c6d0d8641f5714f84"} err="failed to get container status \"aca05f80cf7a91854bdc46ce52421e7e4add0b4c9148344c6d0d8641f5714f84\": rpc error: code = NotFound desc = could not find container \"aca05f80cf7a91854bdc46ce52421e7e4add0b4c9148344c6d0d8641f5714f84\": container with ID starting with aca05f80cf7a91854bdc46ce52421e7e4add0b4c9148344c6d0d8641f5714f84 not found: ID does not exist" Apr 28 19:47:14.088732 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:14.088689 2571 scope.go:117] "RemoveContainer" containerID="cf5b5368991ca1f9e836c47a465ed5507b5320467ae9327e686bbeed513cb86c" Apr 28 19:47:14.088937 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:47:14.088922 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf5b5368991ca1f9e836c47a465ed5507b5320467ae9327e686bbeed513cb86c\": container with ID starting with cf5b5368991ca1f9e836c47a465ed5507b5320467ae9327e686bbeed513cb86c not found: ID does not exist" containerID="cf5b5368991ca1f9e836c47a465ed5507b5320467ae9327e686bbeed513cb86c" Apr 28 19:47:14.088979 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:14.088941 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf5b5368991ca1f9e836c47a465ed5507b5320467ae9327e686bbeed513cb86c"} err="failed to get container status \"cf5b5368991ca1f9e836c47a465ed5507b5320467ae9327e686bbeed513cb86c\": rpc error: code = NotFound desc = could not find container \"cf5b5368991ca1f9e836c47a465ed5507b5320467ae9327e686bbeed513cb86c\": container with ID starting with cf5b5368991ca1f9e836c47a465ed5507b5320467ae9327e686bbeed513cb86c not found: ID does not exist" Apr 28 19:47:14.088979 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:14.088956 2571 scope.go:117] "RemoveContainer" containerID="316b6164c498daa2fa8a5e1cf4fc3961b9d58a78007efda75e0b6de9efdfa6f7" Apr 28 19:47:14.089177 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:47:14.089157 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"316b6164c498daa2fa8a5e1cf4fc3961b9d58a78007efda75e0b6de9efdfa6f7\": container with ID starting with 316b6164c498daa2fa8a5e1cf4fc3961b9d58a78007efda75e0b6de9efdfa6f7 not found: ID does not exist" containerID="316b6164c498daa2fa8a5e1cf4fc3961b9d58a78007efda75e0b6de9efdfa6f7" Apr 28 19:47:14.089268 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:14.089180 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"316b6164c498daa2fa8a5e1cf4fc3961b9d58a78007efda75e0b6de9efdfa6f7"} err="failed to get container status \"316b6164c498daa2fa8a5e1cf4fc3961b9d58a78007efda75e0b6de9efdfa6f7\": rpc error: code = NotFound desc = could not find container \"316b6164c498daa2fa8a5e1cf4fc3961b9d58a78007efda75e0b6de9efdfa6f7\": container with ID starting with 316b6164c498daa2fa8a5e1cf4fc3961b9d58a78007efda75e0b6de9efdfa6f7 not found: ID does not exist" Apr 28 19:47:15.055215 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:15.055188 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-19fe47-predictor-c5688fb5c-bh8pm_42088543-26c0-4162-a75a-dfc5448d4af4/storage-initializer/0.log" Apr 28 19:47:15.055615 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:15.055237 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-19fe47-predictor-c5688fb5c-bh8pm" event={"ID":"42088543-26c0-4162-a75a-dfc5448d4af4","Type":"ContainerStarted","Data":"898963bbefcd42cb3daf1994a2b4d970413a9053ff11c9b12052c2913d007d2a"} Apr 28 19:47:15.947751 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:15.947714 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24d79a92-4318-433f-88da-e160ee5a0e71" path="/var/lib/kubelet/pods/24d79a92-4318-433f-88da-e160ee5a0e71/volumes" Apr 28 19:47:18.065026 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:18.064999 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-19fe47-predictor-c5688fb5c-bh8pm_42088543-26c0-4162-a75a-dfc5448d4af4/storage-initializer/1.log" Apr 28 19:47:18.065405 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:18.065326 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-19fe47-predictor-c5688fb5c-bh8pm_42088543-26c0-4162-a75a-dfc5448d4af4/storage-initializer/0.log" Apr 28 19:47:18.065405 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:18.065356 2571 generic.go:358] "Generic (PLEG): container finished" podID="42088543-26c0-4162-a75a-dfc5448d4af4" containerID="898963bbefcd42cb3daf1994a2b4d970413a9053ff11c9b12052c2913d007d2a" exitCode=1 Apr 28 19:47:18.065498 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:18.065406 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-19fe47-predictor-c5688fb5c-bh8pm" event={"ID":"42088543-26c0-4162-a75a-dfc5448d4af4","Type":"ContainerDied","Data":"898963bbefcd42cb3daf1994a2b4d970413a9053ff11c9b12052c2913d007d2a"} Apr 28 19:47:18.065498 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:18.065440 2571 scope.go:117] "RemoveContainer" containerID="8fde979eccb580836214f0dcdc9ea6d5c2d6f7ebec50c33b35fba0208c0406f0" Apr 28 19:47:18.065892 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:18.065865 2571 scope.go:117] "RemoveContainer" containerID="8fde979eccb580836214f0dcdc9ea6d5c2d6f7ebec50c33b35fba0208c0406f0" Apr 28 19:47:18.076241 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:47:18.076214 2571 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-init-fail-19fe47-predictor-c5688fb5c-bh8pm_kserve-ci-e2e-test_42088543-26c0-4162-a75a-dfc5448d4af4_0 in pod sandbox 6c598c0296d38a92db465338217e77a7b4aefd375ed46f130275aeadd0802334 from index: no such id: '8fde979eccb580836214f0dcdc9ea6d5c2d6f7ebec50c33b35fba0208c0406f0'" containerID="8fde979eccb580836214f0dcdc9ea6d5c2d6f7ebec50c33b35fba0208c0406f0" Apr 28 19:47:18.076310 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:18.076250 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fde979eccb580836214f0dcdc9ea6d5c2d6f7ebec50c33b35fba0208c0406f0"} err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-init-fail-19fe47-predictor-c5688fb5c-bh8pm_kserve-ci-e2e-test_42088543-26c0-4162-a75a-dfc5448d4af4_0 in pod sandbox 6c598c0296d38a92db465338217e77a7b4aefd375ed46f130275aeadd0802334 from index: no such id: '8fde979eccb580836214f0dcdc9ea6d5c2d6f7ebec50c33b35fba0208c0406f0'" Apr 28 19:47:18.076428 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:47:18.076406 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-init-fail-19fe47-predictor-c5688fb5c-bh8pm_kserve-ci-e2e-test(42088543-26c0-4162-a75a-dfc5448d4af4)\"" pod="kserve-ci-e2e-test/isvc-init-fail-19fe47-predictor-c5688fb5c-bh8pm" podUID="42088543-26c0-4162-a75a-dfc5448d4af4" Apr 28 19:47:19.047507 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:19.047460 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-19fe47-predictor-c5688fb5c-bh8pm"] Apr 28 19:47:19.070253 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:19.070227 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-19fe47-predictor-c5688fb5c-bh8pm_42088543-26c0-4162-a75a-dfc5448d4af4/storage-initializer/1.log" Apr 28 19:47:19.202531 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:19.202511 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-19fe47-predictor-c5688fb5c-bh8pm_42088543-26c0-4162-a75a-dfc5448d4af4/storage-initializer/1.log" Apr 28 19:47:19.202663 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:19.202574 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-19fe47-predictor-c5688fb5c-bh8pm" Apr 28 19:47:19.364390 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:19.364287 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/42088543-26c0-4162-a75a-dfc5448d4af4-cabundle-cert\") pod \"42088543-26c0-4162-a75a-dfc5448d4af4\" (UID: \"42088543-26c0-4162-a75a-dfc5448d4af4\") " Apr 28 19:47:19.364390 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:19.364377 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/42088543-26c0-4162-a75a-dfc5448d4af4-proxy-tls\") pod \"42088543-26c0-4162-a75a-dfc5448d4af4\" (UID: \"42088543-26c0-4162-a75a-dfc5448d4af4\") " Apr 28 19:47:19.364673 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:19.364414 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/42088543-26c0-4162-a75a-dfc5448d4af4-kserve-provision-location\") pod \"42088543-26c0-4162-a75a-dfc5448d4af4\" (UID: \"42088543-26c0-4162-a75a-dfc5448d4af4\") " Apr 28 19:47:19.364673 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:19.364440 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-init-fail-19fe47-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/42088543-26c0-4162-a75a-dfc5448d4af4-isvc-init-fail-19fe47-kube-rbac-proxy-sar-config\") pod \"42088543-26c0-4162-a75a-dfc5448d4af4\" (UID: \"42088543-26c0-4162-a75a-dfc5448d4af4\") " Apr 28 19:47:19.364673 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:19.364464 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6ksb\" (UniqueName: \"kubernetes.io/projected/42088543-26c0-4162-a75a-dfc5448d4af4-kube-api-access-x6ksb\") pod \"42088543-26c0-4162-a75a-dfc5448d4af4\" (UID: \"42088543-26c0-4162-a75a-dfc5448d4af4\") " Apr 28 19:47:19.364834 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:19.364769 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42088543-26c0-4162-a75a-dfc5448d4af4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "42088543-26c0-4162-a75a-dfc5448d4af4" (UID: "42088543-26c0-4162-a75a-dfc5448d4af4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:47:19.364834 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:19.364777 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42088543-26c0-4162-a75a-dfc5448d4af4-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "42088543-26c0-4162-a75a-dfc5448d4af4" (UID: "42088543-26c0-4162-a75a-dfc5448d4af4"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:47:19.364834 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:19.364824 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42088543-26c0-4162-a75a-dfc5448d4af4-isvc-init-fail-19fe47-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-init-fail-19fe47-kube-rbac-proxy-sar-config") pod "42088543-26c0-4162-a75a-dfc5448d4af4" (UID: "42088543-26c0-4162-a75a-dfc5448d4af4"). InnerVolumeSpecName "isvc-init-fail-19fe47-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:47:19.366708 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:19.366683 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42088543-26c0-4162-a75a-dfc5448d4af4-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "42088543-26c0-4162-a75a-dfc5448d4af4" (UID: "42088543-26c0-4162-a75a-dfc5448d4af4"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:47:19.366708 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:19.366699 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42088543-26c0-4162-a75a-dfc5448d4af4-kube-api-access-x6ksb" (OuterVolumeSpecName: "kube-api-access-x6ksb") pod "42088543-26c0-4162-a75a-dfc5448d4af4" (UID: "42088543-26c0-4162-a75a-dfc5448d4af4"). InnerVolumeSpecName "kube-api-access-x6ksb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:47:19.465252 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:19.465208 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/42088543-26c0-4162-a75a-dfc5448d4af4-kserve-provision-location\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:47:19.465252 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:19.465252 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-init-fail-19fe47-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/42088543-26c0-4162-a75a-dfc5448d4af4-isvc-init-fail-19fe47-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:47:19.465252 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:19.465265 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x6ksb\" (UniqueName: \"kubernetes.io/projected/42088543-26c0-4162-a75a-dfc5448d4af4-kube-api-access-x6ksb\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:47:19.465508 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:19.465276 2571 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/42088543-26c0-4162-a75a-dfc5448d4af4-cabundle-cert\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:47:19.465508 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:19.465286 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/42088543-26c0-4162-a75a-dfc5448d4af4-proxy-tls\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:47:20.075067 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:20.075038 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-19fe47-predictor-c5688fb5c-bh8pm_42088543-26c0-4162-a75a-dfc5448d4af4/storage-initializer/1.log" Apr 28 19:47:20.075457 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:20.075124 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-19fe47-predictor-c5688fb5c-bh8pm" event={"ID":"42088543-26c0-4162-a75a-dfc5448d4af4","Type":"ContainerDied","Data":"6c598c0296d38a92db465338217e77a7b4aefd375ed46f130275aeadd0802334"} Apr 28 19:47:20.075457 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:20.075154 2571 scope.go:117] "RemoveContainer" containerID="898963bbefcd42cb3daf1994a2b4d970413a9053ff11c9b12052c2913d007d2a" Apr 28 19:47:20.075457 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:20.075158 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-19fe47-predictor-c5688fb5c-bh8pm" Apr 28 19:47:20.107049 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:20.107019 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-19fe47-predictor-c5688fb5c-bh8pm"] Apr 28 19:47:20.115543 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:20.113414 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-19fe47-predictor-c5688fb5c-bh8pm"] Apr 28 19:47:21.946727 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:47:21.946689 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42088543-26c0-4162-a75a-dfc5448d4af4" path="/var/lib/kubelet/pods/42088543-26c0-4162-a75a-dfc5448d4af4/volumes" Apr 28 19:51:23.981140 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:51:23.981110 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ppk4t_238e19ca-102f-43b1-8aed-9322ca47bfc9/ovn-acl-logging/0.log" Apr 28 19:51:23.986105 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:51:23.986084 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ppk4t_238e19ca-102f-43b1-8aed-9322ca47bfc9/ovn-acl-logging/0.log" Apr 28 19:56:24.003045 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:56:24.003007 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ppk4t_238e19ca-102f-43b1-8aed-9322ca47bfc9/ovn-acl-logging/0.log" Apr 28 19:56:24.009642 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:56:24.009617 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ppk4t_238e19ca-102f-43b1-8aed-9322ca47bfc9/ovn-acl-logging/0.log" Apr 28 19:56:50.786897 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:56:50.786813 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-6875c879b7-96mbk"] Apr 28 19:56:50.787351 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:56:50.787115 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="24d79a92-4318-433f-88da-e160ee5a0e71" containerName="kube-rbac-proxy" Apr 28 19:56:50.787351 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:56:50.787126 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="24d79a92-4318-433f-88da-e160ee5a0e71" containerName="kube-rbac-proxy" Apr 28 19:56:50.787351 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:56:50.787136 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="17d692a4-ac88-467f-97fe-4ec44b36dbb0" containerName="storage-initializer" Apr 28 19:56:50.787351 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:56:50.787142 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="17d692a4-ac88-467f-97fe-4ec44b36dbb0" containerName="storage-initializer" Apr 28 19:56:50.787351 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:56:50.787154 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="24d79a92-4318-433f-88da-e160ee5a0e71" containerName="kserve-container" Apr 28 19:56:50.787351 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:56:50.787160 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="24d79a92-4318-433f-88da-e160ee5a0e71" containerName="kserve-container" Apr 28 19:56:50.787351 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:56:50.787166 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="42088543-26c0-4162-a75a-dfc5448d4af4" containerName="storage-initializer" Apr 28 19:56:50.787351 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:56:50.787172 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="42088543-26c0-4162-a75a-dfc5448d4af4" containerName="storage-initializer" Apr 28 19:56:50.787351 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:56:50.787178 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="42088543-26c0-4162-a75a-dfc5448d4af4" containerName="storage-initializer" Apr 28 19:56:50.787351 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:56:50.787183 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="42088543-26c0-4162-a75a-dfc5448d4af4" containerName="storage-initializer" Apr 28 19:56:50.787351 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:56:50.787190 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="17d692a4-ac88-467f-97fe-4ec44b36dbb0" containerName="storage-initializer" Apr 28 19:56:50.787351 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:56:50.787195 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="17d692a4-ac88-467f-97fe-4ec44b36dbb0" containerName="storage-initializer" Apr 28 19:56:50.787351 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:56:50.787205 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="24d79a92-4318-433f-88da-e160ee5a0e71" containerName="storage-initializer" Apr 28 19:56:50.787351 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:56:50.787210 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="24d79a92-4318-433f-88da-e160ee5a0e71" containerName="storage-initializer" Apr 28 19:56:50.787351 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:56:50.787253 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="17d692a4-ac88-467f-97fe-4ec44b36dbb0" containerName="storage-initializer" Apr 28 19:56:50.787351 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:56:50.787261 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="24d79a92-4318-433f-88da-e160ee5a0e71" containerName="kserve-container" Apr 28 19:56:50.787351 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:56:50.787269 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="42088543-26c0-4162-a75a-dfc5448d4af4" containerName="storage-initializer" Apr 28 19:56:50.787351 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:56:50.787276 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="24d79a92-4318-433f-88da-e160ee5a0e71" containerName="kube-rbac-proxy" Apr 28 19:56:50.787351 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:56:50.787283 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="42088543-26c0-4162-a75a-dfc5448d4af4" containerName="storage-initializer" Apr 28 19:56:50.787351 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:56:50.787289 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="17d692a4-ac88-467f-97fe-4ec44b36dbb0" containerName="storage-initializer" Apr 28 19:56:50.790146 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:56:50.790129 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6875c879b7-96mbk" Apr 28 19:56:50.792376 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:56:50.792353 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-kube-rbac-proxy-sar-config\"" Apr 28 19:56:50.792376 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:56:50.792367 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 28 19:56:50.792948 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:56:50.792927 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-predictor-serving-cert\"" Apr 28 19:56:50.793057 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:56:50.792953 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-44jch\"" Apr 28 19:56:50.793057 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:56:50.792987 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 28 19:56:50.800527 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:56:50.800506 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-6875c879b7-96mbk"] Apr 28 19:56:50.877256 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:56:50.877217 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wklc\" (UniqueName: \"kubernetes.io/projected/8cc54079-1f16-4458-8bd8-f3e4f597e6e9-kube-api-access-2wklc\") pod \"isvc-sklearn-predictor-6875c879b7-96mbk\" (UID: \"8cc54079-1f16-4458-8bd8-f3e4f597e6e9\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6875c879b7-96mbk" Apr 28 19:56:50.877432 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:56:50.877284 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8cc54079-1f16-4458-8bd8-f3e4f597e6e9-kserve-provision-location\") pod \"isvc-sklearn-predictor-6875c879b7-96mbk\" (UID: \"8cc54079-1f16-4458-8bd8-f3e4f597e6e9\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6875c879b7-96mbk" Apr 28 19:56:50.877432 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:56:50.877320 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8cc54079-1f16-4458-8bd8-f3e4f597e6e9-proxy-tls\") pod \"isvc-sklearn-predictor-6875c879b7-96mbk\" (UID: \"8cc54079-1f16-4458-8bd8-f3e4f597e6e9\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6875c879b7-96mbk" Apr 28 19:56:50.877432 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:56:50.877344 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8cc54079-1f16-4458-8bd8-f3e4f597e6e9-isvc-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-predictor-6875c879b7-96mbk\" (UID: \"8cc54079-1f16-4458-8bd8-f3e4f597e6e9\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6875c879b7-96mbk" Apr 28 19:56:50.978231 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:56:50.978194 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8cc54079-1f16-4458-8bd8-f3e4f597e6e9-kserve-provision-location\") pod \"isvc-sklearn-predictor-6875c879b7-96mbk\" (UID: \"8cc54079-1f16-4458-8bd8-f3e4f597e6e9\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6875c879b7-96mbk" Apr 28 19:56:50.978231 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:56:50.978233 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8cc54079-1f16-4458-8bd8-f3e4f597e6e9-proxy-tls\") pod \"isvc-sklearn-predictor-6875c879b7-96mbk\" (UID: \"8cc54079-1f16-4458-8bd8-f3e4f597e6e9\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6875c879b7-96mbk" Apr 28 19:56:50.978456 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:56:50.978256 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8cc54079-1f16-4458-8bd8-f3e4f597e6e9-isvc-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-predictor-6875c879b7-96mbk\" (UID: \"8cc54079-1f16-4458-8bd8-f3e4f597e6e9\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6875c879b7-96mbk" Apr 28 19:56:50.978456 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:56:50.978285 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2wklc\" (UniqueName: \"kubernetes.io/projected/8cc54079-1f16-4458-8bd8-f3e4f597e6e9-kube-api-access-2wklc\") pod \"isvc-sklearn-predictor-6875c879b7-96mbk\" (UID: \"8cc54079-1f16-4458-8bd8-f3e4f597e6e9\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6875c879b7-96mbk" Apr 28 19:56:50.978675 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:56:50.978651 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8cc54079-1f16-4458-8bd8-f3e4f597e6e9-kserve-provision-location\") pod \"isvc-sklearn-predictor-6875c879b7-96mbk\" (UID: \"8cc54079-1f16-4458-8bd8-f3e4f597e6e9\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6875c879b7-96mbk" Apr 28 19:56:50.978918 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:56:50.978900 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8cc54079-1f16-4458-8bd8-f3e4f597e6e9-isvc-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-predictor-6875c879b7-96mbk\" (UID: \"8cc54079-1f16-4458-8bd8-f3e4f597e6e9\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6875c879b7-96mbk" Apr 28 19:56:50.980705 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:56:50.980688 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8cc54079-1f16-4458-8bd8-f3e4f597e6e9-proxy-tls\") pod \"isvc-sklearn-predictor-6875c879b7-96mbk\" (UID: \"8cc54079-1f16-4458-8bd8-f3e4f597e6e9\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6875c879b7-96mbk" Apr 28 19:56:50.986444 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:56:50.986420 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wklc\" (UniqueName: \"kubernetes.io/projected/8cc54079-1f16-4458-8bd8-f3e4f597e6e9-kube-api-access-2wklc\") pod \"isvc-sklearn-predictor-6875c879b7-96mbk\" (UID: \"8cc54079-1f16-4458-8bd8-f3e4f597e6e9\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6875c879b7-96mbk" Apr 28 19:56:51.100651 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:56:51.100565 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6875c879b7-96mbk" Apr 28 19:56:51.227989 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:56:51.227967 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-6875c879b7-96mbk"] Apr 28 19:56:51.230275 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:56:51.230245 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8cc54079_1f16_4458_8bd8_f3e4f597e6e9.slice/crio-ef2a0c71dfeba448b907b552827a208681be2ff5fdfd55969c5cd826e12dfa10 WatchSource:0}: Error finding container ef2a0c71dfeba448b907b552827a208681be2ff5fdfd55969c5cd826e12dfa10: Status 404 returned error can't find the container with id ef2a0c71dfeba448b907b552827a208681be2ff5fdfd55969c5cd826e12dfa10 Apr 28 19:56:51.232021 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:56:51.232004 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 19:56:51.767921 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:56:51.767885 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6875c879b7-96mbk" event={"ID":"8cc54079-1f16-4458-8bd8-f3e4f597e6e9","Type":"ContainerStarted","Data":"9a9405be4ea081cea1f6424b66c3c2b1a6c2e493725fd82303bb63da4fee57c1"} Apr 28 19:56:51.767921 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:56:51.767922 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6875c879b7-96mbk" event={"ID":"8cc54079-1f16-4458-8bd8-f3e4f597e6e9","Type":"ContainerStarted","Data":"ef2a0c71dfeba448b907b552827a208681be2ff5fdfd55969c5cd826e12dfa10"} Apr 28 19:56:54.777609 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:56:54.777574 2571 generic.go:358] "Generic (PLEG): container finished" podID="8cc54079-1f16-4458-8bd8-f3e4f597e6e9" containerID="9a9405be4ea081cea1f6424b66c3c2b1a6c2e493725fd82303bb63da4fee57c1" exitCode=0 Apr 28 19:56:54.777979 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:56:54.777651 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6875c879b7-96mbk" event={"ID":"8cc54079-1f16-4458-8bd8-f3e4f597e6e9","Type":"ContainerDied","Data":"9a9405be4ea081cea1f6424b66c3c2b1a6c2e493725fd82303bb63da4fee57c1"} Apr 28 19:56:55.783236 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:56:55.783201 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6875c879b7-96mbk" event={"ID":"8cc54079-1f16-4458-8bd8-f3e4f597e6e9","Type":"ContainerStarted","Data":"b29b6b406ea932b5a1fd6fe73598debd331ec322e82d47d3f3d7f60c33afe8dc"} Apr 28 19:56:55.783236 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:56:55.783238 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6875c879b7-96mbk" event={"ID":"8cc54079-1f16-4458-8bd8-f3e4f597e6e9","Type":"ContainerStarted","Data":"4ab70f591b3051a9d3b6bd60ea3ce6c88d4aa3af0783c2976bd8ade6ad538ec0"} Apr 28 19:56:55.783689 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:56:55.783501 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6875c879b7-96mbk" Apr 28 19:56:55.803707 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:56:55.803661 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6875c879b7-96mbk" podStartSLOduration=5.803645358 podStartE2EDuration="5.803645358s" podCreationTimestamp="2026-04-28 19:56:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:56:55.802792614 +0000 UTC m=+2432.433693559" watchObservedRunningTime="2026-04-28 19:56:55.803645358 +0000 UTC m=+2432.434546304" Apr 28 19:56:56.786632 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:56:56.786596 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6875c879b7-96mbk" Apr 28 19:56:56.787655 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:56:56.787622 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6875c879b7-96mbk" podUID="8cc54079-1f16-4458-8bd8-f3e4f597e6e9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 28 19:56:57.789507 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:56:57.789446 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6875c879b7-96mbk" podUID="8cc54079-1f16-4458-8bd8-f3e4f597e6e9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 28 19:57:02.793708 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:57:02.793678 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6875c879b7-96mbk" Apr 28 19:57:02.794260 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:57:02.794228 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6875c879b7-96mbk" podUID="8cc54079-1f16-4458-8bd8-f3e4f597e6e9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 28 19:57:12.794931 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:57:12.794877 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6875c879b7-96mbk" podUID="8cc54079-1f16-4458-8bd8-f3e4f597e6e9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 28 19:57:22.794755 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:57:22.794713 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6875c879b7-96mbk" podUID="8cc54079-1f16-4458-8bd8-f3e4f597e6e9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 28 19:57:32.794603 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:57:32.794552 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6875c879b7-96mbk" podUID="8cc54079-1f16-4458-8bd8-f3e4f597e6e9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 28 19:57:42.794907 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:57:42.794866 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6875c879b7-96mbk" podUID="8cc54079-1f16-4458-8bd8-f3e4f597e6e9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 28 19:57:52.794907 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:57:52.794862 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6875c879b7-96mbk" podUID="8cc54079-1f16-4458-8bd8-f3e4f597e6e9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 28 19:58:02.794649 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:02.794617 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6875c879b7-96mbk" Apr 28 19:58:11.312452 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:11.312279 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-6875c879b7-96mbk"] Apr 28 19:58:11.312914 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:11.312740 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6875c879b7-96mbk" podUID="8cc54079-1f16-4458-8bd8-f3e4f597e6e9" containerName="kserve-container" containerID="cri-o://4ab70f591b3051a9d3b6bd60ea3ce6c88d4aa3af0783c2976bd8ade6ad538ec0" gracePeriod=30 Apr 28 19:58:11.312914 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:11.312816 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6875c879b7-96mbk" podUID="8cc54079-1f16-4458-8bd8-f3e4f597e6e9" containerName="kube-rbac-proxy" containerID="cri-o://b29b6b406ea932b5a1fd6fe73598debd331ec322e82d47d3f3d7f60c33afe8dc" gracePeriod=30 Apr 28 19:58:11.587322 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:11.587245 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-mw5mr"] Apr 28 19:58:11.590826 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:11.590802 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-mw5mr" Apr 28 19:58:11.592989 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:11.592963 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\"" Apr 28 19:58:11.593114 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:11.592963 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sklearn-v2-mlserver-predictor-serving-cert\"" Apr 28 19:58:11.600657 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:11.600626 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-mw5mr"] Apr 28 19:58:11.765101 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:11.765057 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjfsn\" (UniqueName: \"kubernetes.io/projected/d89511f6-3c88-4a60-9fc2-137df2cbf3d8-kube-api-access-kjfsn\") pod \"sklearn-v2-mlserver-predictor-65d8664766-mw5mr\" (UID: \"d89511f6-3c88-4a60-9fc2-137df2cbf3d8\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-mw5mr" Apr 28 19:58:11.765277 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:11.765118 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d89511f6-3c88-4a60-9fc2-137df2cbf3d8-proxy-tls\") pod \"sklearn-v2-mlserver-predictor-65d8664766-mw5mr\" (UID: \"d89511f6-3c88-4a60-9fc2-137df2cbf3d8\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-mw5mr" Apr 28 19:58:11.765277 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:11.765152 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d89511f6-3c88-4a60-9fc2-137df2cbf3d8-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-65d8664766-mw5mr\" (UID: \"d89511f6-3c88-4a60-9fc2-137df2cbf3d8\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-mw5mr" Apr 28 19:58:11.765277 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:11.765186 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d89511f6-3c88-4a60-9fc2-137df2cbf3d8-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"sklearn-v2-mlserver-predictor-65d8664766-mw5mr\" (UID: \"d89511f6-3c88-4a60-9fc2-137df2cbf3d8\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-mw5mr" Apr 28 19:58:11.865872 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:11.865766 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d89511f6-3c88-4a60-9fc2-137df2cbf3d8-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"sklearn-v2-mlserver-predictor-65d8664766-mw5mr\" (UID: \"d89511f6-3c88-4a60-9fc2-137df2cbf3d8\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-mw5mr" Apr 28 19:58:11.865872 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:11.865855 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kjfsn\" (UniqueName: \"kubernetes.io/projected/d89511f6-3c88-4a60-9fc2-137df2cbf3d8-kube-api-access-kjfsn\") pod \"sklearn-v2-mlserver-predictor-65d8664766-mw5mr\" (UID: \"d89511f6-3c88-4a60-9fc2-137df2cbf3d8\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-mw5mr" Apr 28 19:58:11.866114 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:11.865890 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d89511f6-3c88-4a60-9fc2-137df2cbf3d8-proxy-tls\") pod \"sklearn-v2-mlserver-predictor-65d8664766-mw5mr\" (UID: \"d89511f6-3c88-4a60-9fc2-137df2cbf3d8\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-mw5mr" Apr 28 19:58:11.866114 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:11.865919 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d89511f6-3c88-4a60-9fc2-137df2cbf3d8-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-65d8664766-mw5mr\" (UID: \"d89511f6-3c88-4a60-9fc2-137df2cbf3d8\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-mw5mr" Apr 28 19:58:11.866300 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:11.866279 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d89511f6-3c88-4a60-9fc2-137df2cbf3d8-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-65d8664766-mw5mr\" (UID: \"d89511f6-3c88-4a60-9fc2-137df2cbf3d8\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-mw5mr" Apr 28 19:58:11.866455 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:11.866435 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d89511f6-3c88-4a60-9fc2-137df2cbf3d8-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"sklearn-v2-mlserver-predictor-65d8664766-mw5mr\" (UID: \"d89511f6-3c88-4a60-9fc2-137df2cbf3d8\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-mw5mr" Apr 28 19:58:11.868376 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:11.868341 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d89511f6-3c88-4a60-9fc2-137df2cbf3d8-proxy-tls\") pod \"sklearn-v2-mlserver-predictor-65d8664766-mw5mr\" (UID: \"d89511f6-3c88-4a60-9fc2-137df2cbf3d8\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-mw5mr" Apr 28 19:58:11.873308 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:11.873286 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjfsn\" (UniqueName: \"kubernetes.io/projected/d89511f6-3c88-4a60-9fc2-137df2cbf3d8-kube-api-access-kjfsn\") pod \"sklearn-v2-mlserver-predictor-65d8664766-mw5mr\" (UID: \"d89511f6-3c88-4a60-9fc2-137df2cbf3d8\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-mw5mr" Apr 28 19:58:11.903246 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:11.903215 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-mw5mr" Apr 28 19:58:12.022512 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:12.022342 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-mw5mr"] Apr 28 19:58:12.025334 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:58:12.025309 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd89511f6_3c88_4a60_9fc2_137df2cbf3d8.slice/crio-5a1b61b29db39476c667f981e07d3567e90d35d4c18de1ccceb43da732c48f42 WatchSource:0}: Error finding container 5a1b61b29db39476c667f981e07d3567e90d35d4c18de1ccceb43da732c48f42: Status 404 returned error can't find the container with id 5a1b61b29db39476c667f981e07d3567e90d35d4c18de1ccceb43da732c48f42 Apr 28 19:58:12.029252 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:12.029230 2571 generic.go:358] "Generic (PLEG): container finished" podID="8cc54079-1f16-4458-8bd8-f3e4f597e6e9" containerID="b29b6b406ea932b5a1fd6fe73598debd331ec322e82d47d3f3d7f60c33afe8dc" exitCode=2 Apr 28 19:58:12.029334 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:12.029304 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6875c879b7-96mbk" event={"ID":"8cc54079-1f16-4458-8bd8-f3e4f597e6e9","Type":"ContainerDied","Data":"b29b6b406ea932b5a1fd6fe73598debd331ec322e82d47d3f3d7f60c33afe8dc"} Apr 28 19:58:12.789863 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:12.789821 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6875c879b7-96mbk" podUID="8cc54079-1f16-4458-8bd8-f3e4f597e6e9" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.42:8643/healthz\": dial tcp 10.134.0.42:8643: connect: connection refused" Apr 28 19:58:12.794118 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:12.794085 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6875c879b7-96mbk" podUID="8cc54079-1f16-4458-8bd8-f3e4f597e6e9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 28 19:58:13.033866 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:13.033828 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-mw5mr" event={"ID":"d89511f6-3c88-4a60-9fc2-137df2cbf3d8","Type":"ContainerStarted","Data":"15b476ce2e69ec914f80925ef28d0d2402ea4b8b9172bdcb010a951b9ad76c2d"} Apr 28 19:58:13.033866 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:13.033869 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-mw5mr" event={"ID":"d89511f6-3c88-4a60-9fc2-137df2cbf3d8","Type":"ContainerStarted","Data":"5a1b61b29db39476c667f981e07d3567e90d35d4c18de1ccceb43da732c48f42"} Apr 28 19:58:15.978061 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:15.978033 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6875c879b7-96mbk" Apr 28 19:58:16.044506 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:16.044409 2571 generic.go:358] "Generic (PLEG): container finished" podID="d89511f6-3c88-4a60-9fc2-137df2cbf3d8" containerID="15b476ce2e69ec914f80925ef28d0d2402ea4b8b9172bdcb010a951b9ad76c2d" exitCode=0 Apr 28 19:58:16.044506 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:16.044490 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-mw5mr" event={"ID":"d89511f6-3c88-4a60-9fc2-137df2cbf3d8","Type":"ContainerDied","Data":"15b476ce2e69ec914f80925ef28d0d2402ea4b8b9172bdcb010a951b9ad76c2d"} Apr 28 19:58:16.046323 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:16.046302 2571 generic.go:358] "Generic (PLEG): container finished" podID="8cc54079-1f16-4458-8bd8-f3e4f597e6e9" containerID="4ab70f591b3051a9d3b6bd60ea3ce6c88d4aa3af0783c2976bd8ade6ad538ec0" exitCode=0 Apr 28 19:58:16.046434 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:16.046375 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6875c879b7-96mbk" Apr 28 19:58:16.046434 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:16.046379 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6875c879b7-96mbk" event={"ID":"8cc54079-1f16-4458-8bd8-f3e4f597e6e9","Type":"ContainerDied","Data":"4ab70f591b3051a9d3b6bd60ea3ce6c88d4aa3af0783c2976bd8ade6ad538ec0"} Apr 28 19:58:16.046434 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:16.046410 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6875c879b7-96mbk" event={"ID":"8cc54079-1f16-4458-8bd8-f3e4f597e6e9","Type":"ContainerDied","Data":"ef2a0c71dfeba448b907b552827a208681be2ff5fdfd55969c5cd826e12dfa10"} Apr 28 19:58:16.046434 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:16.046425 2571 scope.go:117] "RemoveContainer" containerID="b29b6b406ea932b5a1fd6fe73598debd331ec322e82d47d3f3d7f60c33afe8dc" Apr 28 19:58:16.056332 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:16.056314 2571 scope.go:117] "RemoveContainer" containerID="4ab70f591b3051a9d3b6bd60ea3ce6c88d4aa3af0783c2976bd8ade6ad538ec0" Apr 28 19:58:16.063574 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:16.063543 2571 scope.go:117] "RemoveContainer" containerID="9a9405be4ea081cea1f6424b66c3c2b1a6c2e493725fd82303bb63da4fee57c1" Apr 28 19:58:16.076642 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:16.076625 2571 scope.go:117] "RemoveContainer" containerID="b29b6b406ea932b5a1fd6fe73598debd331ec322e82d47d3f3d7f60c33afe8dc" Apr 28 19:58:16.076873 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:58:16.076854 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b29b6b406ea932b5a1fd6fe73598debd331ec322e82d47d3f3d7f60c33afe8dc\": container with ID starting with b29b6b406ea932b5a1fd6fe73598debd331ec322e82d47d3f3d7f60c33afe8dc not found: ID does not exist" containerID="b29b6b406ea932b5a1fd6fe73598debd331ec322e82d47d3f3d7f60c33afe8dc" Apr 28 19:58:16.076920 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:16.076887 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b29b6b406ea932b5a1fd6fe73598debd331ec322e82d47d3f3d7f60c33afe8dc"} err="failed to get container status \"b29b6b406ea932b5a1fd6fe73598debd331ec322e82d47d3f3d7f60c33afe8dc\": rpc error: code = NotFound desc = could not find container \"b29b6b406ea932b5a1fd6fe73598debd331ec322e82d47d3f3d7f60c33afe8dc\": container with ID starting with b29b6b406ea932b5a1fd6fe73598debd331ec322e82d47d3f3d7f60c33afe8dc not found: ID does not exist" Apr 28 19:58:16.076920 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:16.076907 2571 scope.go:117] "RemoveContainer" containerID="4ab70f591b3051a9d3b6bd60ea3ce6c88d4aa3af0783c2976bd8ade6ad538ec0" Apr 28 19:58:16.077110 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:58:16.077094 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ab70f591b3051a9d3b6bd60ea3ce6c88d4aa3af0783c2976bd8ade6ad538ec0\": container with ID starting with 4ab70f591b3051a9d3b6bd60ea3ce6c88d4aa3af0783c2976bd8ade6ad538ec0 not found: ID does not exist" containerID="4ab70f591b3051a9d3b6bd60ea3ce6c88d4aa3af0783c2976bd8ade6ad538ec0" Apr 28 19:58:16.077147 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:16.077117 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ab70f591b3051a9d3b6bd60ea3ce6c88d4aa3af0783c2976bd8ade6ad538ec0"} err="failed to get container status \"4ab70f591b3051a9d3b6bd60ea3ce6c88d4aa3af0783c2976bd8ade6ad538ec0\": rpc error: code = NotFound desc = could not find container \"4ab70f591b3051a9d3b6bd60ea3ce6c88d4aa3af0783c2976bd8ade6ad538ec0\": container with ID starting with 4ab70f591b3051a9d3b6bd60ea3ce6c88d4aa3af0783c2976bd8ade6ad538ec0 not found: ID does not exist" Apr 28 19:58:16.077147 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:16.077131 2571 scope.go:117] "RemoveContainer" containerID="9a9405be4ea081cea1f6424b66c3c2b1a6c2e493725fd82303bb63da4fee57c1" Apr 28 19:58:16.077338 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:58:16.077322 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a9405be4ea081cea1f6424b66c3c2b1a6c2e493725fd82303bb63da4fee57c1\": container with ID starting with 9a9405be4ea081cea1f6424b66c3c2b1a6c2e493725fd82303bb63da4fee57c1 not found: ID does not exist" containerID="9a9405be4ea081cea1f6424b66c3c2b1a6c2e493725fd82303bb63da4fee57c1" Apr 28 19:58:16.077387 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:16.077346 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a9405be4ea081cea1f6424b66c3c2b1a6c2e493725fd82303bb63da4fee57c1"} err="failed to get container status \"9a9405be4ea081cea1f6424b66c3c2b1a6c2e493725fd82303bb63da4fee57c1\": rpc error: code = NotFound desc = could not find container \"9a9405be4ea081cea1f6424b66c3c2b1a6c2e493725fd82303bb63da4fee57c1\": container with ID starting with 9a9405be4ea081cea1f6424b66c3c2b1a6c2e493725fd82303bb63da4fee57c1 not found: ID does not exist" Apr 28 19:58:16.101021 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:16.100879 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8cc54079-1f16-4458-8bd8-f3e4f597e6e9-isvc-sklearn-kube-rbac-proxy-sar-config\") pod \"8cc54079-1f16-4458-8bd8-f3e4f597e6e9\" (UID: \"8cc54079-1f16-4458-8bd8-f3e4f597e6e9\") " Apr 28 19:58:16.101021 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:16.100938 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wklc\" (UniqueName: \"kubernetes.io/projected/8cc54079-1f16-4458-8bd8-f3e4f597e6e9-kube-api-access-2wklc\") pod \"8cc54079-1f16-4458-8bd8-f3e4f597e6e9\" (UID: \"8cc54079-1f16-4458-8bd8-f3e4f597e6e9\") " Apr 28 19:58:16.101021 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:16.100963 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8cc54079-1f16-4458-8bd8-f3e4f597e6e9-kserve-provision-location\") pod \"8cc54079-1f16-4458-8bd8-f3e4f597e6e9\" (UID: \"8cc54079-1f16-4458-8bd8-f3e4f597e6e9\") " Apr 28 19:58:16.101187 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:16.101044 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8cc54079-1f16-4458-8bd8-f3e4f597e6e9-proxy-tls\") pod \"8cc54079-1f16-4458-8bd8-f3e4f597e6e9\" (UID: \"8cc54079-1f16-4458-8bd8-f3e4f597e6e9\") " Apr 28 19:58:16.101812 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:16.101405 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8cc54079-1f16-4458-8bd8-f3e4f597e6e9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8cc54079-1f16-4458-8bd8-f3e4f597e6e9" (UID: "8cc54079-1f16-4458-8bd8-f3e4f597e6e9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:58:16.101812 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:16.101599 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cc54079-1f16-4458-8bd8-f3e4f597e6e9-isvc-sklearn-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-kube-rbac-proxy-sar-config") pod "8cc54079-1f16-4458-8bd8-f3e4f597e6e9" (UID: "8cc54079-1f16-4458-8bd8-f3e4f597e6e9"). InnerVolumeSpecName "isvc-sklearn-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:58:16.101972 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:16.101827 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8cc54079-1f16-4458-8bd8-f3e4f597e6e9-isvc-sklearn-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:58:16.101972 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:16.101849 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8cc54079-1f16-4458-8bd8-f3e4f597e6e9-kserve-provision-location\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:58:16.103466 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:16.103444 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cc54079-1f16-4458-8bd8-f3e4f597e6e9-kube-api-access-2wklc" (OuterVolumeSpecName: "kube-api-access-2wklc") pod "8cc54079-1f16-4458-8bd8-f3e4f597e6e9" (UID: "8cc54079-1f16-4458-8bd8-f3e4f597e6e9"). InnerVolumeSpecName "kube-api-access-2wklc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:58:16.103586 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:16.103473 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cc54079-1f16-4458-8bd8-f3e4f597e6e9-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "8cc54079-1f16-4458-8bd8-f3e4f597e6e9" (UID: "8cc54079-1f16-4458-8bd8-f3e4f597e6e9"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:58:16.203183 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:16.203150 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2wklc\" (UniqueName: \"kubernetes.io/projected/8cc54079-1f16-4458-8bd8-f3e4f597e6e9-kube-api-access-2wklc\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:58:16.203183 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:16.203181 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8cc54079-1f16-4458-8bd8-f3e4f597e6e9-proxy-tls\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:58:16.366655 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:16.366625 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-6875c879b7-96mbk"] Apr 28 19:58:16.372041 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:16.372017 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-6875c879b7-96mbk"] Apr 28 19:58:17.052016 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:17.051984 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-mw5mr" event={"ID":"d89511f6-3c88-4a60-9fc2-137df2cbf3d8","Type":"ContainerStarted","Data":"cd80e2e75bcd43888886e49d6bdc3b19aac15a43f2f5d9b41eabe8c031ba480b"} Apr 28 19:58:17.052405 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:17.052023 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-mw5mr" event={"ID":"d89511f6-3c88-4a60-9fc2-137df2cbf3d8","Type":"ContainerStarted","Data":"2714f028e33f1301b4be57c439054cffb288278f2f5e569b48324399566a54c5"} Apr 28 19:58:17.052405 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:17.052242 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-mw5mr" Apr 28 19:58:17.071630 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:17.071586 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-mw5mr" podStartSLOduration=6.07157246 podStartE2EDuration="6.07157246s" podCreationTimestamp="2026-04-28 19:58:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:58:17.070569012 +0000 UTC m=+2513.701469958" watchObservedRunningTime="2026-04-28 19:58:17.07157246 +0000 UTC m=+2513.702473407" Apr 28 19:58:17.947600 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:17.947566 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cc54079-1f16-4458-8bd8-f3e4f597e6e9" path="/var/lib/kubelet/pods/8cc54079-1f16-4458-8bd8-f3e4f597e6e9/volumes" Apr 28 19:58:18.055546 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:18.055519 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-mw5mr" Apr 28 19:58:24.063412 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:24.063386 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-mw5mr" Apr 28 19:58:54.086987 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:58:54.086938 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-mw5mr" podUID="d89511f6-3c88-4a60-9fc2-137df2cbf3d8" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 28 19:59:04.066277 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:04.066249 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-mw5mr" Apr 28 19:59:11.469325 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:11.469293 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-mw5mr"] Apr 28 19:59:11.469748 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:11.469626 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-mw5mr" podUID="d89511f6-3c88-4a60-9fc2-137df2cbf3d8" containerName="kserve-container" containerID="cri-o://2714f028e33f1301b4be57c439054cffb288278f2f5e569b48324399566a54c5" gracePeriod=30 Apr 28 19:59:11.469748 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:11.469679 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-mw5mr" podUID="d89511f6-3c88-4a60-9fc2-137df2cbf3d8" containerName="kube-rbac-proxy" containerID="cri-o://cd80e2e75bcd43888886e49d6bdc3b19aac15a43f2f5d9b41eabe8c031ba480b" gracePeriod=30 Apr 28 19:59:11.686565 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:11.686524 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7c6499f57-hk2ls"] Apr 28 19:59:11.687013 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:11.686982 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8cc54079-1f16-4458-8bd8-f3e4f597e6e9" containerName="kube-rbac-proxy" Apr 28 19:59:11.687013 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:11.687007 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cc54079-1f16-4458-8bd8-f3e4f597e6e9" containerName="kube-rbac-proxy" Apr 28 19:59:11.687209 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:11.687025 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8cc54079-1f16-4458-8bd8-f3e4f597e6e9" containerName="storage-initializer" Apr 28 19:59:11.687209 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:11.687036 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cc54079-1f16-4458-8bd8-f3e4f597e6e9" containerName="storage-initializer" Apr 28 19:59:11.687209 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:11.687045 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8cc54079-1f16-4458-8bd8-f3e4f597e6e9" containerName="kserve-container" Apr 28 19:59:11.687209 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:11.687053 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cc54079-1f16-4458-8bd8-f3e4f597e6e9" containerName="kserve-container" Apr 28 19:59:11.687209 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:11.687116 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="8cc54079-1f16-4458-8bd8-f3e4f597e6e9" containerName="kserve-container" Apr 28 19:59:11.687209 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:11.687124 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="8cc54079-1f16-4458-8bd8-f3e4f597e6e9" containerName="kube-rbac-proxy" Apr 28 19:59:11.690184 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:11.690164 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7c6499f57-hk2ls" Apr 28 19:59:11.692319 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:11.692296 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\"" Apr 28 19:59:11.692433 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:11.692365 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-runtime-predictor-serving-cert\"" Apr 28 19:59:11.699549 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:11.699524 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7c6499f57-hk2ls"] Apr 28 19:59:11.772817 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:11.772711 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/25024859-dcd2-45ca-9021-11bc23b66aa6-proxy-tls\") pod \"isvc-sklearn-runtime-predictor-7c6499f57-hk2ls\" (UID: \"25024859-dcd2-45ca-9021-11bc23b66aa6\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7c6499f57-hk2ls" Apr 28 19:59:11.772974 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:11.772829 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/25024859-dcd2-45ca-9021-11bc23b66aa6-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-7c6499f57-hk2ls\" (UID: \"25024859-dcd2-45ca-9021-11bc23b66aa6\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7c6499f57-hk2ls" Apr 28 19:59:11.772974 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:11.772871 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/25024859-dcd2-45ca-9021-11bc23b66aa6-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-runtime-predictor-7c6499f57-hk2ls\" (UID: \"25024859-dcd2-45ca-9021-11bc23b66aa6\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7c6499f57-hk2ls" Apr 28 19:59:11.772974 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:11.772898 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzm57\" (UniqueName: \"kubernetes.io/projected/25024859-dcd2-45ca-9021-11bc23b66aa6-kube-api-access-vzm57\") pod \"isvc-sklearn-runtime-predictor-7c6499f57-hk2ls\" (UID: \"25024859-dcd2-45ca-9021-11bc23b66aa6\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7c6499f57-hk2ls" Apr 28 19:59:11.874149 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:11.874108 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/25024859-dcd2-45ca-9021-11bc23b66aa6-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-runtime-predictor-7c6499f57-hk2ls\" (UID: \"25024859-dcd2-45ca-9021-11bc23b66aa6\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7c6499f57-hk2ls" Apr 28 19:59:11.874149 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:11.874156 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vzm57\" (UniqueName: \"kubernetes.io/projected/25024859-dcd2-45ca-9021-11bc23b66aa6-kube-api-access-vzm57\") pod \"isvc-sklearn-runtime-predictor-7c6499f57-hk2ls\" (UID: \"25024859-dcd2-45ca-9021-11bc23b66aa6\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7c6499f57-hk2ls" Apr 28 19:59:11.874407 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:11.874187 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/25024859-dcd2-45ca-9021-11bc23b66aa6-proxy-tls\") pod \"isvc-sklearn-runtime-predictor-7c6499f57-hk2ls\" (UID: \"25024859-dcd2-45ca-9021-11bc23b66aa6\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7c6499f57-hk2ls" Apr 28 19:59:11.874407 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:11.874236 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/25024859-dcd2-45ca-9021-11bc23b66aa6-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-7c6499f57-hk2ls\" (UID: \"25024859-dcd2-45ca-9021-11bc23b66aa6\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7c6499f57-hk2ls" Apr 28 19:59:11.874650 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:11.874629 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/25024859-dcd2-45ca-9021-11bc23b66aa6-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-7c6499f57-hk2ls\" (UID: \"25024859-dcd2-45ca-9021-11bc23b66aa6\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7c6499f57-hk2ls" Apr 28 19:59:11.874805 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:11.874786 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/25024859-dcd2-45ca-9021-11bc23b66aa6-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-runtime-predictor-7c6499f57-hk2ls\" (UID: \"25024859-dcd2-45ca-9021-11bc23b66aa6\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7c6499f57-hk2ls" Apr 28 19:59:11.876605 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:11.876589 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/25024859-dcd2-45ca-9021-11bc23b66aa6-proxy-tls\") pod \"isvc-sklearn-runtime-predictor-7c6499f57-hk2ls\" (UID: \"25024859-dcd2-45ca-9021-11bc23b66aa6\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7c6499f57-hk2ls" Apr 28 19:59:11.881707 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:11.881685 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzm57\" (UniqueName: \"kubernetes.io/projected/25024859-dcd2-45ca-9021-11bc23b66aa6-kube-api-access-vzm57\") pod \"isvc-sklearn-runtime-predictor-7c6499f57-hk2ls\" (UID: \"25024859-dcd2-45ca-9021-11bc23b66aa6\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7c6499f57-hk2ls" Apr 28 19:59:12.001469 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:12.001434 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7c6499f57-hk2ls" Apr 28 19:59:12.121248 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:12.121205 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7c6499f57-hk2ls"] Apr 28 19:59:12.123973 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:59:12.123948 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25024859_dcd2_45ca_9021_11bc23b66aa6.slice/crio-ef90ac5820aa631b2e831230049778d08741e09ec1566c588a0b3aec4e7a75c3 WatchSource:0}: Error finding container ef90ac5820aa631b2e831230049778d08741e09ec1566c588a0b3aec4e7a75c3: Status 404 returned error can't find the container with id ef90ac5820aa631b2e831230049778d08741e09ec1566c588a0b3aec4e7a75c3 Apr 28 19:59:12.217255 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:12.217221 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7c6499f57-hk2ls" event={"ID":"25024859-dcd2-45ca-9021-11bc23b66aa6","Type":"ContainerStarted","Data":"cc71c5c944af9c20e93365b9a5ca3e6391cd61edc1ea700015a8a1e280f936cf"} Apr 28 19:59:12.217255 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:12.217259 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7c6499f57-hk2ls" event={"ID":"25024859-dcd2-45ca-9021-11bc23b66aa6","Type":"ContainerStarted","Data":"ef90ac5820aa631b2e831230049778d08741e09ec1566c588a0b3aec4e7a75c3"} Apr 28 19:59:12.219265 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:12.219242 2571 generic.go:358] "Generic (PLEG): container finished" podID="d89511f6-3c88-4a60-9fc2-137df2cbf3d8" containerID="cd80e2e75bcd43888886e49d6bdc3b19aac15a43f2f5d9b41eabe8c031ba480b" exitCode=2 Apr 28 19:59:12.219377 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:12.219279 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-mw5mr" event={"ID":"d89511f6-3c88-4a60-9fc2-137df2cbf3d8","Type":"ContainerDied","Data":"cd80e2e75bcd43888886e49d6bdc3b19aac15a43f2f5d9b41eabe8c031ba480b"} Apr 28 19:59:14.059348 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:14.059302 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-mw5mr" podUID="d89511f6-3c88-4a60-9fc2-137df2cbf3d8" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.43:8643/healthz\": dial tcp 10.134.0.43:8643: connect: connection refused" Apr 28 19:59:18.237573 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:18.237540 2571 generic.go:358] "Generic (PLEG): container finished" podID="25024859-dcd2-45ca-9021-11bc23b66aa6" containerID="cc71c5c944af9c20e93365b9a5ca3e6391cd61edc1ea700015a8a1e280f936cf" exitCode=0 Apr 28 19:59:18.237908 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:18.237597 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7c6499f57-hk2ls" event={"ID":"25024859-dcd2-45ca-9021-11bc23b66aa6","Type":"ContainerDied","Data":"cc71c5c944af9c20e93365b9a5ca3e6391cd61edc1ea700015a8a1e280f936cf"} Apr 28 19:59:18.901673 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:18.901650 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-mw5mr" Apr 28 19:59:18.930099 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:18.930073 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d89511f6-3c88-4a60-9fc2-137df2cbf3d8-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"d89511f6-3c88-4a60-9fc2-137df2cbf3d8\" (UID: \"d89511f6-3c88-4a60-9fc2-137df2cbf3d8\") " Apr 28 19:59:18.930244 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:18.930114 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjfsn\" (UniqueName: \"kubernetes.io/projected/d89511f6-3c88-4a60-9fc2-137df2cbf3d8-kube-api-access-kjfsn\") pod \"d89511f6-3c88-4a60-9fc2-137df2cbf3d8\" (UID: \"d89511f6-3c88-4a60-9fc2-137df2cbf3d8\") " Apr 28 19:59:18.930244 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:18.930153 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d89511f6-3c88-4a60-9fc2-137df2cbf3d8-kserve-provision-location\") pod \"d89511f6-3c88-4a60-9fc2-137df2cbf3d8\" (UID: \"d89511f6-3c88-4a60-9fc2-137df2cbf3d8\") " Apr 28 19:59:18.930368 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:18.930301 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d89511f6-3c88-4a60-9fc2-137df2cbf3d8-proxy-tls\") pod \"d89511f6-3c88-4a60-9fc2-137df2cbf3d8\" (UID: \"d89511f6-3c88-4a60-9fc2-137df2cbf3d8\") " Apr 28 19:59:18.930431 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:18.930398 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d89511f6-3c88-4a60-9fc2-137df2cbf3d8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d89511f6-3c88-4a60-9fc2-137df2cbf3d8" (UID: "d89511f6-3c88-4a60-9fc2-137df2cbf3d8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:59:18.930505 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:18.930442 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d89511f6-3c88-4a60-9fc2-137df2cbf3d8-sklearn-v2-mlserver-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "sklearn-v2-mlserver-kube-rbac-proxy-sar-config") pod "d89511f6-3c88-4a60-9fc2-137df2cbf3d8" (UID: "d89511f6-3c88-4a60-9fc2-137df2cbf3d8"). InnerVolumeSpecName "sklearn-v2-mlserver-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:59:18.930631 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:18.930614 2571 reconciler_common.go:299] "Volume detached for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d89511f6-3c88-4a60-9fc2-137df2cbf3d8-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:59:18.930688 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:18.930637 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d89511f6-3c88-4a60-9fc2-137df2cbf3d8-kserve-provision-location\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:59:18.932469 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:18.932440 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d89511f6-3c88-4a60-9fc2-137df2cbf3d8-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d89511f6-3c88-4a60-9fc2-137df2cbf3d8" (UID: "d89511f6-3c88-4a60-9fc2-137df2cbf3d8"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:59:18.932469 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:18.932459 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d89511f6-3c88-4a60-9fc2-137df2cbf3d8-kube-api-access-kjfsn" (OuterVolumeSpecName: "kube-api-access-kjfsn") pod "d89511f6-3c88-4a60-9fc2-137df2cbf3d8" (UID: "d89511f6-3c88-4a60-9fc2-137df2cbf3d8"). InnerVolumeSpecName "kube-api-access-kjfsn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:59:19.031835 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:19.031753 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d89511f6-3c88-4a60-9fc2-137df2cbf3d8-proxy-tls\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:59:19.031835 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:19.031782 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kjfsn\" (UniqueName: \"kubernetes.io/projected/d89511f6-3c88-4a60-9fc2-137df2cbf3d8-kube-api-access-kjfsn\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:59:19.242325 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:19.242293 2571 generic.go:358] "Generic (PLEG): container finished" podID="d89511f6-3c88-4a60-9fc2-137df2cbf3d8" containerID="2714f028e33f1301b4be57c439054cffb288278f2f5e569b48324399566a54c5" exitCode=0 Apr 28 19:59:19.242795 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:19.242384 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-mw5mr" Apr 28 19:59:19.242795 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:19.242385 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-mw5mr" event={"ID":"d89511f6-3c88-4a60-9fc2-137df2cbf3d8","Type":"ContainerDied","Data":"2714f028e33f1301b4be57c439054cffb288278f2f5e569b48324399566a54c5"} Apr 28 19:59:19.242795 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:19.242428 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-mw5mr" event={"ID":"d89511f6-3c88-4a60-9fc2-137df2cbf3d8","Type":"ContainerDied","Data":"5a1b61b29db39476c667f981e07d3567e90d35d4c18de1ccceb43da732c48f42"} Apr 28 19:59:19.242795 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:19.242448 2571 scope.go:117] "RemoveContainer" containerID="cd80e2e75bcd43888886e49d6bdc3b19aac15a43f2f5d9b41eabe8c031ba480b" Apr 28 19:59:19.244512 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:19.244473 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7c6499f57-hk2ls" event={"ID":"25024859-dcd2-45ca-9021-11bc23b66aa6","Type":"ContainerStarted","Data":"16a946c957f87c554fb5185509bc9c1ebf6f76d2040905d247de2bb0411e7190"} Apr 28 19:59:19.244636 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:19.244521 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7c6499f57-hk2ls" event={"ID":"25024859-dcd2-45ca-9021-11bc23b66aa6","Type":"ContainerStarted","Data":"5adf0dccaa062ee37a34397745623e1ff6cb2038e20bf2ec0f17eb0670a93d84"} Apr 28 19:59:19.244760 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:19.244734 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7c6499f57-hk2ls" Apr 28 19:59:19.244884 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:19.244768 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7c6499f57-hk2ls" Apr 28 19:59:19.246472 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:19.246449 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7c6499f57-hk2ls" podUID="25024859-dcd2-45ca-9021-11bc23b66aa6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 28 19:59:19.251571 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:19.251473 2571 scope.go:117] "RemoveContainer" containerID="2714f028e33f1301b4be57c439054cffb288278f2f5e569b48324399566a54c5" Apr 28 19:59:19.258468 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:19.258448 2571 scope.go:117] "RemoveContainer" containerID="15b476ce2e69ec914f80925ef28d0d2402ea4b8b9172bdcb010a951b9ad76c2d" Apr 28 19:59:19.264528 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:19.264460 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7c6499f57-hk2ls" podStartSLOduration=8.264448994 podStartE2EDuration="8.264448994s" podCreationTimestamp="2026-04-28 19:59:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:59:19.262731706 +0000 UTC m=+2575.893632651" watchObservedRunningTime="2026-04-28 19:59:19.264448994 +0000 UTC m=+2575.895349938" Apr 28 19:59:19.265913 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:19.265897 2571 scope.go:117] "RemoveContainer" containerID="cd80e2e75bcd43888886e49d6bdc3b19aac15a43f2f5d9b41eabe8c031ba480b" Apr 28 19:59:19.266166 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:59:19.266146 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd80e2e75bcd43888886e49d6bdc3b19aac15a43f2f5d9b41eabe8c031ba480b\": container with ID starting with cd80e2e75bcd43888886e49d6bdc3b19aac15a43f2f5d9b41eabe8c031ba480b not found: ID does not exist" containerID="cd80e2e75bcd43888886e49d6bdc3b19aac15a43f2f5d9b41eabe8c031ba480b" Apr 28 19:59:19.266225 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:19.266174 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd80e2e75bcd43888886e49d6bdc3b19aac15a43f2f5d9b41eabe8c031ba480b"} err="failed to get container status \"cd80e2e75bcd43888886e49d6bdc3b19aac15a43f2f5d9b41eabe8c031ba480b\": rpc error: code = NotFound desc = could not find container \"cd80e2e75bcd43888886e49d6bdc3b19aac15a43f2f5d9b41eabe8c031ba480b\": container with ID starting with cd80e2e75bcd43888886e49d6bdc3b19aac15a43f2f5d9b41eabe8c031ba480b not found: ID does not exist" Apr 28 19:59:19.266225 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:19.266190 2571 scope.go:117] "RemoveContainer" containerID="2714f028e33f1301b4be57c439054cffb288278f2f5e569b48324399566a54c5" Apr 28 19:59:19.266428 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:59:19.266413 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2714f028e33f1301b4be57c439054cffb288278f2f5e569b48324399566a54c5\": container with ID starting with 2714f028e33f1301b4be57c439054cffb288278f2f5e569b48324399566a54c5 not found: ID does not exist" containerID="2714f028e33f1301b4be57c439054cffb288278f2f5e569b48324399566a54c5" Apr 28 19:59:19.266470 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:19.266431 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2714f028e33f1301b4be57c439054cffb288278f2f5e569b48324399566a54c5"} err="failed to get container status \"2714f028e33f1301b4be57c439054cffb288278f2f5e569b48324399566a54c5\": rpc error: code = NotFound desc = could not find container \"2714f028e33f1301b4be57c439054cffb288278f2f5e569b48324399566a54c5\": container with ID starting with 2714f028e33f1301b4be57c439054cffb288278f2f5e569b48324399566a54c5 not found: ID does not exist" Apr 28 19:59:19.266470 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:19.266444 2571 scope.go:117] "RemoveContainer" containerID="15b476ce2e69ec914f80925ef28d0d2402ea4b8b9172bdcb010a951b9ad76c2d" Apr 28 19:59:19.266744 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:59:19.266722 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15b476ce2e69ec914f80925ef28d0d2402ea4b8b9172bdcb010a951b9ad76c2d\": container with ID starting with 15b476ce2e69ec914f80925ef28d0d2402ea4b8b9172bdcb010a951b9ad76c2d not found: ID does not exist" containerID="15b476ce2e69ec914f80925ef28d0d2402ea4b8b9172bdcb010a951b9ad76c2d" Apr 28 19:59:19.266799 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:19.266750 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15b476ce2e69ec914f80925ef28d0d2402ea4b8b9172bdcb010a951b9ad76c2d"} err="failed to get container status \"15b476ce2e69ec914f80925ef28d0d2402ea4b8b9172bdcb010a951b9ad76c2d\": rpc error: code = NotFound desc = could not find container \"15b476ce2e69ec914f80925ef28d0d2402ea4b8b9172bdcb010a951b9ad76c2d\": container with ID starting with 15b476ce2e69ec914f80925ef28d0d2402ea4b8b9172bdcb010a951b9ad76c2d not found: ID does not exist" Apr 28 19:59:19.274794 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:19.274771 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-mw5mr"] Apr 28 19:59:19.278556 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:19.278538 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-mw5mr"] Apr 28 19:59:19.948044 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:19.948004 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d89511f6-3c88-4a60-9fc2-137df2cbf3d8" path="/var/lib/kubelet/pods/d89511f6-3c88-4a60-9fc2-137df2cbf3d8/volumes" Apr 28 19:59:20.249560 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:20.249449 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7c6499f57-hk2ls" podUID="25024859-dcd2-45ca-9021-11bc23b66aa6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 28 19:59:25.253918 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:25.253890 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7c6499f57-hk2ls" Apr 28 19:59:25.254694 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:25.254667 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7c6499f57-hk2ls" podUID="25024859-dcd2-45ca-9021-11bc23b66aa6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 28 19:59:35.255693 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:35.255656 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7c6499f57-hk2ls" Apr 28 19:59:47.944978 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:47.944887 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-runtime-predictor-7c6499f57-hk2ls_25024859-dcd2-45ca-9021-11bc23b66aa6/kserve-container/0.log" Apr 28 19:59:48.769678 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:48.769645 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7c6499f57-hk2ls"] Apr 28 19:59:48.770053 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:48.770006 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7c6499f57-hk2ls" podUID="25024859-dcd2-45ca-9021-11bc23b66aa6" containerName="kserve-container" containerID="cri-o://5adf0dccaa062ee37a34397745623e1ff6cb2038e20bf2ec0f17eb0670a93d84" gracePeriod=30 Apr 28 19:59:48.770166 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:48.770041 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7c6499f57-hk2ls" podUID="25024859-dcd2-45ca-9021-11bc23b66aa6" containerName="kube-rbac-proxy" containerID="cri-o://16a946c957f87c554fb5185509bc9c1ebf6f76d2040905d247de2bb0411e7190" gracePeriod=30 Apr 28 19:59:48.988142 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:48.988107 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8ctdr"] Apr 28 19:59:48.988535 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:48.988449 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d89511f6-3c88-4a60-9fc2-137df2cbf3d8" containerName="kserve-container" Apr 28 19:59:48.988535 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:48.988460 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="d89511f6-3c88-4a60-9fc2-137df2cbf3d8" containerName="kserve-container" Apr 28 19:59:48.988535 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:48.988469 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d89511f6-3c88-4a60-9fc2-137df2cbf3d8" containerName="kube-rbac-proxy" Apr 28 19:59:48.988535 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:48.988474 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="d89511f6-3c88-4a60-9fc2-137df2cbf3d8" containerName="kube-rbac-proxy" Apr 28 19:59:48.988535 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:48.988503 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d89511f6-3c88-4a60-9fc2-137df2cbf3d8" containerName="storage-initializer" Apr 28 19:59:48.988535 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:48.988510 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="d89511f6-3c88-4a60-9fc2-137df2cbf3d8" containerName="storage-initializer" Apr 28 19:59:48.988747 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:48.988561 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="d89511f6-3c88-4a60-9fc2-137df2cbf3d8" containerName="kserve-container" Apr 28 19:59:48.988747 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:48.988570 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="d89511f6-3c88-4a60-9fc2-137df2cbf3d8" containerName="kube-rbac-proxy" Apr 28 19:59:48.990684 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:48.990664 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8ctdr" Apr 28 19:59:48.992585 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:48.992563 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\"" Apr 28 19:59:48.992700 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:48.992583 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-runtime-predictor-serving-cert\"" Apr 28 19:59:49.001776 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:49.001753 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8ctdr"] Apr 28 19:59:49.085341 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:49.085252 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/497238a1-bb9e-4d01-af16-2026e059363e-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-8ctdr\" (UID: \"497238a1-bb9e-4d01-af16-2026e059363e\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8ctdr" Apr 28 19:59:49.085341 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:49.085303 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/497238a1-bb9e-4d01-af16-2026e059363e-proxy-tls\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-8ctdr\" (UID: \"497238a1-bb9e-4d01-af16-2026e059363e\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8ctdr" Apr 28 19:59:49.085341 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:49.085330 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/497238a1-bb9e-4d01-af16-2026e059363e-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-8ctdr\" (UID: \"497238a1-bb9e-4d01-af16-2026e059363e\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8ctdr" Apr 28 19:59:49.085605 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:49.085429 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jk4tk\" (UniqueName: \"kubernetes.io/projected/497238a1-bb9e-4d01-af16-2026e059363e-kube-api-access-jk4tk\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-8ctdr\" (UID: \"497238a1-bb9e-4d01-af16-2026e059363e\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8ctdr" Apr 28 19:59:49.186635 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:49.186595 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jk4tk\" (UniqueName: \"kubernetes.io/projected/497238a1-bb9e-4d01-af16-2026e059363e-kube-api-access-jk4tk\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-8ctdr\" (UID: \"497238a1-bb9e-4d01-af16-2026e059363e\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8ctdr" Apr 28 19:59:49.186827 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:49.186659 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/497238a1-bb9e-4d01-af16-2026e059363e-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-8ctdr\" (UID: \"497238a1-bb9e-4d01-af16-2026e059363e\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8ctdr" Apr 28 19:59:49.186827 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:49.186698 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/497238a1-bb9e-4d01-af16-2026e059363e-proxy-tls\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-8ctdr\" (UID: \"497238a1-bb9e-4d01-af16-2026e059363e\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8ctdr" Apr 28 19:59:49.186827 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:49.186724 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/497238a1-bb9e-4d01-af16-2026e059363e-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-8ctdr\" (UID: \"497238a1-bb9e-4d01-af16-2026e059363e\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8ctdr" Apr 28 19:59:49.187165 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:49.187145 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/497238a1-bb9e-4d01-af16-2026e059363e-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-8ctdr\" (UID: \"497238a1-bb9e-4d01-af16-2026e059363e\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8ctdr" Apr 28 19:59:49.187361 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:49.187342 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/497238a1-bb9e-4d01-af16-2026e059363e-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-8ctdr\" (UID: \"497238a1-bb9e-4d01-af16-2026e059363e\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8ctdr" Apr 28 19:59:49.189103 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:49.189085 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/497238a1-bb9e-4d01-af16-2026e059363e-proxy-tls\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-8ctdr\" (UID: \"497238a1-bb9e-4d01-af16-2026e059363e\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8ctdr" Apr 28 19:59:49.196071 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:49.196050 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jk4tk\" (UniqueName: \"kubernetes.io/projected/497238a1-bb9e-4d01-af16-2026e059363e-kube-api-access-jk4tk\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-8ctdr\" (UID: \"497238a1-bb9e-4d01-af16-2026e059363e\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8ctdr" Apr 28 19:59:49.300639 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:49.300601 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8ctdr" Apr 28 19:59:49.335735 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:49.335661 2571 generic.go:358] "Generic (PLEG): container finished" podID="25024859-dcd2-45ca-9021-11bc23b66aa6" containerID="16a946c957f87c554fb5185509bc9c1ebf6f76d2040905d247de2bb0411e7190" exitCode=2 Apr 28 19:59:49.335735 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:49.335710 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7c6499f57-hk2ls" event={"ID":"25024859-dcd2-45ca-9021-11bc23b66aa6","Type":"ContainerDied","Data":"16a946c957f87c554fb5185509bc9c1ebf6f76d2040905d247de2bb0411e7190"} Apr 28 19:59:49.431143 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:49.431096 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8ctdr"] Apr 28 19:59:49.433351 ip-10-0-139-128 kubenswrapper[2571]: W0428 19:59:49.433322 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod497238a1_bb9e_4d01_af16_2026e059363e.slice/crio-a5abe3c5be99eb70402ceac32aa3e83a5e46d827ee92f2a7411d926e3d16a936 WatchSource:0}: Error finding container a5abe3c5be99eb70402ceac32aa3e83a5e46d827ee92f2a7411d926e3d16a936: Status 404 returned error can't find the container with id a5abe3c5be99eb70402ceac32aa3e83a5e46d827ee92f2a7411d926e3d16a936 Apr 28 19:59:49.816560 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:49.816535 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7c6499f57-hk2ls" Apr 28 19:59:49.890516 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:49.890427 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/25024859-dcd2-45ca-9021-11bc23b66aa6-kserve-provision-location\") pod \"25024859-dcd2-45ca-9021-11bc23b66aa6\" (UID: \"25024859-dcd2-45ca-9021-11bc23b66aa6\") " Apr 28 19:59:49.890516 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:49.890467 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/25024859-dcd2-45ca-9021-11bc23b66aa6-proxy-tls\") pod \"25024859-dcd2-45ca-9021-11bc23b66aa6\" (UID: \"25024859-dcd2-45ca-9021-11bc23b66aa6\") " Apr 28 19:59:49.890516 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:49.890511 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzm57\" (UniqueName: \"kubernetes.io/projected/25024859-dcd2-45ca-9021-11bc23b66aa6-kube-api-access-vzm57\") pod \"25024859-dcd2-45ca-9021-11bc23b66aa6\" (UID: \"25024859-dcd2-45ca-9021-11bc23b66aa6\") " Apr 28 19:59:49.890808 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:49.890562 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/25024859-dcd2-45ca-9021-11bc23b66aa6-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") pod \"25024859-dcd2-45ca-9021-11bc23b66aa6\" (UID: \"25024859-dcd2-45ca-9021-11bc23b66aa6\") " Apr 28 19:59:49.890942 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:49.890912 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25024859-dcd2-45ca-9021-11bc23b66aa6-isvc-sklearn-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-runtime-kube-rbac-proxy-sar-config") pod "25024859-dcd2-45ca-9021-11bc23b66aa6" (UID: "25024859-dcd2-45ca-9021-11bc23b66aa6"). InnerVolumeSpecName "isvc-sklearn-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:59:49.892662 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:49.892631 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25024859-dcd2-45ca-9021-11bc23b66aa6-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "25024859-dcd2-45ca-9021-11bc23b66aa6" (UID: "25024859-dcd2-45ca-9021-11bc23b66aa6"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:59:49.892744 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:49.892682 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25024859-dcd2-45ca-9021-11bc23b66aa6-kube-api-access-vzm57" (OuterVolumeSpecName: "kube-api-access-vzm57") pod "25024859-dcd2-45ca-9021-11bc23b66aa6" (UID: "25024859-dcd2-45ca-9021-11bc23b66aa6"). InnerVolumeSpecName "kube-api-access-vzm57". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:59:49.927123 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:49.927086 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25024859-dcd2-45ca-9021-11bc23b66aa6-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "25024859-dcd2-45ca-9021-11bc23b66aa6" (UID: "25024859-dcd2-45ca-9021-11bc23b66aa6"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:59:49.991475 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:49.991445 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vzm57\" (UniqueName: \"kubernetes.io/projected/25024859-dcd2-45ca-9021-11bc23b66aa6-kube-api-access-vzm57\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:59:49.991475 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:49.991474 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/25024859-dcd2-45ca-9021-11bc23b66aa6-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:59:49.991860 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:49.991503 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/25024859-dcd2-45ca-9021-11bc23b66aa6-kserve-provision-location\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:59:49.991860 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:49.991516 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/25024859-dcd2-45ca-9021-11bc23b66aa6-proxy-tls\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 19:59:50.340046 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:50.340013 2571 generic.go:358] "Generic (PLEG): container finished" podID="25024859-dcd2-45ca-9021-11bc23b66aa6" containerID="5adf0dccaa062ee37a34397745623e1ff6cb2038e20bf2ec0f17eb0670a93d84" exitCode=0 Apr 28 19:59:50.340211 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:50.340079 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7c6499f57-hk2ls" event={"ID":"25024859-dcd2-45ca-9021-11bc23b66aa6","Type":"ContainerDied","Data":"5adf0dccaa062ee37a34397745623e1ff6cb2038e20bf2ec0f17eb0670a93d84"} Apr 28 19:59:50.340211 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:50.340105 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7c6499f57-hk2ls" event={"ID":"25024859-dcd2-45ca-9021-11bc23b66aa6","Type":"ContainerDied","Data":"ef90ac5820aa631b2e831230049778d08741e09ec1566c588a0b3aec4e7a75c3"} Apr 28 19:59:50.340211 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:50.340105 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7c6499f57-hk2ls" Apr 28 19:59:50.340211 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:50.340133 2571 scope.go:117] "RemoveContainer" containerID="16a946c957f87c554fb5185509bc9c1ebf6f76d2040905d247de2bb0411e7190" Apr 28 19:59:50.341595 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:50.341571 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8ctdr" event={"ID":"497238a1-bb9e-4d01-af16-2026e059363e","Type":"ContainerStarted","Data":"4cd6fa8dc6ef3ebe823c20d3ef593a45fc324128e18367021b0ec51b27986d7a"} Apr 28 19:59:50.341713 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:50.341602 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8ctdr" event={"ID":"497238a1-bb9e-4d01-af16-2026e059363e","Type":"ContainerStarted","Data":"a5abe3c5be99eb70402ceac32aa3e83a5e46d827ee92f2a7411d926e3d16a936"} Apr 28 19:59:50.348164 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:50.348147 2571 scope.go:117] "RemoveContainer" containerID="5adf0dccaa062ee37a34397745623e1ff6cb2038e20bf2ec0f17eb0670a93d84" Apr 28 19:59:50.354966 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:50.354950 2571 scope.go:117] "RemoveContainer" containerID="cc71c5c944af9c20e93365b9a5ca3e6391cd61edc1ea700015a8a1e280f936cf" Apr 28 19:59:50.362038 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:50.362022 2571 scope.go:117] "RemoveContainer" containerID="16a946c957f87c554fb5185509bc9c1ebf6f76d2040905d247de2bb0411e7190" Apr 28 19:59:50.362365 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:59:50.362303 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16a946c957f87c554fb5185509bc9c1ebf6f76d2040905d247de2bb0411e7190\": container with ID starting with 16a946c957f87c554fb5185509bc9c1ebf6f76d2040905d247de2bb0411e7190 not found: ID does not exist" containerID="16a946c957f87c554fb5185509bc9c1ebf6f76d2040905d247de2bb0411e7190" Apr 28 19:59:50.362365 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:50.362339 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16a946c957f87c554fb5185509bc9c1ebf6f76d2040905d247de2bb0411e7190"} err="failed to get container status \"16a946c957f87c554fb5185509bc9c1ebf6f76d2040905d247de2bb0411e7190\": rpc error: code = NotFound desc = could not find container \"16a946c957f87c554fb5185509bc9c1ebf6f76d2040905d247de2bb0411e7190\": container with ID starting with 16a946c957f87c554fb5185509bc9c1ebf6f76d2040905d247de2bb0411e7190 not found: ID does not exist" Apr 28 19:59:50.362365 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:50.362362 2571 scope.go:117] "RemoveContainer" containerID="5adf0dccaa062ee37a34397745623e1ff6cb2038e20bf2ec0f17eb0670a93d84" Apr 28 19:59:50.362841 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:59:50.362708 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5adf0dccaa062ee37a34397745623e1ff6cb2038e20bf2ec0f17eb0670a93d84\": container with ID starting with 5adf0dccaa062ee37a34397745623e1ff6cb2038e20bf2ec0f17eb0670a93d84 not found: ID does not exist" containerID="5adf0dccaa062ee37a34397745623e1ff6cb2038e20bf2ec0f17eb0670a93d84" Apr 28 19:59:50.362841 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:50.362728 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5adf0dccaa062ee37a34397745623e1ff6cb2038e20bf2ec0f17eb0670a93d84"} err="failed to get container status \"5adf0dccaa062ee37a34397745623e1ff6cb2038e20bf2ec0f17eb0670a93d84\": rpc error: code = NotFound desc = could not find container \"5adf0dccaa062ee37a34397745623e1ff6cb2038e20bf2ec0f17eb0670a93d84\": container with ID starting with 5adf0dccaa062ee37a34397745623e1ff6cb2038e20bf2ec0f17eb0670a93d84 not found: ID does not exist" Apr 28 19:59:50.362841 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:50.362744 2571 scope.go:117] "RemoveContainer" containerID="cc71c5c944af9c20e93365b9a5ca3e6391cd61edc1ea700015a8a1e280f936cf" Apr 28 19:59:50.363117 ip-10-0-139-128 kubenswrapper[2571]: E0428 19:59:50.363097 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc71c5c944af9c20e93365b9a5ca3e6391cd61edc1ea700015a8a1e280f936cf\": container with ID starting with cc71c5c944af9c20e93365b9a5ca3e6391cd61edc1ea700015a8a1e280f936cf not found: ID does not exist" containerID="cc71c5c944af9c20e93365b9a5ca3e6391cd61edc1ea700015a8a1e280f936cf" Apr 28 19:59:50.363189 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:50.363123 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc71c5c944af9c20e93365b9a5ca3e6391cd61edc1ea700015a8a1e280f936cf"} err="failed to get container status \"cc71c5c944af9c20e93365b9a5ca3e6391cd61edc1ea700015a8a1e280f936cf\": rpc error: code = NotFound desc = could not find container \"cc71c5c944af9c20e93365b9a5ca3e6391cd61edc1ea700015a8a1e280f936cf\": container with ID starting with cc71c5c944af9c20e93365b9a5ca3e6391cd61edc1ea700015a8a1e280f936cf not found: ID does not exist" Apr 28 19:59:50.376028 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:50.376007 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7c6499f57-hk2ls"] Apr 28 19:59:50.378737 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:50.378715 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-7c6499f57-hk2ls"] Apr 28 19:59:51.947188 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:51.947158 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25024859-dcd2-45ca-9021-11bc23b66aa6" path="/var/lib/kubelet/pods/25024859-dcd2-45ca-9021-11bc23b66aa6/volumes" Apr 28 19:59:53.352298 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:53.352266 2571 generic.go:358] "Generic (PLEG): container finished" podID="497238a1-bb9e-4d01-af16-2026e059363e" containerID="4cd6fa8dc6ef3ebe823c20d3ef593a45fc324128e18367021b0ec51b27986d7a" exitCode=0 Apr 28 19:59:53.352688 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:53.352311 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8ctdr" event={"ID":"497238a1-bb9e-4d01-af16-2026e059363e","Type":"ContainerDied","Data":"4cd6fa8dc6ef3ebe823c20d3ef593a45fc324128e18367021b0ec51b27986d7a"} Apr 28 19:59:54.357224 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:54.357189 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8ctdr" event={"ID":"497238a1-bb9e-4d01-af16-2026e059363e","Type":"ContainerStarted","Data":"f69a2c54f2f7c66e7210b77f9889919118e0eb0bfd07288a2e72d3c4d0f92691"} Apr 28 19:59:54.357704 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:54.357230 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8ctdr" event={"ID":"497238a1-bb9e-4d01-af16-2026e059363e","Type":"ContainerStarted","Data":"f4543cb430a7ec616071f22df5565930e5b1de3b807e24d0b033c7b572433f45"} Apr 28 19:59:54.357704 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:54.357456 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8ctdr" Apr 28 19:59:54.377721 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:54.377664 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8ctdr" podStartSLOduration=6.377651525 podStartE2EDuration="6.377651525s" podCreationTimestamp="2026-04-28 19:59:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:59:54.375241148 +0000 UTC m=+2611.006142094" watchObservedRunningTime="2026-04-28 19:59:54.377651525 +0000 UTC m=+2611.008552470" Apr 28 19:59:55.360711 ip-10-0-139-128 kubenswrapper[2571]: I0428 19:59:55.360684 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8ctdr" Apr 28 20:00:01.368991 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:01.368963 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8ctdr" Apr 28 20:00:31.387474 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:31.387428 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8ctdr" podUID="497238a1-bb9e-4d01-af16-2026e059363e" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 28 20:00:41.371786 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:41.371747 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8ctdr" Apr 28 20:00:48.877074 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:48.877037 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8ctdr"] Apr 28 20:00:48.877638 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:48.877343 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8ctdr" podUID="497238a1-bb9e-4d01-af16-2026e059363e" containerName="kserve-container" containerID="cri-o://f4543cb430a7ec616071f22df5565930e5b1de3b807e24d0b033c7b572433f45" gracePeriod=30 Apr 28 20:00:48.877638 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:48.877427 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8ctdr" podUID="497238a1-bb9e-4d01-af16-2026e059363e" containerName="kube-rbac-proxy" containerID="cri-o://f69a2c54f2f7c66e7210b77f9889919118e0eb0bfd07288a2e72d3c4d0f92691" gracePeriod=30 Apr 28 20:00:49.086694 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:49.086661 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-64fcb8589f-wgmk5"] Apr 28 20:00:49.087069 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:49.087051 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="25024859-dcd2-45ca-9021-11bc23b66aa6" containerName="kserve-container" Apr 28 20:00:49.087156 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:49.087071 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="25024859-dcd2-45ca-9021-11bc23b66aa6" containerName="kserve-container" Apr 28 20:00:49.087156 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:49.087083 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="25024859-dcd2-45ca-9021-11bc23b66aa6" containerName="kube-rbac-proxy" Apr 28 20:00:49.087156 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:49.087091 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="25024859-dcd2-45ca-9021-11bc23b66aa6" containerName="kube-rbac-proxy" Apr 28 20:00:49.087156 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:49.087117 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="25024859-dcd2-45ca-9021-11bc23b66aa6" containerName="storage-initializer" Apr 28 20:00:49.087156 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:49.087127 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="25024859-dcd2-45ca-9021-11bc23b66aa6" containerName="storage-initializer" Apr 28 20:00:49.087418 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:49.087213 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="25024859-dcd2-45ca-9021-11bc23b66aa6" containerName="kube-rbac-proxy" Apr 28 20:00:49.087418 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:49.087230 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="25024859-dcd2-45ca-9021-11bc23b66aa6" containerName="kserve-container" Apr 28 20:00:49.090316 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:49.090296 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-64fcb8589f-wgmk5" Apr 28 20:00:49.093315 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:49.093294 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-predictor-serving-cert\"" Apr 28 20:00:49.093459 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:49.093409 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-kube-rbac-proxy-sar-config\"" Apr 28 20:00:49.100981 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:49.100951 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-64fcb8589f-wgmk5"] Apr 28 20:00:49.185907 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:49.185868 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htwdw\" (UniqueName: \"kubernetes.io/projected/883551c2-5277-41f4-bb94-0872c41c6024-kube-api-access-htwdw\") pod \"isvc-sklearn-v2-predictor-64fcb8589f-wgmk5\" (UID: \"883551c2-5277-41f4-bb94-0872c41c6024\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-64fcb8589f-wgmk5" Apr 28 20:00:49.186073 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:49.185933 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/883551c2-5277-41f4-bb94-0872c41c6024-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-64fcb8589f-wgmk5\" (UID: \"883551c2-5277-41f4-bb94-0872c41c6024\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-64fcb8589f-wgmk5" Apr 28 20:00:49.186073 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:49.185989 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/883551c2-5277-41f4-bb94-0872c41c6024-proxy-tls\") pod \"isvc-sklearn-v2-predictor-64fcb8589f-wgmk5\" (UID: \"883551c2-5277-41f4-bb94-0872c41c6024\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-64fcb8589f-wgmk5" Apr 28 20:00:49.186181 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:49.186085 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/883551c2-5277-41f4-bb94-0872c41c6024-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-predictor-64fcb8589f-wgmk5\" (UID: \"883551c2-5277-41f4-bb94-0872c41c6024\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-64fcb8589f-wgmk5" Apr 28 20:00:49.287463 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:49.287425 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/883551c2-5277-41f4-bb94-0872c41c6024-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-predictor-64fcb8589f-wgmk5\" (UID: \"883551c2-5277-41f4-bb94-0872c41c6024\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-64fcb8589f-wgmk5" Apr 28 20:00:49.287663 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:49.287514 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-htwdw\" (UniqueName: \"kubernetes.io/projected/883551c2-5277-41f4-bb94-0872c41c6024-kube-api-access-htwdw\") pod \"isvc-sklearn-v2-predictor-64fcb8589f-wgmk5\" (UID: \"883551c2-5277-41f4-bb94-0872c41c6024\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-64fcb8589f-wgmk5" Apr 28 20:00:49.287663 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:49.287539 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/883551c2-5277-41f4-bb94-0872c41c6024-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-64fcb8589f-wgmk5\" (UID: \"883551c2-5277-41f4-bb94-0872c41c6024\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-64fcb8589f-wgmk5" Apr 28 20:00:49.287663 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:49.287560 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/883551c2-5277-41f4-bb94-0872c41c6024-proxy-tls\") pod \"isvc-sklearn-v2-predictor-64fcb8589f-wgmk5\" (UID: \"883551c2-5277-41f4-bb94-0872c41c6024\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-64fcb8589f-wgmk5" Apr 28 20:00:49.287997 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:49.287970 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/883551c2-5277-41f4-bb94-0872c41c6024-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-64fcb8589f-wgmk5\" (UID: \"883551c2-5277-41f4-bb94-0872c41c6024\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-64fcb8589f-wgmk5" Apr 28 20:00:49.288223 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:49.288206 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/883551c2-5277-41f4-bb94-0872c41c6024-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-predictor-64fcb8589f-wgmk5\" (UID: \"883551c2-5277-41f4-bb94-0872c41c6024\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-64fcb8589f-wgmk5" Apr 28 20:00:49.290099 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:49.290076 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/883551c2-5277-41f4-bb94-0872c41c6024-proxy-tls\") pod \"isvc-sklearn-v2-predictor-64fcb8589f-wgmk5\" (UID: \"883551c2-5277-41f4-bb94-0872c41c6024\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-64fcb8589f-wgmk5" Apr 28 20:00:49.295117 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:49.295090 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-htwdw\" (UniqueName: \"kubernetes.io/projected/883551c2-5277-41f4-bb94-0872c41c6024-kube-api-access-htwdw\") pod \"isvc-sklearn-v2-predictor-64fcb8589f-wgmk5\" (UID: \"883551c2-5277-41f4-bb94-0872c41c6024\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-64fcb8589f-wgmk5" Apr 28 20:00:49.401781 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:49.401746 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-64fcb8589f-wgmk5" Apr 28 20:00:49.520998 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:49.520963 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-64fcb8589f-wgmk5"] Apr 28 20:00:49.522777 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:49.522750 2571 generic.go:358] "Generic (PLEG): container finished" podID="497238a1-bb9e-4d01-af16-2026e059363e" containerID="f69a2c54f2f7c66e7210b77f9889919118e0eb0bfd07288a2e72d3c4d0f92691" exitCode=2 Apr 28 20:00:49.522873 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:49.522820 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8ctdr" event={"ID":"497238a1-bb9e-4d01-af16-2026e059363e","Type":"ContainerDied","Data":"f69a2c54f2f7c66e7210b77f9889919118e0eb0bfd07288a2e72d3c4d0f92691"} Apr 28 20:00:49.526870 ip-10-0-139-128 kubenswrapper[2571]: W0428 20:00:49.525604 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod883551c2_5277_41f4_bb94_0872c41c6024.slice/crio-36ef917400d911b427e53eaedfff9c9036d476580232f0af07bf396dd78f0783 WatchSource:0}: Error finding container 36ef917400d911b427e53eaedfff9c9036d476580232f0af07bf396dd78f0783: Status 404 returned error can't find the container with id 36ef917400d911b427e53eaedfff9c9036d476580232f0af07bf396dd78f0783 Apr 28 20:00:50.527275 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:50.527235 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-64fcb8589f-wgmk5" event={"ID":"883551c2-5277-41f4-bb94-0872c41c6024","Type":"ContainerStarted","Data":"97dd3e6636691625d3af6efecbe6790b7668dc15358d39b37a0211028ab37398"} Apr 28 20:00:50.527275 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:50.527274 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-64fcb8589f-wgmk5" event={"ID":"883551c2-5277-41f4-bb94-0872c41c6024","Type":"ContainerStarted","Data":"36ef917400d911b427e53eaedfff9c9036d476580232f0af07bf396dd78f0783"} Apr 28 20:00:51.363930 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:51.363888 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8ctdr" podUID="497238a1-bb9e-4d01-af16-2026e059363e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.45:8643/healthz\": dial tcp 10.134.0.45:8643: connect: connection refused" Apr 28 20:00:52.411689 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:52.411641 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8ctdr" podUID="497238a1-bb9e-4d01-af16-2026e059363e" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.45:8080/v2/models/isvc-sklearn-v2-runtime/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Apr 28 20:00:53.538028 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:53.537994 2571 generic.go:358] "Generic (PLEG): container finished" podID="883551c2-5277-41f4-bb94-0872c41c6024" containerID="97dd3e6636691625d3af6efecbe6790b7668dc15358d39b37a0211028ab37398" exitCode=0 Apr 28 20:00:53.538399 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:53.538065 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-64fcb8589f-wgmk5" event={"ID":"883551c2-5277-41f4-bb94-0872c41c6024","Type":"ContainerDied","Data":"97dd3e6636691625d3af6efecbe6790b7668dc15358d39b37a0211028ab37398"} Apr 28 20:00:54.543495 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:54.543446 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-64fcb8589f-wgmk5" event={"ID":"883551c2-5277-41f4-bb94-0872c41c6024","Type":"ContainerStarted","Data":"c688c19c5ccb7f2cbb0e1403c9aba023d18c858d461b7130a4a2eb4bfaf1985c"} Apr 28 20:00:54.543993 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:54.543514 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-64fcb8589f-wgmk5" event={"ID":"883551c2-5277-41f4-bb94-0872c41c6024","Type":"ContainerStarted","Data":"0308f76362b457b37f8ac7ffcc86c015791cc480a1b6fc00a9604689dbd45a62"} Apr 28 20:00:54.543993 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:54.543798 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-64fcb8589f-wgmk5" Apr 28 20:00:54.543993 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:54.543930 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-64fcb8589f-wgmk5" Apr 28 20:00:54.545003 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:54.544976 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-64fcb8589f-wgmk5" podUID="883551c2-5277-41f4-bb94-0872c41c6024" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 28 20:00:54.561407 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:54.561357 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-64fcb8589f-wgmk5" podStartSLOduration=5.561343488 podStartE2EDuration="5.561343488s" podCreationTimestamp="2026-04-28 20:00:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 20:00:54.559920872 +0000 UTC m=+2671.190821819" watchObservedRunningTime="2026-04-28 20:00:54.561343488 +0000 UTC m=+2671.192244433" Apr 28 20:00:55.546236 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:55.546198 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-64fcb8589f-wgmk5" podUID="883551c2-5277-41f4-bb94-0872c41c6024" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 28 20:00:56.364056 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:56.364011 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8ctdr" podUID="497238a1-bb9e-4d01-af16-2026e059363e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.45:8643/healthz\": dial tcp 10.134.0.45:8643: connect: connection refused" Apr 28 20:00:56.715981 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:56.715958 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8ctdr" Apr 28 20:00:56.853792 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:56.853753 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/497238a1-bb9e-4d01-af16-2026e059363e-proxy-tls\") pod \"497238a1-bb9e-4d01-af16-2026e059363e\" (UID: \"497238a1-bb9e-4d01-af16-2026e059363e\") " Apr 28 20:00:56.853983 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:56.853809 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/497238a1-bb9e-4d01-af16-2026e059363e-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") pod \"497238a1-bb9e-4d01-af16-2026e059363e\" (UID: \"497238a1-bb9e-4d01-af16-2026e059363e\") " Apr 28 20:00:56.853983 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:56.853857 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/497238a1-bb9e-4d01-af16-2026e059363e-kserve-provision-location\") pod \"497238a1-bb9e-4d01-af16-2026e059363e\" (UID: \"497238a1-bb9e-4d01-af16-2026e059363e\") " Apr 28 20:00:56.853983 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:56.853900 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jk4tk\" (UniqueName: \"kubernetes.io/projected/497238a1-bb9e-4d01-af16-2026e059363e-kube-api-access-jk4tk\") pod \"497238a1-bb9e-4d01-af16-2026e059363e\" (UID: \"497238a1-bb9e-4d01-af16-2026e059363e\") " Apr 28 20:00:56.854255 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:56.854226 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/497238a1-bb9e-4d01-af16-2026e059363e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "497238a1-bb9e-4d01-af16-2026e059363e" (UID: "497238a1-bb9e-4d01-af16-2026e059363e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:00:56.854319 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:56.854230 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/497238a1-bb9e-4d01-af16-2026e059363e-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config") pod "497238a1-bb9e-4d01-af16-2026e059363e" (UID: "497238a1-bb9e-4d01-af16-2026e059363e"). InnerVolumeSpecName "isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 20:00:56.856088 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:56.856066 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/497238a1-bb9e-4d01-af16-2026e059363e-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "497238a1-bb9e-4d01-af16-2026e059363e" (UID: "497238a1-bb9e-4d01-af16-2026e059363e"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 20:00:56.856167 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:56.856086 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/497238a1-bb9e-4d01-af16-2026e059363e-kube-api-access-jk4tk" (OuterVolumeSpecName: "kube-api-access-jk4tk") pod "497238a1-bb9e-4d01-af16-2026e059363e" (UID: "497238a1-bb9e-4d01-af16-2026e059363e"). InnerVolumeSpecName "kube-api-access-jk4tk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 20:00:56.955150 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:56.955115 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/497238a1-bb9e-4d01-af16-2026e059363e-kserve-provision-location\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 20:00:56.955150 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:56.955145 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jk4tk\" (UniqueName: \"kubernetes.io/projected/497238a1-bb9e-4d01-af16-2026e059363e-kube-api-access-jk4tk\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 20:00:56.955150 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:56.955156 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/497238a1-bb9e-4d01-af16-2026e059363e-proxy-tls\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 20:00:56.955377 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:56.955166 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/497238a1-bb9e-4d01-af16-2026e059363e-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 20:00:57.553416 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:57.553379 2571 generic.go:358] "Generic (PLEG): container finished" podID="497238a1-bb9e-4d01-af16-2026e059363e" containerID="f4543cb430a7ec616071f22df5565930e5b1de3b807e24d0b033c7b572433f45" exitCode=0 Apr 28 20:00:57.553643 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:57.553458 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8ctdr" Apr 28 20:00:57.553643 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:57.553467 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8ctdr" event={"ID":"497238a1-bb9e-4d01-af16-2026e059363e","Type":"ContainerDied","Data":"f4543cb430a7ec616071f22df5565930e5b1de3b807e24d0b033c7b572433f45"} Apr 28 20:00:57.553643 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:57.553537 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8ctdr" event={"ID":"497238a1-bb9e-4d01-af16-2026e059363e","Type":"ContainerDied","Data":"a5abe3c5be99eb70402ceac32aa3e83a5e46d827ee92f2a7411d926e3d16a936"} Apr 28 20:00:57.553643 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:57.553560 2571 scope.go:117] "RemoveContainer" containerID="f69a2c54f2f7c66e7210b77f9889919118e0eb0bfd07288a2e72d3c4d0f92691" Apr 28 20:00:57.562265 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:57.562250 2571 scope.go:117] "RemoveContainer" containerID="f4543cb430a7ec616071f22df5565930e5b1de3b807e24d0b033c7b572433f45" Apr 28 20:00:57.569357 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:57.569341 2571 scope.go:117] "RemoveContainer" containerID="4cd6fa8dc6ef3ebe823c20d3ef593a45fc324128e18367021b0ec51b27986d7a" Apr 28 20:00:57.577835 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:57.577572 2571 scope.go:117] "RemoveContainer" containerID="f69a2c54f2f7c66e7210b77f9889919118e0eb0bfd07288a2e72d3c4d0f92691" Apr 28 20:00:57.577961 ip-10-0-139-128 kubenswrapper[2571]: E0428 20:00:57.577905 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f69a2c54f2f7c66e7210b77f9889919118e0eb0bfd07288a2e72d3c4d0f92691\": container with ID starting with f69a2c54f2f7c66e7210b77f9889919118e0eb0bfd07288a2e72d3c4d0f92691 not found: ID does not exist" containerID="f69a2c54f2f7c66e7210b77f9889919118e0eb0bfd07288a2e72d3c4d0f92691" Apr 28 20:00:57.578027 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:57.577968 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f69a2c54f2f7c66e7210b77f9889919118e0eb0bfd07288a2e72d3c4d0f92691"} err="failed to get container status \"f69a2c54f2f7c66e7210b77f9889919118e0eb0bfd07288a2e72d3c4d0f92691\": rpc error: code = NotFound desc = could not find container \"f69a2c54f2f7c66e7210b77f9889919118e0eb0bfd07288a2e72d3c4d0f92691\": container with ID starting with f69a2c54f2f7c66e7210b77f9889919118e0eb0bfd07288a2e72d3c4d0f92691 not found: ID does not exist" Apr 28 20:00:57.578027 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:57.577996 2571 scope.go:117] "RemoveContainer" containerID="f4543cb430a7ec616071f22df5565930e5b1de3b807e24d0b033c7b572433f45" Apr 28 20:00:57.578356 ip-10-0-139-128 kubenswrapper[2571]: E0428 20:00:57.578333 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4543cb430a7ec616071f22df5565930e5b1de3b807e24d0b033c7b572433f45\": container with ID starting with f4543cb430a7ec616071f22df5565930e5b1de3b807e24d0b033c7b572433f45 not found: ID does not exist" containerID="f4543cb430a7ec616071f22df5565930e5b1de3b807e24d0b033c7b572433f45" Apr 28 20:00:57.578440 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:57.578372 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4543cb430a7ec616071f22df5565930e5b1de3b807e24d0b033c7b572433f45"} err="failed to get container status \"f4543cb430a7ec616071f22df5565930e5b1de3b807e24d0b033c7b572433f45\": rpc error: code = NotFound desc = could not find container \"f4543cb430a7ec616071f22df5565930e5b1de3b807e24d0b033c7b572433f45\": container with ID starting with f4543cb430a7ec616071f22df5565930e5b1de3b807e24d0b033c7b572433f45 not found: ID does not exist" Apr 28 20:00:57.578440 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:57.578394 2571 scope.go:117] "RemoveContainer" containerID="4cd6fa8dc6ef3ebe823c20d3ef593a45fc324128e18367021b0ec51b27986d7a" Apr 28 20:00:57.578916 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:57.578623 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8ctdr"] Apr 28 20:00:57.578916 ip-10-0-139-128 kubenswrapper[2571]: E0428 20:00:57.578772 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cd6fa8dc6ef3ebe823c20d3ef593a45fc324128e18367021b0ec51b27986d7a\": container with ID starting with 4cd6fa8dc6ef3ebe823c20d3ef593a45fc324128e18367021b0ec51b27986d7a not found: ID does not exist" containerID="4cd6fa8dc6ef3ebe823c20d3ef593a45fc324128e18367021b0ec51b27986d7a" Apr 28 20:00:57.578916 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:57.578812 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cd6fa8dc6ef3ebe823c20d3ef593a45fc324128e18367021b0ec51b27986d7a"} err="failed to get container status \"4cd6fa8dc6ef3ebe823c20d3ef593a45fc324128e18367021b0ec51b27986d7a\": rpc error: code = NotFound desc = could not find container \"4cd6fa8dc6ef3ebe823c20d3ef593a45fc324128e18367021b0ec51b27986d7a\": container with ID starting with 4cd6fa8dc6ef3ebe823c20d3ef593a45fc324128e18367021b0ec51b27986d7a not found: ID does not exist" Apr 28 20:00:57.582608 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:57.582586 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-8ctdr"] Apr 28 20:00:57.946964 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:00:57.946929 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="497238a1-bb9e-4d01-af16-2026e059363e" path="/var/lib/kubelet/pods/497238a1-bb9e-4d01-af16-2026e059363e/volumes" Apr 28 20:01:00.551431 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:01:00.551401 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-64fcb8589f-wgmk5" Apr 28 20:01:00.551965 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:01:00.551938 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-64fcb8589f-wgmk5" podUID="883551c2-5277-41f4-bb94-0872c41c6024" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 28 20:01:10.552245 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:01:10.552167 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-64fcb8589f-wgmk5" podUID="883551c2-5277-41f4-bb94-0872c41c6024" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 28 20:01:20.552459 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:01:20.552417 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-64fcb8589f-wgmk5" podUID="883551c2-5277-41f4-bb94-0872c41c6024" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 28 20:01:24.026551 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:01:24.026466 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ppk4t_238e19ca-102f-43b1-8aed-9322ca47bfc9/ovn-acl-logging/0.log" Apr 28 20:01:24.033823 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:01:24.033800 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ppk4t_238e19ca-102f-43b1-8aed-9322ca47bfc9/ovn-acl-logging/0.log" Apr 28 20:01:30.552316 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:01:30.552273 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-64fcb8589f-wgmk5" podUID="883551c2-5277-41f4-bb94-0872c41c6024" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 28 20:01:40.552046 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:01:40.552002 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-64fcb8589f-wgmk5" podUID="883551c2-5277-41f4-bb94-0872c41c6024" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 28 20:01:50.554765 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:01:50.554717 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-64fcb8589f-wgmk5" podUID="883551c2-5277-41f4-bb94-0872c41c6024" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 28 20:02:00.552903 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:00.552873 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-64fcb8589f-wgmk5" Apr 28 20:02:09.072681 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:09.072648 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-64fcb8589f-wgmk5"] Apr 28 20:02:09.073066 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:09.073003 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-64fcb8589f-wgmk5" podUID="883551c2-5277-41f4-bb94-0872c41c6024" containerName="kube-rbac-proxy" containerID="cri-o://c688c19c5ccb7f2cbb0e1403c9aba023d18c858d461b7130a4a2eb4bfaf1985c" gracePeriod=30 Apr 28 20:02:09.073121 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:09.072983 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-64fcb8589f-wgmk5" podUID="883551c2-5277-41f4-bb94-0872c41c6024" containerName="kserve-container" containerID="cri-o://0308f76362b457b37f8ac7ffcc86c015791cc480a1b6fc00a9604689dbd45a62" gracePeriod=30 Apr 28 20:02:09.288071 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:09.288030 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-86d7579fd6-87x9p"] Apr 28 20:02:09.288356 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:09.288344 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="497238a1-bb9e-4d01-af16-2026e059363e" containerName="storage-initializer" Apr 28 20:02:09.288413 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:09.288358 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="497238a1-bb9e-4d01-af16-2026e059363e" containerName="storage-initializer" Apr 28 20:02:09.288413 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:09.288371 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="497238a1-bb9e-4d01-af16-2026e059363e" containerName="kserve-container" Apr 28 20:02:09.288413 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:09.288377 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="497238a1-bb9e-4d01-af16-2026e059363e" containerName="kserve-container" Apr 28 20:02:09.288413 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:09.288388 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="497238a1-bb9e-4d01-af16-2026e059363e" containerName="kube-rbac-proxy" Apr 28 20:02:09.288413 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:09.288394 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="497238a1-bb9e-4d01-af16-2026e059363e" containerName="kube-rbac-proxy" Apr 28 20:02:09.288600 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:09.288452 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="497238a1-bb9e-4d01-af16-2026e059363e" containerName="kserve-container" Apr 28 20:02:09.288600 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:09.288462 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="497238a1-bb9e-4d01-af16-2026e059363e" containerName="kube-rbac-proxy" Apr 28 20:02:09.291611 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:09.291593 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-86d7579fd6-87x9p" Apr 28 20:02:09.293542 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:09.293515 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-mixed-predictor-serving-cert\"" Apr 28 20:02:09.293542 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:09.293531 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\"" Apr 28 20:02:09.301386 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:09.301364 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-86d7579fd6-87x9p"] Apr 28 20:02:09.323795 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:09.323725 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v56fh\" (UniqueName: \"kubernetes.io/projected/2e2fe17e-4f41-40c5-b2fd-cf84687a5731-kube-api-access-v56fh\") pod \"isvc-sklearn-v2-mixed-predictor-86d7579fd6-87x9p\" (UID: \"2e2fe17e-4f41-40c5-b2fd-cf84687a5731\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-86d7579fd6-87x9p" Apr 28 20:02:09.323908 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:09.323793 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2e2fe17e-4f41-40c5-b2fd-cf84687a5731-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-mixed-predictor-86d7579fd6-87x9p\" (UID: \"2e2fe17e-4f41-40c5-b2fd-cf84687a5731\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-86d7579fd6-87x9p" Apr 28 20:02:09.323908 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:09.323813 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2e2fe17e-4f41-40c5-b2fd-cf84687a5731-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-86d7579fd6-87x9p\" (UID: \"2e2fe17e-4f41-40c5-b2fd-cf84687a5731\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-86d7579fd6-87x9p" Apr 28 20:02:09.323908 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:09.323834 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2e2fe17e-4f41-40c5-b2fd-cf84687a5731-proxy-tls\") pod \"isvc-sklearn-v2-mixed-predictor-86d7579fd6-87x9p\" (UID: \"2e2fe17e-4f41-40c5-b2fd-cf84687a5731\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-86d7579fd6-87x9p" Apr 28 20:02:09.424612 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:09.424579 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2e2fe17e-4f41-40c5-b2fd-cf84687a5731-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-mixed-predictor-86d7579fd6-87x9p\" (UID: \"2e2fe17e-4f41-40c5-b2fd-cf84687a5731\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-86d7579fd6-87x9p" Apr 28 20:02:09.424612 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:09.424615 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2e2fe17e-4f41-40c5-b2fd-cf84687a5731-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-86d7579fd6-87x9p\" (UID: \"2e2fe17e-4f41-40c5-b2fd-cf84687a5731\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-86d7579fd6-87x9p" Apr 28 20:02:09.424858 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:09.424640 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2e2fe17e-4f41-40c5-b2fd-cf84687a5731-proxy-tls\") pod \"isvc-sklearn-v2-mixed-predictor-86d7579fd6-87x9p\" (UID: \"2e2fe17e-4f41-40c5-b2fd-cf84687a5731\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-86d7579fd6-87x9p" Apr 28 20:02:09.424858 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:09.424668 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v56fh\" (UniqueName: \"kubernetes.io/projected/2e2fe17e-4f41-40c5-b2fd-cf84687a5731-kube-api-access-v56fh\") pod \"isvc-sklearn-v2-mixed-predictor-86d7579fd6-87x9p\" (UID: \"2e2fe17e-4f41-40c5-b2fd-cf84687a5731\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-86d7579fd6-87x9p" Apr 28 20:02:09.425049 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:09.425025 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2e2fe17e-4f41-40c5-b2fd-cf84687a5731-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-86d7579fd6-87x9p\" (UID: \"2e2fe17e-4f41-40c5-b2fd-cf84687a5731\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-86d7579fd6-87x9p" Apr 28 20:02:09.425345 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:09.425325 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2e2fe17e-4f41-40c5-b2fd-cf84687a5731-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-mixed-predictor-86d7579fd6-87x9p\" (UID: \"2e2fe17e-4f41-40c5-b2fd-cf84687a5731\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-86d7579fd6-87x9p" Apr 28 20:02:09.427082 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:09.427064 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2e2fe17e-4f41-40c5-b2fd-cf84687a5731-proxy-tls\") pod \"isvc-sklearn-v2-mixed-predictor-86d7579fd6-87x9p\" (UID: \"2e2fe17e-4f41-40c5-b2fd-cf84687a5731\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-86d7579fd6-87x9p" Apr 28 20:02:09.431932 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:09.431910 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v56fh\" (UniqueName: \"kubernetes.io/projected/2e2fe17e-4f41-40c5-b2fd-cf84687a5731-kube-api-access-v56fh\") pod \"isvc-sklearn-v2-mixed-predictor-86d7579fd6-87x9p\" (UID: \"2e2fe17e-4f41-40c5-b2fd-cf84687a5731\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-86d7579fd6-87x9p" Apr 28 20:02:09.602863 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:09.602767 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-86d7579fd6-87x9p" Apr 28 20:02:09.724759 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:09.724589 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-86d7579fd6-87x9p"] Apr 28 20:02:09.727492 ip-10-0-139-128 kubenswrapper[2571]: W0428 20:02:09.727450 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e2fe17e_4f41_40c5_b2fd_cf84687a5731.slice/crio-aabb7e05acc7ce1b5a02ac67320443e0779a8762beeae3044a6d08886b8278ec WatchSource:0}: Error finding container aabb7e05acc7ce1b5a02ac67320443e0779a8762beeae3044a6d08886b8278ec: Status 404 returned error can't find the container with id aabb7e05acc7ce1b5a02ac67320443e0779a8762beeae3044a6d08886b8278ec Apr 28 20:02:09.729302 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:09.729281 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 20:02:09.770630 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:09.770602 2571 generic.go:358] "Generic (PLEG): container finished" podID="883551c2-5277-41f4-bb94-0872c41c6024" containerID="c688c19c5ccb7f2cbb0e1403c9aba023d18c858d461b7130a4a2eb4bfaf1985c" exitCode=2 Apr 28 20:02:09.770763 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:09.770674 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-64fcb8589f-wgmk5" event={"ID":"883551c2-5277-41f4-bb94-0872c41c6024","Type":"ContainerDied","Data":"c688c19c5ccb7f2cbb0e1403c9aba023d18c858d461b7130a4a2eb4bfaf1985c"} Apr 28 20:02:09.771653 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:09.771632 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-86d7579fd6-87x9p" event={"ID":"2e2fe17e-4f41-40c5-b2fd-cf84687a5731","Type":"ContainerStarted","Data":"aabb7e05acc7ce1b5a02ac67320443e0779a8762beeae3044a6d08886b8278ec"} Apr 28 20:02:10.546629 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:10.546582 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-64fcb8589f-wgmk5" podUID="883551c2-5277-41f4-bb94-0872c41c6024" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.46:8643/healthz\": dial tcp 10.134.0.46:8643: connect: connection refused" Apr 28 20:02:10.552734 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:10.552701 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-64fcb8589f-wgmk5" podUID="883551c2-5277-41f4-bb94-0872c41c6024" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 28 20:02:10.776586 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:10.776550 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-86d7579fd6-87x9p" event={"ID":"2e2fe17e-4f41-40c5-b2fd-cf84687a5731","Type":"ContainerStarted","Data":"625d9773f7c873a5baa6027fda798a8ce834777338bafbcc7fb71248b73326c2"} Apr 28 20:02:13.709312 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:13.709286 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-64fcb8589f-wgmk5" Apr 28 20:02:13.752649 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:13.752576 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/883551c2-5277-41f4-bb94-0872c41c6024-proxy-tls\") pod \"883551c2-5277-41f4-bb94-0872c41c6024\" (UID: \"883551c2-5277-41f4-bb94-0872c41c6024\") " Apr 28 20:02:13.752649 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:13.752617 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/883551c2-5277-41f4-bb94-0872c41c6024-kserve-provision-location\") pod \"883551c2-5277-41f4-bb94-0872c41c6024\" (UID: \"883551c2-5277-41f4-bb94-0872c41c6024\") " Apr 28 20:02:13.752649 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:13.752637 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/883551c2-5277-41f4-bb94-0872c41c6024-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"883551c2-5277-41f4-bb94-0872c41c6024\" (UID: \"883551c2-5277-41f4-bb94-0872c41c6024\") " Apr 28 20:02:13.752913 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:13.752674 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htwdw\" (UniqueName: \"kubernetes.io/projected/883551c2-5277-41f4-bb94-0872c41c6024-kube-api-access-htwdw\") pod \"883551c2-5277-41f4-bb94-0872c41c6024\" (UID: \"883551c2-5277-41f4-bb94-0872c41c6024\") " Apr 28 20:02:13.752988 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:13.752940 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/883551c2-5277-41f4-bb94-0872c41c6024-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "883551c2-5277-41f4-bb94-0872c41c6024" (UID: "883551c2-5277-41f4-bb94-0872c41c6024"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:02:13.753037 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:13.752980 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/883551c2-5277-41f4-bb94-0872c41c6024-isvc-sklearn-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-v2-kube-rbac-proxy-sar-config") pod "883551c2-5277-41f4-bb94-0872c41c6024" (UID: "883551c2-5277-41f4-bb94-0872c41c6024"). InnerVolumeSpecName "isvc-sklearn-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 20:02:13.754701 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:13.754676 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/883551c2-5277-41f4-bb94-0872c41c6024-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "883551c2-5277-41f4-bb94-0872c41c6024" (UID: "883551c2-5277-41f4-bb94-0872c41c6024"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 20:02:13.754802 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:13.754717 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/883551c2-5277-41f4-bb94-0872c41c6024-kube-api-access-htwdw" (OuterVolumeSpecName: "kube-api-access-htwdw") pod "883551c2-5277-41f4-bb94-0872c41c6024" (UID: "883551c2-5277-41f4-bb94-0872c41c6024"). InnerVolumeSpecName "kube-api-access-htwdw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 20:02:13.787272 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:13.787245 2571 generic.go:358] "Generic (PLEG): container finished" podID="2e2fe17e-4f41-40c5-b2fd-cf84687a5731" containerID="625d9773f7c873a5baa6027fda798a8ce834777338bafbcc7fb71248b73326c2" exitCode=0 Apr 28 20:02:13.787427 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:13.787323 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-86d7579fd6-87x9p" event={"ID":"2e2fe17e-4f41-40c5-b2fd-cf84687a5731","Type":"ContainerDied","Data":"625d9773f7c873a5baa6027fda798a8ce834777338bafbcc7fb71248b73326c2"} Apr 28 20:02:13.789016 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:13.788995 2571 generic.go:358] "Generic (PLEG): container finished" podID="883551c2-5277-41f4-bb94-0872c41c6024" containerID="0308f76362b457b37f8ac7ffcc86c015791cc480a1b6fc00a9604689dbd45a62" exitCode=0 Apr 28 20:02:13.789111 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:13.789024 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-64fcb8589f-wgmk5" event={"ID":"883551c2-5277-41f4-bb94-0872c41c6024","Type":"ContainerDied","Data":"0308f76362b457b37f8ac7ffcc86c015791cc480a1b6fc00a9604689dbd45a62"} Apr 28 20:02:13.789111 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:13.789042 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-64fcb8589f-wgmk5" event={"ID":"883551c2-5277-41f4-bb94-0872c41c6024","Type":"ContainerDied","Data":"36ef917400d911b427e53eaedfff9c9036d476580232f0af07bf396dd78f0783"} Apr 28 20:02:13.789111 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:13.789056 2571 scope.go:117] "RemoveContainer" containerID="c688c19c5ccb7f2cbb0e1403c9aba023d18c858d461b7130a4a2eb4bfaf1985c" Apr 28 20:02:13.789111 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:13.789063 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-64fcb8589f-wgmk5" Apr 28 20:02:13.797614 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:13.797563 2571 scope.go:117] "RemoveContainer" containerID="0308f76362b457b37f8ac7ffcc86c015791cc480a1b6fc00a9604689dbd45a62" Apr 28 20:02:13.804544 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:13.804521 2571 scope.go:117] "RemoveContainer" containerID="97dd3e6636691625d3af6efecbe6790b7668dc15358d39b37a0211028ab37398" Apr 28 20:02:13.812526 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:13.812508 2571 scope.go:117] "RemoveContainer" containerID="c688c19c5ccb7f2cbb0e1403c9aba023d18c858d461b7130a4a2eb4bfaf1985c" Apr 28 20:02:13.812795 ip-10-0-139-128 kubenswrapper[2571]: E0428 20:02:13.812775 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c688c19c5ccb7f2cbb0e1403c9aba023d18c858d461b7130a4a2eb4bfaf1985c\": container with ID starting with c688c19c5ccb7f2cbb0e1403c9aba023d18c858d461b7130a4a2eb4bfaf1985c not found: ID does not exist" containerID="c688c19c5ccb7f2cbb0e1403c9aba023d18c858d461b7130a4a2eb4bfaf1985c" Apr 28 20:02:13.812842 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:13.812804 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c688c19c5ccb7f2cbb0e1403c9aba023d18c858d461b7130a4a2eb4bfaf1985c"} err="failed to get container status \"c688c19c5ccb7f2cbb0e1403c9aba023d18c858d461b7130a4a2eb4bfaf1985c\": rpc error: code = NotFound desc = could not find container \"c688c19c5ccb7f2cbb0e1403c9aba023d18c858d461b7130a4a2eb4bfaf1985c\": container with ID starting with c688c19c5ccb7f2cbb0e1403c9aba023d18c858d461b7130a4a2eb4bfaf1985c not found: ID does not exist" Apr 28 20:02:13.812842 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:13.812821 2571 scope.go:117] "RemoveContainer" containerID="0308f76362b457b37f8ac7ffcc86c015791cc480a1b6fc00a9604689dbd45a62" Apr 28 20:02:13.813043 ip-10-0-139-128 kubenswrapper[2571]: E0428 20:02:13.813025 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0308f76362b457b37f8ac7ffcc86c015791cc480a1b6fc00a9604689dbd45a62\": container with ID starting with 0308f76362b457b37f8ac7ffcc86c015791cc480a1b6fc00a9604689dbd45a62 not found: ID does not exist" containerID="0308f76362b457b37f8ac7ffcc86c015791cc480a1b6fc00a9604689dbd45a62" Apr 28 20:02:13.813101 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:13.813052 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0308f76362b457b37f8ac7ffcc86c015791cc480a1b6fc00a9604689dbd45a62"} err="failed to get container status \"0308f76362b457b37f8ac7ffcc86c015791cc480a1b6fc00a9604689dbd45a62\": rpc error: code = NotFound desc = could not find container \"0308f76362b457b37f8ac7ffcc86c015791cc480a1b6fc00a9604689dbd45a62\": container with ID starting with 0308f76362b457b37f8ac7ffcc86c015791cc480a1b6fc00a9604689dbd45a62 not found: ID does not exist" Apr 28 20:02:13.813101 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:13.813073 2571 scope.go:117] "RemoveContainer" containerID="97dd3e6636691625d3af6efecbe6790b7668dc15358d39b37a0211028ab37398" Apr 28 20:02:13.813273 ip-10-0-139-128 kubenswrapper[2571]: E0428 20:02:13.813258 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97dd3e6636691625d3af6efecbe6790b7668dc15358d39b37a0211028ab37398\": container with ID starting with 97dd3e6636691625d3af6efecbe6790b7668dc15358d39b37a0211028ab37398 not found: ID does not exist" containerID="97dd3e6636691625d3af6efecbe6790b7668dc15358d39b37a0211028ab37398" Apr 28 20:02:13.813319 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:13.813276 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97dd3e6636691625d3af6efecbe6790b7668dc15358d39b37a0211028ab37398"} err="failed to get container status \"97dd3e6636691625d3af6efecbe6790b7668dc15358d39b37a0211028ab37398\": rpc error: code = NotFound desc = could not find container \"97dd3e6636691625d3af6efecbe6790b7668dc15358d39b37a0211028ab37398\": container with ID starting with 97dd3e6636691625d3af6efecbe6790b7668dc15358d39b37a0211028ab37398 not found: ID does not exist" Apr 28 20:02:13.817603 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:13.817580 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-64fcb8589f-wgmk5"] Apr 28 20:02:13.821191 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:13.821169 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-64fcb8589f-wgmk5"] Apr 28 20:02:13.853205 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:13.853184 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/883551c2-5277-41f4-bb94-0872c41c6024-proxy-tls\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 20:02:13.853298 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:13.853210 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/883551c2-5277-41f4-bb94-0872c41c6024-kserve-provision-location\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 20:02:13.853298 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:13.853226 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/883551c2-5277-41f4-bb94-0872c41c6024-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 20:02:13.853298 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:13.853243 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-htwdw\" (UniqueName: \"kubernetes.io/projected/883551c2-5277-41f4-bb94-0872c41c6024-kube-api-access-htwdw\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 20:02:13.947738 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:13.947708 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="883551c2-5277-41f4-bb94-0872c41c6024" path="/var/lib/kubelet/pods/883551c2-5277-41f4-bb94-0872c41c6024/volumes" Apr 28 20:02:14.794304 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:14.794270 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-86d7579fd6-87x9p" event={"ID":"2e2fe17e-4f41-40c5-b2fd-cf84687a5731","Type":"ContainerStarted","Data":"8f872a9088f6b81ecdc492d6e59bc5e9f815e688744cc8400fc053ab099ee5b3"} Apr 28 20:02:14.794699 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:14.794313 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-86d7579fd6-87x9p" event={"ID":"2e2fe17e-4f41-40c5-b2fd-cf84687a5731","Type":"ContainerStarted","Data":"a53055097d97da03b4a416731cbe63e127aa91217bda7cfcad777edd2314dcf6"} Apr 28 20:02:14.794699 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:14.794599 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-86d7579fd6-87x9p" Apr 28 20:02:14.794816 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:14.794726 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-86d7579fd6-87x9p" Apr 28 20:02:14.795758 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:14.795733 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-86d7579fd6-87x9p" podUID="2e2fe17e-4f41-40c5-b2fd-cf84687a5731" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 28 20:02:14.815050 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:14.815015 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-86d7579fd6-87x9p" podStartSLOduration=5.815003574 podStartE2EDuration="5.815003574s" podCreationTimestamp="2026-04-28 20:02:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 20:02:14.814798879 +0000 UTC m=+2751.445699825" watchObservedRunningTime="2026-04-28 20:02:14.815003574 +0000 UTC m=+2751.445904519" Apr 28 20:02:15.797638 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:15.797601 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-86d7579fd6-87x9p" podUID="2e2fe17e-4f41-40c5-b2fd-cf84687a5731" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 28 20:02:20.802336 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:20.802304 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-86d7579fd6-87x9p" Apr 28 20:02:20.802964 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:20.802926 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-86d7579fd6-87x9p" podUID="2e2fe17e-4f41-40c5-b2fd-cf84687a5731" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 28 20:02:30.803784 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:30.803741 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-86d7579fd6-87x9p" podUID="2e2fe17e-4f41-40c5-b2fd-cf84687a5731" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 28 20:02:40.802991 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:40.802903 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-86d7579fd6-87x9p" podUID="2e2fe17e-4f41-40c5-b2fd-cf84687a5731" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 28 20:02:50.803697 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:02:50.803658 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-86d7579fd6-87x9p" podUID="2e2fe17e-4f41-40c5-b2fd-cf84687a5731" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 28 20:03:00.803817 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:03:00.803778 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-86d7579fd6-87x9p" podUID="2e2fe17e-4f41-40c5-b2fd-cf84687a5731" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 28 20:03:10.803823 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:03:10.803780 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-86d7579fd6-87x9p" podUID="2e2fe17e-4f41-40c5-b2fd-cf84687a5731" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 28 20:03:20.804268 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:03:20.804236 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-86d7579fd6-87x9p" Apr 28 20:03:29.281249 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:03:29.281212 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-86d7579fd6-87x9p"] Apr 28 20:03:29.281733 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:03:29.281638 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-86d7579fd6-87x9p" podUID="2e2fe17e-4f41-40c5-b2fd-cf84687a5731" containerName="kserve-container" containerID="cri-o://a53055097d97da03b4a416731cbe63e127aa91217bda7cfcad777edd2314dcf6" gracePeriod=30 Apr 28 20:03:29.281811 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:03:29.281688 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-86d7579fd6-87x9p" podUID="2e2fe17e-4f41-40c5-b2fd-cf84687a5731" containerName="kube-rbac-proxy" containerID="cri-o://8f872a9088f6b81ecdc492d6e59bc5e9f815e688744cc8400fc053ab099ee5b3" gracePeriod=30 Apr 28 20:03:30.010433 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:03:30.010399 2571 generic.go:358] "Generic (PLEG): container finished" podID="2e2fe17e-4f41-40c5-b2fd-cf84687a5731" containerID="8f872a9088f6b81ecdc492d6e59bc5e9f815e688744cc8400fc053ab099ee5b3" exitCode=2 Apr 28 20:03:30.010631 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:03:30.010439 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-86d7579fd6-87x9p" event={"ID":"2e2fe17e-4f41-40c5-b2fd-cf84687a5731","Type":"ContainerDied","Data":"8f872a9088f6b81ecdc492d6e59bc5e9f815e688744cc8400fc053ab099ee5b3"} Apr 28 20:03:30.798232 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:03:30.798190 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-86d7579fd6-87x9p" podUID="2e2fe17e-4f41-40c5-b2fd-cf84687a5731" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.47:8643/healthz\": dial tcp 10.134.0.47:8643: connect: connection refused" Apr 28 20:03:30.803236 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:03:30.803209 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-86d7579fd6-87x9p" podUID="2e2fe17e-4f41-40c5-b2fd-cf84687a5731" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 28 20:03:33.926509 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:03:33.926467 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-86d7579fd6-87x9p" Apr 28 20:03:34.002791 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:03:34.002707 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2e2fe17e-4f41-40c5-b2fd-cf84687a5731-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") pod \"2e2fe17e-4f41-40c5-b2fd-cf84687a5731\" (UID: \"2e2fe17e-4f41-40c5-b2fd-cf84687a5731\") " Apr 28 20:03:34.002936 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:03:34.002794 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v56fh\" (UniqueName: \"kubernetes.io/projected/2e2fe17e-4f41-40c5-b2fd-cf84687a5731-kube-api-access-v56fh\") pod \"2e2fe17e-4f41-40c5-b2fd-cf84687a5731\" (UID: \"2e2fe17e-4f41-40c5-b2fd-cf84687a5731\") " Apr 28 20:03:34.002936 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:03:34.002822 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2e2fe17e-4f41-40c5-b2fd-cf84687a5731-proxy-tls\") pod \"2e2fe17e-4f41-40c5-b2fd-cf84687a5731\" (UID: \"2e2fe17e-4f41-40c5-b2fd-cf84687a5731\") " Apr 28 20:03:34.002936 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:03:34.002856 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2e2fe17e-4f41-40c5-b2fd-cf84687a5731-kserve-provision-location\") pod \"2e2fe17e-4f41-40c5-b2fd-cf84687a5731\" (UID: \"2e2fe17e-4f41-40c5-b2fd-cf84687a5731\") " Apr 28 20:03:34.003239 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:03:34.003208 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e2fe17e-4f41-40c5-b2fd-cf84687a5731-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config") pod "2e2fe17e-4f41-40c5-b2fd-cf84687a5731" (UID: "2e2fe17e-4f41-40c5-b2fd-cf84687a5731"). InnerVolumeSpecName "isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 20:03:34.003308 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:03:34.003215 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e2fe17e-4f41-40c5-b2fd-cf84687a5731-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2e2fe17e-4f41-40c5-b2fd-cf84687a5731" (UID: "2e2fe17e-4f41-40c5-b2fd-cf84687a5731"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:03:34.004936 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:03:34.004909 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e2fe17e-4f41-40c5-b2fd-cf84687a5731-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "2e2fe17e-4f41-40c5-b2fd-cf84687a5731" (UID: "2e2fe17e-4f41-40c5-b2fd-cf84687a5731"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 20:03:34.005033 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:03:34.004993 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e2fe17e-4f41-40c5-b2fd-cf84687a5731-kube-api-access-v56fh" (OuterVolumeSpecName: "kube-api-access-v56fh") pod "2e2fe17e-4f41-40c5-b2fd-cf84687a5731" (UID: "2e2fe17e-4f41-40c5-b2fd-cf84687a5731"). InnerVolumeSpecName "kube-api-access-v56fh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 20:03:34.024049 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:03:34.024021 2571 generic.go:358] "Generic (PLEG): container finished" podID="2e2fe17e-4f41-40c5-b2fd-cf84687a5731" containerID="a53055097d97da03b4a416731cbe63e127aa91217bda7cfcad777edd2314dcf6" exitCode=0 Apr 28 20:03:34.024168 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:03:34.024079 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-86d7579fd6-87x9p" event={"ID":"2e2fe17e-4f41-40c5-b2fd-cf84687a5731","Type":"ContainerDied","Data":"a53055097d97da03b4a416731cbe63e127aa91217bda7cfcad777edd2314dcf6"} Apr 28 20:03:34.024168 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:03:34.024113 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-86d7579fd6-87x9p" event={"ID":"2e2fe17e-4f41-40c5-b2fd-cf84687a5731","Type":"ContainerDied","Data":"aabb7e05acc7ce1b5a02ac67320443e0779a8762beeae3044a6d08886b8278ec"} Apr 28 20:03:34.024168 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:03:34.024121 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-86d7579fd6-87x9p" Apr 28 20:03:34.024168 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:03:34.024133 2571 scope.go:117] "RemoveContainer" containerID="8f872a9088f6b81ecdc492d6e59bc5e9f815e688744cc8400fc053ab099ee5b3" Apr 28 20:03:34.032350 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:03:34.032332 2571 scope.go:117] "RemoveContainer" containerID="a53055097d97da03b4a416731cbe63e127aa91217bda7cfcad777edd2314dcf6" Apr 28 20:03:34.039285 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:03:34.039267 2571 scope.go:117] "RemoveContainer" containerID="625d9773f7c873a5baa6027fda798a8ce834777338bafbcc7fb71248b73326c2" Apr 28 20:03:34.046272 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:03:34.046255 2571 scope.go:117] "RemoveContainer" containerID="8f872a9088f6b81ecdc492d6e59bc5e9f815e688744cc8400fc053ab099ee5b3" Apr 28 20:03:34.046420 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:03:34.046401 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-86d7579fd6-87x9p"] Apr 28 20:03:34.046560 ip-10-0-139-128 kubenswrapper[2571]: E0428 20:03:34.046541 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f872a9088f6b81ecdc492d6e59bc5e9f815e688744cc8400fc053ab099ee5b3\": container with ID starting with 8f872a9088f6b81ecdc492d6e59bc5e9f815e688744cc8400fc053ab099ee5b3 not found: ID does not exist" containerID="8f872a9088f6b81ecdc492d6e59bc5e9f815e688744cc8400fc053ab099ee5b3" Apr 28 20:03:34.046617 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:03:34.046568 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f872a9088f6b81ecdc492d6e59bc5e9f815e688744cc8400fc053ab099ee5b3"} err="failed to get container status \"8f872a9088f6b81ecdc492d6e59bc5e9f815e688744cc8400fc053ab099ee5b3\": rpc error: code = NotFound desc = could not find container \"8f872a9088f6b81ecdc492d6e59bc5e9f815e688744cc8400fc053ab099ee5b3\": container with ID starting with 8f872a9088f6b81ecdc492d6e59bc5e9f815e688744cc8400fc053ab099ee5b3 not found: ID does not exist" Apr 28 20:03:34.046617 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:03:34.046586 2571 scope.go:117] "RemoveContainer" containerID="a53055097d97da03b4a416731cbe63e127aa91217bda7cfcad777edd2314dcf6" Apr 28 20:03:34.046838 ip-10-0-139-128 kubenswrapper[2571]: E0428 20:03:34.046822 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a53055097d97da03b4a416731cbe63e127aa91217bda7cfcad777edd2314dcf6\": container with ID starting with a53055097d97da03b4a416731cbe63e127aa91217bda7cfcad777edd2314dcf6 not found: ID does not exist" containerID="a53055097d97da03b4a416731cbe63e127aa91217bda7cfcad777edd2314dcf6" Apr 28 20:03:34.046904 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:03:34.046847 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a53055097d97da03b4a416731cbe63e127aa91217bda7cfcad777edd2314dcf6"} err="failed to get container status \"a53055097d97da03b4a416731cbe63e127aa91217bda7cfcad777edd2314dcf6\": rpc error: code = NotFound desc = could not find container \"a53055097d97da03b4a416731cbe63e127aa91217bda7cfcad777edd2314dcf6\": container with ID starting with a53055097d97da03b4a416731cbe63e127aa91217bda7cfcad777edd2314dcf6 not found: ID does not exist" Apr 28 20:03:34.046904 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:03:34.046869 2571 scope.go:117] "RemoveContainer" containerID="625d9773f7c873a5baa6027fda798a8ce834777338bafbcc7fb71248b73326c2" Apr 28 20:03:34.047116 ip-10-0-139-128 kubenswrapper[2571]: E0428 20:03:34.047090 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"625d9773f7c873a5baa6027fda798a8ce834777338bafbcc7fb71248b73326c2\": container with ID starting with 625d9773f7c873a5baa6027fda798a8ce834777338bafbcc7fb71248b73326c2 not found: ID does not exist" containerID="625d9773f7c873a5baa6027fda798a8ce834777338bafbcc7fb71248b73326c2" Apr 28 20:03:34.047166 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:03:34.047111 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"625d9773f7c873a5baa6027fda798a8ce834777338bafbcc7fb71248b73326c2"} err="failed to get container status \"625d9773f7c873a5baa6027fda798a8ce834777338bafbcc7fb71248b73326c2\": rpc error: code = NotFound desc = could not find container \"625d9773f7c873a5baa6027fda798a8ce834777338bafbcc7fb71248b73326c2\": container with ID starting with 625d9773f7c873a5baa6027fda798a8ce834777338bafbcc7fb71248b73326c2 not found: ID does not exist" Apr 28 20:03:34.050291 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:03:34.050270 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-86d7579fd6-87x9p"] Apr 28 20:03:34.103983 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:03:34.103949 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2e2fe17e-4f41-40c5-b2fd-cf84687a5731-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 20:03:34.103983 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:03:34.103980 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v56fh\" (UniqueName: \"kubernetes.io/projected/2e2fe17e-4f41-40c5-b2fd-cf84687a5731-kube-api-access-v56fh\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 20:03:34.103983 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:03:34.103992 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2e2fe17e-4f41-40c5-b2fd-cf84687a5731-proxy-tls\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 20:03:34.104220 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:03:34.104001 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2e2fe17e-4f41-40c5-b2fd-cf84687a5731-kserve-provision-location\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 20:03:35.949180 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:03:35.949146 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e2fe17e-4f41-40c5-b2fd-cf84687a5731" path="/var/lib/kubelet/pods/2e2fe17e-4f41-40c5-b2fd-cf84687a5731/volumes" Apr 28 20:06:24.048915 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:06:24.048889 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ppk4t_238e19ca-102f-43b1-8aed-9322ca47bfc9/ovn-acl-logging/0.log" Apr 28 20:06:24.056453 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:06:24.056431 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ppk4t_238e19ca-102f-43b1-8aed-9322ca47bfc9/ovn-acl-logging/0.log" Apr 28 20:08:48.441945 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:08:48.441856 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-cq8m2"] Apr 28 20:08:48.444263 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:08:48.442188 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="883551c2-5277-41f4-bb94-0872c41c6024" containerName="storage-initializer" Apr 28 20:08:48.444263 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:08:48.442199 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="883551c2-5277-41f4-bb94-0872c41c6024" containerName="storage-initializer" Apr 28 20:08:48.444263 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:08:48.442212 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="883551c2-5277-41f4-bb94-0872c41c6024" containerName="kserve-container" Apr 28 20:08:48.444263 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:08:48.442218 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="883551c2-5277-41f4-bb94-0872c41c6024" containerName="kserve-container" Apr 28 20:08:48.444263 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:08:48.442227 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2e2fe17e-4f41-40c5-b2fd-cf84687a5731" containerName="kube-rbac-proxy" Apr 28 20:08:48.444263 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:08:48.442232 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e2fe17e-4f41-40c5-b2fd-cf84687a5731" containerName="kube-rbac-proxy" Apr 28 20:08:48.444263 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:08:48.442240 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2e2fe17e-4f41-40c5-b2fd-cf84687a5731" containerName="storage-initializer" Apr 28 20:08:48.444263 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:08:48.442246 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e2fe17e-4f41-40c5-b2fd-cf84687a5731" containerName="storage-initializer" Apr 28 20:08:48.444263 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:08:48.442255 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2e2fe17e-4f41-40c5-b2fd-cf84687a5731" containerName="kserve-container" Apr 28 20:08:48.444263 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:08:48.442261 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e2fe17e-4f41-40c5-b2fd-cf84687a5731" containerName="kserve-container" Apr 28 20:08:48.444263 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:08:48.442268 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="883551c2-5277-41f4-bb94-0872c41c6024" containerName="kube-rbac-proxy" Apr 28 20:08:48.444263 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:08:48.442273 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="883551c2-5277-41f4-bb94-0872c41c6024" containerName="kube-rbac-proxy" Apr 28 20:08:48.444263 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:08:48.442320 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="2e2fe17e-4f41-40c5-b2fd-cf84687a5731" containerName="kserve-container" Apr 28 20:08:48.444263 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:08:48.442329 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="883551c2-5277-41f4-bb94-0872c41c6024" containerName="kserve-container" Apr 28 20:08:48.444263 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:08:48.442334 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="883551c2-5277-41f4-bb94-0872c41c6024" containerName="kube-rbac-proxy" Apr 28 20:08:48.444263 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:08:48.442341 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="2e2fe17e-4f41-40c5-b2fd-cf84687a5731" containerName="kube-rbac-proxy" Apr 28 20:08:48.445217 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:08:48.445202 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-cq8m2" Apr 28 20:08:48.447356 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:08:48.447326 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 28 20:08:48.447356 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:08:48.447349 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\"" Apr 28 20:08:48.447931 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:08:48.447911 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-mlserver-predictor-serving-cert\"" Apr 28 20:08:48.448036 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:08:48.447914 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 28 20:08:48.448036 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:08:48.447948 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-44jch\"" Apr 28 20:08:48.456364 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:08:48.456343 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-cq8m2"] Apr 28 20:08:48.532278 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:08:48.532232 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j58m8\" (UniqueName: \"kubernetes.io/projected/bc70ef47-0fc1-4677-a17a-e92cd4c8016a-kube-api-access-j58m8\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-cq8m2\" (UID: \"bc70ef47-0fc1-4677-a17a-e92cd4c8016a\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-cq8m2" Apr 28 20:08:48.532514 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:08:48.532295 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bc70ef47-0fc1-4677-a17a-e92cd4c8016a-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-cq8m2\" (UID: \"bc70ef47-0fc1-4677-a17a-e92cd4c8016a\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-cq8m2" Apr 28 20:08:48.532514 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:08:48.532356 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bc70ef47-0fc1-4677-a17a-e92cd4c8016a-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-cq8m2\" (UID: \"bc70ef47-0fc1-4677-a17a-e92cd4c8016a\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-cq8m2" Apr 28 20:08:48.532514 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:08:48.532437 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bc70ef47-0fc1-4677-a17a-e92cd4c8016a-proxy-tls\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-cq8m2\" (UID: \"bc70ef47-0fc1-4677-a17a-e92cd4c8016a\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-cq8m2" Apr 28 20:08:48.633588 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:08:48.633558 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bc70ef47-0fc1-4677-a17a-e92cd4c8016a-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-cq8m2\" (UID: \"bc70ef47-0fc1-4677-a17a-e92cd4c8016a\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-cq8m2" Apr 28 20:08:48.633588 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:08:48.633591 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bc70ef47-0fc1-4677-a17a-e92cd4c8016a-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-cq8m2\" (UID: \"bc70ef47-0fc1-4677-a17a-e92cd4c8016a\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-cq8m2" Apr 28 20:08:48.633843 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:08:48.633621 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bc70ef47-0fc1-4677-a17a-e92cd4c8016a-proxy-tls\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-cq8m2\" (UID: \"bc70ef47-0fc1-4677-a17a-e92cd4c8016a\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-cq8m2" Apr 28 20:08:48.633843 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:08:48.633664 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j58m8\" (UniqueName: \"kubernetes.io/projected/bc70ef47-0fc1-4677-a17a-e92cd4c8016a-kube-api-access-j58m8\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-cq8m2\" (UID: \"bc70ef47-0fc1-4677-a17a-e92cd4c8016a\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-cq8m2" Apr 28 20:08:48.634053 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:08:48.634030 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bc70ef47-0fc1-4677-a17a-e92cd4c8016a-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-cq8m2\" (UID: \"bc70ef47-0fc1-4677-a17a-e92cd4c8016a\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-cq8m2" Apr 28 20:08:48.634219 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:08:48.634199 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bc70ef47-0fc1-4677-a17a-e92cd4c8016a-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-cq8m2\" (UID: \"bc70ef47-0fc1-4677-a17a-e92cd4c8016a\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-cq8m2" Apr 28 20:08:48.636077 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:08:48.636054 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bc70ef47-0fc1-4677-a17a-e92cd4c8016a-proxy-tls\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-cq8m2\" (UID: \"bc70ef47-0fc1-4677-a17a-e92cd4c8016a\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-cq8m2" Apr 28 20:08:48.641968 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:08:48.641947 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j58m8\" (UniqueName: \"kubernetes.io/projected/bc70ef47-0fc1-4677-a17a-e92cd4c8016a-kube-api-access-j58m8\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-cq8m2\" (UID: \"bc70ef47-0fc1-4677-a17a-e92cd4c8016a\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-cq8m2" Apr 28 20:08:48.755666 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:08:48.755567 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-cq8m2" Apr 28 20:08:48.877495 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:08:48.877445 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-cq8m2"] Apr 28 20:08:48.880928 ip-10-0-139-128 kubenswrapper[2571]: W0428 20:08:48.880890 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc70ef47_0fc1_4677_a17a_e92cd4c8016a.slice/crio-fc4661ee6ef750f88794e20641ba48979d240fed1afae3005bb8c9c9d4cd3b54 WatchSource:0}: Error finding container fc4661ee6ef750f88794e20641ba48979d240fed1afae3005bb8c9c9d4cd3b54: Status 404 returned error can't find the container with id fc4661ee6ef750f88794e20641ba48979d240fed1afae3005bb8c9c9d4cd3b54 Apr 28 20:08:48.882825 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:08:48.882804 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 20:08:48.967436 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:08:48.967403 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-cq8m2" event={"ID":"bc70ef47-0fc1-4677-a17a-e92cd4c8016a","Type":"ContainerStarted","Data":"03c3abfa1dce7e6022179fb2bc9e6f24842c171f04f22ff28bee370670cecfe0"} Apr 28 20:08:48.967436 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:08:48.967439 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-cq8m2" event={"ID":"bc70ef47-0fc1-4677-a17a-e92cd4c8016a","Type":"ContainerStarted","Data":"fc4661ee6ef750f88794e20641ba48979d240fed1afae3005bb8c9c9d4cd3b54"} Apr 28 20:08:52.980376 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:08:52.980344 2571 generic.go:358] "Generic (PLEG): container finished" podID="bc70ef47-0fc1-4677-a17a-e92cd4c8016a" containerID="03c3abfa1dce7e6022179fb2bc9e6f24842c171f04f22ff28bee370670cecfe0" exitCode=0 Apr 28 20:08:52.980817 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:08:52.980406 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-cq8m2" event={"ID":"bc70ef47-0fc1-4677-a17a-e92cd4c8016a","Type":"ContainerDied","Data":"03c3abfa1dce7e6022179fb2bc9e6f24842c171f04f22ff28bee370670cecfe0"} Apr 28 20:08:53.985237 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:08:53.985197 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-cq8m2" event={"ID":"bc70ef47-0fc1-4677-a17a-e92cd4c8016a","Type":"ContainerStarted","Data":"022ab50c18ad3f2b0f32ef8a151f710373b3d4a6ae09d2b434264d1ea426dfff"} Apr 28 20:08:53.985237 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:08:53.985240 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-cq8m2" event={"ID":"bc70ef47-0fc1-4677-a17a-e92cd4c8016a","Type":"ContainerStarted","Data":"5f6c314cd3bec63e4b28a02739a88adfa0ec470c6b2acd1a593a6c9742b1770b"} Apr 28 20:08:53.985698 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:08:53.985534 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-cq8m2" Apr 28 20:08:53.985698 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:08:53.985585 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-cq8m2" Apr 28 20:08:59.994449 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:08:59.994417 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-cq8m2" Apr 28 20:09:00.016890 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:00.016831 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-cq8m2" podStartSLOduration=12.016811516 podStartE2EDuration="12.016811516s" podCreationTimestamp="2026-04-28 20:08:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 20:08:54.010046147 +0000 UTC m=+3150.640947104" watchObservedRunningTime="2026-04-28 20:09:00.016811516 +0000 UTC m=+3156.647712462" Apr 28 20:09:29.998404 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:29.998366 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-cq8m2" Apr 28 20:09:38.475336 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:38.475300 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-cq8m2"] Apr 28 20:09:38.475837 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:38.475630 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-cq8m2" podUID="bc70ef47-0fc1-4677-a17a-e92cd4c8016a" containerName="kserve-container" containerID="cri-o://5f6c314cd3bec63e4b28a02739a88adfa0ec470c6b2acd1a593a6c9742b1770b" gracePeriod=30 Apr 28 20:09:38.475837 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:38.475654 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-cq8m2" podUID="bc70ef47-0fc1-4677-a17a-e92cd4c8016a" containerName="kube-rbac-proxy" containerID="cri-o://022ab50c18ad3f2b0f32ef8a151f710373b3d4a6ae09d2b434264d1ea426dfff" gracePeriod=30 Apr 28 20:09:38.592031 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:38.591998 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4hvs"] Apr 28 20:09:38.595500 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:38.595464 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4hvs" Apr 28 20:09:38.597671 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:38.597650 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\"" Apr 28 20:09:38.597999 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:38.597973 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"xgboost-v2-mlserver-predictor-serving-cert\"" Apr 28 20:09:38.607083 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:38.607059 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4hvs"] Apr 28 20:09:38.652460 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:38.652431 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59778918-1296-4825-8193-240d53917f27-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-v4hvs\" (UID: \"59778918-1296-4825-8193-240d53917f27\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4hvs" Apr 28 20:09:38.652601 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:38.652476 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/59778918-1296-4825-8193-240d53917f27-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-v4hvs\" (UID: \"59778918-1296-4825-8193-240d53917f27\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4hvs" Apr 28 20:09:38.652601 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:38.652594 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/59778918-1296-4825-8193-240d53917f27-proxy-tls\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-v4hvs\" (UID: \"59778918-1296-4825-8193-240d53917f27\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4hvs" Apr 28 20:09:38.652703 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:38.652635 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5n9q\" (UniqueName: \"kubernetes.io/projected/59778918-1296-4825-8193-240d53917f27-kube-api-access-s5n9q\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-v4hvs\" (UID: \"59778918-1296-4825-8193-240d53917f27\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4hvs" Apr 28 20:09:38.753472 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:38.753390 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/59778918-1296-4825-8193-240d53917f27-proxy-tls\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-v4hvs\" (UID: \"59778918-1296-4825-8193-240d53917f27\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4hvs" Apr 28 20:09:38.753472 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:38.753437 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s5n9q\" (UniqueName: \"kubernetes.io/projected/59778918-1296-4825-8193-240d53917f27-kube-api-access-s5n9q\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-v4hvs\" (UID: \"59778918-1296-4825-8193-240d53917f27\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4hvs" Apr 28 20:09:38.753674 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:38.753564 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59778918-1296-4825-8193-240d53917f27-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-v4hvs\" (UID: \"59778918-1296-4825-8193-240d53917f27\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4hvs" Apr 28 20:09:38.753674 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:38.753631 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/59778918-1296-4825-8193-240d53917f27-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-v4hvs\" (UID: \"59778918-1296-4825-8193-240d53917f27\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4hvs" Apr 28 20:09:38.753886 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:38.753867 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59778918-1296-4825-8193-240d53917f27-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-v4hvs\" (UID: \"59778918-1296-4825-8193-240d53917f27\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4hvs" Apr 28 20:09:38.754215 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:38.754196 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/59778918-1296-4825-8193-240d53917f27-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-v4hvs\" (UID: \"59778918-1296-4825-8193-240d53917f27\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4hvs" Apr 28 20:09:38.755890 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:38.755869 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/59778918-1296-4825-8193-240d53917f27-proxy-tls\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-v4hvs\" (UID: \"59778918-1296-4825-8193-240d53917f27\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4hvs" Apr 28 20:09:38.761861 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:38.761837 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5n9q\" (UniqueName: \"kubernetes.io/projected/59778918-1296-4825-8193-240d53917f27-kube-api-access-s5n9q\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-v4hvs\" (UID: \"59778918-1296-4825-8193-240d53917f27\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4hvs" Apr 28 20:09:38.905153 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:38.905105 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4hvs" Apr 28 20:09:39.032147 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:39.032099 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4hvs"] Apr 28 20:09:39.034440 ip-10-0-139-128 kubenswrapper[2571]: W0428 20:09:39.034410 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59778918_1296_4825_8193_240d53917f27.slice/crio-920d016f58e396ae5c71ca0d1a71dea339c2ef607592275f924d54be33eac713 WatchSource:0}: Error finding container 920d016f58e396ae5c71ca0d1a71dea339c2ef607592275f924d54be33eac713: Status 404 returned error can't find the container with id 920d016f58e396ae5c71ca0d1a71dea339c2ef607592275f924d54be33eac713 Apr 28 20:09:39.122298 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:39.122260 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4hvs" event={"ID":"59778918-1296-4825-8193-240d53917f27","Type":"ContainerStarted","Data":"ca2ddd2b2e13c2cd7ffd95c469dfd2e7e978e7b4628fc665769f9f2dde91ca64"} Apr 28 20:09:39.122298 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:39.122299 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4hvs" event={"ID":"59778918-1296-4825-8193-240d53917f27","Type":"ContainerStarted","Data":"920d016f58e396ae5c71ca0d1a71dea339c2ef607592275f924d54be33eac713"} Apr 28 20:09:39.124307 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:39.124281 2571 generic.go:358] "Generic (PLEG): container finished" podID="bc70ef47-0fc1-4677-a17a-e92cd4c8016a" containerID="022ab50c18ad3f2b0f32ef8a151f710373b3d4a6ae09d2b434264d1ea426dfff" exitCode=2 Apr 28 20:09:39.124421 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:39.124332 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-cq8m2" event={"ID":"bc70ef47-0fc1-4677-a17a-e92cd4c8016a","Type":"ContainerDied","Data":"022ab50c18ad3f2b0f32ef8a151f710373b3d4a6ae09d2b434264d1ea426dfff"} Apr 28 20:09:39.989747 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:39.989703 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-cq8m2" podUID="bc70ef47-0fc1-4677-a17a-e92cd4c8016a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.48:8643/healthz\": dial tcp 10.134.0.48:8643: connect: connection refused" Apr 28 20:09:43.138633 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:43.138541 2571 generic.go:358] "Generic (PLEG): container finished" podID="59778918-1296-4825-8193-240d53917f27" containerID="ca2ddd2b2e13c2cd7ffd95c469dfd2e7e978e7b4628fc665769f9f2dde91ca64" exitCode=0 Apr 28 20:09:43.138633 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:43.138603 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4hvs" event={"ID":"59778918-1296-4825-8193-240d53917f27","Type":"ContainerDied","Data":"ca2ddd2b2e13c2cd7ffd95c469dfd2e7e978e7b4628fc665769f9f2dde91ca64"} Apr 28 20:09:44.143297 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:44.143259 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4hvs" event={"ID":"59778918-1296-4825-8193-240d53917f27","Type":"ContainerStarted","Data":"2ce4bb9142de2bce99450b5d057e19301123ea56a900c04b5dd108d78c15169a"} Apr 28 20:09:44.143297 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:44.143304 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4hvs" event={"ID":"59778918-1296-4825-8193-240d53917f27","Type":"ContainerStarted","Data":"8d6f65e4348076fd30a7f037e6dd0f85ba8ed77f27cdc8b2e1b6af9e4e397f4b"} Apr 28 20:09:44.143812 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:44.143527 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4hvs" Apr 28 20:09:44.143812 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:44.143557 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4hvs" Apr 28 20:09:44.164027 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:44.163969 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4hvs" podStartSLOduration=6.163951466 podStartE2EDuration="6.163951466s" podCreationTimestamp="2026-04-28 20:09:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 20:09:44.160994945 +0000 UTC m=+3200.791895890" watchObservedRunningTime="2026-04-28 20:09:44.163951466 +0000 UTC m=+3200.794852412" Apr 28 20:09:44.989711 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:44.989662 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-cq8m2" podUID="bc70ef47-0fc1-4677-a17a-e92cd4c8016a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.48:8643/healthz\": dial tcp 10.134.0.48:8643: connect: connection refused" Apr 28 20:09:45.313683 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:45.313657 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-cq8m2" Apr 28 20:09:45.410953 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:45.410920 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bc70ef47-0fc1-4677-a17a-e92cd4c8016a-kserve-provision-location\") pod \"bc70ef47-0fc1-4677-a17a-e92cd4c8016a\" (UID: \"bc70ef47-0fc1-4677-a17a-e92cd4c8016a\") " Apr 28 20:09:45.411130 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:45.410961 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bc70ef47-0fc1-4677-a17a-e92cd4c8016a-proxy-tls\") pod \"bc70ef47-0fc1-4677-a17a-e92cd4c8016a\" (UID: \"bc70ef47-0fc1-4677-a17a-e92cd4c8016a\") " Apr 28 20:09:45.411130 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:45.410989 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j58m8\" (UniqueName: \"kubernetes.io/projected/bc70ef47-0fc1-4677-a17a-e92cd4c8016a-kube-api-access-j58m8\") pod \"bc70ef47-0fc1-4677-a17a-e92cd4c8016a\" (UID: \"bc70ef47-0fc1-4677-a17a-e92cd4c8016a\") " Apr 28 20:09:45.411130 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:45.411036 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bc70ef47-0fc1-4677-a17a-e92cd4c8016a-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"bc70ef47-0fc1-4677-a17a-e92cd4c8016a\" (UID: \"bc70ef47-0fc1-4677-a17a-e92cd4c8016a\") " Apr 28 20:09:45.411314 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:45.411258 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc70ef47-0fc1-4677-a17a-e92cd4c8016a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "bc70ef47-0fc1-4677-a17a-e92cd4c8016a" (UID: "bc70ef47-0fc1-4677-a17a-e92cd4c8016a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:09:45.411505 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:45.411452 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc70ef47-0fc1-4677-a17a-e92cd4c8016a-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config") pod "bc70ef47-0fc1-4677-a17a-e92cd4c8016a" (UID: "bc70ef47-0fc1-4677-a17a-e92cd4c8016a"). InnerVolumeSpecName "isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 20:09:45.413122 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:45.413097 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc70ef47-0fc1-4677-a17a-e92cd4c8016a-kube-api-access-j58m8" (OuterVolumeSpecName: "kube-api-access-j58m8") pod "bc70ef47-0fc1-4677-a17a-e92cd4c8016a" (UID: "bc70ef47-0fc1-4677-a17a-e92cd4c8016a"). InnerVolumeSpecName "kube-api-access-j58m8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 20:09:45.413222 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:45.413172 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc70ef47-0fc1-4677-a17a-e92cd4c8016a-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "bc70ef47-0fc1-4677-a17a-e92cd4c8016a" (UID: "bc70ef47-0fc1-4677-a17a-e92cd4c8016a"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 20:09:45.512363 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:45.512274 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bc70ef47-0fc1-4677-a17a-e92cd4c8016a-kserve-provision-location\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 20:09:45.512363 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:45.512304 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bc70ef47-0fc1-4677-a17a-e92cd4c8016a-proxy-tls\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 20:09:45.512363 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:45.512318 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j58m8\" (UniqueName: \"kubernetes.io/projected/bc70ef47-0fc1-4677-a17a-e92cd4c8016a-kube-api-access-j58m8\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 20:09:45.512363 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:45.512331 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bc70ef47-0fc1-4677-a17a-e92cd4c8016a-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 20:09:46.150548 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:46.150512 2571 generic.go:358] "Generic (PLEG): container finished" podID="bc70ef47-0fc1-4677-a17a-e92cd4c8016a" containerID="5f6c314cd3bec63e4b28a02739a88adfa0ec470c6b2acd1a593a6c9742b1770b" exitCode=0 Apr 28 20:09:46.150739 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:46.150641 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-cq8m2" Apr 28 20:09:46.150739 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:46.150634 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-cq8m2" event={"ID":"bc70ef47-0fc1-4677-a17a-e92cd4c8016a","Type":"ContainerDied","Data":"5f6c314cd3bec63e4b28a02739a88adfa0ec470c6b2acd1a593a6c9742b1770b"} Apr 28 20:09:46.150828 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:46.150746 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-cq8m2" event={"ID":"bc70ef47-0fc1-4677-a17a-e92cd4c8016a","Type":"ContainerDied","Data":"fc4661ee6ef750f88794e20641ba48979d240fed1afae3005bb8c9c9d4cd3b54"} Apr 28 20:09:46.150828 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:46.150763 2571 scope.go:117] "RemoveContainer" containerID="022ab50c18ad3f2b0f32ef8a151f710373b3d4a6ae09d2b434264d1ea426dfff" Apr 28 20:09:46.158670 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:46.158654 2571 scope.go:117] "RemoveContainer" containerID="5f6c314cd3bec63e4b28a02739a88adfa0ec470c6b2acd1a593a6c9742b1770b" Apr 28 20:09:46.166051 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:46.166026 2571 scope.go:117] "RemoveContainer" containerID="03c3abfa1dce7e6022179fb2bc9e6f24842c171f04f22ff28bee370670cecfe0" Apr 28 20:09:46.167079 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:46.167060 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-cq8m2"] Apr 28 20:09:46.170707 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:46.170684 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-cq8m2"] Apr 28 20:09:46.173726 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:46.173708 2571 scope.go:117] "RemoveContainer" containerID="022ab50c18ad3f2b0f32ef8a151f710373b3d4a6ae09d2b434264d1ea426dfff" Apr 28 20:09:46.173968 ip-10-0-139-128 kubenswrapper[2571]: E0428 20:09:46.173949 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"022ab50c18ad3f2b0f32ef8a151f710373b3d4a6ae09d2b434264d1ea426dfff\": container with ID starting with 022ab50c18ad3f2b0f32ef8a151f710373b3d4a6ae09d2b434264d1ea426dfff not found: ID does not exist" containerID="022ab50c18ad3f2b0f32ef8a151f710373b3d4a6ae09d2b434264d1ea426dfff" Apr 28 20:09:46.174024 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:46.173984 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"022ab50c18ad3f2b0f32ef8a151f710373b3d4a6ae09d2b434264d1ea426dfff"} err="failed to get container status \"022ab50c18ad3f2b0f32ef8a151f710373b3d4a6ae09d2b434264d1ea426dfff\": rpc error: code = NotFound desc = could not find container \"022ab50c18ad3f2b0f32ef8a151f710373b3d4a6ae09d2b434264d1ea426dfff\": container with ID starting with 022ab50c18ad3f2b0f32ef8a151f710373b3d4a6ae09d2b434264d1ea426dfff not found: ID does not exist" Apr 28 20:09:46.174024 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:46.174003 2571 scope.go:117] "RemoveContainer" containerID="5f6c314cd3bec63e4b28a02739a88adfa0ec470c6b2acd1a593a6c9742b1770b" Apr 28 20:09:46.174253 ip-10-0-139-128 kubenswrapper[2571]: E0428 20:09:46.174234 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f6c314cd3bec63e4b28a02739a88adfa0ec470c6b2acd1a593a6c9742b1770b\": container with ID starting with 5f6c314cd3bec63e4b28a02739a88adfa0ec470c6b2acd1a593a6c9742b1770b not found: ID does not exist" containerID="5f6c314cd3bec63e4b28a02739a88adfa0ec470c6b2acd1a593a6c9742b1770b" Apr 28 20:09:46.174303 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:46.174259 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f6c314cd3bec63e4b28a02739a88adfa0ec470c6b2acd1a593a6c9742b1770b"} err="failed to get container status \"5f6c314cd3bec63e4b28a02739a88adfa0ec470c6b2acd1a593a6c9742b1770b\": rpc error: code = NotFound desc = could not find container \"5f6c314cd3bec63e4b28a02739a88adfa0ec470c6b2acd1a593a6c9742b1770b\": container with ID starting with 5f6c314cd3bec63e4b28a02739a88adfa0ec470c6b2acd1a593a6c9742b1770b not found: ID does not exist" Apr 28 20:09:46.174303 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:46.174276 2571 scope.go:117] "RemoveContainer" containerID="03c3abfa1dce7e6022179fb2bc9e6f24842c171f04f22ff28bee370670cecfe0" Apr 28 20:09:46.174504 ip-10-0-139-128 kubenswrapper[2571]: E0428 20:09:46.174473 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03c3abfa1dce7e6022179fb2bc9e6f24842c171f04f22ff28bee370670cecfe0\": container with ID starting with 03c3abfa1dce7e6022179fb2bc9e6f24842c171f04f22ff28bee370670cecfe0 not found: ID does not exist" containerID="03c3abfa1dce7e6022179fb2bc9e6f24842c171f04f22ff28bee370670cecfe0" Apr 28 20:09:46.174556 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:46.174507 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03c3abfa1dce7e6022179fb2bc9e6f24842c171f04f22ff28bee370670cecfe0"} err="failed to get container status \"03c3abfa1dce7e6022179fb2bc9e6f24842c171f04f22ff28bee370670cecfe0\": rpc error: code = NotFound desc = could not find container \"03c3abfa1dce7e6022179fb2bc9e6f24842c171f04f22ff28bee370670cecfe0\": container with ID starting with 03c3abfa1dce7e6022179fb2bc9e6f24842c171f04f22ff28bee370670cecfe0 not found: ID does not exist" Apr 28 20:09:47.946644 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:47.946609 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc70ef47-0fc1-4677-a17a-e92cd4c8016a" path="/var/lib/kubelet/pods/bc70ef47-0fc1-4677-a17a-e92cd4c8016a/volumes" Apr 28 20:09:50.152541 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:09:50.152510 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4hvs" Apr 28 20:10:20.157130 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:10:20.157053 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4hvs" Apr 28 20:10:28.527910 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:10:28.527874 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4hvs"] Apr 28 20:10:28.528404 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:10:28.528207 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4hvs" podUID="59778918-1296-4825-8193-240d53917f27" containerName="kserve-container" containerID="cri-o://8d6f65e4348076fd30a7f037e6dd0f85ba8ed77f27cdc8b2e1b6af9e4e397f4b" gracePeriod=30 Apr 28 20:10:28.528404 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:10:28.528259 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4hvs" podUID="59778918-1296-4825-8193-240d53917f27" containerName="kube-rbac-proxy" containerID="cri-o://2ce4bb9142de2bce99450b5d057e19301123ea56a900c04b5dd108d78c15169a" gracePeriod=30 Apr 28 20:10:29.280665 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:10:29.280625 2571 generic.go:358] "Generic (PLEG): container finished" podID="59778918-1296-4825-8193-240d53917f27" containerID="2ce4bb9142de2bce99450b5d057e19301123ea56a900c04b5dd108d78c15169a" exitCode=2 Apr 28 20:10:29.280842 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:10:29.280683 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4hvs" event={"ID":"59778918-1296-4825-8193-240d53917f27","Type":"ContainerDied","Data":"2ce4bb9142de2bce99450b5d057e19301123ea56a900c04b5dd108d78c15169a"} Apr 28 20:10:30.147719 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:10:30.147680 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4hvs" podUID="59778918-1296-4825-8193-240d53917f27" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.49:8643/healthz\": dial tcp 10.134.0.49:8643: connect: connection refused" Apr 28 20:10:35.147702 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:10:35.147661 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4hvs" podUID="59778918-1296-4825-8193-240d53917f27" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.49:8643/healthz\": dial tcp 10.134.0.49:8643: connect: connection refused" Apr 28 20:10:35.261453 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:10:35.261430 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4hvs" Apr 28 20:10:35.300424 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:10:35.300389 2571 generic.go:358] "Generic (PLEG): container finished" podID="59778918-1296-4825-8193-240d53917f27" containerID="8d6f65e4348076fd30a7f037e6dd0f85ba8ed77f27cdc8b2e1b6af9e4e397f4b" exitCode=0 Apr 28 20:10:35.300607 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:10:35.300468 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4hvs" Apr 28 20:10:35.300607 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:10:35.300475 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4hvs" event={"ID":"59778918-1296-4825-8193-240d53917f27","Type":"ContainerDied","Data":"8d6f65e4348076fd30a7f037e6dd0f85ba8ed77f27cdc8b2e1b6af9e4e397f4b"} Apr 28 20:10:35.300607 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:10:35.300532 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4hvs" event={"ID":"59778918-1296-4825-8193-240d53917f27","Type":"ContainerDied","Data":"920d016f58e396ae5c71ca0d1a71dea339c2ef607592275f924d54be33eac713"} Apr 28 20:10:35.300607 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:10:35.300556 2571 scope.go:117] "RemoveContainer" containerID="2ce4bb9142de2bce99450b5d057e19301123ea56a900c04b5dd108d78c15169a" Apr 28 20:10:35.308803 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:10:35.308787 2571 scope.go:117] "RemoveContainer" containerID="8d6f65e4348076fd30a7f037e6dd0f85ba8ed77f27cdc8b2e1b6af9e4e397f4b" Apr 28 20:10:35.315911 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:10:35.315895 2571 scope.go:117] "RemoveContainer" containerID="ca2ddd2b2e13c2cd7ffd95c469dfd2e7e978e7b4628fc665769f9f2dde91ca64" Apr 28 20:10:35.322767 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:10:35.322750 2571 scope.go:117] "RemoveContainer" containerID="2ce4bb9142de2bce99450b5d057e19301123ea56a900c04b5dd108d78c15169a" Apr 28 20:10:35.323027 ip-10-0-139-128 kubenswrapper[2571]: E0428 20:10:35.323004 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ce4bb9142de2bce99450b5d057e19301123ea56a900c04b5dd108d78c15169a\": container with ID starting with 2ce4bb9142de2bce99450b5d057e19301123ea56a900c04b5dd108d78c15169a not found: ID does not exist" containerID="2ce4bb9142de2bce99450b5d057e19301123ea56a900c04b5dd108d78c15169a" Apr 28 20:10:35.323075 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:10:35.323036 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ce4bb9142de2bce99450b5d057e19301123ea56a900c04b5dd108d78c15169a"} err="failed to get container status \"2ce4bb9142de2bce99450b5d057e19301123ea56a900c04b5dd108d78c15169a\": rpc error: code = NotFound desc = could not find container \"2ce4bb9142de2bce99450b5d057e19301123ea56a900c04b5dd108d78c15169a\": container with ID starting with 2ce4bb9142de2bce99450b5d057e19301123ea56a900c04b5dd108d78c15169a not found: ID does not exist" Apr 28 20:10:35.323075 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:10:35.323055 2571 scope.go:117] "RemoveContainer" containerID="8d6f65e4348076fd30a7f037e6dd0f85ba8ed77f27cdc8b2e1b6af9e4e397f4b" Apr 28 20:10:35.323263 ip-10-0-139-128 kubenswrapper[2571]: E0428 20:10:35.323249 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d6f65e4348076fd30a7f037e6dd0f85ba8ed77f27cdc8b2e1b6af9e4e397f4b\": container with ID starting with 8d6f65e4348076fd30a7f037e6dd0f85ba8ed77f27cdc8b2e1b6af9e4e397f4b not found: ID does not exist" containerID="8d6f65e4348076fd30a7f037e6dd0f85ba8ed77f27cdc8b2e1b6af9e4e397f4b" Apr 28 20:10:35.323311 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:10:35.323267 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d6f65e4348076fd30a7f037e6dd0f85ba8ed77f27cdc8b2e1b6af9e4e397f4b"} err="failed to get container status \"8d6f65e4348076fd30a7f037e6dd0f85ba8ed77f27cdc8b2e1b6af9e4e397f4b\": rpc error: code = NotFound desc = could not find container \"8d6f65e4348076fd30a7f037e6dd0f85ba8ed77f27cdc8b2e1b6af9e4e397f4b\": container with ID starting with 8d6f65e4348076fd30a7f037e6dd0f85ba8ed77f27cdc8b2e1b6af9e4e397f4b not found: ID does not exist" Apr 28 20:10:35.323311 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:10:35.323280 2571 scope.go:117] "RemoveContainer" containerID="ca2ddd2b2e13c2cd7ffd95c469dfd2e7e978e7b4628fc665769f9f2dde91ca64" Apr 28 20:10:35.323463 ip-10-0-139-128 kubenswrapper[2571]: E0428 20:10:35.323449 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca2ddd2b2e13c2cd7ffd95c469dfd2e7e978e7b4628fc665769f9f2dde91ca64\": container with ID starting with ca2ddd2b2e13c2cd7ffd95c469dfd2e7e978e7b4628fc665769f9f2dde91ca64 not found: ID does not exist" containerID="ca2ddd2b2e13c2cd7ffd95c469dfd2e7e978e7b4628fc665769f9f2dde91ca64" Apr 28 20:10:35.323544 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:10:35.323467 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca2ddd2b2e13c2cd7ffd95c469dfd2e7e978e7b4628fc665769f9f2dde91ca64"} err="failed to get container status \"ca2ddd2b2e13c2cd7ffd95c469dfd2e7e978e7b4628fc665769f9f2dde91ca64\": rpc error: code = NotFound desc = could not find container \"ca2ddd2b2e13c2cd7ffd95c469dfd2e7e978e7b4628fc665769f9f2dde91ca64\": container with ID starting with ca2ddd2b2e13c2cd7ffd95c469dfd2e7e978e7b4628fc665769f9f2dde91ca64 not found: ID does not exist" Apr 28 20:10:35.326686 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:10:35.326668 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59778918-1296-4825-8193-240d53917f27-kserve-provision-location\") pod \"59778918-1296-4825-8193-240d53917f27\" (UID: \"59778918-1296-4825-8193-240d53917f27\") " Apr 28 20:10:35.326772 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:10:35.326738 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5n9q\" (UniqueName: \"kubernetes.io/projected/59778918-1296-4825-8193-240d53917f27-kube-api-access-s5n9q\") pod \"59778918-1296-4825-8193-240d53917f27\" (UID: \"59778918-1296-4825-8193-240d53917f27\") " Apr 28 20:10:35.326835 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:10:35.326779 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/59778918-1296-4825-8193-240d53917f27-proxy-tls\") pod \"59778918-1296-4825-8193-240d53917f27\" (UID: \"59778918-1296-4825-8193-240d53917f27\") " Apr 28 20:10:35.326891 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:10:35.326829 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/59778918-1296-4825-8193-240d53917f27-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"59778918-1296-4825-8193-240d53917f27\" (UID: \"59778918-1296-4825-8193-240d53917f27\") " Apr 28 20:10:35.326979 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:10:35.326955 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59778918-1296-4825-8193-240d53917f27-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "59778918-1296-4825-8193-240d53917f27" (UID: "59778918-1296-4825-8193-240d53917f27"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:10:35.327157 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:10:35.327070 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59778918-1296-4825-8193-240d53917f27-kserve-provision-location\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 20:10:35.327232 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:10:35.327206 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59778918-1296-4825-8193-240d53917f27-xgboost-v2-mlserver-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "xgboost-v2-mlserver-kube-rbac-proxy-sar-config") pod "59778918-1296-4825-8193-240d53917f27" (UID: "59778918-1296-4825-8193-240d53917f27"). InnerVolumeSpecName "xgboost-v2-mlserver-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 20:10:35.328707 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:10:35.328685 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59778918-1296-4825-8193-240d53917f27-kube-api-access-s5n9q" (OuterVolumeSpecName: "kube-api-access-s5n9q") pod "59778918-1296-4825-8193-240d53917f27" (UID: "59778918-1296-4825-8193-240d53917f27"). InnerVolumeSpecName "kube-api-access-s5n9q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 20:10:35.328840 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:10:35.328817 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59778918-1296-4825-8193-240d53917f27-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "59778918-1296-4825-8193-240d53917f27" (UID: "59778918-1296-4825-8193-240d53917f27"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 20:10:35.428350 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:10:35.428320 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s5n9q\" (UniqueName: \"kubernetes.io/projected/59778918-1296-4825-8193-240d53917f27-kube-api-access-s5n9q\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 20:10:35.428350 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:10:35.428349 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/59778918-1296-4825-8193-240d53917f27-proxy-tls\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 20:10:35.428568 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:10:35.428360 2571 reconciler_common.go:299] "Volume detached for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/59778918-1296-4825-8193-240d53917f27-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 20:10:35.622662 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:10:35.622628 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4hvs"] Apr 28 20:10:35.628189 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:10:35.628168 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4hvs"] Apr 28 20:10:35.946406 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:10:35.946372 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59778918-1296-4825-8193-240d53917f27" path="/var/lib/kubelet/pods/59778918-1296-4825-8193-240d53917f27/volumes" Apr 28 20:11:24.075561 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:11:24.075530 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ppk4t_238e19ca-102f-43b1-8aed-9322ca47bfc9/ovn-acl-logging/0.log" Apr 28 20:11:24.082938 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:11:24.082911 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ppk4t_238e19ca-102f-43b1-8aed-9322ca47bfc9/ovn-acl-logging/0.log" Apr 28 20:11:48.941307 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:11:48.941216 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-sp5z2"] Apr 28 20:11:48.941789 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:11:48.941687 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc70ef47-0fc1-4677-a17a-e92cd4c8016a" containerName="storage-initializer" Apr 28 20:11:48.941789 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:11:48.941707 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc70ef47-0fc1-4677-a17a-e92cd4c8016a" containerName="storage-initializer" Apr 28 20:11:48.941789 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:11:48.941727 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="59778918-1296-4825-8193-240d53917f27" containerName="kube-rbac-proxy" Apr 28 20:11:48.941789 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:11:48.941735 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="59778918-1296-4825-8193-240d53917f27" containerName="kube-rbac-proxy" Apr 28 20:11:48.941789 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:11:48.941767 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="59778918-1296-4825-8193-240d53917f27" containerName="storage-initializer" Apr 28 20:11:48.941789 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:11:48.941776 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="59778918-1296-4825-8193-240d53917f27" containerName="storage-initializer" Apr 28 20:11:48.941789 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:11:48.941787 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc70ef47-0fc1-4677-a17a-e92cd4c8016a" containerName="kserve-container" Apr 28 20:11:48.942164 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:11:48.941796 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc70ef47-0fc1-4677-a17a-e92cd4c8016a" containerName="kserve-container" Apr 28 20:11:48.942164 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:11:48.941805 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc70ef47-0fc1-4677-a17a-e92cd4c8016a" containerName="kube-rbac-proxy" Apr 28 20:11:48.942164 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:11:48.941813 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc70ef47-0fc1-4677-a17a-e92cd4c8016a" containerName="kube-rbac-proxy" Apr 28 20:11:48.942164 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:11:48.941822 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="59778918-1296-4825-8193-240d53917f27" containerName="kserve-container" Apr 28 20:11:48.942164 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:11:48.941830 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="59778918-1296-4825-8193-240d53917f27" containerName="kserve-container" Apr 28 20:11:48.942164 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:11:48.941905 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="59778918-1296-4825-8193-240d53917f27" containerName="kube-rbac-proxy" Apr 28 20:11:48.942164 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:11:48.941920 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="bc70ef47-0fc1-4677-a17a-e92cd4c8016a" containerName="kserve-container" Apr 28 20:11:48.942164 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:11:48.941934 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="bc70ef47-0fc1-4677-a17a-e92cd4c8016a" containerName="kube-rbac-proxy" Apr 28 20:11:48.942164 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:11:48.941954 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="59778918-1296-4825-8193-240d53917f27" containerName="kserve-container" Apr 28 20:11:48.945682 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:11:48.945655 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-sp5z2" Apr 28 20:11:48.948864 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:11:48.948844 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-44jch\"" Apr 28 20:11:48.948996 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:11:48.948977 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\"" Apr 28 20:11:48.949124 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:11:48.949108 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-runtime-predictor-serving-cert\"" Apr 28 20:11:48.949189 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:11:48.949177 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 28 20:11:48.949575 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:11:48.949547 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 28 20:11:48.955517 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:11:48.955494 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-sp5z2"] Apr 28 20:11:49.017000 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:11:49.016966 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d5667fb7-b49d-4574-99e7-586b6db40876-proxy-tls\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-sp5z2\" (UID: \"d5667fb7-b49d-4574-99e7-586b6db40876\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-sp5z2" Apr 28 20:11:49.017229 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:11:49.017200 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d5667fb7-b49d-4574-99e7-586b6db40876-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-sp5z2\" (UID: \"d5667fb7-b49d-4574-99e7-586b6db40876\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-sp5z2" Apr 28 20:11:49.017323 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:11:49.017250 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xvml\" (UniqueName: \"kubernetes.io/projected/d5667fb7-b49d-4574-99e7-586b6db40876-kube-api-access-9xvml\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-sp5z2\" (UID: \"d5667fb7-b49d-4574-99e7-586b6db40876\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-sp5z2" Apr 28 20:11:49.017323 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:11:49.017280 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d5667fb7-b49d-4574-99e7-586b6db40876-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-sp5z2\" (UID: \"d5667fb7-b49d-4574-99e7-586b6db40876\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-sp5z2" Apr 28 20:11:49.118669 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:11:49.118631 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d5667fb7-b49d-4574-99e7-586b6db40876-proxy-tls\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-sp5z2\" (UID: \"d5667fb7-b49d-4574-99e7-586b6db40876\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-sp5z2" Apr 28 20:11:49.118848 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:11:49.118719 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d5667fb7-b49d-4574-99e7-586b6db40876-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-sp5z2\" (UID: \"d5667fb7-b49d-4574-99e7-586b6db40876\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-sp5z2" Apr 28 20:11:49.118848 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:11:49.118744 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9xvml\" (UniqueName: \"kubernetes.io/projected/d5667fb7-b49d-4574-99e7-586b6db40876-kube-api-access-9xvml\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-sp5z2\" (UID: \"d5667fb7-b49d-4574-99e7-586b6db40876\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-sp5z2" Apr 28 20:11:49.118848 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:11:49.118762 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d5667fb7-b49d-4574-99e7-586b6db40876-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-sp5z2\" (UID: \"d5667fb7-b49d-4574-99e7-586b6db40876\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-sp5z2" Apr 28 20:11:49.119226 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:11:49.119201 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d5667fb7-b49d-4574-99e7-586b6db40876-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-sp5z2\" (UID: \"d5667fb7-b49d-4574-99e7-586b6db40876\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-sp5z2" Apr 28 20:11:49.119442 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:11:49.119419 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d5667fb7-b49d-4574-99e7-586b6db40876-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-sp5z2\" (UID: \"d5667fb7-b49d-4574-99e7-586b6db40876\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-sp5z2" Apr 28 20:11:49.121025 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:11:49.121008 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d5667fb7-b49d-4574-99e7-586b6db40876-proxy-tls\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-sp5z2\" (UID: \"d5667fb7-b49d-4574-99e7-586b6db40876\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-sp5z2" Apr 28 20:11:49.126526 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:11:49.126502 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xvml\" (UniqueName: \"kubernetes.io/projected/d5667fb7-b49d-4574-99e7-586b6db40876-kube-api-access-9xvml\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-sp5z2\" (UID: \"d5667fb7-b49d-4574-99e7-586b6db40876\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-sp5z2" Apr 28 20:11:49.256777 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:11:49.256683 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-sp5z2" Apr 28 20:11:49.380753 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:11:49.380622 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-sp5z2"] Apr 28 20:11:49.383402 ip-10-0-139-128 kubenswrapper[2571]: W0428 20:11:49.383373 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5667fb7_b49d_4574_99e7_586b6db40876.slice/crio-fa69c7db9365598ba4c6a91ec92a6d92cb9fb3e8f59a31d55c1026955268adb3 WatchSource:0}: Error finding container fa69c7db9365598ba4c6a91ec92a6d92cb9fb3e8f59a31d55c1026955268adb3: Status 404 returned error can't find the container with id fa69c7db9365598ba4c6a91ec92a6d92cb9fb3e8f59a31d55c1026955268adb3 Apr 28 20:11:49.522785 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:11:49.522696 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-sp5z2" event={"ID":"d5667fb7-b49d-4574-99e7-586b6db40876","Type":"ContainerStarted","Data":"963346cc963563727811a03375f74b26238c8c99f3f80228e5c470e61f7bc6f5"} Apr 28 20:11:49.522785 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:11:49.522735 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-sp5z2" event={"ID":"d5667fb7-b49d-4574-99e7-586b6db40876","Type":"ContainerStarted","Data":"fa69c7db9365598ba4c6a91ec92a6d92cb9fb3e8f59a31d55c1026955268adb3"} Apr 28 20:11:53.535829 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:11:53.535779 2571 generic.go:358] "Generic (PLEG): container finished" podID="d5667fb7-b49d-4574-99e7-586b6db40876" containerID="963346cc963563727811a03375f74b26238c8c99f3f80228e5c470e61f7bc6f5" exitCode=0 Apr 28 20:11:53.536202 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:11:53.535849 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-sp5z2" event={"ID":"d5667fb7-b49d-4574-99e7-586b6db40876","Type":"ContainerDied","Data":"963346cc963563727811a03375f74b26238c8c99f3f80228e5c470e61f7bc6f5"} Apr 28 20:11:54.540783 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:11:54.540746 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-sp5z2" event={"ID":"d5667fb7-b49d-4574-99e7-586b6db40876","Type":"ContainerStarted","Data":"18b0b1743a6bb68c7c70c453f0810dd566bd2add599c7c494ce0273c46d26378"} Apr 28 20:11:54.540783 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:11:54.540782 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-sp5z2" event={"ID":"d5667fb7-b49d-4574-99e7-586b6db40876","Type":"ContainerStarted","Data":"d2422ebae243a49612d6d8e435e2110af9b3a65a927826490da5d68acd0bb949"} Apr 28 20:11:54.541306 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:11:54.540992 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-sp5z2" Apr 28 20:11:54.541306 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:11:54.541021 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-sp5z2" Apr 28 20:11:54.560552 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:11:54.560507 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-sp5z2" podStartSLOduration=6.5604753129999995 podStartE2EDuration="6.560475313s" podCreationTimestamp="2026-04-28 20:11:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 20:11:54.559720234 +0000 UTC m=+3331.190621179" watchObservedRunningTime="2026-04-28 20:11:54.560475313 +0000 UTC m=+3331.191376258" Apr 28 20:12:00.548684 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:12:00.548655 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-sp5z2" Apr 28 20:12:30.587200 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:12:30.587150 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-sp5z2" podUID="d5667fb7-b49d-4574-99e7-586b6db40876" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 28 20:12:40.551611 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:12:40.551577 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-sp5z2" Apr 28 20:12:48.928757 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:12:48.928725 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-sp5z2"] Apr 28 20:12:48.929136 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:12:48.929037 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-sp5z2" podUID="d5667fb7-b49d-4574-99e7-586b6db40876" containerName="kserve-container" containerID="cri-o://d2422ebae243a49612d6d8e435e2110af9b3a65a927826490da5d68acd0bb949" gracePeriod=30 Apr 28 20:12:48.929136 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:12:48.929065 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-sp5z2" podUID="d5667fb7-b49d-4574-99e7-586b6db40876" containerName="kube-rbac-proxy" containerID="cri-o://18b0b1743a6bb68c7c70c453f0810dd566bd2add599c7c494ce0273c46d26378" gracePeriod=30 Apr 28 20:12:49.708116 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:12:49.708081 2571 generic.go:358] "Generic (PLEG): container finished" podID="d5667fb7-b49d-4574-99e7-586b6db40876" containerID="18b0b1743a6bb68c7c70c453f0810dd566bd2add599c7c494ce0273c46d26378" exitCode=2 Apr 28 20:12:49.708301 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:12:49.708156 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-sp5z2" event={"ID":"d5667fb7-b49d-4574-99e7-586b6db40876","Type":"ContainerDied","Data":"18b0b1743a6bb68c7c70c453f0810dd566bd2add599c7c494ce0273c46d26378"} Apr 28 20:12:50.545075 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:12:50.545021 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-sp5z2" podUID="d5667fb7-b49d-4574-99e7-586b6db40876" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.50:8643/healthz\": dial tcp 10.134.0.50:8643: connect: connection refused" Apr 28 20:12:51.589793 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:12:51.589683 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-sp5z2" podUID="d5667fb7-b49d-4574-99e7-586b6db40876" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.50:8080/v2/models/isvc-xgboost-v2-runtime/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Apr 28 20:12:55.545112 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:12:55.545063 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-sp5z2" podUID="d5667fb7-b49d-4574-99e7-586b6db40876" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.50:8643/healthz\": dial tcp 10.134.0.50:8643: connect: connection refused" Apr 28 20:12:56.577437 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:12:56.577413 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-sp5z2" Apr 28 20:12:56.683897 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:12:56.683865 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d5667fb7-b49d-4574-99e7-586b6db40876-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") pod \"d5667fb7-b49d-4574-99e7-586b6db40876\" (UID: \"d5667fb7-b49d-4574-99e7-586b6db40876\") " Apr 28 20:12:56.684086 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:12:56.683936 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xvml\" (UniqueName: \"kubernetes.io/projected/d5667fb7-b49d-4574-99e7-586b6db40876-kube-api-access-9xvml\") pod \"d5667fb7-b49d-4574-99e7-586b6db40876\" (UID: \"d5667fb7-b49d-4574-99e7-586b6db40876\") " Apr 28 20:12:56.684086 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:12:56.683977 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d5667fb7-b49d-4574-99e7-586b6db40876-proxy-tls\") pod \"d5667fb7-b49d-4574-99e7-586b6db40876\" (UID: \"d5667fb7-b49d-4574-99e7-586b6db40876\") " Apr 28 20:12:56.684086 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:12:56.684013 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d5667fb7-b49d-4574-99e7-586b6db40876-kserve-provision-location\") pod \"d5667fb7-b49d-4574-99e7-586b6db40876\" (UID: \"d5667fb7-b49d-4574-99e7-586b6db40876\") " Apr 28 20:12:56.684321 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:12:56.684215 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5667fb7-b49d-4574-99e7-586b6db40876-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config") pod "d5667fb7-b49d-4574-99e7-586b6db40876" (UID: "d5667fb7-b49d-4574-99e7-586b6db40876"). InnerVolumeSpecName "isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 20:12:56.684377 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:12:56.684327 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5667fb7-b49d-4574-99e7-586b6db40876-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d5667fb7-b49d-4574-99e7-586b6db40876" (UID: "d5667fb7-b49d-4574-99e7-586b6db40876"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:12:56.686211 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:12:56.686186 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5667fb7-b49d-4574-99e7-586b6db40876-kube-api-access-9xvml" (OuterVolumeSpecName: "kube-api-access-9xvml") pod "d5667fb7-b49d-4574-99e7-586b6db40876" (UID: "d5667fb7-b49d-4574-99e7-586b6db40876"). InnerVolumeSpecName "kube-api-access-9xvml". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 20:12:56.686302 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:12:56.686188 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5667fb7-b49d-4574-99e7-586b6db40876-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d5667fb7-b49d-4574-99e7-586b6db40876" (UID: "d5667fb7-b49d-4574-99e7-586b6db40876"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 20:12:56.729807 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:12:56.729772 2571 generic.go:358] "Generic (PLEG): container finished" podID="d5667fb7-b49d-4574-99e7-586b6db40876" containerID="d2422ebae243a49612d6d8e435e2110af9b3a65a927826490da5d68acd0bb949" exitCode=0 Apr 28 20:12:56.729987 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:12:56.729861 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-sp5z2" event={"ID":"d5667fb7-b49d-4574-99e7-586b6db40876","Type":"ContainerDied","Data":"d2422ebae243a49612d6d8e435e2110af9b3a65a927826490da5d68acd0bb949"} Apr 28 20:12:56.729987 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:12:56.729894 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-sp5z2" Apr 28 20:12:56.729987 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:12:56.729902 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-sp5z2" event={"ID":"d5667fb7-b49d-4574-99e7-586b6db40876","Type":"ContainerDied","Data":"fa69c7db9365598ba4c6a91ec92a6d92cb9fb3e8f59a31d55c1026955268adb3"} Apr 28 20:12:56.729987 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:12:56.729919 2571 scope.go:117] "RemoveContainer" containerID="18b0b1743a6bb68c7c70c453f0810dd566bd2add599c7c494ce0273c46d26378" Apr 28 20:12:56.738715 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:12:56.738695 2571 scope.go:117] "RemoveContainer" containerID="d2422ebae243a49612d6d8e435e2110af9b3a65a927826490da5d68acd0bb949" Apr 28 20:12:56.745857 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:12:56.745841 2571 scope.go:117] "RemoveContainer" containerID="963346cc963563727811a03375f74b26238c8c99f3f80228e5c470e61f7bc6f5" Apr 28 20:12:56.751606 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:12:56.751581 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-sp5z2"] Apr 28 20:12:56.753885 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:12:56.753868 2571 scope.go:117] "RemoveContainer" containerID="18b0b1743a6bb68c7c70c453f0810dd566bd2add599c7c494ce0273c46d26378" Apr 28 20:12:56.754182 ip-10-0-139-128 kubenswrapper[2571]: E0428 20:12:56.754145 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18b0b1743a6bb68c7c70c453f0810dd566bd2add599c7c494ce0273c46d26378\": container with ID starting with 18b0b1743a6bb68c7c70c453f0810dd566bd2add599c7c494ce0273c46d26378 not found: ID does not exist" containerID="18b0b1743a6bb68c7c70c453f0810dd566bd2add599c7c494ce0273c46d26378" Apr 28 20:12:56.754255 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:12:56.754183 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18b0b1743a6bb68c7c70c453f0810dd566bd2add599c7c494ce0273c46d26378"} err="failed to get container status \"18b0b1743a6bb68c7c70c453f0810dd566bd2add599c7c494ce0273c46d26378\": rpc error: code = NotFound desc = could not find container \"18b0b1743a6bb68c7c70c453f0810dd566bd2add599c7c494ce0273c46d26378\": container with ID starting with 18b0b1743a6bb68c7c70c453f0810dd566bd2add599c7c494ce0273c46d26378 not found: ID does not exist" Apr 28 20:12:56.754255 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:12:56.754201 2571 scope.go:117] "RemoveContainer" containerID="d2422ebae243a49612d6d8e435e2110af9b3a65a927826490da5d68acd0bb949" Apr 28 20:12:56.754455 ip-10-0-139-128 kubenswrapper[2571]: E0428 20:12:56.754436 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2422ebae243a49612d6d8e435e2110af9b3a65a927826490da5d68acd0bb949\": container with ID starting with d2422ebae243a49612d6d8e435e2110af9b3a65a927826490da5d68acd0bb949 not found: ID does not exist" containerID="d2422ebae243a49612d6d8e435e2110af9b3a65a927826490da5d68acd0bb949" Apr 28 20:12:56.754535 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:12:56.754462 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2422ebae243a49612d6d8e435e2110af9b3a65a927826490da5d68acd0bb949"} err="failed to get container status \"d2422ebae243a49612d6d8e435e2110af9b3a65a927826490da5d68acd0bb949\": rpc error: code = NotFound desc = could not find container \"d2422ebae243a49612d6d8e435e2110af9b3a65a927826490da5d68acd0bb949\": container with ID starting with d2422ebae243a49612d6d8e435e2110af9b3a65a927826490da5d68acd0bb949 not found: ID does not exist" Apr 28 20:12:56.754535 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:12:56.754499 2571 scope.go:117] "RemoveContainer" containerID="963346cc963563727811a03375f74b26238c8c99f3f80228e5c470e61f7bc6f5" Apr 28 20:12:56.754769 ip-10-0-139-128 kubenswrapper[2571]: E0428 20:12:56.754752 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"963346cc963563727811a03375f74b26238c8c99f3f80228e5c470e61f7bc6f5\": container with ID starting with 963346cc963563727811a03375f74b26238c8c99f3f80228e5c470e61f7bc6f5 not found: ID does not exist" containerID="963346cc963563727811a03375f74b26238c8c99f3f80228e5c470e61f7bc6f5" Apr 28 20:12:56.754843 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:12:56.754771 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"963346cc963563727811a03375f74b26238c8c99f3f80228e5c470e61f7bc6f5"} err="failed to get container status \"963346cc963563727811a03375f74b26238c8c99f3f80228e5c470e61f7bc6f5\": rpc error: code = NotFound desc = could not find container \"963346cc963563727811a03375f74b26238c8c99f3f80228e5c470e61f7bc6f5\": container with ID starting with 963346cc963563727811a03375f74b26238c8c99f3f80228e5c470e61f7bc6f5 not found: ID does not exist" Apr 28 20:12:56.754937 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:12:56.754921 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-sp5z2"] Apr 28 20:12:56.785203 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:12:56.785174 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d5667fb7-b49d-4574-99e7-586b6db40876-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 20:12:56.785203 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:12:56.785198 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9xvml\" (UniqueName: \"kubernetes.io/projected/d5667fb7-b49d-4574-99e7-586b6db40876-kube-api-access-9xvml\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 20:12:56.785351 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:12:56.785209 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d5667fb7-b49d-4574-99e7-586b6db40876-proxy-tls\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 20:12:56.785351 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:12:56.785221 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d5667fb7-b49d-4574-99e7-586b6db40876-kserve-provision-location\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 20:12:57.946749 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:12:57.946715 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5667fb7-b49d-4574-99e7-586b6db40876" path="/var/lib/kubelet/pods/d5667fb7-b49d-4574-99e7-586b6db40876/volumes" Apr 28 20:14:09.186737 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:14:09.186703 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-584b446894-dhtlj"] Apr 28 20:14:09.187179 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:14:09.187036 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d5667fb7-b49d-4574-99e7-586b6db40876" containerName="kube-rbac-proxy" Apr 28 20:14:09.187179 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:14:09.187048 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5667fb7-b49d-4574-99e7-586b6db40876" containerName="kube-rbac-proxy" Apr 28 20:14:09.187179 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:14:09.187057 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d5667fb7-b49d-4574-99e7-586b6db40876" containerName="kserve-container" Apr 28 20:14:09.187179 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:14:09.187063 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5667fb7-b49d-4574-99e7-586b6db40876" containerName="kserve-container" Apr 28 20:14:09.187179 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:14:09.187079 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d5667fb7-b49d-4574-99e7-586b6db40876" containerName="storage-initializer" Apr 28 20:14:09.187179 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:14:09.187084 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5667fb7-b49d-4574-99e7-586b6db40876" containerName="storage-initializer" Apr 28 20:14:09.187179 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:14:09.187128 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="d5667fb7-b49d-4574-99e7-586b6db40876" containerName="kserve-container" Apr 28 20:14:09.187179 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:14:09.187138 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="d5667fb7-b49d-4574-99e7-586b6db40876" containerName="kube-rbac-proxy" Apr 28 20:14:09.189570 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:14:09.189552 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-584b446894-dhtlj" Apr 28 20:14:09.191620 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:14:09.191596 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"storage-config\"" Apr 28 20:14:09.192404 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:14:09.192384 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 28 20:14:09.192517 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:14:09.192470 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-predictor-serving-cert\"" Apr 28 20:14:09.192517 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:14:09.192504 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 28 20:14:09.192607 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:14:09.192532 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-44jch\"" Apr 28 20:14:09.192647 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:14:09.192603 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-kube-rbac-proxy-sar-config\"" Apr 28 20:14:09.198186 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:14:09.198166 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-584b446894-dhtlj"] Apr 28 20:14:09.383875 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:14:09.383834 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/854a7aa2-bb56-41fd-b223-fa5c257929f2-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-584b446894-dhtlj\" (UID: \"854a7aa2-bb56-41fd-b223-fa5c257929f2\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-584b446894-dhtlj" Apr 28 20:14:09.383875 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:14:09.383878 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/854a7aa2-bb56-41fd-b223-fa5c257929f2-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-predictor-584b446894-dhtlj\" (UID: \"854a7aa2-bb56-41fd-b223-fa5c257929f2\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-584b446894-dhtlj" Apr 28 20:14:09.384092 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:14:09.384002 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hp2ft\" (UniqueName: \"kubernetes.io/projected/854a7aa2-bb56-41fd-b223-fa5c257929f2-kube-api-access-hp2ft\") pod \"isvc-sklearn-s3-predictor-584b446894-dhtlj\" (UID: \"854a7aa2-bb56-41fd-b223-fa5c257929f2\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-584b446894-dhtlj" Apr 28 20:14:09.384092 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:14:09.384039 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/854a7aa2-bb56-41fd-b223-fa5c257929f2-proxy-tls\") pod \"isvc-sklearn-s3-predictor-584b446894-dhtlj\" (UID: \"854a7aa2-bb56-41fd-b223-fa5c257929f2\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-584b446894-dhtlj" Apr 28 20:14:09.484698 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:14:09.484614 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/854a7aa2-bb56-41fd-b223-fa5c257929f2-proxy-tls\") pod \"isvc-sklearn-s3-predictor-584b446894-dhtlj\" (UID: \"854a7aa2-bb56-41fd-b223-fa5c257929f2\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-584b446894-dhtlj" Apr 28 20:14:09.484698 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:14:09.484661 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/854a7aa2-bb56-41fd-b223-fa5c257929f2-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-584b446894-dhtlj\" (UID: \"854a7aa2-bb56-41fd-b223-fa5c257929f2\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-584b446894-dhtlj" Apr 28 20:14:09.484886 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:14:09.484836 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/854a7aa2-bb56-41fd-b223-fa5c257929f2-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-predictor-584b446894-dhtlj\" (UID: \"854a7aa2-bb56-41fd-b223-fa5c257929f2\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-584b446894-dhtlj" Apr 28 20:14:09.484943 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:14:09.484934 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hp2ft\" (UniqueName: \"kubernetes.io/projected/854a7aa2-bb56-41fd-b223-fa5c257929f2-kube-api-access-hp2ft\") pod \"isvc-sklearn-s3-predictor-584b446894-dhtlj\" (UID: \"854a7aa2-bb56-41fd-b223-fa5c257929f2\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-584b446894-dhtlj" Apr 28 20:14:09.485087 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:14:09.485070 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/854a7aa2-bb56-41fd-b223-fa5c257929f2-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-584b446894-dhtlj\" (UID: \"854a7aa2-bb56-41fd-b223-fa5c257929f2\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-584b446894-dhtlj" Apr 28 20:14:09.485455 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:14:09.485434 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/854a7aa2-bb56-41fd-b223-fa5c257929f2-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-predictor-584b446894-dhtlj\" (UID: \"854a7aa2-bb56-41fd-b223-fa5c257929f2\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-584b446894-dhtlj" Apr 28 20:14:09.487083 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:14:09.487055 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/854a7aa2-bb56-41fd-b223-fa5c257929f2-proxy-tls\") pod \"isvc-sklearn-s3-predictor-584b446894-dhtlj\" (UID: \"854a7aa2-bb56-41fd-b223-fa5c257929f2\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-584b446894-dhtlj" Apr 28 20:14:09.492633 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:14:09.492610 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hp2ft\" (UniqueName: \"kubernetes.io/projected/854a7aa2-bb56-41fd-b223-fa5c257929f2-kube-api-access-hp2ft\") pod \"isvc-sklearn-s3-predictor-584b446894-dhtlj\" (UID: \"854a7aa2-bb56-41fd-b223-fa5c257929f2\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-584b446894-dhtlj" Apr 28 20:14:09.500886 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:14:09.500869 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-584b446894-dhtlj" Apr 28 20:14:09.623978 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:14:09.623950 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-584b446894-dhtlj"] Apr 28 20:14:09.626662 ip-10-0-139-128 kubenswrapper[2571]: W0428 20:14:09.626632 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod854a7aa2_bb56_41fd_b223_fa5c257929f2.slice/crio-91027cf727801fe50318a41eb180cbadba002a10ee6616f252dbb6ef985efef0 WatchSource:0}: Error finding container 91027cf727801fe50318a41eb180cbadba002a10ee6616f252dbb6ef985efef0: Status 404 returned error can't find the container with id 91027cf727801fe50318a41eb180cbadba002a10ee6616f252dbb6ef985efef0 Apr 28 20:14:09.628897 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:14:09.628879 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 20:14:09.938202 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:14:09.938161 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-584b446894-dhtlj" event={"ID":"854a7aa2-bb56-41fd-b223-fa5c257929f2","Type":"ContainerStarted","Data":"64bdec30fe44c79e50a3f0bd5090093ebed3b8841ffd09dca55456c33e91026c"} Apr 28 20:14:09.938202 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:14:09.938198 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-584b446894-dhtlj" event={"ID":"854a7aa2-bb56-41fd-b223-fa5c257929f2","Type":"ContainerStarted","Data":"91027cf727801fe50318a41eb180cbadba002a10ee6616f252dbb6ef985efef0"} Apr 28 20:14:10.943571 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:14:10.943538 2571 generic.go:358] "Generic (PLEG): container finished" podID="854a7aa2-bb56-41fd-b223-fa5c257929f2" containerID="64bdec30fe44c79e50a3f0bd5090093ebed3b8841ffd09dca55456c33e91026c" exitCode=0 Apr 28 20:14:10.943951 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:14:10.943620 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-584b446894-dhtlj" event={"ID":"854a7aa2-bb56-41fd-b223-fa5c257929f2","Type":"ContainerDied","Data":"64bdec30fe44c79e50a3f0bd5090093ebed3b8841ffd09dca55456c33e91026c"} Apr 28 20:14:11.948491 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:14:11.948452 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-584b446894-dhtlj" event={"ID":"854a7aa2-bb56-41fd-b223-fa5c257929f2","Type":"ContainerStarted","Data":"4663274548fcbfd6938f2aac91bc33f04d8a4bbb6fd86cb00c09fa7458a0adeb"} Apr 28 20:14:11.948896 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:14:11.948507 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-584b446894-dhtlj" event={"ID":"854a7aa2-bb56-41fd-b223-fa5c257929f2","Type":"ContainerStarted","Data":"320259f94cf3818e6fbcd9b1f41098bf980e26a96c5c11fff5d672c854d7b188"} Apr 28 20:14:11.948896 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:14:11.948764 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-584b446894-dhtlj" Apr 28 20:14:11.968385 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:14:11.968332 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-584b446894-dhtlj" podStartSLOduration=2.968318193 podStartE2EDuration="2.968318193s" podCreationTimestamp="2026-04-28 20:14:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 20:14:11.96635686 +0000 UTC m=+3468.597257805" watchObservedRunningTime="2026-04-28 20:14:11.968318193 +0000 UTC m=+3468.599219138" Apr 28 20:14:12.951922 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:14:12.951890 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-584b446894-dhtlj" Apr 28 20:14:12.953046 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:14:12.953020 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-584b446894-dhtlj" podUID="854a7aa2-bb56-41fd-b223-fa5c257929f2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 28 20:14:13.954329 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:14:13.954291 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-584b446894-dhtlj" podUID="854a7aa2-bb56-41fd-b223-fa5c257929f2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 28 20:14:18.958255 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:14:18.958222 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-584b446894-dhtlj" Apr 28 20:14:18.958816 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:14:18.958789 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-584b446894-dhtlj" podUID="854a7aa2-bb56-41fd-b223-fa5c257929f2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 28 20:14:28.959657 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:14:28.959617 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-584b446894-dhtlj" podUID="854a7aa2-bb56-41fd-b223-fa5c257929f2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 28 20:14:38.959376 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:14:38.959333 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-584b446894-dhtlj" podUID="854a7aa2-bb56-41fd-b223-fa5c257929f2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 28 20:14:48.958973 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:14:48.958926 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-584b446894-dhtlj" podUID="854a7aa2-bb56-41fd-b223-fa5c257929f2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 28 20:14:58.958811 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:14:58.958772 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-584b446894-dhtlj" podUID="854a7aa2-bb56-41fd-b223-fa5c257929f2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 28 20:15:08.959423 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:08.959381 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-584b446894-dhtlj" podUID="854a7aa2-bb56-41fd-b223-fa5c257929f2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 28 20:15:18.960212 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:18.960172 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-584b446894-dhtlj" Apr 28 20:15:28.769547 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:28.769512 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-584b446894-dhtlj"] Apr 28 20:15:28.769956 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:28.769840 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-584b446894-dhtlj" podUID="854a7aa2-bb56-41fd-b223-fa5c257929f2" containerName="kserve-container" containerID="cri-o://320259f94cf3818e6fbcd9b1f41098bf980e26a96c5c11fff5d672c854d7b188" gracePeriod=30 Apr 28 20:15:28.769956 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:28.769887 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-584b446894-dhtlj" podUID="854a7aa2-bb56-41fd-b223-fa5c257929f2" containerName="kube-rbac-proxy" containerID="cri-o://4663274548fcbfd6938f2aac91bc33f04d8a4bbb6fd86cb00c09fa7458a0adeb" gracePeriod=30 Apr 28 20:15:28.954719 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:28.954677 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-584b446894-dhtlj" podUID="854a7aa2-bb56-41fd-b223-fa5c257929f2" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.51:8643/healthz\": dial tcp 10.134.0.51:8643: connect: connection refused" Apr 28 20:15:28.959121 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:28.959087 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-584b446894-dhtlj" podUID="854a7aa2-bb56-41fd-b223-fa5c257929f2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 28 20:15:29.182682 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:29.182647 2571 generic.go:358] "Generic (PLEG): container finished" podID="854a7aa2-bb56-41fd-b223-fa5c257929f2" containerID="4663274548fcbfd6938f2aac91bc33f04d8a4bbb6fd86cb00c09fa7458a0adeb" exitCode=2 Apr 28 20:15:29.182853 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:29.182722 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-584b446894-dhtlj" event={"ID":"854a7aa2-bb56-41fd-b223-fa5c257929f2","Type":"ContainerDied","Data":"4663274548fcbfd6938f2aac91bc33f04d8a4bbb6fd86cb00c09fa7458a0adeb"} Apr 28 20:15:29.389792 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:29.389754 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-764bc9bb86-5qmr4"] Apr 28 20:15:29.393117 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:29.393098 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-764bc9bb86-5qmr4" Apr 28 20:15:29.394994 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:29.394962 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\"" Apr 28 20:15:29.394994 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:29.394985 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 28 20:15:29.394994 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:29.394962 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-global-pass-predictor-serving-cert\"" Apr 28 20:15:29.403838 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:29.403812 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-764bc9bb86-5qmr4"] Apr 28 20:15:29.442736 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:29.442636 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-764bc9bb86-5qmr4\" (UID: \"f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-764bc9bb86-5qmr4" Apr 28 20:15:29.442736 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:29.442714 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wdcq\" (UniqueName: \"kubernetes.io/projected/f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b-kube-api-access-4wdcq\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-764bc9bb86-5qmr4\" (UID: \"f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-764bc9bb86-5qmr4" Apr 28 20:15:29.442935 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:29.442743 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-764bc9bb86-5qmr4\" (UID: \"f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-764bc9bb86-5qmr4" Apr 28 20:15:29.442935 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:29.442780 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-764bc9bb86-5qmr4\" (UID: \"f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-764bc9bb86-5qmr4" Apr 28 20:15:29.442935 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:29.442857 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-764bc9bb86-5qmr4\" (UID: \"f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-764bc9bb86-5qmr4" Apr 28 20:15:29.544114 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:29.544073 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4wdcq\" (UniqueName: \"kubernetes.io/projected/f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b-kube-api-access-4wdcq\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-764bc9bb86-5qmr4\" (UID: \"f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-764bc9bb86-5qmr4" Apr 28 20:15:29.544114 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:29.544117 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-764bc9bb86-5qmr4\" (UID: \"f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-764bc9bb86-5qmr4" Apr 28 20:15:29.544377 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:29.544145 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-764bc9bb86-5qmr4\" (UID: \"f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-764bc9bb86-5qmr4" Apr 28 20:15:29.544377 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:29.544172 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-764bc9bb86-5qmr4\" (UID: \"f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-764bc9bb86-5qmr4" Apr 28 20:15:29.544377 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:29.544229 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-764bc9bb86-5qmr4\" (UID: \"f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-764bc9bb86-5qmr4" Apr 28 20:15:29.544660 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:29.544632 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-764bc9bb86-5qmr4\" (UID: \"f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-764bc9bb86-5qmr4" Apr 28 20:15:29.544919 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:29.544896 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-764bc9bb86-5qmr4\" (UID: \"f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-764bc9bb86-5qmr4" Apr 28 20:15:29.544992 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:29.544951 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-764bc9bb86-5qmr4\" (UID: \"f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-764bc9bb86-5qmr4" Apr 28 20:15:29.546615 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:29.546594 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-764bc9bb86-5qmr4\" (UID: \"f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-764bc9bb86-5qmr4" Apr 28 20:15:29.551738 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:29.551710 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wdcq\" (UniqueName: \"kubernetes.io/projected/f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b-kube-api-access-4wdcq\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-764bc9bb86-5qmr4\" (UID: \"f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-764bc9bb86-5qmr4" Apr 28 20:15:29.703389 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:29.703293 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-764bc9bb86-5qmr4" Apr 28 20:15:29.832897 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:29.832641 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-764bc9bb86-5qmr4"] Apr 28 20:15:29.835560 ip-10-0-139-128 kubenswrapper[2571]: W0428 20:15:29.835523 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5decf8d_6dcd_4c7c_9a7e_70651aa9dd5b.slice/crio-0589e1a5a639e263e37f8fa372c385f4476fd064e443dc8b5c76a5fa03095667 WatchSource:0}: Error finding container 0589e1a5a639e263e37f8fa372c385f4476fd064e443dc8b5c76a5fa03095667: Status 404 returned error can't find the container with id 0589e1a5a639e263e37f8fa372c385f4476fd064e443dc8b5c76a5fa03095667 Apr 28 20:15:30.188065 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:30.188013 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-764bc9bb86-5qmr4" event={"ID":"f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b","Type":"ContainerStarted","Data":"d3242fe961cd1b98c5b69213f522e8288b86f33d272d59903744ad0ab77d1553"} Apr 28 20:15:30.188065 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:30.188064 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-764bc9bb86-5qmr4" event={"ID":"f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b","Type":"ContainerStarted","Data":"0589e1a5a639e263e37f8fa372c385f4476fd064e443dc8b5c76a5fa03095667"} Apr 28 20:15:31.192847 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:31.192814 2571 generic.go:358] "Generic (PLEG): container finished" podID="f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b" containerID="d3242fe961cd1b98c5b69213f522e8288b86f33d272d59903744ad0ab77d1553" exitCode=0 Apr 28 20:15:31.193239 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:31.192897 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-764bc9bb86-5qmr4" event={"ID":"f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b","Type":"ContainerDied","Data":"d3242fe961cd1b98c5b69213f522e8288b86f33d272d59903744ad0ab77d1553"} Apr 28 20:15:32.203109 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:32.203069 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-764bc9bb86-5qmr4" event={"ID":"f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b","Type":"ContainerStarted","Data":"899be02d4e2fbd4e4e9e17a6b5371389409b27c3aac0cf4e77e6ffb5e3b209f8"} Apr 28 20:15:32.203109 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:32.203111 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-764bc9bb86-5qmr4" event={"ID":"f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b","Type":"ContainerStarted","Data":"5a325988850e4626cef68a4d2230cf13d94a0e84ac0a668d2921b56b65e16d85"} Apr 28 20:15:32.203660 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:32.203274 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-764bc9bb86-5qmr4" Apr 28 20:15:32.203660 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:32.203292 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-764bc9bb86-5qmr4" Apr 28 20:15:32.204525 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:32.204474 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-764bc9bb86-5qmr4" podUID="f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.52:8080: connect: connection refused" Apr 28 20:15:32.222314 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:32.222247 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-764bc9bb86-5qmr4" podStartSLOduration=3.222228409 podStartE2EDuration="3.222228409s" podCreationTimestamp="2026-04-28 20:15:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 20:15:32.220338581 +0000 UTC m=+3548.851239526" watchObservedRunningTime="2026-04-28 20:15:32.222228409 +0000 UTC m=+3548.853129355" Apr 28 20:15:33.206400 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:33.206353 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-764bc9bb86-5qmr4" podUID="f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.52:8080: connect: connection refused" Apr 28 20:15:33.615583 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:33.615557 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-584b446894-dhtlj" Apr 28 20:15:33.681136 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:33.681095 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/854a7aa2-bb56-41fd-b223-fa5c257929f2-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") pod \"854a7aa2-bb56-41fd-b223-fa5c257929f2\" (UID: \"854a7aa2-bb56-41fd-b223-fa5c257929f2\") " Apr 28 20:15:33.681136 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:33.681146 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/854a7aa2-bb56-41fd-b223-fa5c257929f2-proxy-tls\") pod \"854a7aa2-bb56-41fd-b223-fa5c257929f2\" (UID: \"854a7aa2-bb56-41fd-b223-fa5c257929f2\") " Apr 28 20:15:33.681461 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:33.681183 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hp2ft\" (UniqueName: \"kubernetes.io/projected/854a7aa2-bb56-41fd-b223-fa5c257929f2-kube-api-access-hp2ft\") pod \"854a7aa2-bb56-41fd-b223-fa5c257929f2\" (UID: \"854a7aa2-bb56-41fd-b223-fa5c257929f2\") " Apr 28 20:15:33.681461 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:33.681221 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/854a7aa2-bb56-41fd-b223-fa5c257929f2-kserve-provision-location\") pod \"854a7aa2-bb56-41fd-b223-fa5c257929f2\" (UID: \"854a7aa2-bb56-41fd-b223-fa5c257929f2\") " Apr 28 20:15:33.681674 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:33.681649 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/854a7aa2-bb56-41fd-b223-fa5c257929f2-isvc-sklearn-s3-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-kube-rbac-proxy-sar-config") pod "854a7aa2-bb56-41fd-b223-fa5c257929f2" (UID: "854a7aa2-bb56-41fd-b223-fa5c257929f2"). InnerVolumeSpecName "isvc-sklearn-s3-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 20:15:33.681733 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:33.681657 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/854a7aa2-bb56-41fd-b223-fa5c257929f2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "854a7aa2-bb56-41fd-b223-fa5c257929f2" (UID: "854a7aa2-bb56-41fd-b223-fa5c257929f2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:15:33.683345 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:33.683324 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/854a7aa2-bb56-41fd-b223-fa5c257929f2-kube-api-access-hp2ft" (OuterVolumeSpecName: "kube-api-access-hp2ft") pod "854a7aa2-bb56-41fd-b223-fa5c257929f2" (UID: "854a7aa2-bb56-41fd-b223-fa5c257929f2"). InnerVolumeSpecName "kube-api-access-hp2ft". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 20:15:33.683433 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:33.683354 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/854a7aa2-bb56-41fd-b223-fa5c257929f2-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "854a7aa2-bb56-41fd-b223-fa5c257929f2" (UID: "854a7aa2-bb56-41fd-b223-fa5c257929f2"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 20:15:33.782189 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:33.782100 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hp2ft\" (UniqueName: \"kubernetes.io/projected/854a7aa2-bb56-41fd-b223-fa5c257929f2-kube-api-access-hp2ft\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 20:15:33.782189 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:33.782132 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/854a7aa2-bb56-41fd-b223-fa5c257929f2-kserve-provision-location\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 20:15:33.782189 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:33.782143 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/854a7aa2-bb56-41fd-b223-fa5c257929f2-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 20:15:33.782189 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:33.782154 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/854a7aa2-bb56-41fd-b223-fa5c257929f2-proxy-tls\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 20:15:34.210931 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:34.210897 2571 generic.go:358] "Generic (PLEG): container finished" podID="854a7aa2-bb56-41fd-b223-fa5c257929f2" containerID="320259f94cf3818e6fbcd9b1f41098bf980e26a96c5c11fff5d672c854d7b188" exitCode=0 Apr 28 20:15:34.211422 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:34.210980 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-584b446894-dhtlj" event={"ID":"854a7aa2-bb56-41fd-b223-fa5c257929f2","Type":"ContainerDied","Data":"320259f94cf3818e6fbcd9b1f41098bf980e26a96c5c11fff5d672c854d7b188"} Apr 28 20:15:34.211422 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:34.210985 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-584b446894-dhtlj" Apr 28 20:15:34.211422 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:34.211012 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-584b446894-dhtlj" event={"ID":"854a7aa2-bb56-41fd-b223-fa5c257929f2","Type":"ContainerDied","Data":"91027cf727801fe50318a41eb180cbadba002a10ee6616f252dbb6ef985efef0"} Apr 28 20:15:34.211422 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:34.211030 2571 scope.go:117] "RemoveContainer" containerID="4663274548fcbfd6938f2aac91bc33f04d8a4bbb6fd86cb00c09fa7458a0adeb" Apr 28 20:15:34.219169 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:34.219151 2571 scope.go:117] "RemoveContainer" containerID="320259f94cf3818e6fbcd9b1f41098bf980e26a96c5c11fff5d672c854d7b188" Apr 28 20:15:34.225859 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:34.225832 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-584b446894-dhtlj"] Apr 28 20:15:34.227831 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:34.227659 2571 scope.go:117] "RemoveContainer" containerID="64bdec30fe44c79e50a3f0bd5090093ebed3b8841ffd09dca55456c33e91026c" Apr 28 20:15:34.229626 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:34.229604 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-584b446894-dhtlj"] Apr 28 20:15:34.235605 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:34.235587 2571 scope.go:117] "RemoveContainer" containerID="4663274548fcbfd6938f2aac91bc33f04d8a4bbb6fd86cb00c09fa7458a0adeb" Apr 28 20:15:34.235892 ip-10-0-139-128 kubenswrapper[2571]: E0428 20:15:34.235870 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4663274548fcbfd6938f2aac91bc33f04d8a4bbb6fd86cb00c09fa7458a0adeb\": container with ID starting with 4663274548fcbfd6938f2aac91bc33f04d8a4bbb6fd86cb00c09fa7458a0adeb not found: ID does not exist" containerID="4663274548fcbfd6938f2aac91bc33f04d8a4bbb6fd86cb00c09fa7458a0adeb" Apr 28 20:15:34.235936 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:34.235911 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4663274548fcbfd6938f2aac91bc33f04d8a4bbb6fd86cb00c09fa7458a0adeb"} err="failed to get container status \"4663274548fcbfd6938f2aac91bc33f04d8a4bbb6fd86cb00c09fa7458a0adeb\": rpc error: code = NotFound desc = could not find container \"4663274548fcbfd6938f2aac91bc33f04d8a4bbb6fd86cb00c09fa7458a0adeb\": container with ID starting with 4663274548fcbfd6938f2aac91bc33f04d8a4bbb6fd86cb00c09fa7458a0adeb not found: ID does not exist" Apr 28 20:15:34.235936 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:34.235931 2571 scope.go:117] "RemoveContainer" containerID="320259f94cf3818e6fbcd9b1f41098bf980e26a96c5c11fff5d672c854d7b188" Apr 28 20:15:34.236195 ip-10-0-139-128 kubenswrapper[2571]: E0428 20:15:34.236175 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"320259f94cf3818e6fbcd9b1f41098bf980e26a96c5c11fff5d672c854d7b188\": container with ID starting with 320259f94cf3818e6fbcd9b1f41098bf980e26a96c5c11fff5d672c854d7b188 not found: ID does not exist" containerID="320259f94cf3818e6fbcd9b1f41098bf980e26a96c5c11fff5d672c854d7b188" Apr 28 20:15:34.236229 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:34.236204 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"320259f94cf3818e6fbcd9b1f41098bf980e26a96c5c11fff5d672c854d7b188"} err="failed to get container status \"320259f94cf3818e6fbcd9b1f41098bf980e26a96c5c11fff5d672c854d7b188\": rpc error: code = NotFound desc = could not find container \"320259f94cf3818e6fbcd9b1f41098bf980e26a96c5c11fff5d672c854d7b188\": container with ID starting with 320259f94cf3818e6fbcd9b1f41098bf980e26a96c5c11fff5d672c854d7b188 not found: ID does not exist" Apr 28 20:15:34.236229 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:34.236221 2571 scope.go:117] "RemoveContainer" containerID="64bdec30fe44c79e50a3f0bd5090093ebed3b8841ffd09dca55456c33e91026c" Apr 28 20:15:34.236413 ip-10-0-139-128 kubenswrapper[2571]: E0428 20:15:34.236399 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64bdec30fe44c79e50a3f0bd5090093ebed3b8841ffd09dca55456c33e91026c\": container with ID starting with 64bdec30fe44c79e50a3f0bd5090093ebed3b8841ffd09dca55456c33e91026c not found: ID does not exist" containerID="64bdec30fe44c79e50a3f0bd5090093ebed3b8841ffd09dca55456c33e91026c" Apr 28 20:15:34.236451 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:34.236418 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64bdec30fe44c79e50a3f0bd5090093ebed3b8841ffd09dca55456c33e91026c"} err="failed to get container status \"64bdec30fe44c79e50a3f0bd5090093ebed3b8841ffd09dca55456c33e91026c\": rpc error: code = NotFound desc = could not find container \"64bdec30fe44c79e50a3f0bd5090093ebed3b8841ffd09dca55456c33e91026c\": container with ID starting with 64bdec30fe44c79e50a3f0bd5090093ebed3b8841ffd09dca55456c33e91026c not found: ID does not exist" Apr 28 20:15:35.946854 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:35.946820 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="854a7aa2-bb56-41fd-b223-fa5c257929f2" path="/var/lib/kubelet/pods/854a7aa2-bb56-41fd-b223-fa5c257929f2/volumes" Apr 28 20:15:38.211354 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:38.211316 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-764bc9bb86-5qmr4" Apr 28 20:15:38.211880 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:38.211848 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-764bc9bb86-5qmr4" podUID="f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.52:8080: connect: connection refused" Apr 28 20:15:48.212201 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:48.212154 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-764bc9bb86-5qmr4" podUID="f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.52:8080: connect: connection refused" Apr 28 20:15:58.212752 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:15:58.212707 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-764bc9bb86-5qmr4" podUID="f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.52:8080: connect: connection refused" Apr 28 20:16:08.212387 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:08.212323 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-764bc9bb86-5qmr4" podUID="f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.52:8080: connect: connection refused" Apr 28 20:16:18.211917 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:18.211875 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-764bc9bb86-5qmr4" podUID="f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.52:8080: connect: connection refused" Apr 28 20:16:24.101119 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:24.101089 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ppk4t_238e19ca-102f-43b1-8aed-9322ca47bfc9/ovn-acl-logging/0.log" Apr 28 20:16:24.108564 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:24.108535 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ppk4t_238e19ca-102f-43b1-8aed-9322ca47bfc9/ovn-acl-logging/0.log" Apr 28 20:16:28.212194 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:28.212153 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-764bc9bb86-5qmr4" podUID="f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.52:8080: connect: connection refused" Apr 28 20:16:38.212772 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:38.212738 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-764bc9bb86-5qmr4" Apr 28 20:16:39.018273 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:39.018228 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-764bc9bb86-5qmr4"] Apr 28 20:16:39.018618 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:39.018560 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-764bc9bb86-5qmr4" podUID="f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b" containerName="kserve-container" containerID="cri-o://5a325988850e4626cef68a4d2230cf13d94a0e84ac0a668d2921b56b65e16d85" gracePeriod=30 Apr 28 20:16:39.018779 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:39.018613 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-764bc9bb86-5qmr4" podUID="f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b" containerName="kube-rbac-proxy" containerID="cri-o://899be02d4e2fbd4e4e9e17a6b5371389409b27c3aac0cf4e77e6ffb5e3b209f8" gracePeriod=30 Apr 28 20:16:39.412239 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:39.412208 2571 generic.go:358] "Generic (PLEG): container finished" podID="f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b" containerID="899be02d4e2fbd4e4e9e17a6b5371389409b27c3aac0cf4e77e6ffb5e3b209f8" exitCode=2 Apr 28 20:16:39.412676 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:39.412292 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-764bc9bb86-5qmr4" event={"ID":"f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b","Type":"ContainerDied","Data":"899be02d4e2fbd4e4e9e17a6b5371389409b27c3aac0cf4e77e6ffb5e3b209f8"} Apr 28 20:16:39.952257 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:39.952224 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-8cfdd8b8d-nx7ll"] Apr 28 20:16:39.952572 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:39.952558 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="854a7aa2-bb56-41fd-b223-fa5c257929f2" containerName="kserve-container" Apr 28 20:16:39.952627 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:39.952574 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="854a7aa2-bb56-41fd-b223-fa5c257929f2" containerName="kserve-container" Apr 28 20:16:39.952627 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:39.952587 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="854a7aa2-bb56-41fd-b223-fa5c257929f2" containerName="storage-initializer" Apr 28 20:16:39.952627 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:39.952593 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="854a7aa2-bb56-41fd-b223-fa5c257929f2" containerName="storage-initializer" Apr 28 20:16:39.952627 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:39.952600 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="854a7aa2-bb56-41fd-b223-fa5c257929f2" containerName="kube-rbac-proxy" Apr 28 20:16:39.952627 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:39.952605 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="854a7aa2-bb56-41fd-b223-fa5c257929f2" containerName="kube-rbac-proxy" Apr 28 20:16:39.952791 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:39.952662 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="854a7aa2-bb56-41fd-b223-fa5c257929f2" containerName="kube-rbac-proxy" Apr 28 20:16:39.952791 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:39.952670 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="854a7aa2-bb56-41fd-b223-fa5c257929f2" containerName="kserve-container" Apr 28 20:16:39.955714 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:39.955695 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-8cfdd8b8d-nx7ll" Apr 28 20:16:39.957550 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:39.957527 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-global-fail-predictor-serving-cert\"" Apr 28 20:16:39.957682 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:39.957527 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\"" Apr 28 20:16:39.965072 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:39.965046 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-8cfdd8b8d-nx7ll"] Apr 28 20:16:40.042017 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:40.041969 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxkl5\" (UniqueName: \"kubernetes.io/projected/0cec68ac-098a-4035-973d-4ae9e7f46cce-kube-api-access-zxkl5\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-8cfdd8b8d-nx7ll\" (UID: \"0cec68ac-098a-4035-973d-4ae9e7f46cce\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-8cfdd8b8d-nx7ll" Apr 28 20:16:40.042206 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:40.042109 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0cec68ac-098a-4035-973d-4ae9e7f46cce-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-8cfdd8b8d-nx7ll\" (UID: \"0cec68ac-098a-4035-973d-4ae9e7f46cce\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-8cfdd8b8d-nx7ll" Apr 28 20:16:40.042206 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:40.042175 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0cec68ac-098a-4035-973d-4ae9e7f46cce-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-8cfdd8b8d-nx7ll\" (UID: \"0cec68ac-098a-4035-973d-4ae9e7f46cce\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-8cfdd8b8d-nx7ll" Apr 28 20:16:40.042293 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:40.042232 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0cec68ac-098a-4035-973d-4ae9e7f46cce-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-8cfdd8b8d-nx7ll\" (UID: \"0cec68ac-098a-4035-973d-4ae9e7f46cce\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-8cfdd8b8d-nx7ll" Apr 28 20:16:40.142818 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:40.142779 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zxkl5\" (UniqueName: \"kubernetes.io/projected/0cec68ac-098a-4035-973d-4ae9e7f46cce-kube-api-access-zxkl5\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-8cfdd8b8d-nx7ll\" (UID: \"0cec68ac-098a-4035-973d-4ae9e7f46cce\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-8cfdd8b8d-nx7ll" Apr 28 20:16:40.143027 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:40.142842 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0cec68ac-098a-4035-973d-4ae9e7f46cce-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-8cfdd8b8d-nx7ll\" (UID: \"0cec68ac-098a-4035-973d-4ae9e7f46cce\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-8cfdd8b8d-nx7ll" Apr 28 20:16:40.143027 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:40.142867 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0cec68ac-098a-4035-973d-4ae9e7f46cce-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-8cfdd8b8d-nx7ll\" (UID: \"0cec68ac-098a-4035-973d-4ae9e7f46cce\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-8cfdd8b8d-nx7ll" Apr 28 20:16:40.143027 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:40.142894 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0cec68ac-098a-4035-973d-4ae9e7f46cce-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-8cfdd8b8d-nx7ll\" (UID: \"0cec68ac-098a-4035-973d-4ae9e7f46cce\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-8cfdd8b8d-nx7ll" Apr 28 20:16:40.143027 ip-10-0-139-128 kubenswrapper[2571]: E0428 20:16:40.142984 2571 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-serving-cert: secret "isvc-sklearn-s3-tls-global-fail-predictor-serving-cert" not found Apr 28 20:16:40.143232 ip-10-0-139-128 kubenswrapper[2571]: E0428 20:16:40.143044 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0cec68ac-098a-4035-973d-4ae9e7f46cce-proxy-tls podName:0cec68ac-098a-4035-973d-4ae9e7f46cce nodeName:}" failed. No retries permitted until 2026-04-28 20:16:40.64302909 +0000 UTC m=+3617.273930012 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/0cec68ac-098a-4035-973d-4ae9e7f46cce-proxy-tls") pod "isvc-sklearn-s3-tls-global-fail-predictor-8cfdd8b8d-nx7ll" (UID: "0cec68ac-098a-4035-973d-4ae9e7f46cce") : secret "isvc-sklearn-s3-tls-global-fail-predictor-serving-cert" not found Apr 28 20:16:40.143327 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:40.143302 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0cec68ac-098a-4035-973d-4ae9e7f46cce-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-8cfdd8b8d-nx7ll\" (UID: \"0cec68ac-098a-4035-973d-4ae9e7f46cce\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-8cfdd8b8d-nx7ll" Apr 28 20:16:40.143625 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:40.143605 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0cec68ac-098a-4035-973d-4ae9e7f46cce-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-8cfdd8b8d-nx7ll\" (UID: \"0cec68ac-098a-4035-973d-4ae9e7f46cce\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-8cfdd8b8d-nx7ll" Apr 28 20:16:40.162363 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:40.162332 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxkl5\" (UniqueName: \"kubernetes.io/projected/0cec68ac-098a-4035-973d-4ae9e7f46cce-kube-api-access-zxkl5\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-8cfdd8b8d-nx7ll\" (UID: \"0cec68ac-098a-4035-973d-4ae9e7f46cce\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-8cfdd8b8d-nx7ll" Apr 28 20:16:40.648346 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:40.648303 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0cec68ac-098a-4035-973d-4ae9e7f46cce-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-8cfdd8b8d-nx7ll\" (UID: \"0cec68ac-098a-4035-973d-4ae9e7f46cce\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-8cfdd8b8d-nx7ll" Apr 28 20:16:40.650895 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:40.650869 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0cec68ac-098a-4035-973d-4ae9e7f46cce-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-8cfdd8b8d-nx7ll\" (UID: \"0cec68ac-098a-4035-973d-4ae9e7f46cce\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-8cfdd8b8d-nx7ll" Apr 28 20:16:40.866882 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:40.866834 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-8cfdd8b8d-nx7ll" Apr 28 20:16:40.993061 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:40.993023 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-8cfdd8b8d-nx7ll"] Apr 28 20:16:40.996147 ip-10-0-139-128 kubenswrapper[2571]: W0428 20:16:40.996114 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0cec68ac_098a_4035_973d_4ae9e7f46cce.slice/crio-f69c438681766078d65d0e851662929a10b803146529701733d4ae6cf9b8d56c WatchSource:0}: Error finding container f69c438681766078d65d0e851662929a10b803146529701733d4ae6cf9b8d56c: Status 404 returned error can't find the container with id f69c438681766078d65d0e851662929a10b803146529701733d4ae6cf9b8d56c Apr 28 20:16:41.419944 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:41.419908 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-8cfdd8b8d-nx7ll" event={"ID":"0cec68ac-098a-4035-973d-4ae9e7f46cce","Type":"ContainerStarted","Data":"e52f1e0a7f1d3fa4b42f52b5f6fcb4ab20884c85ea1efb9547f69f0bbb2b7121"} Apr 28 20:16:41.419944 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:41.419942 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-8cfdd8b8d-nx7ll" event={"ID":"0cec68ac-098a-4035-973d-4ae9e7f46cce","Type":"ContainerStarted","Data":"f69c438681766078d65d0e851662929a10b803146529701733d4ae6cf9b8d56c"} Apr 28 20:16:43.207131 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:43.207082 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-764bc9bb86-5qmr4" podUID="f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.52:8643/healthz\": dial tcp 10.134.0.52:8643: connect: connection refused" Apr 28 20:16:43.869729 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:43.869703 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-764bc9bb86-5qmr4" Apr 28 20:16:43.975958 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:43.975873 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b-proxy-tls\") pod \"f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b\" (UID: \"f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b\") " Apr 28 20:16:43.975958 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:43.975912 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") pod \"f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b\" (UID: \"f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b\") " Apr 28 20:16:43.976197 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:43.975970 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b-cabundle-cert\") pod \"f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b\" (UID: \"f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b\") " Apr 28 20:16:43.976197 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:43.975986 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wdcq\" (UniqueName: \"kubernetes.io/projected/f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b-kube-api-access-4wdcq\") pod \"f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b\" (UID: \"f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b\") " Apr 28 20:16:43.976197 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:43.976029 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b-kserve-provision-location\") pod \"f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b\" (UID: \"f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b\") " Apr 28 20:16:43.976400 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:43.976376 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b" (UID: "f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 20:16:43.976450 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:43.976392 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config") pod "f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b" (UID: "f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b"). InnerVolumeSpecName "isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 20:16:43.976450 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:43.976434 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b" (UID: "f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:16:43.978143 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:43.978124 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b" (UID: "f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 20:16:43.978211 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:43.978136 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b-kube-api-access-4wdcq" (OuterVolumeSpecName: "kube-api-access-4wdcq") pod "f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b" (UID: "f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b"). InnerVolumeSpecName "kube-api-access-4wdcq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 20:16:44.077010 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:44.076969 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b-proxy-tls\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 20:16:44.077010 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:44.077003 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 20:16:44.077010 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:44.077015 2571 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b-cabundle-cert\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 20:16:44.077246 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:44.077025 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4wdcq\" (UniqueName: \"kubernetes.io/projected/f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b-kube-api-access-4wdcq\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 20:16:44.077246 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:44.077037 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b-kserve-provision-location\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 20:16:44.429940 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:44.429902 2571 generic.go:358] "Generic (PLEG): container finished" podID="f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b" containerID="5a325988850e4626cef68a4d2230cf13d94a0e84ac0a668d2921b56b65e16d85" exitCode=0 Apr 28 20:16:44.430404 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:44.429953 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-764bc9bb86-5qmr4" event={"ID":"f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b","Type":"ContainerDied","Data":"5a325988850e4626cef68a4d2230cf13d94a0e84ac0a668d2921b56b65e16d85"} Apr 28 20:16:44.430404 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:44.429984 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-764bc9bb86-5qmr4" Apr 28 20:16:44.430404 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:44.429997 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-764bc9bb86-5qmr4" event={"ID":"f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b","Type":"ContainerDied","Data":"0589e1a5a639e263e37f8fa372c385f4476fd064e443dc8b5c76a5fa03095667"} Apr 28 20:16:44.430404 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:44.430018 2571 scope.go:117] "RemoveContainer" containerID="899be02d4e2fbd4e4e9e17a6b5371389409b27c3aac0cf4e77e6ffb5e3b209f8" Apr 28 20:16:44.438716 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:44.438696 2571 scope.go:117] "RemoveContainer" containerID="5a325988850e4626cef68a4d2230cf13d94a0e84ac0a668d2921b56b65e16d85" Apr 28 20:16:44.445662 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:44.445642 2571 scope.go:117] "RemoveContainer" containerID="d3242fe961cd1b98c5b69213f522e8288b86f33d272d59903744ad0ab77d1553" Apr 28 20:16:44.450725 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:44.450697 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-764bc9bb86-5qmr4"] Apr 28 20:16:44.453119 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:44.453102 2571 scope.go:117] "RemoveContainer" containerID="899be02d4e2fbd4e4e9e17a6b5371389409b27c3aac0cf4e77e6ffb5e3b209f8" Apr 28 20:16:44.453383 ip-10-0-139-128 kubenswrapper[2571]: E0428 20:16:44.453363 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"899be02d4e2fbd4e4e9e17a6b5371389409b27c3aac0cf4e77e6ffb5e3b209f8\": container with ID starting with 899be02d4e2fbd4e4e9e17a6b5371389409b27c3aac0cf4e77e6ffb5e3b209f8 not found: ID does not exist" containerID="899be02d4e2fbd4e4e9e17a6b5371389409b27c3aac0cf4e77e6ffb5e3b209f8" Apr 28 20:16:44.453429 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:44.453393 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"899be02d4e2fbd4e4e9e17a6b5371389409b27c3aac0cf4e77e6ffb5e3b209f8"} err="failed to get container status \"899be02d4e2fbd4e4e9e17a6b5371389409b27c3aac0cf4e77e6ffb5e3b209f8\": rpc error: code = NotFound desc = could not find container \"899be02d4e2fbd4e4e9e17a6b5371389409b27c3aac0cf4e77e6ffb5e3b209f8\": container with ID starting with 899be02d4e2fbd4e4e9e17a6b5371389409b27c3aac0cf4e77e6ffb5e3b209f8 not found: ID does not exist" Apr 28 20:16:44.453429 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:44.453411 2571 scope.go:117] "RemoveContainer" containerID="5a325988850e4626cef68a4d2230cf13d94a0e84ac0a668d2921b56b65e16d85" Apr 28 20:16:44.453698 ip-10-0-139-128 kubenswrapper[2571]: E0428 20:16:44.453679 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a325988850e4626cef68a4d2230cf13d94a0e84ac0a668d2921b56b65e16d85\": container with ID starting with 5a325988850e4626cef68a4d2230cf13d94a0e84ac0a668d2921b56b65e16d85 not found: ID does not exist" containerID="5a325988850e4626cef68a4d2230cf13d94a0e84ac0a668d2921b56b65e16d85" Apr 28 20:16:44.453758 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:44.453705 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a325988850e4626cef68a4d2230cf13d94a0e84ac0a668d2921b56b65e16d85"} err="failed to get container status \"5a325988850e4626cef68a4d2230cf13d94a0e84ac0a668d2921b56b65e16d85\": rpc error: code = NotFound desc = could not find container \"5a325988850e4626cef68a4d2230cf13d94a0e84ac0a668d2921b56b65e16d85\": container with ID starting with 5a325988850e4626cef68a4d2230cf13d94a0e84ac0a668d2921b56b65e16d85 not found: ID does not exist" Apr 28 20:16:44.453758 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:44.453721 2571 scope.go:117] "RemoveContainer" containerID="d3242fe961cd1b98c5b69213f522e8288b86f33d272d59903744ad0ab77d1553" Apr 28 20:16:44.453969 ip-10-0-139-128 kubenswrapper[2571]: E0428 20:16:44.453936 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3242fe961cd1b98c5b69213f522e8288b86f33d272d59903744ad0ab77d1553\": container with ID starting with d3242fe961cd1b98c5b69213f522e8288b86f33d272d59903744ad0ab77d1553 not found: ID does not exist" containerID="d3242fe961cd1b98c5b69213f522e8288b86f33d272d59903744ad0ab77d1553" Apr 28 20:16:44.454082 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:44.453965 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3242fe961cd1b98c5b69213f522e8288b86f33d272d59903744ad0ab77d1553"} err="failed to get container status \"d3242fe961cd1b98c5b69213f522e8288b86f33d272d59903744ad0ab77d1553\": rpc error: code = NotFound desc = could not find container \"d3242fe961cd1b98c5b69213f522e8288b86f33d272d59903744ad0ab77d1553\": container with ID starting with d3242fe961cd1b98c5b69213f522e8288b86f33d272d59903744ad0ab77d1553 not found: ID does not exist" Apr 28 20:16:44.456147 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:44.456126 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-764bc9bb86-5qmr4"] Apr 28 20:16:45.434241 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:45.434210 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-8cfdd8b8d-nx7ll_0cec68ac-098a-4035-973d-4ae9e7f46cce/storage-initializer/0.log" Apr 28 20:16:45.434689 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:45.434260 2571 generic.go:358] "Generic (PLEG): container finished" podID="0cec68ac-098a-4035-973d-4ae9e7f46cce" containerID="e52f1e0a7f1d3fa4b42f52b5f6fcb4ab20884c85ea1efb9547f69f0bbb2b7121" exitCode=1 Apr 28 20:16:45.434689 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:45.434334 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-8cfdd8b8d-nx7ll" event={"ID":"0cec68ac-098a-4035-973d-4ae9e7f46cce","Type":"ContainerDied","Data":"e52f1e0a7f1d3fa4b42f52b5f6fcb4ab20884c85ea1efb9547f69f0bbb2b7121"} Apr 28 20:16:45.946686 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:45.946649 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b" path="/var/lib/kubelet/pods/f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b/volumes" Apr 28 20:16:46.439797 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:46.439769 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-8cfdd8b8d-nx7ll_0cec68ac-098a-4035-973d-4ae9e7f46cce/storage-initializer/0.log" Apr 28 20:16:46.440241 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:46.439884 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-8cfdd8b8d-nx7ll" event={"ID":"0cec68ac-098a-4035-973d-4ae9e7f46cce","Type":"ContainerStarted","Data":"23917576c5f172d6009f8a91ca96a4a7e49edd72a1e4f15f7a420d3abf96be9d"} Apr 28 20:16:49.451339 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:49.451307 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-8cfdd8b8d-nx7ll_0cec68ac-098a-4035-973d-4ae9e7f46cce/storage-initializer/1.log" Apr 28 20:16:49.451760 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:49.451684 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-8cfdd8b8d-nx7ll_0cec68ac-098a-4035-973d-4ae9e7f46cce/storage-initializer/0.log" Apr 28 20:16:49.451760 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:49.451718 2571 generic.go:358] "Generic (PLEG): container finished" podID="0cec68ac-098a-4035-973d-4ae9e7f46cce" containerID="23917576c5f172d6009f8a91ca96a4a7e49edd72a1e4f15f7a420d3abf96be9d" exitCode=1 Apr 28 20:16:49.451848 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:49.451798 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-8cfdd8b8d-nx7ll" event={"ID":"0cec68ac-098a-4035-973d-4ae9e7f46cce","Type":"ContainerDied","Data":"23917576c5f172d6009f8a91ca96a4a7e49edd72a1e4f15f7a420d3abf96be9d"} Apr 28 20:16:49.451886 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:49.451845 2571 scope.go:117] "RemoveContainer" containerID="e52f1e0a7f1d3fa4b42f52b5f6fcb4ab20884c85ea1efb9547f69f0bbb2b7121" Apr 28 20:16:49.452218 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:49.452203 2571 scope.go:117] "RemoveContainer" containerID="e52f1e0a7f1d3fa4b42f52b5f6fcb4ab20884c85ea1efb9547f69f0bbb2b7121" Apr 28 20:16:49.462450 ip-10-0-139-128 kubenswrapper[2571]: E0428 20:16:49.462422 2571 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-global-fail-predictor-8cfdd8b8d-nx7ll_kserve-ci-e2e-test_0cec68ac-098a-4035-973d-4ae9e7f46cce_0 in pod sandbox f69c438681766078d65d0e851662929a10b803146529701733d4ae6cf9b8d56c from index: no such id: 'e52f1e0a7f1d3fa4b42f52b5f6fcb4ab20884c85ea1efb9547f69f0bbb2b7121'" containerID="e52f1e0a7f1d3fa4b42f52b5f6fcb4ab20884c85ea1efb9547f69f0bbb2b7121" Apr 28 20:16:49.462537 ip-10-0-139-128 kubenswrapper[2571]: E0428 20:16:49.462469 2571 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-global-fail-predictor-8cfdd8b8d-nx7ll_kserve-ci-e2e-test_0cec68ac-098a-4035-973d-4ae9e7f46cce_0 in pod sandbox f69c438681766078d65d0e851662929a10b803146529701733d4ae6cf9b8d56c from index: no such id: 'e52f1e0a7f1d3fa4b42f52b5f6fcb4ab20884c85ea1efb9547f69f0bbb2b7121'; Skipping pod \"isvc-sklearn-s3-tls-global-fail-predictor-8cfdd8b8d-nx7ll_kserve-ci-e2e-test(0cec68ac-098a-4035-973d-4ae9e7f46cce)\"" logger="UnhandledError" Apr 28 20:16:49.463842 ip-10-0-139-128 kubenswrapper[2571]: E0428 20:16:49.463822 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-sklearn-s3-tls-global-fail-predictor-8cfdd8b8d-nx7ll_kserve-ci-e2e-test(0cec68ac-098a-4035-973d-4ae9e7f46cce)\"" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-8cfdd8b8d-nx7ll" podUID="0cec68ac-098a-4035-973d-4ae9e7f46cce" Apr 28 20:16:50.068544 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:50.068511 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-8cfdd8b8d-nx7ll"] Apr 28 20:16:50.456731 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:50.456700 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-8cfdd8b8d-nx7ll_0cec68ac-098a-4035-973d-4ae9e7f46cce/storage-initializer/1.log" Apr 28 20:16:50.581642 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:50.581616 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-8cfdd8b8d-nx7ll_0cec68ac-098a-4035-973d-4ae9e7f46cce/storage-initializer/1.log" Apr 28 20:16:50.581793 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:50.581683 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-8cfdd8b8d-nx7ll" Apr 28 20:16:50.732643 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:50.732536 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0cec68ac-098a-4035-973d-4ae9e7f46cce-kserve-provision-location\") pod \"0cec68ac-098a-4035-973d-4ae9e7f46cce\" (UID: \"0cec68ac-098a-4035-973d-4ae9e7f46cce\") " Apr 28 20:16:50.732643 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:50.732610 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxkl5\" (UniqueName: \"kubernetes.io/projected/0cec68ac-098a-4035-973d-4ae9e7f46cce-kube-api-access-zxkl5\") pod \"0cec68ac-098a-4035-973d-4ae9e7f46cce\" (UID: \"0cec68ac-098a-4035-973d-4ae9e7f46cce\") " Apr 28 20:16:50.732643 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:50.732632 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0cec68ac-098a-4035-973d-4ae9e7f46cce-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") pod \"0cec68ac-098a-4035-973d-4ae9e7f46cce\" (UID: \"0cec68ac-098a-4035-973d-4ae9e7f46cce\") " Apr 28 20:16:50.732960 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:50.732657 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0cec68ac-098a-4035-973d-4ae9e7f46cce-proxy-tls\") pod \"0cec68ac-098a-4035-973d-4ae9e7f46cce\" (UID: \"0cec68ac-098a-4035-973d-4ae9e7f46cce\") " Apr 28 20:16:50.732960 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:50.732841 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cec68ac-098a-4035-973d-4ae9e7f46cce-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0cec68ac-098a-4035-973d-4ae9e7f46cce" (UID: "0cec68ac-098a-4035-973d-4ae9e7f46cce"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:16:50.733062 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:50.733036 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0cec68ac-098a-4035-973d-4ae9e7f46cce-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config") pod "0cec68ac-098a-4035-973d-4ae9e7f46cce" (UID: "0cec68ac-098a-4035-973d-4ae9e7f46cce"). InnerVolumeSpecName "isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 20:16:50.734897 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:50.734877 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cec68ac-098a-4035-973d-4ae9e7f46cce-kube-api-access-zxkl5" (OuterVolumeSpecName: "kube-api-access-zxkl5") pod "0cec68ac-098a-4035-973d-4ae9e7f46cce" (UID: "0cec68ac-098a-4035-973d-4ae9e7f46cce"). InnerVolumeSpecName "kube-api-access-zxkl5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 20:16:50.735000 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:50.734929 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cec68ac-098a-4035-973d-4ae9e7f46cce-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0cec68ac-098a-4035-973d-4ae9e7f46cce" (UID: "0cec68ac-098a-4035-973d-4ae9e7f46cce"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 20:16:50.833898 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:50.833861 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zxkl5\" (UniqueName: \"kubernetes.io/projected/0cec68ac-098a-4035-973d-4ae9e7f46cce-kube-api-access-zxkl5\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 20:16:50.833898 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:50.833891 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0cec68ac-098a-4035-973d-4ae9e7f46cce-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 20:16:50.833898 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:50.833904 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0cec68ac-098a-4035-973d-4ae9e7f46cce-proxy-tls\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 20:16:50.834151 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:50.833914 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0cec68ac-098a-4035-973d-4ae9e7f46cce-kserve-provision-location\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 20:16:51.020087 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:51.019994 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765"] Apr 28 20:16:51.020322 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:51.020310 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b" containerName="storage-initializer" Apr 28 20:16:51.020383 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:51.020323 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b" containerName="storage-initializer" Apr 28 20:16:51.020383 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:51.020333 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0cec68ac-098a-4035-973d-4ae9e7f46cce" containerName="storage-initializer" Apr 28 20:16:51.020383 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:51.020338 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cec68ac-098a-4035-973d-4ae9e7f46cce" containerName="storage-initializer" Apr 28 20:16:51.020383 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:51.020346 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b" containerName="kube-rbac-proxy" Apr 28 20:16:51.020383 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:51.020351 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b" containerName="kube-rbac-proxy" Apr 28 20:16:51.020383 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:51.020360 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b" containerName="kserve-container" Apr 28 20:16:51.020383 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:51.020365 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b" containerName="kserve-container" Apr 28 20:16:51.020383 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:51.020380 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0cec68ac-098a-4035-973d-4ae9e7f46cce" containerName="storage-initializer" Apr 28 20:16:51.020383 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:51.020385 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cec68ac-098a-4035-973d-4ae9e7f46cce" containerName="storage-initializer" Apr 28 20:16:51.020710 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:51.020444 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b" containerName="kserve-container" Apr 28 20:16:51.020710 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:51.020456 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="0cec68ac-098a-4035-973d-4ae9e7f46cce" containerName="storage-initializer" Apr 28 20:16:51.020710 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:51.020463 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="0cec68ac-098a-4035-973d-4ae9e7f46cce" containerName="storage-initializer" Apr 28 20:16:51.020710 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:51.020469 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="f5decf8d-6dcd-4c7c-9a7e-70651aa9dd5b" containerName="kube-rbac-proxy" Apr 28 20:16:51.024863 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:51.024844 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765" Apr 28 20:16:51.026966 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:51.026938 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-custom-pass-predictor-serving-cert\"" Apr 28 20:16:51.026966 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:51.026957 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 28 20:16:51.027162 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:51.026998 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\"" Apr 28 20:16:51.035958 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:51.035933 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765"] Apr 28 20:16:51.136412 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:51.136369 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/38d97f1d-324f-407b-a782-bd77ae9c5d35-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765\" (UID: \"38d97f1d-324f-407b-a782-bd77ae9c5d35\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765" Apr 28 20:16:51.136621 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:51.136422 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/38d97f1d-324f-407b-a782-bd77ae9c5d35-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765\" (UID: \"38d97f1d-324f-407b-a782-bd77ae9c5d35\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765" Apr 28 20:16:51.136621 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:51.136551 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/38d97f1d-324f-407b-a782-bd77ae9c5d35-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765\" (UID: \"38d97f1d-324f-407b-a782-bd77ae9c5d35\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765" Apr 28 20:16:51.136621 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:51.136588 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jtnl\" (UniqueName: \"kubernetes.io/projected/38d97f1d-324f-407b-a782-bd77ae9c5d35-kube-api-access-8jtnl\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765\" (UID: \"38d97f1d-324f-407b-a782-bd77ae9c5d35\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765" Apr 28 20:16:51.136734 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:51.136634 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/38d97f1d-324f-407b-a782-bd77ae9c5d35-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765\" (UID: \"38d97f1d-324f-407b-a782-bd77ae9c5d35\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765" Apr 28 20:16:51.237311 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:51.237267 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/38d97f1d-324f-407b-a782-bd77ae9c5d35-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765\" (UID: \"38d97f1d-324f-407b-a782-bd77ae9c5d35\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765" Apr 28 20:16:51.237311 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:51.237315 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8jtnl\" (UniqueName: \"kubernetes.io/projected/38d97f1d-324f-407b-a782-bd77ae9c5d35-kube-api-access-8jtnl\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765\" (UID: \"38d97f1d-324f-407b-a782-bd77ae9c5d35\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765" Apr 28 20:16:51.237571 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:51.237358 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/38d97f1d-324f-407b-a782-bd77ae9c5d35-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765\" (UID: \"38d97f1d-324f-407b-a782-bd77ae9c5d35\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765" Apr 28 20:16:51.237571 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:51.237439 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/38d97f1d-324f-407b-a782-bd77ae9c5d35-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765\" (UID: \"38d97f1d-324f-407b-a782-bd77ae9c5d35\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765" Apr 28 20:16:51.237571 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:51.237471 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/38d97f1d-324f-407b-a782-bd77ae9c5d35-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765\" (UID: \"38d97f1d-324f-407b-a782-bd77ae9c5d35\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765" Apr 28 20:16:51.237741 ip-10-0-139-128 kubenswrapper[2571]: E0428 20:16:51.237660 2571 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-serving-cert: secret "isvc-sklearn-s3-tls-custom-pass-predictor-serving-cert" not found Apr 28 20:16:51.237798 ip-10-0-139-128 kubenswrapper[2571]: E0428 20:16:51.237760 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38d97f1d-324f-407b-a782-bd77ae9c5d35-proxy-tls podName:38d97f1d-324f-407b-a782-bd77ae9c5d35 nodeName:}" failed. No retries permitted until 2026-04-28 20:16:51.737738759 +0000 UTC m=+3628.368639684 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/38d97f1d-324f-407b-a782-bd77ae9c5d35-proxy-tls") pod "isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765" (UID: "38d97f1d-324f-407b-a782-bd77ae9c5d35") : secret "isvc-sklearn-s3-tls-custom-pass-predictor-serving-cert" not found Apr 28 20:16:51.237867 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:51.237817 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/38d97f1d-324f-407b-a782-bd77ae9c5d35-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765\" (UID: \"38d97f1d-324f-407b-a782-bd77ae9c5d35\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765" Apr 28 20:16:51.238073 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:51.238053 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/38d97f1d-324f-407b-a782-bd77ae9c5d35-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765\" (UID: \"38d97f1d-324f-407b-a782-bd77ae9c5d35\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765" Apr 28 20:16:51.238200 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:51.238180 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/38d97f1d-324f-407b-a782-bd77ae9c5d35-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765\" (UID: \"38d97f1d-324f-407b-a782-bd77ae9c5d35\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765" Apr 28 20:16:51.246388 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:51.246361 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jtnl\" (UniqueName: \"kubernetes.io/projected/38d97f1d-324f-407b-a782-bd77ae9c5d35-kube-api-access-8jtnl\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765\" (UID: \"38d97f1d-324f-407b-a782-bd77ae9c5d35\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765" Apr 28 20:16:51.460772 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:51.460745 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-8cfdd8b8d-nx7ll_0cec68ac-098a-4035-973d-4ae9e7f46cce/storage-initializer/1.log" Apr 28 20:16:51.461163 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:51.460863 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-8cfdd8b8d-nx7ll" Apr 28 20:16:51.461163 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:51.460884 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-8cfdd8b8d-nx7ll" event={"ID":"0cec68ac-098a-4035-973d-4ae9e7f46cce","Type":"ContainerDied","Data":"f69c438681766078d65d0e851662929a10b803146529701733d4ae6cf9b8d56c"} Apr 28 20:16:51.461163 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:51.460933 2571 scope.go:117] "RemoveContainer" containerID="23917576c5f172d6009f8a91ca96a4a7e49edd72a1e4f15f7a420d3abf96be9d" Apr 28 20:16:51.521144 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:51.521115 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-8cfdd8b8d-nx7ll"] Apr 28 20:16:51.524282 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:51.524258 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-8cfdd8b8d-nx7ll"] Apr 28 20:16:51.741132 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:51.741033 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/38d97f1d-324f-407b-a782-bd77ae9c5d35-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765\" (UID: \"38d97f1d-324f-407b-a782-bd77ae9c5d35\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765" Apr 28 20:16:51.743654 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:51.743627 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/38d97f1d-324f-407b-a782-bd77ae9c5d35-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765\" (UID: \"38d97f1d-324f-407b-a782-bd77ae9c5d35\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765" Apr 28 20:16:51.935651 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:51.935603 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765" Apr 28 20:16:51.947412 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:51.947370 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cec68ac-098a-4035-973d-4ae9e7f46cce" path="/var/lib/kubelet/pods/0cec68ac-098a-4035-973d-4ae9e7f46cce/volumes" Apr 28 20:16:52.065819 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:52.063050 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765"] Apr 28 20:16:52.466024 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:52.465986 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765" event={"ID":"38d97f1d-324f-407b-a782-bd77ae9c5d35","Type":"ContainerStarted","Data":"01b7efc723f48d01419ffffb96d3ea1dff0429d049f7f133944596fa4459063c"} Apr 28 20:16:52.466024 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:52.466030 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765" event={"ID":"38d97f1d-324f-407b-a782-bd77ae9c5d35","Type":"ContainerStarted","Data":"44e5d5d8e76d97ecd8ce9e484cd05be12af39742ab3e56fbfe46c85c9bc3b42c"} Apr 28 20:16:53.471035 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:53.470998 2571 generic.go:358] "Generic (PLEG): container finished" podID="38d97f1d-324f-407b-a782-bd77ae9c5d35" containerID="01b7efc723f48d01419ffffb96d3ea1dff0429d049f7f133944596fa4459063c" exitCode=0 Apr 28 20:16:53.471420 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:53.471087 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765" event={"ID":"38d97f1d-324f-407b-a782-bd77ae9c5d35","Type":"ContainerDied","Data":"01b7efc723f48d01419ffffb96d3ea1dff0429d049f7f133944596fa4459063c"} Apr 28 20:16:54.475650 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:54.475613 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765" event={"ID":"38d97f1d-324f-407b-a782-bd77ae9c5d35","Type":"ContainerStarted","Data":"3074a215e5653d66cbf04a60a1344c86ef3d6bad51b771980d9ffa462a5e2968"} Apr 28 20:16:54.476142 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:54.475655 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765" event={"ID":"38d97f1d-324f-407b-a782-bd77ae9c5d35","Type":"ContainerStarted","Data":"f474f3eb97248be8f4aede9502d19f8b8f44111b12ca037779545759ceaca4bb"} Apr 28 20:16:54.476142 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:54.475803 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765" Apr 28 20:16:54.494446 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:54.494386 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765" podStartSLOduration=3.494370033 podStartE2EDuration="3.494370033s" podCreationTimestamp="2026-04-28 20:16:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 20:16:54.492424113 +0000 UTC m=+3631.123325057" watchObservedRunningTime="2026-04-28 20:16:54.494370033 +0000 UTC m=+3631.125270977" Apr 28 20:16:55.479140 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:55.479109 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765" Apr 28 20:16:55.480221 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:55.480195 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765" podUID="38d97f1d-324f-407b-a782-bd77ae9c5d35" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 28 20:16:56.487098 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:16:56.487044 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765" podUID="38d97f1d-324f-407b-a782-bd77ae9c5d35" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 28 20:17:01.488023 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:17:01.487996 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765" Apr 28 20:17:01.488516 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:17:01.488467 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765" podUID="38d97f1d-324f-407b-a782-bd77ae9c5d35" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 28 20:17:11.488453 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:17:11.488403 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765" podUID="38d97f1d-324f-407b-a782-bd77ae9c5d35" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 28 20:17:21.488864 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:17:21.488825 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765" podUID="38d97f1d-324f-407b-a782-bd77ae9c5d35" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 28 20:17:31.488979 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:17:31.488937 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765" podUID="38d97f1d-324f-407b-a782-bd77ae9c5d35" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 28 20:17:41.488826 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:17:41.488745 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765" podUID="38d97f1d-324f-407b-a782-bd77ae9c5d35" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 28 20:17:51.489437 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:17:51.489317 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765" podUID="38d97f1d-324f-407b-a782-bd77ae9c5d35" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 28 20:18:01.489611 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:01.489579 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765" Apr 28 20:18:11.126052 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:11.126014 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765"] Apr 28 20:18:11.126568 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:11.126333 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765" podUID="38d97f1d-324f-407b-a782-bd77ae9c5d35" containerName="kserve-container" containerID="cri-o://f474f3eb97248be8f4aede9502d19f8b8f44111b12ca037779545759ceaca4bb" gracePeriod=30 Apr 28 20:18:11.126568 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:11.126367 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765" podUID="38d97f1d-324f-407b-a782-bd77ae9c5d35" containerName="kube-rbac-proxy" containerID="cri-o://3074a215e5653d66cbf04a60a1344c86ef3d6bad51b771980d9ffa462a5e2968" gracePeriod=30 Apr 28 20:18:11.485075 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:11.485033 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765" podUID="38d97f1d-324f-407b-a782-bd77ae9c5d35" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.54:8643/healthz\": dial tcp 10.134.0.54:8643: connect: connection refused" Apr 28 20:18:11.489015 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:11.488980 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765" podUID="38d97f1d-324f-407b-a782-bd77ae9c5d35" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 28 20:18:11.711783 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:11.711747 2571 generic.go:358] "Generic (PLEG): container finished" podID="38d97f1d-324f-407b-a782-bd77ae9c5d35" containerID="3074a215e5653d66cbf04a60a1344c86ef3d6bad51b771980d9ffa462a5e2968" exitCode=2 Apr 28 20:18:11.711962 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:11.711822 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765" event={"ID":"38d97f1d-324f-407b-a782-bd77ae9c5d35","Type":"ContainerDied","Data":"3074a215e5653d66cbf04a60a1344c86ef3d6bad51b771980d9ffa462a5e2968"} Apr 28 20:18:12.108676 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:12.108642 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-c7d959f56-zkxbc"] Apr 28 20:18:12.112138 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:12.112118 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-c7d959f56-zkxbc" Apr 28 20:18:12.114058 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:12.114034 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-custom-fail-predictor-serving-cert\"" Apr 28 20:18:12.114169 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:12.114064 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\"" Apr 28 20:18:12.122202 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:12.122181 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-c7d959f56-zkxbc"] Apr 28 20:18:12.199930 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:12.199888 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e5a615da-1d37-4fb3-a6ee-fc66321e9008-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-c7d959f56-zkxbc\" (UID: \"e5a615da-1d37-4fb3-a6ee-fc66321e9008\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-c7d959f56-zkxbc" Apr 28 20:18:12.200305 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:12.199929 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e5a615da-1d37-4fb3-a6ee-fc66321e9008-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-c7d959f56-zkxbc\" (UID: \"e5a615da-1d37-4fb3-a6ee-fc66321e9008\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-c7d959f56-zkxbc" Apr 28 20:18:12.200305 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:12.199987 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss5sp\" (UniqueName: \"kubernetes.io/projected/e5a615da-1d37-4fb3-a6ee-fc66321e9008-kube-api-access-ss5sp\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-c7d959f56-zkxbc\" (UID: \"e5a615da-1d37-4fb3-a6ee-fc66321e9008\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-c7d959f56-zkxbc" Apr 28 20:18:12.200305 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:12.200011 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e5a615da-1d37-4fb3-a6ee-fc66321e9008-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-c7d959f56-zkxbc\" (UID: \"e5a615da-1d37-4fb3-a6ee-fc66321e9008\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-c7d959f56-zkxbc" Apr 28 20:18:12.300832 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:12.300791 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e5a615da-1d37-4fb3-a6ee-fc66321e9008-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-c7d959f56-zkxbc\" (UID: \"e5a615da-1d37-4fb3-a6ee-fc66321e9008\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-c7d959f56-zkxbc" Apr 28 20:18:12.300832 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:12.300834 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e5a615da-1d37-4fb3-a6ee-fc66321e9008-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-c7d959f56-zkxbc\" (UID: \"e5a615da-1d37-4fb3-a6ee-fc66321e9008\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-c7d959f56-zkxbc" Apr 28 20:18:12.301103 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:12.300865 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ss5sp\" (UniqueName: \"kubernetes.io/projected/e5a615da-1d37-4fb3-a6ee-fc66321e9008-kube-api-access-ss5sp\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-c7d959f56-zkxbc\" (UID: \"e5a615da-1d37-4fb3-a6ee-fc66321e9008\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-c7d959f56-zkxbc" Apr 28 20:18:12.301103 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:12.300891 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e5a615da-1d37-4fb3-a6ee-fc66321e9008-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-c7d959f56-zkxbc\" (UID: \"e5a615da-1d37-4fb3-a6ee-fc66321e9008\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-c7d959f56-zkxbc" Apr 28 20:18:12.301103 ip-10-0-139-128 kubenswrapper[2571]: E0428 20:18:12.300958 2571 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-serving-cert: secret "isvc-sklearn-s3-tls-custom-fail-predictor-serving-cert" not found Apr 28 20:18:12.301103 ip-10-0-139-128 kubenswrapper[2571]: E0428 20:18:12.301037 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5a615da-1d37-4fb3-a6ee-fc66321e9008-proxy-tls podName:e5a615da-1d37-4fb3-a6ee-fc66321e9008 nodeName:}" failed. No retries permitted until 2026-04-28 20:18:12.801015881 +0000 UTC m=+3709.431916808 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/e5a615da-1d37-4fb3-a6ee-fc66321e9008-proxy-tls") pod "isvc-sklearn-s3-tls-custom-fail-predictor-c7d959f56-zkxbc" (UID: "e5a615da-1d37-4fb3-a6ee-fc66321e9008") : secret "isvc-sklearn-s3-tls-custom-fail-predictor-serving-cert" not found Apr 28 20:18:12.301338 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:12.301217 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e5a615da-1d37-4fb3-a6ee-fc66321e9008-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-c7d959f56-zkxbc\" (UID: \"e5a615da-1d37-4fb3-a6ee-fc66321e9008\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-c7d959f56-zkxbc" Apr 28 20:18:12.301635 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:12.301612 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e5a615da-1d37-4fb3-a6ee-fc66321e9008-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-c7d959f56-zkxbc\" (UID: \"e5a615da-1d37-4fb3-a6ee-fc66321e9008\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-c7d959f56-zkxbc" Apr 28 20:18:12.309224 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:12.309202 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss5sp\" (UniqueName: \"kubernetes.io/projected/e5a615da-1d37-4fb3-a6ee-fc66321e9008-kube-api-access-ss5sp\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-c7d959f56-zkxbc\" (UID: \"e5a615da-1d37-4fb3-a6ee-fc66321e9008\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-c7d959f56-zkxbc" Apr 28 20:18:12.805632 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:12.805591 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e5a615da-1d37-4fb3-a6ee-fc66321e9008-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-c7d959f56-zkxbc\" (UID: \"e5a615da-1d37-4fb3-a6ee-fc66321e9008\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-c7d959f56-zkxbc" Apr 28 20:18:12.808000 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:12.807975 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e5a615da-1d37-4fb3-a6ee-fc66321e9008-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-c7d959f56-zkxbc\" (UID: \"e5a615da-1d37-4fb3-a6ee-fc66321e9008\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-c7d959f56-zkxbc" Apr 28 20:18:13.022857 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:13.022818 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-c7d959f56-zkxbc" Apr 28 20:18:13.147670 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:13.147637 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-c7d959f56-zkxbc"] Apr 28 20:18:13.150921 ip-10-0-139-128 kubenswrapper[2571]: W0428 20:18:13.150892 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5a615da_1d37_4fb3_a6ee_fc66321e9008.slice/crio-895e123bd6ffca51c755ccb5d1a8449d4ad7602266821c2f9bc0dd28df0a83d2 WatchSource:0}: Error finding container 895e123bd6ffca51c755ccb5d1a8449d4ad7602266821c2f9bc0dd28df0a83d2: Status 404 returned error can't find the container with id 895e123bd6ffca51c755ccb5d1a8449d4ad7602266821c2f9bc0dd28df0a83d2 Apr 28 20:18:13.719892 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:13.719850 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-c7d959f56-zkxbc" event={"ID":"e5a615da-1d37-4fb3-a6ee-fc66321e9008","Type":"ContainerStarted","Data":"8585da41b03c6d6f47540c6c852e98408c681d1a97d3778c82412c26e2e3942f"} Apr 28 20:18:13.719892 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:13.719893 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-c7d959f56-zkxbc" event={"ID":"e5a615da-1d37-4fb3-a6ee-fc66321e9008","Type":"ContainerStarted","Data":"895e123bd6ffca51c755ccb5d1a8449d4ad7602266821c2f9bc0dd28df0a83d2"} Apr 28 20:18:15.729224 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:15.729191 2571 generic.go:358] "Generic (PLEG): container finished" podID="38d97f1d-324f-407b-a782-bd77ae9c5d35" containerID="f474f3eb97248be8f4aede9502d19f8b8f44111b12ca037779545759ceaca4bb" exitCode=0 Apr 28 20:18:15.729600 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:15.729236 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765" event={"ID":"38d97f1d-324f-407b-a782-bd77ae9c5d35","Type":"ContainerDied","Data":"f474f3eb97248be8f4aede9502d19f8b8f44111b12ca037779545759ceaca4bb"} Apr 28 20:18:15.770414 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:15.770390 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765" Apr 28 20:18:15.832979 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:15.832952 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/38d97f1d-324f-407b-a782-bd77ae9c5d35-kserve-provision-location\") pod \"38d97f1d-324f-407b-a782-bd77ae9c5d35\" (UID: \"38d97f1d-324f-407b-a782-bd77ae9c5d35\") " Apr 28 20:18:15.833135 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:15.833000 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/38d97f1d-324f-407b-a782-bd77ae9c5d35-proxy-tls\") pod \"38d97f1d-324f-407b-a782-bd77ae9c5d35\" (UID: \"38d97f1d-324f-407b-a782-bd77ae9c5d35\") " Apr 28 20:18:15.833135 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:15.833034 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/38d97f1d-324f-407b-a782-bd77ae9c5d35-cabundle-cert\") pod \"38d97f1d-324f-407b-a782-bd77ae9c5d35\" (UID: \"38d97f1d-324f-407b-a782-bd77ae9c5d35\") " Apr 28 20:18:15.833135 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:15.833067 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/38d97f1d-324f-407b-a782-bd77ae9c5d35-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") pod \"38d97f1d-324f-407b-a782-bd77ae9c5d35\" (UID: \"38d97f1d-324f-407b-a782-bd77ae9c5d35\") " Apr 28 20:18:15.833135 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:15.833088 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jtnl\" (UniqueName: \"kubernetes.io/projected/38d97f1d-324f-407b-a782-bd77ae9c5d35-kube-api-access-8jtnl\") pod \"38d97f1d-324f-407b-a782-bd77ae9c5d35\" (UID: \"38d97f1d-324f-407b-a782-bd77ae9c5d35\") " Apr 28 20:18:15.833368 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:15.833341 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38d97f1d-324f-407b-a782-bd77ae9c5d35-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "38d97f1d-324f-407b-a782-bd77ae9c5d35" (UID: "38d97f1d-324f-407b-a782-bd77ae9c5d35"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:18:15.833472 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:15.833444 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38d97f1d-324f-407b-a782-bd77ae9c5d35-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "38d97f1d-324f-407b-a782-bd77ae9c5d35" (UID: "38d97f1d-324f-407b-a782-bd77ae9c5d35"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 20:18:15.833603 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:15.833509 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38d97f1d-324f-407b-a782-bd77ae9c5d35-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config") pod "38d97f1d-324f-407b-a782-bd77ae9c5d35" (UID: "38d97f1d-324f-407b-a782-bd77ae9c5d35"). InnerVolumeSpecName "isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 20:18:15.835267 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:15.835246 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38d97f1d-324f-407b-a782-bd77ae9c5d35-kube-api-access-8jtnl" (OuterVolumeSpecName: "kube-api-access-8jtnl") pod "38d97f1d-324f-407b-a782-bd77ae9c5d35" (UID: "38d97f1d-324f-407b-a782-bd77ae9c5d35"). InnerVolumeSpecName "kube-api-access-8jtnl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 20:18:15.835267 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:15.835256 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38d97f1d-324f-407b-a782-bd77ae9c5d35-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "38d97f1d-324f-407b-a782-bd77ae9c5d35" (UID: "38d97f1d-324f-407b-a782-bd77ae9c5d35"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 20:18:15.934576 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:15.934532 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/38d97f1d-324f-407b-a782-bd77ae9c5d35-kserve-provision-location\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 20:18:15.934576 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:15.934567 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/38d97f1d-324f-407b-a782-bd77ae9c5d35-proxy-tls\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 20:18:15.934576 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:15.934578 2571 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/38d97f1d-324f-407b-a782-bd77ae9c5d35-cabundle-cert\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 20:18:15.934817 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:15.934588 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/38d97f1d-324f-407b-a782-bd77ae9c5d35-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 20:18:15.934817 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:15.934599 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8jtnl\" (UniqueName: \"kubernetes.io/projected/38d97f1d-324f-407b-a782-bd77ae9c5d35-kube-api-access-8jtnl\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 20:18:16.733182 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:16.733153 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-c7d959f56-zkxbc_e5a615da-1d37-4fb3-a6ee-fc66321e9008/storage-initializer/0.log" Apr 28 20:18:16.733635 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:16.733190 2571 generic.go:358] "Generic (PLEG): container finished" podID="e5a615da-1d37-4fb3-a6ee-fc66321e9008" containerID="8585da41b03c6d6f47540c6c852e98408c681d1a97d3778c82412c26e2e3942f" exitCode=1 Apr 28 20:18:16.733635 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:16.733254 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-c7d959f56-zkxbc" event={"ID":"e5a615da-1d37-4fb3-a6ee-fc66321e9008","Type":"ContainerDied","Data":"8585da41b03c6d6f47540c6c852e98408c681d1a97d3778c82412c26e2e3942f"} Apr 28 20:18:16.735088 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:16.735065 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765" event={"ID":"38d97f1d-324f-407b-a782-bd77ae9c5d35","Type":"ContainerDied","Data":"44e5d5d8e76d97ecd8ce9e484cd05be12af39742ab3e56fbfe46c85c9bc3b42c"} Apr 28 20:18:16.735172 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:16.735104 2571 scope.go:117] "RemoveContainer" containerID="3074a215e5653d66cbf04a60a1344c86ef3d6bad51b771980d9ffa462a5e2968" Apr 28 20:18:16.735172 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:16.735125 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765" Apr 28 20:18:16.746592 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:16.746568 2571 scope.go:117] "RemoveContainer" containerID="f474f3eb97248be8f4aede9502d19f8b8f44111b12ca037779545759ceaca4bb" Apr 28 20:18:16.758717 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:16.758696 2571 scope.go:117] "RemoveContainer" containerID="01b7efc723f48d01419ffffb96d3ea1dff0429d049f7f133944596fa4459063c" Apr 28 20:18:16.761136 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:16.761117 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765"] Apr 28 20:18:16.764246 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:16.764225 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-848ff9798-tm765"] Apr 28 20:18:17.739662 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:17.739634 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-c7d959f56-zkxbc_e5a615da-1d37-4fb3-a6ee-fc66321e9008/storage-initializer/0.log" Apr 28 20:18:17.740067 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:17.739723 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-c7d959f56-zkxbc" event={"ID":"e5a615da-1d37-4fb3-a6ee-fc66321e9008","Type":"ContainerStarted","Data":"013e646eadfe2fed6fd9696dd771db31ba073b1c0891b06993212602246cf3ea"} Apr 28 20:18:17.947845 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:17.947814 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38d97f1d-324f-407b-a782-bd77ae9c5d35" path="/var/lib/kubelet/pods/38d97f1d-324f-407b-a782-bd77ae9c5d35/volumes" Apr 28 20:18:22.099013 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:22.098985 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-c7d959f56-zkxbc"] Apr 28 20:18:22.099344 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:22.099233 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-c7d959f56-zkxbc" podUID="e5a615da-1d37-4fb3-a6ee-fc66321e9008" containerName="storage-initializer" containerID="cri-o://013e646eadfe2fed6fd9696dd771db31ba073b1c0891b06993212602246cf3ea" gracePeriod=30 Apr 28 20:18:22.220993 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:22.220967 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-c7d959f56-zkxbc_e5a615da-1d37-4fb3-a6ee-fc66321e9008/storage-initializer/1.log" Apr 28 20:18:22.221297 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:22.221281 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-c7d959f56-zkxbc_e5a615da-1d37-4fb3-a6ee-fc66321e9008/storage-initializer/0.log" Apr 28 20:18:22.221388 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:22.221358 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-c7d959f56-zkxbc" Apr 28 20:18:22.285141 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:22.285113 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e5a615da-1d37-4fb3-a6ee-fc66321e9008-proxy-tls\") pod \"e5a615da-1d37-4fb3-a6ee-fc66321e9008\" (UID: \"e5a615da-1d37-4fb3-a6ee-fc66321e9008\") " Apr 28 20:18:22.285322 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:22.285158 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ss5sp\" (UniqueName: \"kubernetes.io/projected/e5a615da-1d37-4fb3-a6ee-fc66321e9008-kube-api-access-ss5sp\") pod \"e5a615da-1d37-4fb3-a6ee-fc66321e9008\" (UID: \"e5a615da-1d37-4fb3-a6ee-fc66321e9008\") " Apr 28 20:18:22.285322 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:22.285218 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e5a615da-1d37-4fb3-a6ee-fc66321e9008-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") pod \"e5a615da-1d37-4fb3-a6ee-fc66321e9008\" (UID: \"e5a615da-1d37-4fb3-a6ee-fc66321e9008\") " Apr 28 20:18:22.285322 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:22.285242 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e5a615da-1d37-4fb3-a6ee-fc66321e9008-kserve-provision-location\") pod \"e5a615da-1d37-4fb3-a6ee-fc66321e9008\" (UID: \"e5a615da-1d37-4fb3-a6ee-fc66321e9008\") " Apr 28 20:18:22.285579 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:22.285551 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5a615da-1d37-4fb3-a6ee-fc66321e9008-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e5a615da-1d37-4fb3-a6ee-fc66321e9008" (UID: "e5a615da-1d37-4fb3-a6ee-fc66321e9008"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:18:22.285647 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:22.285608 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5a615da-1d37-4fb3-a6ee-fc66321e9008-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config") pod "e5a615da-1d37-4fb3-a6ee-fc66321e9008" (UID: "e5a615da-1d37-4fb3-a6ee-fc66321e9008"). InnerVolumeSpecName "isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 20:18:22.287298 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:22.287226 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5a615da-1d37-4fb3-a6ee-fc66321e9008-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "e5a615da-1d37-4fb3-a6ee-fc66321e9008" (UID: "e5a615da-1d37-4fb3-a6ee-fc66321e9008"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 20:18:22.287403 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:22.287322 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5a615da-1d37-4fb3-a6ee-fc66321e9008-kube-api-access-ss5sp" (OuterVolumeSpecName: "kube-api-access-ss5sp") pod "e5a615da-1d37-4fb3-a6ee-fc66321e9008" (UID: "e5a615da-1d37-4fb3-a6ee-fc66321e9008"). InnerVolumeSpecName "kube-api-access-ss5sp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 20:18:22.386132 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:22.386098 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e5a615da-1d37-4fb3-a6ee-fc66321e9008-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 20:18:22.386132 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:22.386127 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e5a615da-1d37-4fb3-a6ee-fc66321e9008-kserve-provision-location\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 20:18:22.386309 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:22.386140 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e5a615da-1d37-4fb3-a6ee-fc66321e9008-proxy-tls\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 20:18:22.386309 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:22.386154 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ss5sp\" (UniqueName: \"kubernetes.io/projected/e5a615da-1d37-4fb3-a6ee-fc66321e9008-kube-api-access-ss5sp\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 20:18:22.756202 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:22.756171 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-c7d959f56-zkxbc_e5a615da-1d37-4fb3-a6ee-fc66321e9008/storage-initializer/1.log" Apr 28 20:18:22.756563 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:22.756545 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-c7d959f56-zkxbc_e5a615da-1d37-4fb3-a6ee-fc66321e9008/storage-initializer/0.log" Apr 28 20:18:22.756635 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:22.756581 2571 generic.go:358] "Generic (PLEG): container finished" podID="e5a615da-1d37-4fb3-a6ee-fc66321e9008" containerID="013e646eadfe2fed6fd9696dd771db31ba073b1c0891b06993212602246cf3ea" exitCode=1 Apr 28 20:18:22.756635 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:22.756608 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-c7d959f56-zkxbc" event={"ID":"e5a615da-1d37-4fb3-a6ee-fc66321e9008","Type":"ContainerDied","Data":"013e646eadfe2fed6fd9696dd771db31ba073b1c0891b06993212602246cf3ea"} Apr 28 20:18:22.756635 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:22.756630 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-c7d959f56-zkxbc" event={"ID":"e5a615da-1d37-4fb3-a6ee-fc66321e9008","Type":"ContainerDied","Data":"895e123bd6ffca51c755ccb5d1a8449d4ad7602266821c2f9bc0dd28df0a83d2"} Apr 28 20:18:22.756737 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:22.756646 2571 scope.go:117] "RemoveContainer" containerID="013e646eadfe2fed6fd9696dd771db31ba073b1c0891b06993212602246cf3ea" Apr 28 20:18:22.756737 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:22.756678 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-c7d959f56-zkxbc" Apr 28 20:18:22.764797 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:22.764774 2571 scope.go:117] "RemoveContainer" containerID="8585da41b03c6d6f47540c6c852e98408c681d1a97d3778c82412c26e2e3942f" Apr 28 20:18:22.771639 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:22.771620 2571 scope.go:117] "RemoveContainer" containerID="013e646eadfe2fed6fd9696dd771db31ba073b1c0891b06993212602246cf3ea" Apr 28 20:18:22.771873 ip-10-0-139-128 kubenswrapper[2571]: E0428 20:18:22.771854 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"013e646eadfe2fed6fd9696dd771db31ba073b1c0891b06993212602246cf3ea\": container with ID starting with 013e646eadfe2fed6fd9696dd771db31ba073b1c0891b06993212602246cf3ea not found: ID does not exist" containerID="013e646eadfe2fed6fd9696dd771db31ba073b1c0891b06993212602246cf3ea" Apr 28 20:18:22.771936 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:22.771882 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"013e646eadfe2fed6fd9696dd771db31ba073b1c0891b06993212602246cf3ea"} err="failed to get container status \"013e646eadfe2fed6fd9696dd771db31ba073b1c0891b06993212602246cf3ea\": rpc error: code = NotFound desc = could not find container \"013e646eadfe2fed6fd9696dd771db31ba073b1c0891b06993212602246cf3ea\": container with ID starting with 013e646eadfe2fed6fd9696dd771db31ba073b1c0891b06993212602246cf3ea not found: ID does not exist" Apr 28 20:18:22.771936 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:22.771899 2571 scope.go:117] "RemoveContainer" containerID="8585da41b03c6d6f47540c6c852e98408c681d1a97d3778c82412c26e2e3942f" Apr 28 20:18:22.772112 ip-10-0-139-128 kubenswrapper[2571]: E0428 20:18:22.772092 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8585da41b03c6d6f47540c6c852e98408c681d1a97d3778c82412c26e2e3942f\": container with ID starting with 8585da41b03c6d6f47540c6c852e98408c681d1a97d3778c82412c26e2e3942f not found: ID does not exist" containerID="8585da41b03c6d6f47540c6c852e98408c681d1a97d3778c82412c26e2e3942f" Apr 28 20:18:22.772156 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:22.772119 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8585da41b03c6d6f47540c6c852e98408c681d1a97d3778c82412c26e2e3942f"} err="failed to get container status \"8585da41b03c6d6f47540c6c852e98408c681d1a97d3778c82412c26e2e3942f\": rpc error: code = NotFound desc = could not find container \"8585da41b03c6d6f47540c6c852e98408c681d1a97d3778c82412c26e2e3942f\": container with ID starting with 8585da41b03c6d6f47540c6c852e98408c681d1a97d3778c82412c26e2e3942f not found: ID does not exist" Apr 28 20:18:22.791111 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:22.791087 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-c7d959f56-zkxbc"] Apr 28 20:18:22.793703 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:22.793679 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-c7d959f56-zkxbc"] Apr 28 20:18:23.169226 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:23.169201 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-545f7c7957-66n4x"] Apr 28 20:18:23.169630 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:23.169554 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="38d97f1d-324f-407b-a782-bd77ae9c5d35" containerName="kube-rbac-proxy" Apr 28 20:18:23.169630 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:23.169568 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="38d97f1d-324f-407b-a782-bd77ae9c5d35" containerName="kube-rbac-proxy" Apr 28 20:18:23.169630 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:23.169578 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e5a615da-1d37-4fb3-a6ee-fc66321e9008" containerName="storage-initializer" Apr 28 20:18:23.169630 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:23.169584 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5a615da-1d37-4fb3-a6ee-fc66321e9008" containerName="storage-initializer" Apr 28 20:18:23.169630 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:23.169597 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="38d97f1d-324f-407b-a782-bd77ae9c5d35" containerName="storage-initializer" Apr 28 20:18:23.169630 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:23.169602 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="38d97f1d-324f-407b-a782-bd77ae9c5d35" containerName="storage-initializer" Apr 28 20:18:23.169630 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:23.169608 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="38d97f1d-324f-407b-a782-bd77ae9c5d35" containerName="kserve-container" Apr 28 20:18:23.169630 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:23.169613 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="38d97f1d-324f-407b-a782-bd77ae9c5d35" containerName="kserve-container" Apr 28 20:18:23.169630 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:23.169620 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e5a615da-1d37-4fb3-a6ee-fc66321e9008" containerName="storage-initializer" Apr 28 20:18:23.169630 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:23.169625 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5a615da-1d37-4fb3-a6ee-fc66321e9008" containerName="storage-initializer" Apr 28 20:18:23.169951 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:23.169670 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="e5a615da-1d37-4fb3-a6ee-fc66321e9008" containerName="storage-initializer" Apr 28 20:18:23.169951 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:23.169682 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="38d97f1d-324f-407b-a782-bd77ae9c5d35" containerName="kube-rbac-proxy" Apr 28 20:18:23.169951 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:23.169690 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="38d97f1d-324f-407b-a782-bd77ae9c5d35" containerName="kserve-container" Apr 28 20:18:23.169951 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:23.169778 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="e5a615da-1d37-4fb3-a6ee-fc66321e9008" containerName="storage-initializer" Apr 28 20:18:23.174443 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:23.174426 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-545f7c7957-66n4x" Apr 28 20:18:23.176627 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:23.176605 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"storage-config\"" Apr 28 20:18:23.176784 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:23.176768 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-serving-pass-predictor-serving-cert\"" Apr 28 20:18:23.176866 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:23.176851 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 28 20:18:23.177066 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:23.177051 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 28 20:18:23.177121 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:23.177105 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-44jch\"" Apr 28 20:18:23.177339 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:23.177325 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 28 20:18:23.177379 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:23.177353 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\"" Apr 28 20:18:23.184025 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:23.184004 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-545f7c7957-66n4x"] Apr 28 20:18:23.293186 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:23.293158 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9c7a5202-6c5d-4819-806d-245f9a244a0e-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-545f7c7957-66n4x\" (UID: \"9c7a5202-6c5d-4819-806d-245f9a244a0e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-545f7c7957-66n4x" Apr 28 20:18:23.293374 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:23.293205 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9c7a5202-6c5d-4819-806d-245f9a244a0e-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-545f7c7957-66n4x\" (UID: \"9c7a5202-6c5d-4819-806d-245f9a244a0e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-545f7c7957-66n4x" Apr 28 20:18:23.293374 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:23.293287 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbtxf\" (UniqueName: \"kubernetes.io/projected/9c7a5202-6c5d-4819-806d-245f9a244a0e-kube-api-access-hbtxf\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-545f7c7957-66n4x\" (UID: \"9c7a5202-6c5d-4819-806d-245f9a244a0e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-545f7c7957-66n4x" Apr 28 20:18:23.293374 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:23.293325 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9c7a5202-6c5d-4819-806d-245f9a244a0e-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-545f7c7957-66n4x\" (UID: \"9c7a5202-6c5d-4819-806d-245f9a244a0e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-545f7c7957-66n4x" Apr 28 20:18:23.293374 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:23.293357 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/9c7a5202-6c5d-4819-806d-245f9a244a0e-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-545f7c7957-66n4x\" (UID: \"9c7a5202-6c5d-4819-806d-245f9a244a0e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-545f7c7957-66n4x" Apr 28 20:18:23.393864 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:23.393830 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9c7a5202-6c5d-4819-806d-245f9a244a0e-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-545f7c7957-66n4x\" (UID: \"9c7a5202-6c5d-4819-806d-245f9a244a0e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-545f7c7957-66n4x" Apr 28 20:18:23.394046 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:23.393872 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9c7a5202-6c5d-4819-806d-245f9a244a0e-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-545f7c7957-66n4x\" (UID: \"9c7a5202-6c5d-4819-806d-245f9a244a0e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-545f7c7957-66n4x" Apr 28 20:18:23.394046 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:23.393915 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hbtxf\" (UniqueName: \"kubernetes.io/projected/9c7a5202-6c5d-4819-806d-245f9a244a0e-kube-api-access-hbtxf\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-545f7c7957-66n4x\" (UID: \"9c7a5202-6c5d-4819-806d-245f9a244a0e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-545f7c7957-66n4x" Apr 28 20:18:23.394046 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:23.393942 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9c7a5202-6c5d-4819-806d-245f9a244a0e-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-545f7c7957-66n4x\" (UID: \"9c7a5202-6c5d-4819-806d-245f9a244a0e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-545f7c7957-66n4x" Apr 28 20:18:23.394046 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:23.393964 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/9c7a5202-6c5d-4819-806d-245f9a244a0e-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-545f7c7957-66n4x\" (UID: \"9c7a5202-6c5d-4819-806d-245f9a244a0e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-545f7c7957-66n4x" Apr 28 20:18:23.394343 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:23.394323 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9c7a5202-6c5d-4819-806d-245f9a244a0e-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-545f7c7957-66n4x\" (UID: \"9c7a5202-6c5d-4819-806d-245f9a244a0e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-545f7c7957-66n4x" Apr 28 20:18:23.394746 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:23.394724 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9c7a5202-6c5d-4819-806d-245f9a244a0e-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-545f7c7957-66n4x\" (UID: \"9c7a5202-6c5d-4819-806d-245f9a244a0e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-545f7c7957-66n4x" Apr 28 20:18:23.394795 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:23.394724 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/9c7a5202-6c5d-4819-806d-245f9a244a0e-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-545f7c7957-66n4x\" (UID: \"9c7a5202-6c5d-4819-806d-245f9a244a0e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-545f7c7957-66n4x" Apr 28 20:18:23.396273 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:23.396247 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9c7a5202-6c5d-4819-806d-245f9a244a0e-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-545f7c7957-66n4x\" (UID: \"9c7a5202-6c5d-4819-806d-245f9a244a0e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-545f7c7957-66n4x" Apr 28 20:18:23.401815 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:23.401788 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbtxf\" (UniqueName: \"kubernetes.io/projected/9c7a5202-6c5d-4819-806d-245f9a244a0e-kube-api-access-hbtxf\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-545f7c7957-66n4x\" (UID: \"9c7a5202-6c5d-4819-806d-245f9a244a0e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-545f7c7957-66n4x" Apr 28 20:18:23.486357 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:23.486281 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-545f7c7957-66n4x" Apr 28 20:18:23.605843 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:23.605811 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-545f7c7957-66n4x"] Apr 28 20:18:23.608807 ip-10-0-139-128 kubenswrapper[2571]: W0428 20:18:23.608774 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c7a5202_6c5d_4819_806d_245f9a244a0e.slice/crio-b7b117cfbecc8372f7e452348c196fbff2c32ef40eb22490c6e548036f2f80eb WatchSource:0}: Error finding container b7b117cfbecc8372f7e452348c196fbff2c32ef40eb22490c6e548036f2f80eb: Status 404 returned error can't find the container with id b7b117cfbecc8372f7e452348c196fbff2c32ef40eb22490c6e548036f2f80eb Apr 28 20:18:23.761441 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:23.761350 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-545f7c7957-66n4x" event={"ID":"9c7a5202-6c5d-4819-806d-245f9a244a0e","Type":"ContainerStarted","Data":"b4787fcbe2922aeb70978ae8ac77f39c0aa57aac324e9699b6bcefe411fccf81"} Apr 28 20:18:23.761441 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:23.761391 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-545f7c7957-66n4x" event={"ID":"9c7a5202-6c5d-4819-806d-245f9a244a0e","Type":"ContainerStarted","Data":"b7b117cfbecc8372f7e452348c196fbff2c32ef40eb22490c6e548036f2f80eb"} Apr 28 20:18:23.946978 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:23.946943 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5a615da-1d37-4fb3-a6ee-fc66321e9008" path="/var/lib/kubelet/pods/e5a615da-1d37-4fb3-a6ee-fc66321e9008/volumes" Apr 28 20:18:24.765539 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:24.765506 2571 generic.go:358] "Generic (PLEG): container finished" podID="9c7a5202-6c5d-4819-806d-245f9a244a0e" containerID="b4787fcbe2922aeb70978ae8ac77f39c0aa57aac324e9699b6bcefe411fccf81" exitCode=0 Apr 28 20:18:24.765883 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:24.765563 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-545f7c7957-66n4x" event={"ID":"9c7a5202-6c5d-4819-806d-245f9a244a0e","Type":"ContainerDied","Data":"b4787fcbe2922aeb70978ae8ac77f39c0aa57aac324e9699b6bcefe411fccf81"} Apr 28 20:18:25.771441 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:25.771403 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-545f7c7957-66n4x" event={"ID":"9c7a5202-6c5d-4819-806d-245f9a244a0e","Type":"ContainerStarted","Data":"db9cf61e6a9676730c65f9ce1627743b12b2d65fc5dcd2c9bca4fada8c8c5ea9"} Apr 28 20:18:25.771441 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:25.771446 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-545f7c7957-66n4x" event={"ID":"9c7a5202-6c5d-4819-806d-245f9a244a0e","Type":"ContainerStarted","Data":"29c5bc94108797685257f6af00e3ba45f2012dad7332d5f89e7332250c37faca"} Apr 28 20:18:25.771977 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:25.771539 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-545f7c7957-66n4x" Apr 28 20:18:25.790427 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:25.790374 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-545f7c7957-66n4x" podStartSLOduration=2.790359354 podStartE2EDuration="2.790359354s" podCreationTimestamp="2026-04-28 20:18:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 20:18:25.787997925 +0000 UTC m=+3722.418898869" watchObservedRunningTime="2026-04-28 20:18:25.790359354 +0000 UTC m=+3722.421260300" Apr 28 20:18:26.774176 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:26.774138 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-545f7c7957-66n4x" Apr 28 20:18:26.775315 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:26.775289 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-545f7c7957-66n4x" podUID="9c7a5202-6c5d-4819-806d-245f9a244a0e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.56:8080: connect: connection refused" Apr 28 20:18:27.777277 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:27.777242 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-545f7c7957-66n4x" podUID="9c7a5202-6c5d-4819-806d-245f9a244a0e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.56:8080: connect: connection refused" Apr 28 20:18:32.781791 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:32.781760 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-545f7c7957-66n4x" Apr 28 20:18:32.782349 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:32.782319 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-545f7c7957-66n4x" podUID="9c7a5202-6c5d-4819-806d-245f9a244a0e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.56:8080: connect: connection refused" Apr 28 20:18:42.782540 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:42.782503 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-545f7c7957-66n4x" podUID="9c7a5202-6c5d-4819-806d-245f9a244a0e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.56:8080: connect: connection refused" Apr 28 20:18:52.782887 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:18:52.782842 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-545f7c7957-66n4x" podUID="9c7a5202-6c5d-4819-806d-245f9a244a0e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.56:8080: connect: connection refused" Apr 28 20:19:02.783329 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:02.783288 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-545f7c7957-66n4x" podUID="9c7a5202-6c5d-4819-806d-245f9a244a0e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.56:8080: connect: connection refused" Apr 28 20:19:12.782377 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:12.782294 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-545f7c7957-66n4x" podUID="9c7a5202-6c5d-4819-806d-245f9a244a0e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.56:8080: connect: connection refused" Apr 28 20:19:22.782854 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:22.782812 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-545f7c7957-66n4x" podUID="9c7a5202-6c5d-4819-806d-245f9a244a0e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.56:8080: connect: connection refused" Apr 28 20:19:32.782660 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:32.782626 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-545f7c7957-66n4x" Apr 28 20:19:33.337758 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:33.337723 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-545f7c7957-66n4x"] Apr 28 20:19:33.338200 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:33.338145 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-545f7c7957-66n4x" podUID="9c7a5202-6c5d-4819-806d-245f9a244a0e" containerName="kserve-container" containerID="cri-o://29c5bc94108797685257f6af00e3ba45f2012dad7332d5f89e7332250c37faca" gracePeriod=30 Apr 28 20:19:33.338294 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:33.338167 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-545f7c7957-66n4x" podUID="9c7a5202-6c5d-4819-806d-245f9a244a0e" containerName="kube-rbac-proxy" containerID="cri-o://db9cf61e6a9676730c65f9ce1627743b12b2d65fc5dcd2c9bca4fada8c8c5ea9" gracePeriod=30 Apr 28 20:19:33.970029 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:33.969993 2571 generic.go:358] "Generic (PLEG): container finished" podID="9c7a5202-6c5d-4819-806d-245f9a244a0e" containerID="db9cf61e6a9676730c65f9ce1627743b12b2d65fc5dcd2c9bca4fada8c8c5ea9" exitCode=2 Apr 28 20:19:33.970399 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:33.970069 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-545f7c7957-66n4x" event={"ID":"9c7a5202-6c5d-4819-806d-245f9a244a0e","Type":"ContainerDied","Data":"db9cf61e6a9676730c65f9ce1627743b12b2d65fc5dcd2c9bca4fada8c8c5ea9"} Apr 28 20:19:34.269200 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:34.269109 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b6b7fcbd7-77h4d"] Apr 28 20:19:34.272720 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:34.272694 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b6b7fcbd7-77h4d" Apr 28 20:19:34.274680 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:34.274654 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\"" Apr 28 20:19:34.274786 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:34.274701 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-serving-fail-predictor-serving-cert\"" Apr 28 20:19:34.282794 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:34.282772 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b6b7fcbd7-77h4d"] Apr 28 20:19:34.371814 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:34.371778 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e8de176b-be7c-4666-b35b-ef444271904d-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-7b6b7fcbd7-77h4d\" (UID: \"e8de176b-be7c-4666-b35b-ef444271904d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b6b7fcbd7-77h4d" Apr 28 20:19:34.371814 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:34.371811 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e8de176b-be7c-4666-b35b-ef444271904d-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-7b6b7fcbd7-77h4d\" (UID: \"e8de176b-be7c-4666-b35b-ef444271904d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b6b7fcbd7-77h4d" Apr 28 20:19:34.372021 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:34.371894 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e8de176b-be7c-4666-b35b-ef444271904d-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-7b6b7fcbd7-77h4d\" (UID: \"e8de176b-be7c-4666-b35b-ef444271904d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b6b7fcbd7-77h4d" Apr 28 20:19:34.372021 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:34.371939 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8dj4\" (UniqueName: \"kubernetes.io/projected/e8de176b-be7c-4666-b35b-ef444271904d-kube-api-access-r8dj4\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-7b6b7fcbd7-77h4d\" (UID: \"e8de176b-be7c-4666-b35b-ef444271904d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b6b7fcbd7-77h4d" Apr 28 20:19:34.472730 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:34.472694 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r8dj4\" (UniqueName: \"kubernetes.io/projected/e8de176b-be7c-4666-b35b-ef444271904d-kube-api-access-r8dj4\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-7b6b7fcbd7-77h4d\" (UID: \"e8de176b-be7c-4666-b35b-ef444271904d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b6b7fcbd7-77h4d" Apr 28 20:19:34.472917 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:34.472748 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e8de176b-be7c-4666-b35b-ef444271904d-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-7b6b7fcbd7-77h4d\" (UID: \"e8de176b-be7c-4666-b35b-ef444271904d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b6b7fcbd7-77h4d" Apr 28 20:19:34.472917 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:34.472769 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e8de176b-be7c-4666-b35b-ef444271904d-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-7b6b7fcbd7-77h4d\" (UID: \"e8de176b-be7c-4666-b35b-ef444271904d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b6b7fcbd7-77h4d" Apr 28 20:19:34.472917 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:34.472806 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e8de176b-be7c-4666-b35b-ef444271904d-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-7b6b7fcbd7-77h4d\" (UID: \"e8de176b-be7c-4666-b35b-ef444271904d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b6b7fcbd7-77h4d" Apr 28 20:19:34.472917 ip-10-0-139-128 kubenswrapper[2571]: E0428 20:19:34.472899 2571 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-serving-cert: secret "isvc-sklearn-s3-tls-serving-fail-predictor-serving-cert" not found Apr 28 20:19:34.473131 ip-10-0-139-128 kubenswrapper[2571]: E0428 20:19:34.472968 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8de176b-be7c-4666-b35b-ef444271904d-proxy-tls podName:e8de176b-be7c-4666-b35b-ef444271904d nodeName:}" failed. No retries permitted until 2026-04-28 20:19:34.972946417 +0000 UTC m=+3791.603847348 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/e8de176b-be7c-4666-b35b-ef444271904d-proxy-tls") pod "isvc-sklearn-s3-tls-serving-fail-predictor-7b6b7fcbd7-77h4d" (UID: "e8de176b-be7c-4666-b35b-ef444271904d") : secret "isvc-sklearn-s3-tls-serving-fail-predictor-serving-cert" not found Apr 28 20:19:34.473241 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:34.473217 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e8de176b-be7c-4666-b35b-ef444271904d-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-7b6b7fcbd7-77h4d\" (UID: \"e8de176b-be7c-4666-b35b-ef444271904d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b6b7fcbd7-77h4d" Apr 28 20:19:34.473389 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:34.473373 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e8de176b-be7c-4666-b35b-ef444271904d-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-7b6b7fcbd7-77h4d\" (UID: \"e8de176b-be7c-4666-b35b-ef444271904d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b6b7fcbd7-77h4d" Apr 28 20:19:34.480964 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:34.480943 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8dj4\" (UniqueName: \"kubernetes.io/projected/e8de176b-be7c-4666-b35b-ef444271904d-kube-api-access-r8dj4\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-7b6b7fcbd7-77h4d\" (UID: \"e8de176b-be7c-4666-b35b-ef444271904d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b6b7fcbd7-77h4d" Apr 28 20:19:34.977874 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:34.977841 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e8de176b-be7c-4666-b35b-ef444271904d-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-7b6b7fcbd7-77h4d\" (UID: \"e8de176b-be7c-4666-b35b-ef444271904d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b6b7fcbd7-77h4d" Apr 28 20:19:34.980262 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:34.980240 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e8de176b-be7c-4666-b35b-ef444271904d-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-7b6b7fcbd7-77h4d\" (UID: \"e8de176b-be7c-4666-b35b-ef444271904d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b6b7fcbd7-77h4d" Apr 28 20:19:35.183559 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:35.183525 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b6b7fcbd7-77h4d" Apr 28 20:19:35.307703 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:35.307669 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b6b7fcbd7-77h4d"] Apr 28 20:19:35.311933 ip-10-0-139-128 kubenswrapper[2571]: W0428 20:19:35.311908 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8de176b_be7c_4666_b35b_ef444271904d.slice/crio-d3072486d9fd575182a67852925738ad52c0e50cd96648aa05b6d790ed95b895 WatchSource:0}: Error finding container d3072486d9fd575182a67852925738ad52c0e50cd96648aa05b6d790ed95b895: Status 404 returned error can't find the container with id d3072486d9fd575182a67852925738ad52c0e50cd96648aa05b6d790ed95b895 Apr 28 20:19:35.313802 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:35.313783 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 20:19:35.977575 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:35.977535 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b6b7fcbd7-77h4d" event={"ID":"e8de176b-be7c-4666-b35b-ef444271904d","Type":"ContainerStarted","Data":"cbd39ef59fac7cbdc21d15050af598040bf096ca84fa0d22a987c4fab7707745"} Apr 28 20:19:35.977575 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:35.977574 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b6b7fcbd7-77h4d" event={"ID":"e8de176b-be7c-4666-b35b-ef444271904d","Type":"ContainerStarted","Data":"d3072486d9fd575182a67852925738ad52c0e50cd96648aa05b6d790ed95b895"} Apr 28 20:19:37.777520 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:37.777463 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-545f7c7957-66n4x" podUID="9c7a5202-6c5d-4819-806d-245f9a244a0e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.56:8643/healthz\": dial tcp 10.134.0.56:8643: connect: connection refused" Apr 28 20:19:37.880073 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:37.880039 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-545f7c7957-66n4x" Apr 28 20:19:37.984843 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:37.984767 2571 generic.go:358] "Generic (PLEG): container finished" podID="9c7a5202-6c5d-4819-806d-245f9a244a0e" containerID="29c5bc94108797685257f6af00e3ba45f2012dad7332d5f89e7332250c37faca" exitCode=0 Apr 28 20:19:37.985088 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:37.984842 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-545f7c7957-66n4x" event={"ID":"9c7a5202-6c5d-4819-806d-245f9a244a0e","Type":"ContainerDied","Data":"29c5bc94108797685257f6af00e3ba45f2012dad7332d5f89e7332250c37faca"} Apr 28 20:19:37.985088 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:37.984855 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-545f7c7957-66n4x" Apr 28 20:19:37.985088 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:37.984869 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-545f7c7957-66n4x" event={"ID":"9c7a5202-6c5d-4819-806d-245f9a244a0e","Type":"ContainerDied","Data":"b7b117cfbecc8372f7e452348c196fbff2c32ef40eb22490c6e548036f2f80eb"} Apr 28 20:19:37.985088 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:37.984883 2571 scope.go:117] "RemoveContainer" containerID="db9cf61e6a9676730c65f9ce1627743b12b2d65fc5dcd2c9bca4fada8c8c5ea9" Apr 28 20:19:37.992563 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:37.992545 2571 scope.go:117] "RemoveContainer" containerID="29c5bc94108797685257f6af00e3ba45f2012dad7332d5f89e7332250c37faca" Apr 28 20:19:37.999509 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:37.999471 2571 scope.go:117] "RemoveContainer" containerID="b4787fcbe2922aeb70978ae8ac77f39c0aa57aac324e9699b6bcefe411fccf81" Apr 28 20:19:38.004577 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:38.004550 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbtxf\" (UniqueName: \"kubernetes.io/projected/9c7a5202-6c5d-4819-806d-245f9a244a0e-kube-api-access-hbtxf\") pod \"9c7a5202-6c5d-4819-806d-245f9a244a0e\" (UID: \"9c7a5202-6c5d-4819-806d-245f9a244a0e\") " Apr 28 20:19:38.004660 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:38.004603 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9c7a5202-6c5d-4819-806d-245f9a244a0e-kserve-provision-location\") pod \"9c7a5202-6c5d-4819-806d-245f9a244a0e\" (UID: \"9c7a5202-6c5d-4819-806d-245f9a244a0e\") " Apr 28 20:19:38.004702 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:38.004659 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9c7a5202-6c5d-4819-806d-245f9a244a0e-proxy-tls\") pod \"9c7a5202-6c5d-4819-806d-245f9a244a0e\" (UID: \"9c7a5202-6c5d-4819-806d-245f9a244a0e\") " Apr 28 20:19:38.004737 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:38.004705 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/9c7a5202-6c5d-4819-806d-245f9a244a0e-cabundle-cert\") pod \"9c7a5202-6c5d-4819-806d-245f9a244a0e\" (UID: \"9c7a5202-6c5d-4819-806d-245f9a244a0e\") " Apr 28 20:19:38.004781 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:38.004745 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9c7a5202-6c5d-4819-806d-245f9a244a0e-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") pod \"9c7a5202-6c5d-4819-806d-245f9a244a0e\" (UID: \"9c7a5202-6c5d-4819-806d-245f9a244a0e\") " Apr 28 20:19:38.005025 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:38.004953 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c7a5202-6c5d-4819-806d-245f9a244a0e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9c7a5202-6c5d-4819-806d-245f9a244a0e" (UID: "9c7a5202-6c5d-4819-806d-245f9a244a0e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:19:38.005130 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:38.005081 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c7a5202-6c5d-4819-806d-245f9a244a0e-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "9c7a5202-6c5d-4819-806d-245f9a244a0e" (UID: "9c7a5202-6c5d-4819-806d-245f9a244a0e"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 20:19:38.005196 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:38.005135 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c7a5202-6c5d-4819-806d-245f9a244a0e-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config") pod "9c7a5202-6c5d-4819-806d-245f9a244a0e" (UID: "9c7a5202-6c5d-4819-806d-245f9a244a0e"). InnerVolumeSpecName "isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 20:19:38.006583 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:38.006554 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c7a5202-6c5d-4819-806d-245f9a244a0e-kube-api-access-hbtxf" (OuterVolumeSpecName: "kube-api-access-hbtxf") pod "9c7a5202-6c5d-4819-806d-245f9a244a0e" (UID: "9c7a5202-6c5d-4819-806d-245f9a244a0e"). InnerVolumeSpecName "kube-api-access-hbtxf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 20:19:38.006661 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:38.006612 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c7a5202-6c5d-4819-806d-245f9a244a0e-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "9c7a5202-6c5d-4819-806d-245f9a244a0e" (UID: "9c7a5202-6c5d-4819-806d-245f9a244a0e"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 20:19:38.006842 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:38.006825 2571 scope.go:117] "RemoveContainer" containerID="db9cf61e6a9676730c65f9ce1627743b12b2d65fc5dcd2c9bca4fada8c8c5ea9" Apr 28 20:19:38.007058 ip-10-0-139-128 kubenswrapper[2571]: E0428 20:19:38.007042 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db9cf61e6a9676730c65f9ce1627743b12b2d65fc5dcd2c9bca4fada8c8c5ea9\": container with ID starting with db9cf61e6a9676730c65f9ce1627743b12b2d65fc5dcd2c9bca4fada8c8c5ea9 not found: ID does not exist" containerID="db9cf61e6a9676730c65f9ce1627743b12b2d65fc5dcd2c9bca4fada8c8c5ea9" Apr 28 20:19:38.007125 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:38.007066 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db9cf61e6a9676730c65f9ce1627743b12b2d65fc5dcd2c9bca4fada8c8c5ea9"} err="failed to get container status \"db9cf61e6a9676730c65f9ce1627743b12b2d65fc5dcd2c9bca4fada8c8c5ea9\": rpc error: code = NotFound desc = could not find container \"db9cf61e6a9676730c65f9ce1627743b12b2d65fc5dcd2c9bca4fada8c8c5ea9\": container with ID starting with db9cf61e6a9676730c65f9ce1627743b12b2d65fc5dcd2c9bca4fada8c8c5ea9 not found: ID does not exist" Apr 28 20:19:38.007125 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:38.007082 2571 scope.go:117] "RemoveContainer" containerID="29c5bc94108797685257f6af00e3ba45f2012dad7332d5f89e7332250c37faca" Apr 28 20:19:38.007306 ip-10-0-139-128 kubenswrapper[2571]: E0428 20:19:38.007287 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29c5bc94108797685257f6af00e3ba45f2012dad7332d5f89e7332250c37faca\": container with ID starting with 29c5bc94108797685257f6af00e3ba45f2012dad7332d5f89e7332250c37faca not found: ID does not exist" containerID="29c5bc94108797685257f6af00e3ba45f2012dad7332d5f89e7332250c37faca" Apr 28 20:19:38.007367 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:38.007312 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29c5bc94108797685257f6af00e3ba45f2012dad7332d5f89e7332250c37faca"} err="failed to get container status \"29c5bc94108797685257f6af00e3ba45f2012dad7332d5f89e7332250c37faca\": rpc error: code = NotFound desc = could not find container \"29c5bc94108797685257f6af00e3ba45f2012dad7332d5f89e7332250c37faca\": container with ID starting with 29c5bc94108797685257f6af00e3ba45f2012dad7332d5f89e7332250c37faca not found: ID does not exist" Apr 28 20:19:38.007367 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:38.007332 2571 scope.go:117] "RemoveContainer" containerID="b4787fcbe2922aeb70978ae8ac77f39c0aa57aac324e9699b6bcefe411fccf81" Apr 28 20:19:38.007592 ip-10-0-139-128 kubenswrapper[2571]: E0428 20:19:38.007557 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4787fcbe2922aeb70978ae8ac77f39c0aa57aac324e9699b6bcefe411fccf81\": container with ID starting with b4787fcbe2922aeb70978ae8ac77f39c0aa57aac324e9699b6bcefe411fccf81 not found: ID does not exist" containerID="b4787fcbe2922aeb70978ae8ac77f39c0aa57aac324e9699b6bcefe411fccf81" Apr 28 20:19:38.007658 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:38.007594 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4787fcbe2922aeb70978ae8ac77f39c0aa57aac324e9699b6bcefe411fccf81"} err="failed to get container status \"b4787fcbe2922aeb70978ae8ac77f39c0aa57aac324e9699b6bcefe411fccf81\": rpc error: code = NotFound desc = could not find container \"b4787fcbe2922aeb70978ae8ac77f39c0aa57aac324e9699b6bcefe411fccf81\": container with ID starting with b4787fcbe2922aeb70978ae8ac77f39c0aa57aac324e9699b6bcefe411fccf81 not found: ID does not exist" Apr 28 20:19:38.106239 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:38.106117 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9c7a5202-6c5d-4819-806d-245f9a244a0e-proxy-tls\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 20:19:38.106239 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:38.106154 2571 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/9c7a5202-6c5d-4819-806d-245f9a244a0e-cabundle-cert\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 20:19:38.106239 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:38.106173 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9c7a5202-6c5d-4819-806d-245f9a244a0e-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 20:19:38.106239 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:38.106192 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hbtxf\" (UniqueName: \"kubernetes.io/projected/9c7a5202-6c5d-4819-806d-245f9a244a0e-kube-api-access-hbtxf\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 20:19:38.106239 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:38.106210 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9c7a5202-6c5d-4819-806d-245f9a244a0e-kserve-provision-location\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 20:19:38.305709 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:38.305677 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-545f7c7957-66n4x"] Apr 28 20:19:38.309145 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:38.309112 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-545f7c7957-66n4x"] Apr 28 20:19:38.989018 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:38.988939 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-7b6b7fcbd7-77h4d_e8de176b-be7c-4666-b35b-ef444271904d/storage-initializer/0.log" Apr 28 20:19:38.989018 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:38.988978 2571 generic.go:358] "Generic (PLEG): container finished" podID="e8de176b-be7c-4666-b35b-ef444271904d" containerID="cbd39ef59fac7cbdc21d15050af598040bf096ca84fa0d22a987c4fab7707745" exitCode=1 Apr 28 20:19:38.989517 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:38.989053 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b6b7fcbd7-77h4d" event={"ID":"e8de176b-be7c-4666-b35b-ef444271904d","Type":"ContainerDied","Data":"cbd39ef59fac7cbdc21d15050af598040bf096ca84fa0d22a987c4fab7707745"} Apr 28 20:19:39.947208 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:39.947174 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c7a5202-6c5d-4819-806d-245f9a244a0e" path="/var/lib/kubelet/pods/9c7a5202-6c5d-4819-806d-245f9a244a0e/volumes" Apr 28 20:19:39.994355 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:39.994327 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-7b6b7fcbd7-77h4d_e8de176b-be7c-4666-b35b-ef444271904d/storage-initializer/0.log" Apr 28 20:19:39.994745 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:39.994447 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b6b7fcbd7-77h4d" event={"ID":"e8de176b-be7c-4666-b35b-ef444271904d","Type":"ContainerStarted","Data":"4f9c98676acc1cdaa542c112a6fa48875ed8d48728736a5a13db99549a691c18"} Apr 28 20:19:44.259070 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:44.259035 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b6b7fcbd7-77h4d"] Apr 28 20:19:44.259435 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:44.259398 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b6b7fcbd7-77h4d" podUID="e8de176b-be7c-4666-b35b-ef444271904d" containerName="storage-initializer" containerID="cri-o://4f9c98676acc1cdaa542c112a6fa48875ed8d48728736a5a13db99549a691c18" gracePeriod=30 Apr 28 20:19:46.013807 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:46.013777 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-7b6b7fcbd7-77h4d_e8de176b-be7c-4666-b35b-ef444271904d/storage-initializer/1.log" Apr 28 20:19:46.014282 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:46.014173 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-7b6b7fcbd7-77h4d_e8de176b-be7c-4666-b35b-ef444271904d/storage-initializer/0.log" Apr 28 20:19:46.014282 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:46.014207 2571 generic.go:358] "Generic (PLEG): container finished" podID="e8de176b-be7c-4666-b35b-ef444271904d" containerID="4f9c98676acc1cdaa542c112a6fa48875ed8d48728736a5a13db99549a691c18" exitCode=1 Apr 28 20:19:46.014282 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:46.014236 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b6b7fcbd7-77h4d" event={"ID":"e8de176b-be7c-4666-b35b-ef444271904d","Type":"ContainerDied","Data":"4f9c98676acc1cdaa542c112a6fa48875ed8d48728736a5a13db99549a691c18"} Apr 28 20:19:46.014282 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:46.014267 2571 scope.go:117] "RemoveContainer" containerID="cbd39ef59fac7cbdc21d15050af598040bf096ca84fa0d22a987c4fab7707745" Apr 28 20:19:46.105415 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:46.105394 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-7b6b7fcbd7-77h4d_e8de176b-be7c-4666-b35b-ef444271904d/storage-initializer/1.log" Apr 28 20:19:46.105550 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:46.105461 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b6b7fcbd7-77h4d" Apr 28 20:19:46.171289 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:46.171264 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e8de176b-be7c-4666-b35b-ef444271904d-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") pod \"e8de176b-be7c-4666-b35b-ef444271904d\" (UID: \"e8de176b-be7c-4666-b35b-ef444271904d\") " Apr 28 20:19:46.171426 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:46.171307 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8dj4\" (UniqueName: \"kubernetes.io/projected/e8de176b-be7c-4666-b35b-ef444271904d-kube-api-access-r8dj4\") pod \"e8de176b-be7c-4666-b35b-ef444271904d\" (UID: \"e8de176b-be7c-4666-b35b-ef444271904d\") " Apr 28 20:19:46.171426 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:46.171335 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e8de176b-be7c-4666-b35b-ef444271904d-proxy-tls\") pod \"e8de176b-be7c-4666-b35b-ef444271904d\" (UID: \"e8de176b-be7c-4666-b35b-ef444271904d\") " Apr 28 20:19:46.171426 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:46.171392 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e8de176b-be7c-4666-b35b-ef444271904d-kserve-provision-location\") pod \"e8de176b-be7c-4666-b35b-ef444271904d\" (UID: \"e8de176b-be7c-4666-b35b-ef444271904d\") " Apr 28 20:19:46.171713 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:46.171686 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8de176b-be7c-4666-b35b-ef444271904d-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config") pod "e8de176b-be7c-4666-b35b-ef444271904d" (UID: "e8de176b-be7c-4666-b35b-ef444271904d"). InnerVolumeSpecName "isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 20:19:46.171780 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:46.171684 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8de176b-be7c-4666-b35b-ef444271904d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e8de176b-be7c-4666-b35b-ef444271904d" (UID: "e8de176b-be7c-4666-b35b-ef444271904d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:19:46.173422 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:46.173402 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8de176b-be7c-4666-b35b-ef444271904d-kube-api-access-r8dj4" (OuterVolumeSpecName: "kube-api-access-r8dj4") pod "e8de176b-be7c-4666-b35b-ef444271904d" (UID: "e8de176b-be7c-4666-b35b-ef444271904d"). InnerVolumeSpecName "kube-api-access-r8dj4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 20:19:46.173475 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:46.173407 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8de176b-be7c-4666-b35b-ef444271904d-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "e8de176b-be7c-4666-b35b-ef444271904d" (UID: "e8de176b-be7c-4666-b35b-ef444271904d"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 20:19:46.272770 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:46.272733 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e8de176b-be7c-4666-b35b-ef444271904d-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 20:19:46.272770 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:46.272765 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r8dj4\" (UniqueName: \"kubernetes.io/projected/e8de176b-be7c-4666-b35b-ef444271904d-kube-api-access-r8dj4\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 20:19:46.272962 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:46.272779 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e8de176b-be7c-4666-b35b-ef444271904d-proxy-tls\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 20:19:46.272962 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:46.272792 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e8de176b-be7c-4666-b35b-ef444271904d-kserve-provision-location\") on node \"ip-10-0-139-128.ec2.internal\" DevicePath \"\"" Apr 28 20:19:47.018628 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:47.018600 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-7b6b7fcbd7-77h4d_e8de176b-be7c-4666-b35b-ef444271904d/storage-initializer/1.log" Apr 28 20:19:47.019044 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:47.018676 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b6b7fcbd7-77h4d" event={"ID":"e8de176b-be7c-4666-b35b-ef444271904d","Type":"ContainerDied","Data":"d3072486d9fd575182a67852925738ad52c0e50cd96648aa05b6d790ed95b895"} Apr 28 20:19:47.019044 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:47.018702 2571 scope.go:117] "RemoveContainer" containerID="4f9c98676acc1cdaa542c112a6fa48875ed8d48728736a5a13db99549a691c18" Apr 28 20:19:47.019044 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:47.018725 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b6b7fcbd7-77h4d" Apr 28 20:19:47.056217 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:47.056184 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b6b7fcbd7-77h4d"] Apr 28 20:19:47.059336 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:47.059309 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7b6b7fcbd7-77h4d"] Apr 28 20:19:47.947192 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:19:47.947159 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8de176b-be7c-4666-b35b-ef444271904d" path="/var/lib/kubelet/pods/e8de176b-be7c-4666-b35b-ef444271904d/volumes" Apr 28 20:20:13.802894 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:13.802858 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-r9lxr/must-gather-kslct"] Apr 28 20:20:13.803596 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:13.803165 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e8de176b-be7c-4666-b35b-ef444271904d" containerName="storage-initializer" Apr 28 20:20:13.803596 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:13.803177 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8de176b-be7c-4666-b35b-ef444271904d" containerName="storage-initializer" Apr 28 20:20:13.803596 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:13.803187 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e8de176b-be7c-4666-b35b-ef444271904d" containerName="storage-initializer" Apr 28 20:20:13.803596 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:13.803192 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8de176b-be7c-4666-b35b-ef444271904d" containerName="storage-initializer" Apr 28 20:20:13.803596 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:13.803201 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9c7a5202-6c5d-4819-806d-245f9a244a0e" containerName="kserve-container" Apr 28 20:20:13.803596 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:13.803208 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c7a5202-6c5d-4819-806d-245f9a244a0e" containerName="kserve-container" Apr 28 20:20:13.803596 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:13.803221 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9c7a5202-6c5d-4819-806d-245f9a244a0e" containerName="kube-rbac-proxy" Apr 28 20:20:13.803596 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:13.803226 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c7a5202-6c5d-4819-806d-245f9a244a0e" containerName="kube-rbac-proxy" Apr 28 20:20:13.803596 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:13.803234 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9c7a5202-6c5d-4819-806d-245f9a244a0e" containerName="storage-initializer" Apr 28 20:20:13.803596 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:13.803239 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c7a5202-6c5d-4819-806d-245f9a244a0e" containerName="storage-initializer" Apr 28 20:20:13.803596 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:13.803286 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="9c7a5202-6c5d-4819-806d-245f9a244a0e" containerName="kube-rbac-proxy" Apr 28 20:20:13.803596 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:13.803295 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="9c7a5202-6c5d-4819-806d-245f9a244a0e" containerName="kserve-container" Apr 28 20:20:13.803596 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:13.803301 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="e8de176b-be7c-4666-b35b-ef444271904d" containerName="storage-initializer" Apr 28 20:20:13.803596 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:13.803386 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="e8de176b-be7c-4666-b35b-ef444271904d" containerName="storage-initializer" Apr 28 20:20:13.806133 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:13.806115 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r9lxr/must-gather-kslct" Apr 28 20:20:13.808123 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:13.808101 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-r9lxr\"/\"default-dockercfg-7h6qj\"" Apr 28 20:20:13.808123 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:13.808114 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-r9lxr\"/\"openshift-service-ca.crt\"" Apr 28 20:20:13.808250 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:13.808101 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-r9lxr\"/\"kube-root-ca.crt\"" Apr 28 20:20:13.816571 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:13.816550 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-r9lxr/must-gather-kslct"] Apr 28 20:20:13.891086 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:13.891052 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6674dc93-5cc8-4be6-bbee-c08d8682e92f-must-gather-output\") pod \"must-gather-kslct\" (UID: \"6674dc93-5cc8-4be6-bbee-c08d8682e92f\") " pod="openshift-must-gather-r9lxr/must-gather-kslct" Apr 28 20:20:13.891250 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:13.891096 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xgp7\" (UniqueName: \"kubernetes.io/projected/6674dc93-5cc8-4be6-bbee-c08d8682e92f-kube-api-access-2xgp7\") pod \"must-gather-kslct\" (UID: \"6674dc93-5cc8-4be6-bbee-c08d8682e92f\") " pod="openshift-must-gather-r9lxr/must-gather-kslct" Apr 28 20:20:13.992411 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:13.992381 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6674dc93-5cc8-4be6-bbee-c08d8682e92f-must-gather-output\") pod \"must-gather-kslct\" (UID: \"6674dc93-5cc8-4be6-bbee-c08d8682e92f\") " pod="openshift-must-gather-r9lxr/must-gather-kslct" Apr 28 20:20:13.992576 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:13.992423 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2xgp7\" (UniqueName: \"kubernetes.io/projected/6674dc93-5cc8-4be6-bbee-c08d8682e92f-kube-api-access-2xgp7\") pod \"must-gather-kslct\" (UID: \"6674dc93-5cc8-4be6-bbee-c08d8682e92f\") " pod="openshift-must-gather-r9lxr/must-gather-kslct" Apr 28 20:20:13.992736 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:13.992718 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6674dc93-5cc8-4be6-bbee-c08d8682e92f-must-gather-output\") pod \"must-gather-kslct\" (UID: \"6674dc93-5cc8-4be6-bbee-c08d8682e92f\") " pod="openshift-must-gather-r9lxr/must-gather-kslct" Apr 28 20:20:13.999240 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:13.999218 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xgp7\" (UniqueName: \"kubernetes.io/projected/6674dc93-5cc8-4be6-bbee-c08d8682e92f-kube-api-access-2xgp7\") pod \"must-gather-kslct\" (UID: \"6674dc93-5cc8-4be6-bbee-c08d8682e92f\") " pod="openshift-must-gather-r9lxr/must-gather-kslct" Apr 28 20:20:14.115858 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:14.115785 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r9lxr/must-gather-kslct" Apr 28 20:20:14.233935 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:14.233890 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-r9lxr/must-gather-kslct"] Apr 28 20:20:14.237596 ip-10-0-139-128 kubenswrapper[2571]: W0428 20:20:14.237563 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6674dc93_5cc8_4be6_bbee_c08d8682e92f.slice/crio-9a1f2349d26a96811e8ab13d09de3c667c92668b83f444ea6f493d50a74c55d1 WatchSource:0}: Error finding container 9a1f2349d26a96811e8ab13d09de3c667c92668b83f444ea6f493d50a74c55d1: Status 404 returned error can't find the container with id 9a1f2349d26a96811e8ab13d09de3c667c92668b83f444ea6f493d50a74c55d1 Apr 28 20:20:15.104733 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:15.104691 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r9lxr/must-gather-kslct" event={"ID":"6674dc93-5cc8-4be6-bbee-c08d8682e92f","Type":"ContainerStarted","Data":"9a1f2349d26a96811e8ab13d09de3c667c92668b83f444ea6f493d50a74c55d1"} Apr 28 20:20:16.110356 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:16.110309 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r9lxr/must-gather-kslct" event={"ID":"6674dc93-5cc8-4be6-bbee-c08d8682e92f","Type":"ContainerStarted","Data":"fa58657ede10693df66dbfc1364be25ec11ade3c7231bea05426359fc3a2d83c"} Apr 28 20:20:16.110356 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:16.110361 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r9lxr/must-gather-kslct" event={"ID":"6674dc93-5cc8-4be6-bbee-c08d8682e92f","Type":"ContainerStarted","Data":"5599d84db3c5187cf62140db16b0223f90e5b2c29e74ebaf8e697e624b452e63"} Apr 28 20:20:16.127375 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:16.127314 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-r9lxr/must-gather-kslct" podStartSLOduration=2.239526785 podStartE2EDuration="3.127296631s" podCreationTimestamp="2026-04-28 20:20:13 +0000 UTC" firstStartedPulling="2026-04-28 20:20:14.239674772 +0000 UTC m=+3830.870575694" lastFinishedPulling="2026-04-28 20:20:15.127444604 +0000 UTC m=+3831.758345540" observedRunningTime="2026-04-28 20:20:16.125233398 +0000 UTC m=+3832.756134343" watchObservedRunningTime="2026-04-28 20:20:16.127296631 +0000 UTC m=+3832.758197575" Apr 28 20:20:16.574679 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:16.574649 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-jfmqj_7ea34c75-1b0f-4ec6-a805-8cf83f9d32fb/global-pull-secret-syncer/0.log" Apr 28 20:20:16.736854 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:16.736821 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-76d57_9a11e608-5ea9-4123-8db2-08683b9e10b6/konnectivity-agent/0.log" Apr 28 20:20:16.816194 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:16.816162 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-139-128.ec2.internal_3f217b631ac7267173c9067d07088610/haproxy/0.log" Apr 28 20:20:19.806670 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:19.806632 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_24e133e7-af81-4b03-9995-6c8082eeaf83/alertmanager/0.log" Apr 28 20:20:19.834016 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:19.833893 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_24e133e7-af81-4b03-9995-6c8082eeaf83/config-reloader/0.log" Apr 28 20:20:19.872871 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:19.872832 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_24e133e7-af81-4b03-9995-6c8082eeaf83/kube-rbac-proxy-web/0.log" Apr 28 20:20:19.905531 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:19.905453 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_24e133e7-af81-4b03-9995-6c8082eeaf83/kube-rbac-proxy/0.log" Apr 28 20:20:19.950616 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:19.950092 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_24e133e7-af81-4b03-9995-6c8082eeaf83/kube-rbac-proxy-metric/0.log" Apr 28 20:20:19.974698 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:19.974672 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_24e133e7-af81-4b03-9995-6c8082eeaf83/prom-label-proxy/0.log" Apr 28 20:20:20.003472 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:20.003442 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_24e133e7-af81-4b03-9995-6c8082eeaf83/init-config-reloader/0.log" Apr 28 20:20:20.099832 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:20.099751 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-zqwh5_f03f118d-0ede-40a8-a7cf-3c637824276d/kube-state-metrics/0.log" Apr 28 20:20:20.126278 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:20.126226 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-zqwh5_f03f118d-0ede-40a8-a7cf-3c637824276d/kube-rbac-proxy-main/0.log" Apr 28 20:20:20.154308 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:20.154274 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-zqwh5_f03f118d-0ede-40a8-a7cf-3c637824276d/kube-rbac-proxy-self/0.log" Apr 28 20:20:20.213957 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:20.213913 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-wqhlw_10b46dea-b1bd-4fe9-b096-d027eda0809d/monitoring-plugin/0.log" Apr 28 20:20:20.408548 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:20.408473 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-tl8ms_6eb370a0-b2e9-42c0-acb9-92f39db33103/node-exporter/0.log" Apr 28 20:20:20.431356 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:20.431325 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-tl8ms_6eb370a0-b2e9-42c0-acb9-92f39db33103/kube-rbac-proxy/0.log" Apr 28 20:20:20.456608 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:20.456581 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-tl8ms_6eb370a0-b2e9-42c0-acb9-92f39db33103/init-textfile/0.log" Apr 28 20:20:20.483093 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:20.483063 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-m7r5s_86939d9a-e349-408c-aad2-55e43a981aac/kube-rbac-proxy-main/0.log" Apr 28 20:20:20.508604 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:20.508579 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-m7r5s_86939d9a-e349-408c-aad2-55e43a981aac/kube-rbac-proxy-self/0.log" Apr 28 20:20:20.534280 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:20.534234 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-m7r5s_86939d9a-e349-408c-aad2-55e43a981aac/openshift-state-metrics/0.log" Apr 28 20:20:20.805835 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:20.805751 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6475644cb-ndpb2_3807a95d-ac8a-42a9-95bf-87514836c9be/telemeter-client/0.log" Apr 28 20:20:20.832292 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:20.832248 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6475644cb-ndpb2_3807a95d-ac8a-42a9-95bf-87514836c9be/reload/0.log" Apr 28 20:20:20.857142 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:20.857114 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6475644cb-ndpb2_3807a95d-ac8a-42a9-95bf-87514836c9be/kube-rbac-proxy/0.log" Apr 28 20:20:20.887663 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:20.887638 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-67c8d86c4f-d6x8n_45277e70-f2c6-4e33-97af-9cb5f76dee0b/thanos-query/0.log" Apr 28 20:20:20.910668 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:20.910641 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-67c8d86c4f-d6x8n_45277e70-f2c6-4e33-97af-9cb5f76dee0b/kube-rbac-proxy-web/0.log" Apr 28 20:20:20.939754 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:20.939707 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-67c8d86c4f-d6x8n_45277e70-f2c6-4e33-97af-9cb5f76dee0b/kube-rbac-proxy/0.log" Apr 28 20:20:20.964644 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:20.964606 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-67c8d86c4f-d6x8n_45277e70-f2c6-4e33-97af-9cb5f76dee0b/prom-label-proxy/0.log" Apr 28 20:20:20.990503 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:20.990452 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-67c8d86c4f-d6x8n_45277e70-f2c6-4e33-97af-9cb5f76dee0b/kube-rbac-proxy-rules/0.log" Apr 28 20:20:21.011916 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:21.011881 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-67c8d86c4f-d6x8n_45277e70-f2c6-4e33-97af-9cb5f76dee0b/kube-rbac-proxy-metrics/0.log" Apr 28 20:20:23.564719 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:23.564687 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-r9lxr/perf-node-gather-daemonset-5w9q4"] Apr 28 20:20:23.569454 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:23.569432 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-5w9q4" Apr 28 20:20:23.575367 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:23.575337 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-r9lxr/perf-node-gather-daemonset-5w9q4"] Apr 28 20:20:23.684290 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:23.684251 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b9c392bc-45c4-4876-9f55-9e55713c666d-sys\") pod \"perf-node-gather-daemonset-5w9q4\" (UID: \"b9c392bc-45c4-4876-9f55-9e55713c666d\") " pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-5w9q4" Apr 28 20:20:23.684290 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:23.684295 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b9c392bc-45c4-4876-9f55-9e55713c666d-proc\") pod \"perf-node-gather-daemonset-5w9q4\" (UID: \"b9c392bc-45c4-4876-9f55-9e55713c666d\") " pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-5w9q4" Apr 28 20:20:23.684562 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:23.684378 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b9c392bc-45c4-4876-9f55-9e55713c666d-lib-modules\") pod \"perf-node-gather-daemonset-5w9q4\" (UID: \"b9c392bc-45c4-4876-9f55-9e55713c666d\") " pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-5w9q4" Apr 28 20:20:23.684562 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:23.684422 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdl66\" (UniqueName: \"kubernetes.io/projected/b9c392bc-45c4-4876-9f55-9e55713c666d-kube-api-access-xdl66\") pod \"perf-node-gather-daemonset-5w9q4\" (UID: \"b9c392bc-45c4-4876-9f55-9e55713c666d\") " pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-5w9q4" Apr 28 20:20:23.684562 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:23.684451 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b9c392bc-45c4-4876-9f55-9e55713c666d-podres\") pod \"perf-node-gather-daemonset-5w9q4\" (UID: \"b9c392bc-45c4-4876-9f55-9e55713c666d\") " pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-5w9q4" Apr 28 20:20:23.785547 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:23.785507 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b9c392bc-45c4-4876-9f55-9e55713c666d-proc\") pod \"perf-node-gather-daemonset-5w9q4\" (UID: \"b9c392bc-45c4-4876-9f55-9e55713c666d\") " pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-5w9q4" Apr 28 20:20:23.785734 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:23.785572 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b9c392bc-45c4-4876-9f55-9e55713c666d-lib-modules\") pod \"perf-node-gather-daemonset-5w9q4\" (UID: \"b9c392bc-45c4-4876-9f55-9e55713c666d\") " pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-5w9q4" Apr 28 20:20:23.785734 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:23.785609 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xdl66\" (UniqueName: \"kubernetes.io/projected/b9c392bc-45c4-4876-9f55-9e55713c666d-kube-api-access-xdl66\") pod \"perf-node-gather-daemonset-5w9q4\" (UID: \"b9c392bc-45c4-4876-9f55-9e55713c666d\") " pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-5w9q4" Apr 28 20:20:23.785734 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:23.785637 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b9c392bc-45c4-4876-9f55-9e55713c666d-podres\") pod \"perf-node-gather-daemonset-5w9q4\" (UID: \"b9c392bc-45c4-4876-9f55-9e55713c666d\") " pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-5w9q4" Apr 28 20:20:23.785734 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:23.785655 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b9c392bc-45c4-4876-9f55-9e55713c666d-proc\") pod \"perf-node-gather-daemonset-5w9q4\" (UID: \"b9c392bc-45c4-4876-9f55-9e55713c666d\") " pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-5w9q4" Apr 28 20:20:23.785734 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:23.785680 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b9c392bc-45c4-4876-9f55-9e55713c666d-sys\") pod \"perf-node-gather-daemonset-5w9q4\" (UID: \"b9c392bc-45c4-4876-9f55-9e55713c666d\") " pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-5w9q4" Apr 28 20:20:23.786016 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:23.785748 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b9c392bc-45c4-4876-9f55-9e55713c666d-sys\") pod \"perf-node-gather-daemonset-5w9q4\" (UID: \"b9c392bc-45c4-4876-9f55-9e55713c666d\") " pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-5w9q4" Apr 28 20:20:23.786016 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:23.785778 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b9c392bc-45c4-4876-9f55-9e55713c666d-lib-modules\") pod \"perf-node-gather-daemonset-5w9q4\" (UID: \"b9c392bc-45c4-4876-9f55-9e55713c666d\") " pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-5w9q4" Apr 28 20:20:23.786016 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:23.785799 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b9c392bc-45c4-4876-9f55-9e55713c666d-podres\") pod \"perf-node-gather-daemonset-5w9q4\" (UID: \"b9c392bc-45c4-4876-9f55-9e55713c666d\") " pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-5w9q4" Apr 28 20:20:23.793793 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:23.793740 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdl66\" (UniqueName: \"kubernetes.io/projected/b9c392bc-45c4-4876-9f55-9e55713c666d-kube-api-access-xdl66\") pod \"perf-node-gather-daemonset-5w9q4\" (UID: \"b9c392bc-45c4-4876-9f55-9e55713c666d\") " pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-5w9q4" Apr 28 20:20:23.882504 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:23.882399 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-5w9q4" Apr 28 20:20:24.006027 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:24.005995 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-r9lxr/perf-node-gather-daemonset-5w9q4"] Apr 28 20:20:24.010097 ip-10-0-139-128 kubenswrapper[2571]: W0428 20:20:24.010052 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podb9c392bc_45c4_4876_9f55_9e55713c666d.slice/crio-6e03b14abb893629cd375151ec6e77fc3d44cc849239f43437bb3282fc5ecd6c WatchSource:0}: Error finding container 6e03b14abb893629cd375151ec6e77fc3d44cc849239f43437bb3282fc5ecd6c: Status 404 returned error can't find the container with id 6e03b14abb893629cd375151ec6e77fc3d44cc849239f43437bb3282fc5ecd6c Apr 28 20:20:24.140507 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:24.140398 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-5w9q4" event={"ID":"b9c392bc-45c4-4876-9f55-9e55713c666d","Type":"ContainerStarted","Data":"cd2457a2e66d2472da475cc85a0b0581632f57016ea2d3b8761929a0edc62754"} Apr 28 20:20:24.140507 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:24.140438 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-5w9q4" event={"ID":"b9c392bc-45c4-4876-9f55-9e55713c666d","Type":"ContainerStarted","Data":"6e03b14abb893629cd375151ec6e77fc3d44cc849239f43437bb3282fc5ecd6c"} Apr 28 20:20:24.140507 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:24.140493 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-5w9q4" Apr 28 20:20:24.157643 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:24.157600 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-5w9q4" podStartSLOduration=1.157585294 podStartE2EDuration="1.157585294s" podCreationTimestamp="2026-04-28 20:20:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 20:20:24.156702689 +0000 UTC m=+3840.787603658" watchObservedRunningTime="2026-04-28 20:20:24.157585294 +0000 UTC m=+3840.788486239" Apr 28 20:20:24.161398 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:24.161376 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-bczfz_0b1ddaed-b20b-4d05-9006-8d55a8bd05f8/dns/0.log" Apr 28 20:20:24.187299 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:24.187277 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-bczfz_0b1ddaed-b20b-4d05-9006-8d55a8bd05f8/kube-rbac-proxy/0.log" Apr 28 20:20:24.362066 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:24.362035 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-w5zst_dae64fa9-2628-461e-a0d3-e468450879cf/dns-node-resolver/0.log" Apr 28 20:20:24.844762 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:24.844723 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-jw8bb_5de918e6-f589-4708-869e-21232a3f0b2e/node-ca/0.log" Apr 28 20:20:25.852672 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:25.852637 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-8cwst_4b4de6e2-0f57-4508-837c-5b18d4524864/serve-healthcheck-canary/0.log" Apr 28 20:20:26.399928 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:26.399892 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-pd8ms_f525727f-5701-4a75-ae8d-ab2bea2bde16/kube-rbac-proxy/0.log" Apr 28 20:20:26.421455 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:26.421425 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-pd8ms_f525727f-5701-4a75-ae8d-ab2bea2bde16/exporter/0.log" Apr 28 20:20:26.443464 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:26.443425 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-pd8ms_f525727f-5701-4a75-ae8d-ab2bea2bde16/extractor/0.log" Apr 28 20:20:30.158171 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:30.158140 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-5w9q4" Apr 28 20:20:30.488914 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:30.488846 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-v2jtv_9739fd02-2f4c-4bcc-ac96-e5b981305b49/s3-init/0.log" Apr 28 20:20:30.513431 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:30.513407 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-tls-init-custom-7hrw9_d3655183-0aab-4af9-8351-e667cb4da8b0/s3-tls-init-custom/0.log" Apr 28 20:20:35.867249 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:35.867210 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cwbdf_d4a7a1d2-5229-47cb-b4b1-097846a273d7/kube-multus-additional-cni-plugins/0.log" Apr 28 20:20:35.890042 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:35.890001 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cwbdf_d4a7a1d2-5229-47cb-b4b1-097846a273d7/egress-router-binary-copy/0.log" Apr 28 20:20:35.913277 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:35.913244 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cwbdf_d4a7a1d2-5229-47cb-b4b1-097846a273d7/cni-plugins/0.log" Apr 28 20:20:35.945287 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:35.945255 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cwbdf_d4a7a1d2-5229-47cb-b4b1-097846a273d7/bond-cni-plugin/0.log" Apr 28 20:20:35.969187 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:35.969164 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cwbdf_d4a7a1d2-5229-47cb-b4b1-097846a273d7/routeoverride-cni/0.log" Apr 28 20:20:35.993495 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:35.993418 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cwbdf_d4a7a1d2-5229-47cb-b4b1-097846a273d7/whereabouts-cni-bincopy/0.log" Apr 28 20:20:36.014324 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:36.014294 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cwbdf_d4a7a1d2-5229-47cb-b4b1-097846a273d7/whereabouts-cni/0.log" Apr 28 20:20:36.408468 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:36.408434 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kdzc2_3c006a6d-6170-4bf5-9f80-c7e10b5bf9dd/kube-multus/0.log" Apr 28 20:20:36.668096 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:36.668025 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-zlvsf_caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b/network-metrics-daemon/0.log" Apr 28 20:20:36.709519 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:36.709473 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-zlvsf_caa0f2a2-4dc9-4c44-b8b9-96f4cc8a695b/kube-rbac-proxy/0.log" Apr 28 20:20:38.135212 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:38.135178 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ppk4t_238e19ca-102f-43b1-8aed-9322ca47bfc9/ovn-controller/0.log" Apr 28 20:20:38.151986 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:38.151958 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ppk4t_238e19ca-102f-43b1-8aed-9322ca47bfc9/ovn-acl-logging/0.log" Apr 28 20:20:38.169049 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:38.169010 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ppk4t_238e19ca-102f-43b1-8aed-9322ca47bfc9/ovn-acl-logging/1.log" Apr 28 20:20:38.188000 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:38.187972 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ppk4t_238e19ca-102f-43b1-8aed-9322ca47bfc9/kube-rbac-proxy-node/0.log" Apr 28 20:20:38.210140 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:38.210115 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ppk4t_238e19ca-102f-43b1-8aed-9322ca47bfc9/kube-rbac-proxy-ovn-metrics/0.log" Apr 28 20:20:38.231358 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:38.231327 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ppk4t_238e19ca-102f-43b1-8aed-9322ca47bfc9/northd/0.log" Apr 28 20:20:38.253050 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:38.253023 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ppk4t_238e19ca-102f-43b1-8aed-9322ca47bfc9/nbdb/0.log" Apr 28 20:20:38.275000 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:38.274971 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ppk4t_238e19ca-102f-43b1-8aed-9322ca47bfc9/sbdb/0.log" Apr 28 20:20:38.388398 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:38.388322 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ppk4t_238e19ca-102f-43b1-8aed-9322ca47bfc9/ovnkube-controller/0.log" Apr 28 20:20:39.326781 ip-10-0-139-128 kubenswrapper[2571]: I0428 20:20:39.326679 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-5qtkh_e46a06a4-894f-4f3d-a446-b501af6e42eb/network-check-target-container/0.log"