Apr 24 14:21:34.303289 ip-10-0-129-231 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 24 14:21:34.303301 ip-10-0-129-231 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 24 14:21:34.303309 ip-10-0-129-231 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 24 14:21:34.303630 ip-10-0-129-231 systemd[1]: Failed to start Kubernetes Kubelet. Apr 24 14:21:44.534182 ip-10-0-129-231 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 24 14:21:44.534205 ip-10-0-129-231 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 4119a81c1f6d453faf4d69d3e03a14ec -- Apr 24 14:24:07.087238 ip-10-0-129-231 systemd[1]: Starting Kubernetes Kubelet... Apr 24 14:24:07.532538 ip-10-0-129-231 kubenswrapper[2569]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 14:24:07.532538 ip-10-0-129-231 kubenswrapper[2569]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 14:24:07.532538 ip-10-0-129-231 kubenswrapper[2569]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 14:24:07.532538 ip-10-0-129-231 kubenswrapper[2569]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 14:24:07.532538 ip-10-0-129-231 kubenswrapper[2569]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 14:24:07.533714 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.533186 2569 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 14:24:07.537914 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.537879 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 14:24:07.537914 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.537911 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 14:24:07.537914 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.537917 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 14:24:07.538015 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.537920 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 14:24:07.538015 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.537923 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 14:24:07.538015 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.537926 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 14:24:07.538015 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.537929 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 14:24:07.538015 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.537932 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 14:24:07.538015 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.537936 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 14:24:07.538015 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.537938 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 14:24:07.538015 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.537942 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 14:24:07.538015 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.537944 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 14:24:07.538015 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.537952 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 14:24:07.538015 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.537955 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 14:24:07.538015 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.537958 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 14:24:07.538015 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.537961 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 14:24:07.538015 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.537964 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 14:24:07.538015 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.537966 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 14:24:07.538015 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.537970 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 14:24:07.538015 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.537974 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 14:24:07.538015 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.537977 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 14:24:07.538015 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.537980 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 14:24:07.538015 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.537982 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 14:24:07.538499 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.537985 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 14:24:07.538499 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.537987 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 14:24:07.538499 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.537990 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 14:24:07.538499 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.537993 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 24 14:24:07.538499 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.537996 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 14:24:07.538499 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.537998 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 14:24:07.538499 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.538001 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 14:24:07.538499 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.538004 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 14:24:07.538499 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.538007 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 14:24:07.538499 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.538010 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 14:24:07.538499 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.538013 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 14:24:07.538499 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.538016 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 14:24:07.538499 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.538019 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 14:24:07.538499 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.538023 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 14:24:07.538499 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.538026 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 14:24:07.538499 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.538029 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 14:24:07.538499 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.538032 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 14:24:07.538499 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.538034 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 14:24:07.538499 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.538038 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 14:24:07.538499 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.538040 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 14:24:07.539050 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.538043 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 14:24:07.539050 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.538045 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 14:24:07.539050 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.538048 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 14:24:07.539050 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.538050 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 14:24:07.539050 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.538053 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 14:24:07.539050 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.538055 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 14:24:07.539050 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.538058 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 14:24:07.539050 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.538060 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 14:24:07.539050 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.538063 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 14:24:07.539050 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.538065 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 14:24:07.539050 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.538068 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 14:24:07.539050 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.538070 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 14:24:07.539050 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.538072 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 14:24:07.539050 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.538077 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 14:24:07.539050 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.538081 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 14:24:07.539050 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.538085 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 14:24:07.539050 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.538088 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 14:24:07.539050 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.538091 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 14:24:07.539050 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.538094 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 14:24:07.539517 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.538097 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 14:24:07.539517 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.538100 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 14:24:07.539517 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.538103 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 14:24:07.539517 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.538107 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 14:24:07.539517 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.538110 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 14:24:07.539517 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.538112 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 14:24:07.539517 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.538115 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 14:24:07.539517 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.538118 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 14:24:07.539517 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.538120 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 14:24:07.539517 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.538123 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 14:24:07.539517 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.538125 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 14:24:07.539517 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.538128 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 14:24:07.539517 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.538131 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 14:24:07.539517 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.538134 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 14:24:07.539517 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.538136 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 14:24:07.539517 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.538139 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 14:24:07.539517 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.538141 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 14:24:07.539517 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.538144 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 14:24:07.539517 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.538147 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 14:24:07.539993 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.538149 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 14:24:07.539993 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.538151 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 14:24:07.539993 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.538154 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 14:24:07.539993 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.538156 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 14:24:07.539993 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.538159 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 14:24:07.540289 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540276 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 14:24:07.540289 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540287 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 14:24:07.540345 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540291 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 14:24:07.540345 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540294 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 14:24:07.540345 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540297 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 14:24:07.540345 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540300 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 14:24:07.540345 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540303 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 14:24:07.540345 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540306 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 14:24:07.540345 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540308 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 14:24:07.540345 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540311 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 14:24:07.540345 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540314 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 14:24:07.540345 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540317 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 14:24:07.540345 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540320 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 14:24:07.540345 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540323 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 14:24:07.540345 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540325 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 14:24:07.540345 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540328 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 14:24:07.540345 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540331 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 14:24:07.540345 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540333 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 14:24:07.540345 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540336 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 14:24:07.540345 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540339 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 14:24:07.540345 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540344 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 14:24:07.540345 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540347 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 14:24:07.540817 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540350 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 14:24:07.540817 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540353 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 14:24:07.540817 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540356 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 14:24:07.540817 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540359 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 14:24:07.540817 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540362 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 14:24:07.540817 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540364 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 14:24:07.540817 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540367 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 14:24:07.540817 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540370 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 14:24:07.540817 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540372 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 14:24:07.540817 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540375 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 14:24:07.540817 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540379 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 14:24:07.540817 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540383 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 14:24:07.540817 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540386 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 14:24:07.540817 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540389 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 14:24:07.540817 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540392 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 14:24:07.540817 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540394 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 14:24:07.540817 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540397 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 14:24:07.540817 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540400 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 14:24:07.540817 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540402 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 14:24:07.540817 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540405 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 14:24:07.541332 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540407 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 14:24:07.541332 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540409 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 14:24:07.541332 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540412 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 14:24:07.541332 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540414 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 14:24:07.541332 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540417 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 14:24:07.541332 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540420 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 14:24:07.541332 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540423 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 14:24:07.541332 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540426 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 14:24:07.541332 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540428 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 14:24:07.541332 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540431 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 14:24:07.541332 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540433 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 14:24:07.541332 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540436 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 14:24:07.541332 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540438 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 14:24:07.541332 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540442 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 14:24:07.541332 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540445 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 14:24:07.541332 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540448 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 14:24:07.541332 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540450 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 14:24:07.541332 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540454 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 14:24:07.541332 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540458 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 14:24:07.541841 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540461 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 14:24:07.541841 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540464 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 14:24:07.541841 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540467 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 14:24:07.541841 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540470 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 14:24:07.541841 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540473 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 14:24:07.541841 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540475 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 14:24:07.541841 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540478 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 14:24:07.541841 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540481 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 14:24:07.541841 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540484 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 14:24:07.541841 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540486 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 14:24:07.541841 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540489 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 14:24:07.541841 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540492 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 14:24:07.541841 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540494 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 14:24:07.541841 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540497 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 14:24:07.541841 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540500 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 24 14:24:07.541841 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540502 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 14:24:07.541841 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540505 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 14:24:07.541841 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540507 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 14:24:07.541841 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540510 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 14:24:07.541841 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540512 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 14:24:07.542346 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540515 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 14:24:07.542346 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540517 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 14:24:07.542346 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540520 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 14:24:07.542346 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540523 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 14:24:07.542346 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.540526 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 14:24:07.542346 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540600 2569 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 14:24:07.542346 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540608 2569 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 14:24:07.542346 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540616 2569 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 14:24:07.542346 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540620 2569 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 14:24:07.542346 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540625 2569 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 14:24:07.542346 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540629 2569 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 14:24:07.542346 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540633 2569 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 14:24:07.542346 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540639 2569 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 14:24:07.542346 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540642 2569 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 14:24:07.542346 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540645 2569 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 14:24:07.542346 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540649 2569 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 14:24:07.542346 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540652 2569 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 14:24:07.542346 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540655 2569 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 14:24:07.542346 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540658 2569 flags.go:64] FLAG: --cgroup-root="" Apr 24 14:24:07.542346 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540661 2569 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 14:24:07.542346 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540664 2569 flags.go:64] FLAG: --client-ca-file="" Apr 24 14:24:07.542346 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540667 2569 flags.go:64] FLAG: --cloud-config="" Apr 24 14:24:07.542346 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540670 2569 flags.go:64] FLAG: --cloud-provider="external" Apr 24 14:24:07.542346 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540673 2569 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 14:24:07.542965 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540677 2569 flags.go:64] FLAG: --cluster-domain="" Apr 24 14:24:07.542965 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540680 2569 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 14:24:07.542965 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540683 2569 flags.go:64] FLAG: --config-dir="" Apr 24 14:24:07.542965 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540686 2569 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 14:24:07.542965 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540690 2569 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 14:24:07.542965 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540694 2569 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 14:24:07.542965 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540698 2569 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 14:24:07.542965 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540701 2569 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 14:24:07.542965 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540704 2569 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 14:24:07.542965 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540707 2569 flags.go:64] FLAG: --contention-profiling="false" Apr 24 14:24:07.542965 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540710 2569 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 14:24:07.542965 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540714 2569 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 14:24:07.542965 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540718 2569 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 14:24:07.542965 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540721 2569 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 14:24:07.542965 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540726 2569 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 14:24:07.542965 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540730 2569 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 14:24:07.542965 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540733 2569 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 14:24:07.542965 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540736 2569 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 14:24:07.542965 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540739 2569 flags.go:64] FLAG: --enable-server="true" Apr 24 14:24:07.542965 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540742 2569 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 14:24:07.542965 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540747 2569 flags.go:64] FLAG: --event-burst="100" Apr 24 14:24:07.542965 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540750 2569 flags.go:64] FLAG: --event-qps="50" Apr 24 14:24:07.542965 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540754 2569 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 14:24:07.542965 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540757 2569 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 14:24:07.542965 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540760 2569 flags.go:64] FLAG: --eviction-hard="" Apr 24 14:24:07.543574 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540764 2569 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 14:24:07.543574 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540767 2569 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 14:24:07.543574 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540770 2569 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 14:24:07.543574 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540773 2569 flags.go:64] FLAG: --eviction-soft="" Apr 24 14:24:07.543574 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540776 2569 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 14:24:07.543574 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540779 2569 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 14:24:07.543574 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540782 2569 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 14:24:07.543574 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540785 2569 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 14:24:07.543574 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540788 2569 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 14:24:07.543574 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540791 2569 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 14:24:07.543574 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540794 2569 flags.go:64] FLAG: --feature-gates="" Apr 24 14:24:07.543574 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540798 2569 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 14:24:07.543574 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540801 2569 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 14:24:07.543574 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540804 2569 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 14:24:07.543574 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540808 2569 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 14:24:07.543574 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540811 2569 flags.go:64] FLAG: --healthz-port="10248" Apr 24 14:24:07.543574 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540814 2569 flags.go:64] FLAG: --help="false" Apr 24 14:24:07.543574 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540817 2569 flags.go:64] FLAG: --hostname-override="ip-10-0-129-231.ec2.internal" Apr 24 14:24:07.543574 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540821 2569 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 14:24:07.543574 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540824 2569 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 14:24:07.543574 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540827 2569 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 14:24:07.543574 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540832 2569 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 14:24:07.543574 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540836 2569 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 14:24:07.544160 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540839 2569 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 14:24:07.544160 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540842 2569 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 14:24:07.544160 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540845 2569 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 14:24:07.544160 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540848 2569 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 14:24:07.544160 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540851 2569 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 14:24:07.544160 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540855 2569 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 14:24:07.544160 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540857 2569 flags.go:64] FLAG: --kube-reserved="" Apr 24 14:24:07.544160 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540860 2569 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 14:24:07.544160 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540863 2569 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 14:24:07.544160 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540866 2569 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 14:24:07.544160 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540869 2569 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 14:24:07.544160 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540872 2569 flags.go:64] FLAG: --lock-file="" Apr 24 14:24:07.544160 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540875 2569 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 14:24:07.544160 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540878 2569 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 14:24:07.544160 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540881 2569 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 14:24:07.544160 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540886 2569 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 14:24:07.544160 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540904 2569 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 14:24:07.544160 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540908 2569 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 14:24:07.544160 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540911 2569 flags.go:64] FLAG: --logging-format="text" Apr 24 14:24:07.544160 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540914 2569 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 14:24:07.544160 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540918 2569 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 14:24:07.544160 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540921 2569 flags.go:64] FLAG: --manifest-url="" Apr 24 14:24:07.544160 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540924 2569 flags.go:64] FLAG: --manifest-url-header="" Apr 24 14:24:07.544160 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540928 2569 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 14:24:07.544160 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540931 2569 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 14:24:07.544762 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540936 2569 flags.go:64] FLAG: --max-pods="110" Apr 24 14:24:07.544762 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540939 2569 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 14:24:07.544762 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540942 2569 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 14:24:07.544762 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540945 2569 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 14:24:07.544762 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540949 2569 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 14:24:07.544762 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540953 2569 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 14:24:07.544762 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540956 2569 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 14:24:07.544762 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540960 2569 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 14:24:07.544762 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540968 2569 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 14:24:07.544762 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540971 2569 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 14:24:07.544762 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540974 2569 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 14:24:07.544762 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540977 2569 flags.go:64] FLAG: --pod-cidr="" Apr 24 14:24:07.544762 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540980 2569 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 14:24:07.544762 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540987 2569 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 14:24:07.544762 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540990 2569 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 14:24:07.544762 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540994 2569 flags.go:64] FLAG: --pods-per-core="0" Apr 24 14:24:07.544762 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.540997 2569 flags.go:64] FLAG: --port="10250" Apr 24 14:24:07.544762 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.541000 2569 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 14:24:07.544762 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.541003 2569 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-04ef8f6be42ea302b" Apr 24 14:24:07.544762 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.541006 2569 flags.go:64] FLAG: --qos-reserved="" Apr 24 14:24:07.544762 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.541010 2569 flags.go:64] FLAG: --read-only-port="10255" Apr 24 14:24:07.544762 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.541013 2569 flags.go:64] FLAG: --register-node="true" Apr 24 14:24:07.544762 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.541016 2569 flags.go:64] FLAG: --register-schedulable="true" Apr 24 14:24:07.544762 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.541019 2569 flags.go:64] FLAG: --register-with-taints="" Apr 24 14:24:07.545371 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.541022 2569 flags.go:64] FLAG: --registry-burst="10" Apr 24 14:24:07.545371 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.541026 2569 flags.go:64] FLAG: --registry-qps="5" Apr 24 14:24:07.545371 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.541029 2569 flags.go:64] FLAG: --reserved-cpus="" Apr 24 14:24:07.545371 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.541031 2569 flags.go:64] FLAG: --reserved-memory="" Apr 24 14:24:07.545371 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.541035 2569 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 14:24:07.545371 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.541038 2569 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 14:24:07.545371 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.541042 2569 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 14:24:07.545371 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.541044 2569 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 14:24:07.545371 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.541047 2569 flags.go:64] FLAG: --runonce="false" Apr 24 14:24:07.545371 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.541050 2569 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 14:24:07.545371 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.541053 2569 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 14:24:07.545371 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.541056 2569 flags.go:64] FLAG: --seccomp-default="false" Apr 24 14:24:07.545371 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.541059 2569 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 14:24:07.545371 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.541063 2569 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 14:24:07.545371 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.541066 2569 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 14:24:07.545371 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.541070 2569 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 14:24:07.545371 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.541074 2569 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 14:24:07.545371 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.541077 2569 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 14:24:07.545371 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.541080 2569 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 14:24:07.545371 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.541083 2569 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 14:24:07.545371 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.541086 2569 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 14:24:07.545371 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.541089 2569 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 14:24:07.545371 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.541092 2569 flags.go:64] FLAG: --system-cgroups="" Apr 24 14:24:07.545371 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.541095 2569 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 14:24:07.545371 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.541101 2569 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 14:24:07.545973 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.541104 2569 flags.go:64] FLAG: --tls-cert-file="" Apr 24 14:24:07.545973 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.541107 2569 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 14:24:07.545973 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.541111 2569 flags.go:64] FLAG: --tls-min-version="" Apr 24 14:24:07.545973 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.541114 2569 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 14:24:07.545973 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.541117 2569 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 14:24:07.545973 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.541120 2569 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 14:24:07.545973 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.541123 2569 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 14:24:07.545973 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.541126 2569 flags.go:64] FLAG: --v="2" Apr 24 14:24:07.545973 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.541131 2569 flags.go:64] FLAG: --version="false" Apr 24 14:24:07.545973 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.541135 2569 flags.go:64] FLAG: --vmodule="" Apr 24 14:24:07.545973 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.541140 2569 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 14:24:07.545973 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.541143 2569 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 14:24:07.545973 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541242 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 14:24:07.545973 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541246 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 14:24:07.545973 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541249 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 14:24:07.545973 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541252 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 14:24:07.545973 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541256 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 14:24:07.545973 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541259 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 14:24:07.545973 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541262 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 14:24:07.545973 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541265 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 14:24:07.545973 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541268 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 14:24:07.545973 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541271 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 14:24:07.545973 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541274 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 14:24:07.546529 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541277 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 14:24:07.546529 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541280 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 14:24:07.546529 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541283 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 14:24:07.546529 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541285 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 14:24:07.546529 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541288 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 14:24:07.546529 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541291 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 14:24:07.546529 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541293 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 14:24:07.546529 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541296 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 14:24:07.546529 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541298 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 14:24:07.546529 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541301 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 14:24:07.546529 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541304 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 14:24:07.546529 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541306 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 14:24:07.546529 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541309 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 14:24:07.546529 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541312 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 14:24:07.546529 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541314 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 14:24:07.546529 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541317 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 14:24:07.546529 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541319 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 14:24:07.546529 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541322 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 14:24:07.546529 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541324 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 14:24:07.546529 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541327 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 14:24:07.547119 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541330 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 14:24:07.547119 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541334 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 14:24:07.547119 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541337 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 14:24:07.547119 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541340 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 14:24:07.547119 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541343 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 14:24:07.547119 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541346 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 14:24:07.547119 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541349 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 14:24:07.547119 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541351 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 14:24:07.547119 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541354 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 14:24:07.547119 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541357 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 14:24:07.547119 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541359 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 24 14:24:07.547119 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541362 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 14:24:07.547119 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541365 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 14:24:07.547119 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541367 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 14:24:07.547119 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541370 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 14:24:07.547119 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541373 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 14:24:07.547119 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541375 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 14:24:07.547119 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541378 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 14:24:07.547119 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541380 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 14:24:07.547119 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541383 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 14:24:07.547649 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541386 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 14:24:07.547649 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541388 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 14:24:07.547649 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541391 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 14:24:07.547649 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541394 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 14:24:07.547649 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541397 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 14:24:07.547649 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541399 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 14:24:07.547649 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541403 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 14:24:07.547649 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541407 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 14:24:07.547649 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541410 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 14:24:07.547649 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541413 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 14:24:07.547649 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541415 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 14:24:07.547649 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541419 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 14:24:07.547649 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541423 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 14:24:07.547649 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541427 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 14:24:07.547649 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541430 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 14:24:07.547649 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541432 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 14:24:07.547649 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541435 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 14:24:07.547649 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541438 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 14:24:07.547649 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541441 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 14:24:07.548215 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541444 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 14:24:07.548215 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541447 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 14:24:07.548215 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541450 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 14:24:07.548215 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541452 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 14:24:07.548215 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541455 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 14:24:07.548215 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541458 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 14:24:07.548215 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541460 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 14:24:07.548215 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541463 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 14:24:07.548215 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541466 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 14:24:07.548215 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541469 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 14:24:07.548215 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541471 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 14:24:07.548215 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541474 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 14:24:07.548215 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541476 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 14:24:07.548215 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541479 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 14:24:07.548215 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541481 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 14:24:07.548215 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.541483 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 14:24:07.548774 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.542203 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 14:24:07.549691 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.549668 2569 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 14:24:07.549691 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.549689 2569 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 14:24:07.549806 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549743 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 14:24:07.549806 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549749 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 14:24:07.549806 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549752 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 14:24:07.549806 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549756 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 14:24:07.549806 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549759 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 14:24:07.549806 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549762 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 14:24:07.549806 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549766 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 14:24:07.549806 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549769 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 14:24:07.549806 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549771 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 14:24:07.549806 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549774 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 14:24:07.549806 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549777 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 14:24:07.549806 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549780 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 14:24:07.549806 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549783 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 14:24:07.549806 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549785 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 14:24:07.549806 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549788 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 14:24:07.549806 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549791 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 14:24:07.549806 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549793 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 14:24:07.549806 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549796 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 14:24:07.549806 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549799 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 14:24:07.549806 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549801 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 14:24:07.550326 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549804 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 14:24:07.550326 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549807 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 14:24:07.550326 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549810 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 14:24:07.550326 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549813 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 14:24:07.550326 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549817 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 14:24:07.550326 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549822 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 14:24:07.550326 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549825 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 14:24:07.550326 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549828 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 14:24:07.550326 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549831 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 14:24:07.550326 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549833 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 14:24:07.550326 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549836 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 14:24:07.550326 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549839 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 14:24:07.550326 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549841 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 14:24:07.550326 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549844 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 14:24:07.550326 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549847 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 14:24:07.550326 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549850 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 14:24:07.550326 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549852 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 14:24:07.550326 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549855 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 14:24:07.550326 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549857 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 14:24:07.550326 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549860 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 14:24:07.550822 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549862 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 14:24:07.550822 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549865 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 14:24:07.550822 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549868 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 24 14:24:07.550822 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549870 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 14:24:07.550822 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549873 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 14:24:07.550822 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549875 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 14:24:07.550822 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549878 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 14:24:07.550822 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549881 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 14:24:07.550822 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549885 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 14:24:07.550822 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549910 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 14:24:07.550822 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549914 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 14:24:07.550822 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549917 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 14:24:07.550822 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549920 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 14:24:07.550822 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549924 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 14:24:07.550822 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549927 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 14:24:07.550822 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549930 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 14:24:07.550822 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549932 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 14:24:07.550822 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549935 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 14:24:07.550822 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549938 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 14:24:07.551360 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549941 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 14:24:07.551360 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549943 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 14:24:07.551360 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549946 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 14:24:07.551360 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549949 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 14:24:07.551360 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549952 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 14:24:07.551360 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549955 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 14:24:07.551360 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549957 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 14:24:07.551360 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549960 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 14:24:07.551360 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549962 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 14:24:07.551360 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549965 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 14:24:07.551360 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549967 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 14:24:07.551360 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549970 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 14:24:07.551360 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549972 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 14:24:07.551360 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549975 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 14:24:07.551360 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549977 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 14:24:07.551360 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549980 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 14:24:07.551360 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549982 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 14:24:07.551360 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549987 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 14:24:07.551360 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549990 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 14:24:07.551360 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549993 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 14:24:07.551856 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549996 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 14:24:07.551856 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.549998 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 14:24:07.551856 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550001 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 14:24:07.551856 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550003 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 14:24:07.551856 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550006 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 14:24:07.551856 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550009 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 14:24:07.551856 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550013 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 14:24:07.551856 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.550018 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 14:24:07.551856 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550141 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 14:24:07.551856 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550147 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 14:24:07.551856 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550150 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 14:24:07.551856 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550153 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 14:24:07.551856 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550157 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 14:24:07.551856 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550160 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 14:24:07.551856 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550163 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 14:24:07.552293 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550166 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 14:24:07.552293 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550169 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 14:24:07.552293 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550172 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 14:24:07.552293 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550174 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 14:24:07.552293 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550177 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 14:24:07.552293 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550180 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 14:24:07.552293 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550182 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 14:24:07.552293 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550185 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 24 14:24:07.552293 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550187 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 14:24:07.552293 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550190 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 14:24:07.552293 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550192 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 14:24:07.552293 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550195 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 14:24:07.552293 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550197 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 14:24:07.552293 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550200 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 14:24:07.552293 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550202 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 14:24:07.552293 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550206 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 14:24:07.552293 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550208 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 14:24:07.552293 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550211 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 14:24:07.552293 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550214 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 14:24:07.552293 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550217 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 14:24:07.552772 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550219 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 14:24:07.552772 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550222 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 14:24:07.552772 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550224 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 14:24:07.552772 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550227 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 14:24:07.552772 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550230 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 14:24:07.552772 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550233 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 14:24:07.552772 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550237 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 14:24:07.552772 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550241 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 14:24:07.552772 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550244 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 14:24:07.552772 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550247 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 14:24:07.552772 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550249 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 14:24:07.552772 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550252 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 14:24:07.552772 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550255 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 14:24:07.552772 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550257 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 14:24:07.552772 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550260 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 14:24:07.552772 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550262 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 14:24:07.552772 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550265 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 14:24:07.552772 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550267 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 14:24:07.552772 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550270 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 14:24:07.553261 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550272 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 14:24:07.553261 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550275 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 14:24:07.553261 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550278 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 14:24:07.553261 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550280 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 14:24:07.553261 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550283 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 14:24:07.553261 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550285 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 14:24:07.553261 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550287 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 14:24:07.553261 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550290 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 14:24:07.553261 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550292 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 14:24:07.553261 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550296 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 14:24:07.553261 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550298 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 14:24:07.553261 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550301 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 14:24:07.553261 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550304 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 14:24:07.553261 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550306 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 14:24:07.553261 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550309 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 14:24:07.553261 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550311 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 14:24:07.553261 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550314 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 14:24:07.553261 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550317 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 14:24:07.553261 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550320 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 14:24:07.553261 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550323 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 14:24:07.553736 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550325 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 14:24:07.553736 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550328 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 14:24:07.553736 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550330 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 14:24:07.553736 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550333 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 14:24:07.553736 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550336 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 14:24:07.553736 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550338 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 14:24:07.553736 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550341 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 14:24:07.553736 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550344 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 14:24:07.553736 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550347 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 14:24:07.553736 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550349 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 14:24:07.553736 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550352 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 14:24:07.553736 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550354 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 14:24:07.553736 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550357 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 14:24:07.553736 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550359 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 14:24:07.553736 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550362 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 14:24:07.553736 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550364 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 14:24:07.553736 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550367 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 14:24:07.553736 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550371 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 14:24:07.553736 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550374 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 14:24:07.554212 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:07.550377 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 14:24:07.554212 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.550381 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 14:24:07.554212 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.551127 2569 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 14:24:07.554212 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.554161 2569 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 14:24:07.555088 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.555076 2569 server.go:1019] "Starting client certificate rotation" Apr 24 14:24:07.555205 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.555184 2569 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 14:24:07.555251 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.555239 2569 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 14:24:07.583010 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.582986 2569 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 14:24:07.587846 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.587829 2569 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 14:24:07.602218 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.602200 2569 log.go:25] "Validated CRI v1 runtime API" Apr 24 14:24:07.607391 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.607375 2569 log.go:25] "Validated CRI v1 image API" Apr 24 14:24:07.608964 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.608949 2569 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 14:24:07.614563 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.614538 2569 fs.go:135] Filesystem UUIDs: map[51b2b3aa-621c-4dba-ad84-5010ad7f0879:/dev/nvme0n1p3 579e08d2-89fe-454f-a5a8-2ebbc39d1c00:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 24 14:24:07.614657 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.614558 2569 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 14:24:07.614657 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.614621 2569 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 14:24:07.622370 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.622245 2569 manager.go:217] Machine: {Timestamp:2026-04-24 14:24:07.618944127 +0000 UTC m=+0.413001019 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3101371 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec25e7141e8be072c2a5d328228c8787 SystemUUID:ec25e714-1e8b-e072-c2a5-d328228c8787 BootID:4119a81c-1f6d-453f-af4d-69d3e03a14ec Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:0c:c7:92:c8:df Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:0c:c7:92:c8:df Speed:0 Mtu:9001} {Name:ovs-system MacAddress:3a:dc:f6:bd:29:30 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 14:24:07.622370 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.622358 2569 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 14:24:07.622519 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.622462 2569 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 14:24:07.623604 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.623579 2569 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 14:24:07.623740 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.623606 2569 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-129-231.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 14:24:07.623786 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.623750 2569 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 14:24:07.623786 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.623759 2569 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 14:24:07.623786 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.623772 2569 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 14:24:07.624496 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.624485 2569 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 14:24:07.625743 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.625732 2569 state_mem.go:36] "Initialized new in-memory state store" Apr 24 14:24:07.625853 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.625844 2569 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 14:24:07.628201 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.628192 2569 kubelet.go:491] "Attempting to sync node with API server" Apr 24 14:24:07.628239 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.628206 2569 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 14:24:07.628239 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.628219 2569 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 14:24:07.628239 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.628228 2569 kubelet.go:397] "Adding apiserver pod source" Apr 24 14:24:07.628239 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.628238 2569 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 14:24:07.629280 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.629268 2569 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 14:24:07.629332 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.629286 2569 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 14:24:07.631963 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.631944 2569 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 14:24:07.632955 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.632939 2569 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-x68jh" Apr 24 14:24:07.633276 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.633264 2569 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 14:24:07.635023 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.635008 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 14:24:07.635137 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.635032 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 14:24:07.635137 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.635046 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 14:24:07.635137 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.635054 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 14:24:07.635137 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.635067 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 14:24:07.635137 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.635073 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 14:24:07.635137 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.635079 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 14:24:07.635137 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.635085 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 14:24:07.635137 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.635101 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 14:24:07.635137 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.635110 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 14:24:07.635137 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.635135 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 14:24:07.635390 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.635144 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 14:24:07.636036 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.636027 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 14:24:07.636036 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.636037 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 14:24:07.639862 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.639847 2569 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 14:24:07.639960 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.639909 2569 server.go:1295] "Started kubelet" Apr 24 14:24:07.640016 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.639906 2569 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-129-231.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 14:24:07.640065 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:07.640016 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 14:24:07.640065 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:07.640023 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-129-231.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 14:24:07.640065 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.640012 2569 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 14:24:07.640171 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.640052 2569 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 14:24:07.640171 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.640118 2569 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 14:24:07.640794 ip-10-0-129-231 systemd[1]: Started Kubernetes Kubelet. Apr 24 14:24:07.640949 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.640884 2569 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-x68jh" Apr 24 14:24:07.641297 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.641277 2569 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 14:24:07.642738 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.642722 2569 server.go:317] "Adding debug handlers to kubelet server" Apr 24 14:24:07.646416 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.646399 2569 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 14:24:07.646913 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.646882 2569 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 14:24:07.647556 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.647498 2569 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 14:24:07.647707 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.647695 2569 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 14:24:07.647797 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.647781 2569 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 14:24:07.647931 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.647729 2569 factory.go:55] Registering systemd factory Apr 24 14:24:07.647997 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.647932 2569 factory.go:223] Registration of the systemd container factory successfully Apr 24 14:24:07.647997 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.647942 2569 reconstruct.go:97] "Volume reconstruction finished" Apr 24 14:24:07.647997 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.647949 2569 reconciler.go:26] "Reconciler: start to sync state" Apr 24 14:24:07.647997 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:07.647968 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-231.ec2.internal\" not found" Apr 24 14:24:07.648180 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.648171 2569 factory.go:153] Registering CRI-O factory Apr 24 14:24:07.648233 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.648185 2569 factory.go:223] Registration of the crio container factory successfully Apr 24 14:24:07.648280 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.648255 2569 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 14:24:07.648280 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.648276 2569 factory.go:103] Registering Raw factory Apr 24 14:24:07.648376 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.648288 2569 manager.go:1196] Started watching for new ooms in manager Apr 24 14:24:07.648847 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.648829 2569 manager.go:319] Starting recovery of all containers Apr 24 14:24:07.650017 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:07.649991 2569 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 14:24:07.650178 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.650163 2569 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 14:24:07.653179 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:07.653151 2569 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-129-231.ec2.internal\" not found" node="ip-10-0-129-231.ec2.internal" Apr 24 14:24:07.657386 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.657329 2569 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 14:24:07.660579 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.660417 2569 manager.go:324] Recovery completed Apr 24 14:24:07.665477 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.665463 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 14:24:07.668131 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.668113 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-231.ec2.internal" event="NodeHasSufficientMemory" Apr 24 14:24:07.668200 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.668148 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-231.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 14:24:07.668200 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.668161 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-231.ec2.internal" event="NodeHasSufficientPID" Apr 24 14:24:07.668626 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.668613 2569 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 14:24:07.668668 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.668628 2569 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 14:24:07.668668 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.668648 2569 state_mem.go:36] "Initialized new in-memory state store" Apr 24 14:24:07.671819 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.671805 2569 policy_none.go:49] "None policy: Start" Apr 24 14:24:07.671884 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.671823 2569 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 14:24:07.671884 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.671834 2569 state_mem.go:35] "Initializing new in-memory state store" Apr 24 14:24:07.714571 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.714547 2569 manager.go:341] "Starting Device Plugin manager" Apr 24 14:24:07.734319 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:07.714589 2569 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 14:24:07.734319 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.714602 2569 server.go:85] "Starting device plugin registration server" Apr 24 14:24:07.734319 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.714918 2569 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 14:24:07.734319 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.714931 2569 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 14:24:07.734319 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.715007 2569 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 14:24:07.734319 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.715103 2569 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 14:24:07.734319 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.715112 2569 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 14:24:07.734319 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:07.715692 2569 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 14:24:07.734319 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:07.715737 2569 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-129-231.ec2.internal\" not found" Apr 24 14:24:07.815886 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.815794 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 14:24:07.817363 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.817344 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-231.ec2.internal" event="NodeHasSufficientMemory" Apr 24 14:24:07.817481 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.817374 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-231.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 14:24:07.817481 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.817400 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-231.ec2.internal" event="NodeHasSufficientPID" Apr 24 14:24:07.817481 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.817428 2569 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-129-231.ec2.internal" Apr 24 14:24:07.822858 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.822837 2569 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 14:24:07.822951 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.822873 2569 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 14:24:07.822951 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.822903 2569 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 14:24:07.822951 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.822910 2569 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 14:24:07.822951 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:07.822947 2569 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 14:24:07.825483 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.825457 2569 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 14:24:07.826250 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.826236 2569 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-129-231.ec2.internal" Apr 24 14:24:07.826308 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:07.826257 2569 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-129-231.ec2.internal\": node \"ip-10-0-129-231.ec2.internal\" not found" Apr 24 14:24:07.860504 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:07.860476 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-231.ec2.internal\" not found" Apr 24 14:24:07.923647 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.923599 2569 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-231.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-129-231.ec2.internal"] Apr 24 14:24:07.923760 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.923695 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 14:24:07.925548 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.925531 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-231.ec2.internal" event="NodeHasSufficientMemory" Apr 24 14:24:07.925636 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.925566 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-231.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 14:24:07.925636 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.925578 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-231.ec2.internal" event="NodeHasSufficientPID" Apr 24 14:24:07.927908 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.927879 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 14:24:07.928057 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.928042 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-231.ec2.internal" Apr 24 14:24:07.928103 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.928072 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 14:24:07.928802 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.928784 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-231.ec2.internal" event="NodeHasSufficientMemory" Apr 24 14:24:07.928881 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.928815 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-231.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 14:24:07.928881 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.928826 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-231.ec2.internal" event="NodeHasSufficientPID" Apr 24 14:24:07.928881 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.928784 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-231.ec2.internal" event="NodeHasSufficientMemory" Apr 24 14:24:07.929040 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.928923 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-231.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 14:24:07.929040 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.928950 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-231.ec2.internal" event="NodeHasSufficientPID" Apr 24 14:24:07.931042 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.931028 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-231.ec2.internal" Apr 24 14:24:07.931121 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.931051 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 14:24:07.931840 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.931827 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-231.ec2.internal" event="NodeHasSufficientMemory" Apr 24 14:24:07.931910 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.931853 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-231.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 14:24:07.931910 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.931863 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-231.ec2.internal" event="NodeHasSufficientPID" Apr 24 14:24:07.948511 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:07.948486 2569 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-231.ec2.internal\" not found" node="ip-10-0-129-231.ec2.internal" Apr 24 14:24:07.949204 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.949184 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b8b57e2ee8dbdeeec77b109df4b31692-config\") pod \"kube-apiserver-proxy-ip-10-0-129-231.ec2.internal\" (UID: \"b8b57e2ee8dbdeeec77b109df4b31692\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-231.ec2.internal" Apr 24 14:24:07.949257 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.949215 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/43b3b938de58a8a753e7808e14b92798-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-231.ec2.internal\" (UID: \"43b3b938de58a8a753e7808e14b92798\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-231.ec2.internal" Apr 24 14:24:07.949257 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:07.949234 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/43b3b938de58a8a753e7808e14b92798-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-231.ec2.internal\" (UID: \"43b3b938de58a8a753e7808e14b92798\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-231.ec2.internal" Apr 24 14:24:07.953059 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:07.953042 2569 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-231.ec2.internal\" not found" node="ip-10-0-129-231.ec2.internal" Apr 24 14:24:07.960699 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:07.960680 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-231.ec2.internal\" not found" Apr 24 14:24:08.049817 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.049781 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/43b3b938de58a8a753e7808e14b92798-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-231.ec2.internal\" (UID: \"43b3b938de58a8a753e7808e14b92798\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-231.ec2.internal" Apr 24 14:24:08.049817 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.049811 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/43b3b938de58a8a753e7808e14b92798-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-231.ec2.internal\" (UID: \"43b3b938de58a8a753e7808e14b92798\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-231.ec2.internal" Apr 24 14:24:08.049817 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.049828 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b8b57e2ee8dbdeeec77b109df4b31692-config\") pod \"kube-apiserver-proxy-ip-10-0-129-231.ec2.internal\" (UID: \"b8b57e2ee8dbdeeec77b109df4b31692\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-231.ec2.internal" Apr 24 14:24:08.050078 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.049860 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/43b3b938de58a8a753e7808e14b92798-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-231.ec2.internal\" (UID: \"43b3b938de58a8a753e7808e14b92798\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-231.ec2.internal" Apr 24 14:24:08.050078 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.049869 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/43b3b938de58a8a753e7808e14b92798-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-231.ec2.internal\" (UID: \"43b3b938de58a8a753e7808e14b92798\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-231.ec2.internal" Apr 24 14:24:08.050078 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.049915 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b8b57e2ee8dbdeeec77b109df4b31692-config\") pod \"kube-apiserver-proxy-ip-10-0-129-231.ec2.internal\" (UID: \"b8b57e2ee8dbdeeec77b109df4b31692\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-231.ec2.internal" Apr 24 14:24:08.060904 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:08.060868 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-231.ec2.internal\" not found" Apr 24 14:24:08.161803 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:08.161736 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-231.ec2.internal\" not found" Apr 24 14:24:08.251131 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.251097 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-231.ec2.internal" Apr 24 14:24:08.255624 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.255604 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-231.ec2.internal" Apr 24 14:24:08.262211 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:08.262187 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-231.ec2.internal\" not found" Apr 24 14:24:08.362804 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:08.362762 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-231.ec2.internal\" not found" Apr 24 14:24:08.462245 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.462173 2569 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 14:24:08.547396 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.547358 2569 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-231.ec2.internal" Apr 24 14:24:08.555750 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.555728 2569 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 14:24:08.555923 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.555882 2569 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 14:24:08.555982 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.555915 2569 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 14:24:08.555982 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.555937 2569 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 14:24:08.555982 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:08.555930 2569 kubelet.go:3342] "Failed creating a mirror pod" err="Post \"https://a7f794fcb8b2c4ec3b77ccc5e7557ff1-2ccbaf4af5715868.elb.us-east-1.amazonaws.com:6443/api/v1/namespaces/kube-system/pods\": read tcp 10.0.129.231:53250->44.220.99.46:6443: use of closed network connection" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-231.ec2.internal" Apr 24 14:24:08.555982 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.555967 2569 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-231.ec2.internal" Apr 24 14:24:08.572155 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.572132 2569 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 14:24:08.628435 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.628406 2569 apiserver.go:52] "Watching apiserver" Apr 24 14:24:08.639341 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.639308 2569 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 14:24:08.640380 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.640355 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-231.ec2.internal","openshift-multus/multus-additional-cni-plugins-2bsbc","openshift-multus/network-metrics-daemon-d47r2","openshift-network-operator/iptables-alerter-8grsv","openshift-ovn-kubernetes/ovnkube-node-mjpgb","kube-system/konnectivity-agent-k8phw","kube-system/kube-apiserver-proxy-ip-10-0-129-231.ec2.internal","openshift-cluster-node-tuning-operator/tuned-7fkmg","openshift-multus/multus-4cxgt","openshift-network-diagnostics/network-check-target-tzvkp","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7ptdz","openshift-dns/node-resolver-2ngpp","openshift-image-registry/node-ca-5f7nk"] Apr 24 14:24:08.642695 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.642666 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 14:19:07 +0000 UTC" deadline="2027-11-08 14:18:43.841340644 +0000 UTC" Apr 24 14:24:08.642695 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.642692 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13511h54m35.198650877s" Apr 24 14:24:08.643467 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.643440 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7ptdz" Apr 24 14:24:08.645450 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.645421 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 14:24:08.645559 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.645473 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 14:24:08.645559 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.645492 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-59jzc\"" Apr 24 14:24:08.645700 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.645556 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 14:24:08.646490 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.646472 2569 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 14:24:08.647843 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.647823 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-2bsbc" Apr 24 14:24:08.649583 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.649565 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 14:24:08.649683 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.649599 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 14:24:08.649771 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.649700 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 14:24:08.649771 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.649711 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-kmxbf\"" Apr 24 14:24:08.649771 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.649720 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 14:24:08.649906 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.649823 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 14:24:08.650117 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.650101 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d47r2" Apr 24 14:24:08.650181 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:08.650164 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d47r2" podUID="496c729f-9eee-4311-8fe8-4502d4af37f8" Apr 24 14:24:08.650215 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.650197 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-8grsv" Apr 24 14:24:08.651826 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.651804 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 14:24:08.651826 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.651827 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-862gl\"" Apr 24 14:24:08.651990 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.651840 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 14:24:08.651990 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.651873 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 14:24:08.652450 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.652432 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:08.652790 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.652673 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/663fea9a-c74c-4a2d-8d62-31e14a29d43a-os-release\") pod \"multus-additional-cni-plugins-2bsbc\" (UID: \"663fea9a-c74c-4a2d-8d62-31e14a29d43a\") " pod="openshift-multus/multus-additional-cni-plugins-2bsbc" Apr 24 14:24:08.652790 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.652717 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/663fea9a-c74c-4a2d-8d62-31e14a29d43a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2bsbc\" (UID: \"663fea9a-c74c-4a2d-8d62-31e14a29d43a\") " pod="openshift-multus/multus-additional-cni-plugins-2bsbc" Apr 24 14:24:08.652790 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.652747 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87jwg\" (UniqueName: \"kubernetes.io/projected/663fea9a-c74c-4a2d-8d62-31e14a29d43a-kube-api-access-87jwg\") pod \"multus-additional-cni-plugins-2bsbc\" (UID: \"663fea9a-c74c-4a2d-8d62-31e14a29d43a\") " pod="openshift-multus/multus-additional-cni-plugins-2bsbc" Apr 24 14:24:08.652790 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.652776 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5r8n2\" (UniqueName: \"kubernetes.io/projected/496c729f-9eee-4311-8fe8-4502d4af37f8-kube-api-access-5r8n2\") pod \"network-metrics-daemon-d47r2\" (UID: \"496c729f-9eee-4311-8fe8-4502d4af37f8\") " pod="openshift-multus/network-metrics-daemon-d47r2" Apr 24 14:24:08.653013 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.652803 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zwr5\" (UniqueName: \"kubernetes.io/projected/853085d1-3eec-4e3a-a1e6-999af329c8d0-kube-api-access-2zwr5\") pod \"iptables-alerter-8grsv\" (UID: \"853085d1-3eec-4e3a-a1e6-999af329c8d0\") " pod="openshift-network-operator/iptables-alerter-8grsv" Apr 24 14:24:08.653013 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.652827 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/dc88fdf6-9eaa-4f26-9344-140dd6d94ef5-sys-fs\") pod \"aws-ebs-csi-driver-node-7ptdz\" (UID: \"dc88fdf6-9eaa-4f26-9344-140dd6d94ef5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7ptdz" Apr 24 14:24:08.653013 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.652849 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/663fea9a-c74c-4a2d-8d62-31e14a29d43a-cnibin\") pod \"multus-additional-cni-plugins-2bsbc\" (UID: \"663fea9a-c74c-4a2d-8d62-31e14a29d43a\") " pod="openshift-multus/multus-additional-cni-plugins-2bsbc" Apr 24 14:24:08.653013 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.652901 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/663fea9a-c74c-4a2d-8d62-31e14a29d43a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-2bsbc\" (UID: \"663fea9a-c74c-4a2d-8d62-31e14a29d43a\") " pod="openshift-multus/multus-additional-cni-plugins-2bsbc" Apr 24 14:24:08.653013 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.652933 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/496c729f-9eee-4311-8fe8-4502d4af37f8-metrics-certs\") pod \"network-metrics-daemon-d47r2\" (UID: \"496c729f-9eee-4311-8fe8-4502d4af37f8\") " pod="openshift-multus/network-metrics-daemon-d47r2" Apr 24 14:24:08.653013 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.652955 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/663fea9a-c74c-4a2d-8d62-31e14a29d43a-system-cni-dir\") pod \"multus-additional-cni-plugins-2bsbc\" (UID: \"663fea9a-c74c-4a2d-8d62-31e14a29d43a\") " pod="openshift-multus/multus-additional-cni-plugins-2bsbc" Apr 24 14:24:08.653013 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.652978 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/663fea9a-c74c-4a2d-8d62-31e14a29d43a-cni-binary-copy\") pod \"multus-additional-cni-plugins-2bsbc\" (UID: \"663fea9a-c74c-4a2d-8d62-31e14a29d43a\") " pod="openshift-multus/multus-additional-cni-plugins-2bsbc" Apr 24 14:24:08.653013 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.652995 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/853085d1-3eec-4e3a-a1e6-999af329c8d0-host-slash\") pod \"iptables-alerter-8grsv\" (UID: \"853085d1-3eec-4e3a-a1e6-999af329c8d0\") " pod="openshift-network-operator/iptables-alerter-8grsv" Apr 24 14:24:08.653013 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.653013 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dc88fdf6-9eaa-4f26-9344-140dd6d94ef5-kubelet-dir\") pod \"aws-ebs-csi-driver-node-7ptdz\" (UID: \"dc88fdf6-9eaa-4f26-9344-140dd6d94ef5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7ptdz" Apr 24 14:24:08.653362 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.653038 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/663fea9a-c74c-4a2d-8d62-31e14a29d43a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2bsbc\" (UID: \"663fea9a-c74c-4a2d-8d62-31e14a29d43a\") " pod="openshift-multus/multus-additional-cni-plugins-2bsbc" Apr 24 14:24:08.653362 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.653083 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/853085d1-3eec-4e3a-a1e6-999af329c8d0-iptables-alerter-script\") pod \"iptables-alerter-8grsv\" (UID: \"853085d1-3eec-4e3a-a1e6-999af329c8d0\") " pod="openshift-network-operator/iptables-alerter-8grsv" Apr 24 14:24:08.653362 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.653114 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/dc88fdf6-9eaa-4f26-9344-140dd6d94ef5-socket-dir\") pod \"aws-ebs-csi-driver-node-7ptdz\" (UID: \"dc88fdf6-9eaa-4f26-9344-140dd6d94ef5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7ptdz" Apr 24 14:24:08.653362 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.653135 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/dc88fdf6-9eaa-4f26-9344-140dd6d94ef5-registration-dir\") pod \"aws-ebs-csi-driver-node-7ptdz\" (UID: \"dc88fdf6-9eaa-4f26-9344-140dd6d94ef5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7ptdz" Apr 24 14:24:08.653362 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.653150 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/dc88fdf6-9eaa-4f26-9344-140dd6d94ef5-device-dir\") pod \"aws-ebs-csi-driver-node-7ptdz\" (UID: \"dc88fdf6-9eaa-4f26-9344-140dd6d94ef5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7ptdz" Apr 24 14:24:08.653362 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.653168 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/dc88fdf6-9eaa-4f26-9344-140dd6d94ef5-etc-selinux\") pod \"aws-ebs-csi-driver-node-7ptdz\" (UID: \"dc88fdf6-9eaa-4f26-9344-140dd6d94ef5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7ptdz" Apr 24 14:24:08.653362 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.653192 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zn6v4\" (UniqueName: \"kubernetes.io/projected/dc88fdf6-9eaa-4f26-9344-140dd6d94ef5-kube-api-access-zn6v4\") pod \"aws-ebs-csi-driver-node-7ptdz\" (UID: \"dc88fdf6-9eaa-4f26-9344-140dd6d94ef5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7ptdz" Apr 24 14:24:08.653964 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.653944 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 14:24:08.654259 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.654241 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 14:24:08.654463 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.654448 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 14:24:08.654600 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.654583 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 14:24:08.654800 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.654781 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 14:24:08.654885 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.654782 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-hb9sx\"" Apr 24 14:24:08.654995 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.654978 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 14:24:08.656545 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.656529 2569 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 14:24:08.657107 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.657085 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-k8phw" Apr 24 14:24:08.657210 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.657192 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-5f7nk" Apr 24 14:24:08.659082 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.659057 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 14:24:08.659195 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.659117 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 14:24:08.659195 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.659166 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-fbpnt\"" Apr 24 14:24:08.659425 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.659316 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-96gr7\"" Apr 24 14:24:08.659425 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.659380 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 14:24:08.659595 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.659579 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 14:24:08.659679 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.659660 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 14:24:08.659756 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.659664 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-7fkmg" Apr 24 14:24:08.661788 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.661772 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-4cxgt" Apr 24 14:24:08.662998 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.662791 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 14:24:08.663122 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.663015 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-hzp8q\"" Apr 24 14:24:08.663375 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.663356 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 14:24:08.664398 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.664196 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-mg2j4\"" Apr 24 14:24:08.664398 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.664342 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 14:24:08.666214 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.666196 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-2ngpp" Apr 24 14:24:08.667765 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.667750 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-pmznq\"" Apr 24 14:24:08.667982 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.667957 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 14:24:08.667982 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.667979 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 14:24:08.668412 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.668398 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tzvkp" Apr 24 14:24:08.668466 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:08.668447 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tzvkp" podUID="66892658-4db6-4064-b52f-60baa00dcc6d" Apr 24 14:24:08.686231 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.686201 2569 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-rl27s" Apr 24 14:24:08.692294 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.692274 2569 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-rl27s" Apr 24 14:24:08.749315 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.749288 2569 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 14:24:08.753719 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.753697 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/be4da945-f6d1-4406-adea-f3ccedab88f6-host-slash\") pod \"ovnkube-node-mjpgb\" (UID: \"be4da945-f6d1-4406-adea-f3ccedab88f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:08.753803 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.753724 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/be4da945-f6d1-4406-adea-f3ccedab88f6-node-log\") pod \"ovnkube-node-mjpgb\" (UID: \"be4da945-f6d1-4406-adea-f3ccedab88f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:08.753803 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.753742 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0-etc-tuned\") pod \"tuned-7fkmg\" (UID: \"45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0\") " pod="openshift-cluster-node-tuning-operator/tuned-7fkmg" Apr 24 14:24:08.753803 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.753764 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/47ccc73c-0a43-4642-8362-1fa6a8574f23-cni-binary-copy\") pod \"multus-4cxgt\" (UID: \"47ccc73c-0a43-4642-8362-1fa6a8574f23\") " pod="openshift-multus/multus-4cxgt" Apr 24 14:24:08.753924 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.753806 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0-sys\") pod \"tuned-7fkmg\" (UID: \"45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0\") " pod="openshift-cluster-node-tuning-operator/tuned-7fkmg" Apr 24 14:24:08.753924 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.753856 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/dc88fdf6-9eaa-4f26-9344-140dd6d94ef5-socket-dir\") pod \"aws-ebs-csi-driver-node-7ptdz\" (UID: \"dc88fdf6-9eaa-4f26-9344-140dd6d94ef5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7ptdz" Apr 24 14:24:08.753924 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.753888 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/dc88fdf6-9eaa-4f26-9344-140dd6d94ef5-device-dir\") pod \"aws-ebs-csi-driver-node-7ptdz\" (UID: \"dc88fdf6-9eaa-4f26-9344-140dd6d94ef5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7ptdz" Apr 24 14:24:08.754072 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.753944 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/dc88fdf6-9eaa-4f26-9344-140dd6d94ef5-device-dir\") pod \"aws-ebs-csi-driver-node-7ptdz\" (UID: \"dc88fdf6-9eaa-4f26-9344-140dd6d94ef5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7ptdz" Apr 24 14:24:08.754072 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.753955 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zn6v4\" (UniqueName: \"kubernetes.io/projected/dc88fdf6-9eaa-4f26-9344-140dd6d94ef5-kube-api-access-zn6v4\") pod \"aws-ebs-csi-driver-node-7ptdz\" (UID: \"dc88fdf6-9eaa-4f26-9344-140dd6d94ef5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7ptdz" Apr 24 14:24:08.754072 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.753999 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/be4da945-f6d1-4406-adea-f3ccedab88f6-host-run-netns\") pod \"ovnkube-node-mjpgb\" (UID: \"be4da945-f6d1-4406-adea-f3ccedab88f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:08.754072 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.754022 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/be4da945-f6d1-4406-adea-f3ccedab88f6-env-overrides\") pod \"ovnkube-node-mjpgb\" (UID: \"be4da945-f6d1-4406-adea-f3ccedab88f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:08.754072 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.754025 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/dc88fdf6-9eaa-4f26-9344-140dd6d94ef5-socket-dir\") pod \"aws-ebs-csi-driver-node-7ptdz\" (UID: \"dc88fdf6-9eaa-4f26-9344-140dd6d94ef5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7ptdz" Apr 24 14:24:08.754072 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.754038 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0-etc-sysconfig\") pod \"tuned-7fkmg\" (UID: \"45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0\") " pod="openshift-cluster-node-tuning-operator/tuned-7fkmg" Apr 24 14:24:08.754338 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.754073 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/dc88fdf6-9eaa-4f26-9344-140dd6d94ef5-sys-fs\") pod \"aws-ebs-csi-driver-node-7ptdz\" (UID: \"dc88fdf6-9eaa-4f26-9344-140dd6d94ef5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7ptdz" Apr 24 14:24:08.754338 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.754100 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/be4da945-f6d1-4406-adea-f3ccedab88f6-log-socket\") pod \"ovnkube-node-mjpgb\" (UID: \"be4da945-f6d1-4406-adea-f3ccedab88f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:08.754338 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.754126 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/be4da945-f6d1-4406-adea-f3ccedab88f6-host-run-ovn-kubernetes\") pod \"ovnkube-node-mjpgb\" (UID: \"be4da945-f6d1-4406-adea-f3ccedab88f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:08.754338 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.754153 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/be4da945-f6d1-4406-adea-f3ccedab88f6-host-cni-bin\") pod \"ovnkube-node-mjpgb\" (UID: \"be4da945-f6d1-4406-adea-f3ccedab88f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:08.754338 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.754172 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/dc88fdf6-9eaa-4f26-9344-140dd6d94ef5-sys-fs\") pod \"aws-ebs-csi-driver-node-7ptdz\" (UID: \"dc88fdf6-9eaa-4f26-9344-140dd6d94ef5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7ptdz" Apr 24 14:24:08.754338 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.754176 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/be4da945-f6d1-4406-adea-f3ccedab88f6-host-cni-netd\") pod \"ovnkube-node-mjpgb\" (UID: \"be4da945-f6d1-4406-adea-f3ccedab88f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:08.754338 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.754218 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5a0260d3-f8e0-4a3d-b95a-547feee30046-serviceca\") pod \"node-ca-5f7nk\" (UID: \"5a0260d3-f8e0-4a3d-b95a-547feee30046\") " pod="openshift-image-registry/node-ca-5f7nk" Apr 24 14:24:08.754338 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.754246 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nk96f\" (UniqueName: \"kubernetes.io/projected/5a0260d3-f8e0-4a3d-b95a-547feee30046-kube-api-access-nk96f\") pod \"node-ca-5f7nk\" (UID: \"5a0260d3-f8e0-4a3d-b95a-547feee30046\") " pod="openshift-image-registry/node-ca-5f7nk" Apr 24 14:24:08.754338 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.754283 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0-etc-kubernetes\") pod \"tuned-7fkmg\" (UID: \"45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0\") " pod="openshift-cluster-node-tuning-operator/tuned-7fkmg" Apr 24 14:24:08.754338 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.754318 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/663fea9a-c74c-4a2d-8d62-31e14a29d43a-cnibin\") pod \"multus-additional-cni-plugins-2bsbc\" (UID: \"663fea9a-c74c-4a2d-8d62-31e14a29d43a\") " pod="openshift-multus/multus-additional-cni-plugins-2bsbc" Apr 24 14:24:08.754338 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.754338 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/be4da945-f6d1-4406-adea-f3ccedab88f6-run-systemd\") pod \"ovnkube-node-mjpgb\" (UID: \"be4da945-f6d1-4406-adea-f3ccedab88f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:08.754789 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.754366 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/663fea9a-c74c-4a2d-8d62-31e14a29d43a-cnibin\") pod \"multus-additional-cni-plugins-2bsbc\" (UID: \"663fea9a-c74c-4a2d-8d62-31e14a29d43a\") " pod="openshift-multus/multus-additional-cni-plugins-2bsbc" Apr 24 14:24:08.754789 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.754385 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c5b32595-8f39-477c-8d42-8cc919341875-agent-certs\") pod \"konnectivity-agent-k8phw\" (UID: \"c5b32595-8f39-477c-8d42-8cc919341875\") " pod="kube-system/konnectivity-agent-k8phw" Apr 24 14:24:08.754789 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.754411 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/47ccc73c-0a43-4642-8362-1fa6a8574f23-host-run-multus-certs\") pod \"multus-4cxgt\" (UID: \"47ccc73c-0a43-4642-8362-1fa6a8574f23\") " pod="openshift-multus/multus-4cxgt" Apr 24 14:24:08.754789 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.754433 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-455n6\" (UniqueName: \"kubernetes.io/projected/47ccc73c-0a43-4642-8362-1fa6a8574f23-kube-api-access-455n6\") pod \"multus-4cxgt\" (UID: \"47ccc73c-0a43-4642-8362-1fa6a8574f23\") " pod="openshift-multus/multus-4cxgt" Apr 24 14:24:08.754789 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.754452 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/853085d1-3eec-4e3a-a1e6-999af329c8d0-host-slash\") pod \"iptables-alerter-8grsv\" (UID: \"853085d1-3eec-4e3a-a1e6-999af329c8d0\") " pod="openshift-network-operator/iptables-alerter-8grsv" Apr 24 14:24:08.754789 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.754474 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/be4da945-f6d1-4406-adea-f3ccedab88f6-var-lib-openvswitch\") pod \"ovnkube-node-mjpgb\" (UID: \"be4da945-f6d1-4406-adea-f3ccedab88f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:08.754789 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.754505 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/853085d1-3eec-4e3a-a1e6-999af329c8d0-host-slash\") pod \"iptables-alerter-8grsv\" (UID: \"853085d1-3eec-4e3a-a1e6-999af329c8d0\") " pod="openshift-network-operator/iptables-alerter-8grsv" Apr 24 14:24:08.754789 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.754558 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/be4da945-f6d1-4406-adea-f3ccedab88f6-run-ovn\") pod \"ovnkube-node-mjpgb\" (UID: \"be4da945-f6d1-4406-adea-f3ccedab88f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:08.754789 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.754584 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0-tmp\") pod \"tuned-7fkmg\" (UID: \"45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0\") " pod="openshift-cluster-node-tuning-operator/tuned-7fkmg" Apr 24 14:24:08.754789 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.754607 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/47ccc73c-0a43-4642-8362-1fa6a8574f23-system-cni-dir\") pod \"multus-4cxgt\" (UID: \"47ccc73c-0a43-4642-8362-1fa6a8574f23\") " pod="openshift-multus/multus-4cxgt" Apr 24 14:24:08.754789 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.754630 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/47ccc73c-0a43-4642-8362-1fa6a8574f23-host-var-lib-cni-bin\") pod \"multus-4cxgt\" (UID: \"47ccc73c-0a43-4642-8362-1fa6a8574f23\") " pod="openshift-multus/multus-4cxgt" Apr 24 14:24:08.754789 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.754669 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/47ccc73c-0a43-4642-8362-1fa6a8574f23-multus-daemon-config\") pod \"multus-4cxgt\" (UID: \"47ccc73c-0a43-4642-8362-1fa6a8574f23\") " pod="openshift-multus/multus-4cxgt" Apr 24 14:24:08.754789 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.754693 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/47ccc73c-0a43-4642-8362-1fa6a8574f23-etc-kubernetes\") pod \"multus-4cxgt\" (UID: \"47ccc73c-0a43-4642-8362-1fa6a8574f23\") " pod="openshift-multus/multus-4cxgt" Apr 24 14:24:08.754789 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.754715 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/be4da945-f6d1-4406-adea-f3ccedab88f6-ovnkube-script-lib\") pod \"ovnkube-node-mjpgb\" (UID: \"be4da945-f6d1-4406-adea-f3ccedab88f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:08.754789 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.754739 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e741aaa3-8cb4-407a-835a-a552c5b77737-tmp-dir\") pod \"node-resolver-2ngpp\" (UID: \"e741aaa3-8cb4-407a-835a-a552c5b77737\") " pod="openshift-dns/node-resolver-2ngpp" Apr 24 14:24:08.754789 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.754763 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxkqx\" (UniqueName: \"kubernetes.io/projected/66892658-4db6-4064-b52f-60baa00dcc6d-kube-api-access-jxkqx\") pod \"network-check-target-tzvkp\" (UID: \"66892658-4db6-4064-b52f-60baa00dcc6d\") " pod="openshift-network-diagnostics/network-check-target-tzvkp" Apr 24 14:24:08.754789 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.754792 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/663fea9a-c74c-4a2d-8d62-31e14a29d43a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2bsbc\" (UID: \"663fea9a-c74c-4a2d-8d62-31e14a29d43a\") " pod="openshift-multus/multus-additional-cni-plugins-2bsbc" Apr 24 14:24:08.755419 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.754824 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5r8n2\" (UniqueName: \"kubernetes.io/projected/496c729f-9eee-4311-8fe8-4502d4af37f8-kube-api-access-5r8n2\") pod \"network-metrics-daemon-d47r2\" (UID: \"496c729f-9eee-4311-8fe8-4502d4af37f8\") " pod="openshift-multus/network-metrics-daemon-d47r2" Apr 24 14:24:08.755419 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.754851 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2zwr5\" (UniqueName: \"kubernetes.io/projected/853085d1-3eec-4e3a-a1e6-999af329c8d0-kube-api-access-2zwr5\") pod \"iptables-alerter-8grsv\" (UID: \"853085d1-3eec-4e3a-a1e6-999af329c8d0\") " pod="openshift-network-operator/iptables-alerter-8grsv" Apr 24 14:24:08.755419 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.754877 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0-etc-modprobe-d\") pod \"tuned-7fkmg\" (UID: \"45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0\") " pod="openshift-cluster-node-tuning-operator/tuned-7fkmg" Apr 24 14:24:08.755419 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.754918 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0-etc-systemd\") pod \"tuned-7fkmg\" (UID: \"45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0\") " pod="openshift-cluster-node-tuning-operator/tuned-7fkmg" Apr 24 14:24:08.755419 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.754957 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/47ccc73c-0a43-4642-8362-1fa6a8574f23-os-release\") pod \"multus-4cxgt\" (UID: \"47ccc73c-0a43-4642-8362-1fa6a8574f23\") " pod="openshift-multus/multus-4cxgt" Apr 24 14:24:08.755419 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.754990 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0-lib-modules\") pod \"tuned-7fkmg\" (UID: \"45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0\") " pod="openshift-cluster-node-tuning-operator/tuned-7fkmg" Apr 24 14:24:08.755419 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.755028 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/663fea9a-c74c-4a2d-8d62-31e14a29d43a-system-cni-dir\") pod \"multus-additional-cni-plugins-2bsbc\" (UID: \"663fea9a-c74c-4a2d-8d62-31e14a29d43a\") " pod="openshift-multus/multus-additional-cni-plugins-2bsbc" Apr 24 14:24:08.755419 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.755055 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/47ccc73c-0a43-4642-8362-1fa6a8574f23-cnibin\") pod \"multus-4cxgt\" (UID: \"47ccc73c-0a43-4642-8362-1fa6a8574f23\") " pod="openshift-multus/multus-4cxgt" Apr 24 14:24:08.755419 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.755081 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/47ccc73c-0a43-4642-8362-1fa6a8574f23-host-run-k8s-cni-cncf-io\") pod \"multus-4cxgt\" (UID: \"47ccc73c-0a43-4642-8362-1fa6a8574f23\") " pod="openshift-multus/multus-4cxgt" Apr 24 14:24:08.755419 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.755082 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/663fea9a-c74c-4a2d-8d62-31e14a29d43a-system-cni-dir\") pod \"multus-additional-cni-plugins-2bsbc\" (UID: \"663fea9a-c74c-4a2d-8d62-31e14a29d43a\") " pod="openshift-multus/multus-additional-cni-plugins-2bsbc" Apr 24 14:24:08.755419 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.755124 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/47ccc73c-0a43-4642-8362-1fa6a8574f23-host-run-netns\") pod \"multus-4cxgt\" (UID: \"47ccc73c-0a43-4642-8362-1fa6a8574f23\") " pod="openshift-multus/multus-4cxgt" Apr 24 14:24:08.755419 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.755170 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lg5lh\" (UniqueName: \"kubernetes.io/projected/e741aaa3-8cb4-407a-835a-a552c5b77737-kube-api-access-lg5lh\") pod \"node-resolver-2ngpp\" (UID: \"e741aaa3-8cb4-407a-835a-a552c5b77737\") " pod="openshift-dns/node-resolver-2ngpp" Apr 24 14:24:08.755419 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.755200 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/663fea9a-c74c-4a2d-8d62-31e14a29d43a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2bsbc\" (UID: \"663fea9a-c74c-4a2d-8d62-31e14a29d43a\") " pod="openshift-multus/multus-additional-cni-plugins-2bsbc" Apr 24 14:24:08.755419 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.755217 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/dc88fdf6-9eaa-4f26-9344-140dd6d94ef5-registration-dir\") pod \"aws-ebs-csi-driver-node-7ptdz\" (UID: \"dc88fdf6-9eaa-4f26-9344-140dd6d94ef5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7ptdz" Apr 24 14:24:08.755419 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.755239 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/be4da945-f6d1-4406-adea-f3ccedab88f6-host-kubelet\") pod \"ovnkube-node-mjpgb\" (UID: \"be4da945-f6d1-4406-adea-f3ccedab88f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:08.755419 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.755264 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5a0260d3-f8e0-4a3d-b95a-547feee30046-host\") pod \"node-ca-5f7nk\" (UID: \"5a0260d3-f8e0-4a3d-b95a-547feee30046\") " pod="openshift-image-registry/node-ca-5f7nk" Apr 24 14:24:08.755419 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.755274 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/dc88fdf6-9eaa-4f26-9344-140dd6d94ef5-registration-dir\") pod \"aws-ebs-csi-driver-node-7ptdz\" (UID: \"dc88fdf6-9eaa-4f26-9344-140dd6d94ef5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7ptdz" Apr 24 14:24:08.755928 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.755297 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/663fea9a-c74c-4a2d-8d62-31e14a29d43a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2bsbc\" (UID: \"663fea9a-c74c-4a2d-8d62-31e14a29d43a\") " pod="openshift-multus/multus-additional-cni-plugins-2bsbc" Apr 24 14:24:08.755928 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.755289 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0-etc-sysctl-d\") pod \"tuned-7fkmg\" (UID: \"45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0\") " pod="openshift-cluster-node-tuning-operator/tuned-7fkmg" Apr 24 14:24:08.755928 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.755320 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0-etc-sysctl-conf\") pod \"tuned-7fkmg\" (UID: \"45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0\") " pod="openshift-cluster-node-tuning-operator/tuned-7fkmg" Apr 24 14:24:08.755928 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.755300 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/663fea9a-c74c-4a2d-8d62-31e14a29d43a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2bsbc\" (UID: \"663fea9a-c74c-4a2d-8d62-31e14a29d43a\") " pod="openshift-multus/multus-additional-cni-plugins-2bsbc" Apr 24 14:24:08.755928 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.755336 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/663fea9a-c74c-4a2d-8d62-31e14a29d43a-os-release\") pod \"multus-additional-cni-plugins-2bsbc\" (UID: \"663fea9a-c74c-4a2d-8d62-31e14a29d43a\") " pod="openshift-multus/multus-additional-cni-plugins-2bsbc" Apr 24 14:24:08.755928 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.755366 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-87jwg\" (UniqueName: \"kubernetes.io/projected/663fea9a-c74c-4a2d-8d62-31e14a29d43a-kube-api-access-87jwg\") pod \"multus-additional-cni-plugins-2bsbc\" (UID: \"663fea9a-c74c-4a2d-8d62-31e14a29d43a\") " pod="openshift-multus/multus-additional-cni-plugins-2bsbc" Apr 24 14:24:08.755928 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.755393 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0-var-lib-kubelet\") pod \"tuned-7fkmg\" (UID: \"45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0\") " pod="openshift-cluster-node-tuning-operator/tuned-7fkmg" Apr 24 14:24:08.755928 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.755410 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/663fea9a-c74c-4a2d-8d62-31e14a29d43a-os-release\") pod \"multus-additional-cni-plugins-2bsbc\" (UID: \"663fea9a-c74c-4a2d-8d62-31e14a29d43a\") " pod="openshift-multus/multus-additional-cni-plugins-2bsbc" Apr 24 14:24:08.755928 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.755419 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/496c729f-9eee-4311-8fe8-4502d4af37f8-metrics-certs\") pod \"network-metrics-daemon-d47r2\" (UID: \"496c729f-9eee-4311-8fe8-4502d4af37f8\") " pod="openshift-multus/network-metrics-daemon-d47r2" Apr 24 14:24:08.755928 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.755446 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/be4da945-f6d1-4406-adea-f3ccedab88f6-ovnkube-config\") pod \"ovnkube-node-mjpgb\" (UID: \"be4da945-f6d1-4406-adea-f3ccedab88f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:08.755928 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.755542 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/be4da945-f6d1-4406-adea-f3ccedab88f6-ovn-node-metrics-cert\") pod \"ovnkube-node-mjpgb\" (UID: \"be4da945-f6d1-4406-adea-f3ccedab88f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:08.755928 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.755578 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/47ccc73c-0a43-4642-8362-1fa6a8574f23-multus-cni-dir\") pod \"multus-4cxgt\" (UID: \"47ccc73c-0a43-4642-8362-1fa6a8574f23\") " pod="openshift-multus/multus-4cxgt" Apr 24 14:24:08.755928 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:08.755583 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:08.755928 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.755602 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dc88fdf6-9eaa-4f26-9344-140dd6d94ef5-kubelet-dir\") pod \"aws-ebs-csi-driver-node-7ptdz\" (UID: \"dc88fdf6-9eaa-4f26-9344-140dd6d94ef5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7ptdz" Apr 24 14:24:08.755928 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:08.755695 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/496c729f-9eee-4311-8fe8-4502d4af37f8-metrics-certs podName:496c729f-9eee-4311-8fe8-4502d4af37f8 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:09.255655039 +0000 UTC m=+2.049711926 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/496c729f-9eee-4311-8fe8-4502d4af37f8-metrics-certs") pod "network-metrics-daemon-d47r2" (UID: "496c729f-9eee-4311-8fe8-4502d4af37f8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:08.755928 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.755712 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dc88fdf6-9eaa-4f26-9344-140dd6d94ef5-kubelet-dir\") pod \"aws-ebs-csi-driver-node-7ptdz\" (UID: \"dc88fdf6-9eaa-4f26-9344-140dd6d94ef5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7ptdz" Apr 24 14:24:08.756419 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.755746 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kl5d\" (UniqueName: \"kubernetes.io/projected/be4da945-f6d1-4406-adea-f3ccedab88f6-kube-api-access-9kl5d\") pod \"ovnkube-node-mjpgb\" (UID: \"be4da945-f6d1-4406-adea-f3ccedab88f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:08.756419 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.755783 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/47ccc73c-0a43-4642-8362-1fa6a8574f23-host-var-lib-cni-multus\") pod \"multus-4cxgt\" (UID: \"47ccc73c-0a43-4642-8362-1fa6a8574f23\") " pod="openshift-multus/multus-4cxgt" Apr 24 14:24:08.756419 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.755808 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/853085d1-3eec-4e3a-a1e6-999af329c8d0-iptables-alerter-script\") pod \"iptables-alerter-8grsv\" (UID: \"853085d1-3eec-4e3a-a1e6-999af329c8d0\") " pod="openshift-network-operator/iptables-alerter-8grsv" Apr 24 14:24:08.756419 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.755841 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/dc88fdf6-9eaa-4f26-9344-140dd6d94ef5-etc-selinux\") pod \"aws-ebs-csi-driver-node-7ptdz\" (UID: \"dc88fdf6-9eaa-4f26-9344-140dd6d94ef5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7ptdz" Apr 24 14:24:08.756419 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.755923 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/be4da945-f6d1-4406-adea-f3ccedab88f6-etc-openvswitch\") pod \"ovnkube-node-mjpgb\" (UID: \"be4da945-f6d1-4406-adea-f3ccedab88f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:08.756419 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.755933 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/dc88fdf6-9eaa-4f26-9344-140dd6d94ef5-etc-selinux\") pod \"aws-ebs-csi-driver-node-7ptdz\" (UID: \"dc88fdf6-9eaa-4f26-9344-140dd6d94ef5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7ptdz" Apr 24 14:24:08.756419 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.755950 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/be4da945-f6d1-4406-adea-f3ccedab88f6-run-openvswitch\") pod \"ovnkube-node-mjpgb\" (UID: \"be4da945-f6d1-4406-adea-f3ccedab88f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:08.756419 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.755972 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/47ccc73c-0a43-4642-8362-1fa6a8574f23-hostroot\") pod \"multus-4cxgt\" (UID: \"47ccc73c-0a43-4642-8362-1fa6a8574f23\") " pod="openshift-multus/multus-4cxgt" Apr 24 14:24:08.756419 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.755993 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/be4da945-f6d1-4406-adea-f3ccedab88f6-systemd-units\") pod \"ovnkube-node-mjpgb\" (UID: \"be4da945-f6d1-4406-adea-f3ccedab88f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:08.756419 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.756008 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/be4da945-f6d1-4406-adea-f3ccedab88f6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mjpgb\" (UID: \"be4da945-f6d1-4406-adea-f3ccedab88f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:08.756419 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.756041 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c5b32595-8f39-477c-8d42-8cc919341875-konnectivity-ca\") pod \"konnectivity-agent-k8phw\" (UID: \"c5b32595-8f39-477c-8d42-8cc919341875\") " pod="kube-system/konnectivity-agent-k8phw" Apr 24 14:24:08.756419 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.756056 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0-run\") pod \"tuned-7fkmg\" (UID: \"45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0\") " pod="openshift-cluster-node-tuning-operator/tuned-7fkmg" Apr 24 14:24:08.756419 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.756069 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0-host\") pod \"tuned-7fkmg\" (UID: \"45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0\") " pod="openshift-cluster-node-tuning-operator/tuned-7fkmg" Apr 24 14:24:08.756419 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.756083 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwxnc\" (UniqueName: \"kubernetes.io/projected/45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0-kube-api-access-jwxnc\") pod \"tuned-7fkmg\" (UID: \"45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0\") " pod="openshift-cluster-node-tuning-operator/tuned-7fkmg" Apr 24 14:24:08.756419 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.756121 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/47ccc73c-0a43-4642-8362-1fa6a8574f23-multus-conf-dir\") pod \"multus-4cxgt\" (UID: \"47ccc73c-0a43-4642-8362-1fa6a8574f23\") " pod="openshift-multus/multus-4cxgt" Apr 24 14:24:08.756419 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.756149 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/663fea9a-c74c-4a2d-8d62-31e14a29d43a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-2bsbc\" (UID: \"663fea9a-c74c-4a2d-8d62-31e14a29d43a\") " pod="openshift-multus/multus-additional-cni-plugins-2bsbc" Apr 24 14:24:08.756853 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.756158 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/853085d1-3eec-4e3a-a1e6-999af329c8d0-iptables-alerter-script\") pod \"iptables-alerter-8grsv\" (UID: \"853085d1-3eec-4e3a-a1e6-999af329c8d0\") " pod="openshift-network-operator/iptables-alerter-8grsv" Apr 24 14:24:08.756853 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.756176 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/47ccc73c-0a43-4642-8362-1fa6a8574f23-multus-socket-dir-parent\") pod \"multus-4cxgt\" (UID: \"47ccc73c-0a43-4642-8362-1fa6a8574f23\") " pod="openshift-multus/multus-4cxgt" Apr 24 14:24:08.756853 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.756201 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/47ccc73c-0a43-4642-8362-1fa6a8574f23-host-var-lib-kubelet\") pod \"multus-4cxgt\" (UID: \"47ccc73c-0a43-4642-8362-1fa6a8574f23\") " pod="openshift-multus/multus-4cxgt" Apr 24 14:24:08.756853 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.756225 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e741aaa3-8cb4-407a-835a-a552c5b77737-hosts-file\") pod \"node-resolver-2ngpp\" (UID: \"e741aaa3-8cb4-407a-835a-a552c5b77737\") " pod="openshift-dns/node-resolver-2ngpp" Apr 24 14:24:08.756853 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.756255 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/663fea9a-c74c-4a2d-8d62-31e14a29d43a-cni-binary-copy\") pod \"multus-additional-cni-plugins-2bsbc\" (UID: \"663fea9a-c74c-4a2d-8d62-31e14a29d43a\") " pod="openshift-multus/multus-additional-cni-plugins-2bsbc" Apr 24 14:24:08.756853 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.756637 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/663fea9a-c74c-4a2d-8d62-31e14a29d43a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-2bsbc\" (UID: \"663fea9a-c74c-4a2d-8d62-31e14a29d43a\") " pod="openshift-multus/multus-additional-cni-plugins-2bsbc" Apr 24 14:24:08.756853 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.756687 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/663fea9a-c74c-4a2d-8d62-31e14a29d43a-cni-binary-copy\") pod \"multus-additional-cni-plugins-2bsbc\" (UID: \"663fea9a-c74c-4a2d-8d62-31e14a29d43a\") " pod="openshift-multus/multus-additional-cni-plugins-2bsbc" Apr 24 14:24:08.761844 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.761825 2569 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 14:24:08.765550 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.765523 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zn6v4\" (UniqueName: \"kubernetes.io/projected/dc88fdf6-9eaa-4f26-9344-140dd6d94ef5-kube-api-access-zn6v4\") pod \"aws-ebs-csi-driver-node-7ptdz\" (UID: \"dc88fdf6-9eaa-4f26-9344-140dd6d94ef5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7ptdz" Apr 24 14:24:08.765741 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.765525 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-87jwg\" (UniqueName: \"kubernetes.io/projected/663fea9a-c74c-4a2d-8d62-31e14a29d43a-kube-api-access-87jwg\") pod \"multus-additional-cni-plugins-2bsbc\" (UID: \"663fea9a-c74c-4a2d-8d62-31e14a29d43a\") " pod="openshift-multus/multus-additional-cni-plugins-2bsbc" Apr 24 14:24:08.766667 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.766642 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5r8n2\" (UniqueName: \"kubernetes.io/projected/496c729f-9eee-4311-8fe8-4502d4af37f8-kube-api-access-5r8n2\") pod \"network-metrics-daemon-d47r2\" (UID: \"496c729f-9eee-4311-8fe8-4502d4af37f8\") " pod="openshift-multus/network-metrics-daemon-d47r2" Apr 24 14:24:08.768834 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.768804 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zwr5\" (UniqueName: \"kubernetes.io/projected/853085d1-3eec-4e3a-a1e6-999af329c8d0-kube-api-access-2zwr5\") pod \"iptables-alerter-8grsv\" (UID: \"853085d1-3eec-4e3a-a1e6-999af329c8d0\") " pod="openshift-network-operator/iptables-alerter-8grsv" Apr 24 14:24:08.770317 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.770288 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 14:24:08.825880 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.825834 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-231.ec2.internal" event={"ID":"43b3b938de58a8a753e7808e14b92798","Type":"ContainerStarted","Data":"8cffe975d840676f503d88bff2802d2dfe945c7e13817b298b4bdc2166b4b77d"} Apr 24 14:24:08.826947 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.826918 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-231.ec2.internal" event={"ID":"b8b57e2ee8dbdeeec77b109df4b31692","Type":"ContainerStarted","Data":"eaf7828c8e06d5d3bd6c9547886e6f8a2d7602a9dabaeffe6986525859341550"} Apr 24 14:24:08.856804 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.856771 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e741aaa3-8cb4-407a-835a-a552c5b77737-hosts-file\") pod \"node-resolver-2ngpp\" (UID: \"e741aaa3-8cb4-407a-835a-a552c5b77737\") " pod="openshift-dns/node-resolver-2ngpp" Apr 24 14:24:08.856932 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.856815 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/be4da945-f6d1-4406-adea-f3ccedab88f6-host-slash\") pod \"ovnkube-node-mjpgb\" (UID: \"be4da945-f6d1-4406-adea-f3ccedab88f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:08.856932 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.856831 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/be4da945-f6d1-4406-adea-f3ccedab88f6-node-log\") pod \"ovnkube-node-mjpgb\" (UID: \"be4da945-f6d1-4406-adea-f3ccedab88f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:08.856932 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.856854 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0-etc-tuned\") pod \"tuned-7fkmg\" (UID: \"45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0\") " pod="openshift-cluster-node-tuning-operator/tuned-7fkmg" Apr 24 14:24:08.856932 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.856876 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/47ccc73c-0a43-4642-8362-1fa6a8574f23-cni-binary-copy\") pod \"multus-4cxgt\" (UID: \"47ccc73c-0a43-4642-8362-1fa6a8574f23\") " pod="openshift-multus/multus-4cxgt" Apr 24 14:24:08.856932 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.856911 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e741aaa3-8cb4-407a-835a-a552c5b77737-hosts-file\") pod \"node-resolver-2ngpp\" (UID: \"e741aaa3-8cb4-407a-835a-a552c5b77737\") " pod="openshift-dns/node-resolver-2ngpp" Apr 24 14:24:08.856932 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.856921 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/be4da945-f6d1-4406-adea-f3ccedab88f6-host-slash\") pod \"ovnkube-node-mjpgb\" (UID: \"be4da945-f6d1-4406-adea-f3ccedab88f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:08.857130 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.856919 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0-sys\") pod \"tuned-7fkmg\" (UID: \"45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0\") " pod="openshift-cluster-node-tuning-operator/tuned-7fkmg" Apr 24 14:24:08.857130 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.856939 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/be4da945-f6d1-4406-adea-f3ccedab88f6-node-log\") pod \"ovnkube-node-mjpgb\" (UID: \"be4da945-f6d1-4406-adea-f3ccedab88f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:08.857130 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.856969 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0-sys\") pod \"tuned-7fkmg\" (UID: \"45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0\") " pod="openshift-cluster-node-tuning-operator/tuned-7fkmg" Apr 24 14:24:08.857130 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.856974 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/be4da945-f6d1-4406-adea-f3ccedab88f6-host-run-netns\") pod \"ovnkube-node-mjpgb\" (UID: \"be4da945-f6d1-4406-adea-f3ccedab88f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:08.857130 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.856999 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/be4da945-f6d1-4406-adea-f3ccedab88f6-host-run-netns\") pod \"ovnkube-node-mjpgb\" (UID: \"be4da945-f6d1-4406-adea-f3ccedab88f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:08.857130 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.857002 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/be4da945-f6d1-4406-adea-f3ccedab88f6-env-overrides\") pod \"ovnkube-node-mjpgb\" (UID: \"be4da945-f6d1-4406-adea-f3ccedab88f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:08.857130 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.857042 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0-etc-sysconfig\") pod \"tuned-7fkmg\" (UID: \"45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0\") " pod="openshift-cluster-node-tuning-operator/tuned-7fkmg" Apr 24 14:24:08.857130 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.857065 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/be4da945-f6d1-4406-adea-f3ccedab88f6-log-socket\") pod \"ovnkube-node-mjpgb\" (UID: \"be4da945-f6d1-4406-adea-f3ccedab88f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:08.857130 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.857107 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/be4da945-f6d1-4406-adea-f3ccedab88f6-log-socket\") pod \"ovnkube-node-mjpgb\" (UID: \"be4da945-f6d1-4406-adea-f3ccedab88f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:08.857480 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.857141 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0-etc-sysconfig\") pod \"tuned-7fkmg\" (UID: \"45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0\") " pod="openshift-cluster-node-tuning-operator/tuned-7fkmg" Apr 24 14:24:08.857480 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.857174 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/be4da945-f6d1-4406-adea-f3ccedab88f6-host-run-ovn-kubernetes\") pod \"ovnkube-node-mjpgb\" (UID: \"be4da945-f6d1-4406-adea-f3ccedab88f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:08.857480 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.857200 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/be4da945-f6d1-4406-adea-f3ccedab88f6-host-cni-bin\") pod \"ovnkube-node-mjpgb\" (UID: \"be4da945-f6d1-4406-adea-f3ccedab88f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:08.857480 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.857225 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/be4da945-f6d1-4406-adea-f3ccedab88f6-host-cni-netd\") pod \"ovnkube-node-mjpgb\" (UID: \"be4da945-f6d1-4406-adea-f3ccedab88f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:08.857480 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.857255 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5a0260d3-f8e0-4a3d-b95a-547feee30046-serviceca\") pod \"node-ca-5f7nk\" (UID: \"5a0260d3-f8e0-4a3d-b95a-547feee30046\") " pod="openshift-image-registry/node-ca-5f7nk" Apr 24 14:24:08.857480 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.857256 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/be4da945-f6d1-4406-adea-f3ccedab88f6-host-run-ovn-kubernetes\") pod \"ovnkube-node-mjpgb\" (UID: \"be4da945-f6d1-4406-adea-f3ccedab88f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:08.857480 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.857263 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/be4da945-f6d1-4406-adea-f3ccedab88f6-host-cni-bin\") pod \"ovnkube-node-mjpgb\" (UID: \"be4da945-f6d1-4406-adea-f3ccedab88f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:08.857480 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.857281 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nk96f\" (UniqueName: \"kubernetes.io/projected/5a0260d3-f8e0-4a3d-b95a-547feee30046-kube-api-access-nk96f\") pod \"node-ca-5f7nk\" (UID: \"5a0260d3-f8e0-4a3d-b95a-547feee30046\") " pod="openshift-image-registry/node-ca-5f7nk" Apr 24 14:24:08.857480 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.857305 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0-etc-kubernetes\") pod \"tuned-7fkmg\" (UID: \"45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0\") " pod="openshift-cluster-node-tuning-operator/tuned-7fkmg" Apr 24 14:24:08.857480 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.857309 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/be4da945-f6d1-4406-adea-f3ccedab88f6-host-cni-netd\") pod \"ovnkube-node-mjpgb\" (UID: \"be4da945-f6d1-4406-adea-f3ccedab88f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:08.857480 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.857367 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/be4da945-f6d1-4406-adea-f3ccedab88f6-run-systemd\") pod \"ovnkube-node-mjpgb\" (UID: \"be4da945-f6d1-4406-adea-f3ccedab88f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:08.857480 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.857388 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c5b32595-8f39-477c-8d42-8cc919341875-agent-certs\") pod \"konnectivity-agent-k8phw\" (UID: \"c5b32595-8f39-477c-8d42-8cc919341875\") " pod="kube-system/konnectivity-agent-k8phw" Apr 24 14:24:08.857480 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.857411 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/47ccc73c-0a43-4642-8362-1fa6a8574f23-host-run-multus-certs\") pod \"multus-4cxgt\" (UID: \"47ccc73c-0a43-4642-8362-1fa6a8574f23\") " pod="openshift-multus/multus-4cxgt" Apr 24 14:24:08.857480 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.857438 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/be4da945-f6d1-4406-adea-f3ccedab88f6-run-systemd\") pod \"ovnkube-node-mjpgb\" (UID: \"be4da945-f6d1-4406-adea-f3ccedab88f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:08.857480 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.857453 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0-etc-kubernetes\") pod \"tuned-7fkmg\" (UID: \"45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0\") " pod="openshift-cluster-node-tuning-operator/tuned-7fkmg" Apr 24 14:24:08.857480 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.857459 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-455n6\" (UniqueName: \"kubernetes.io/projected/47ccc73c-0a43-4642-8362-1fa6a8574f23-kube-api-access-455n6\") pod \"multus-4cxgt\" (UID: \"47ccc73c-0a43-4642-8362-1fa6a8574f23\") " pod="openshift-multus/multus-4cxgt" Apr 24 14:24:08.858218 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.857505 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/be4da945-f6d1-4406-adea-f3ccedab88f6-var-lib-openvswitch\") pod \"ovnkube-node-mjpgb\" (UID: \"be4da945-f6d1-4406-adea-f3ccedab88f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:08.858218 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.857522 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/47ccc73c-0a43-4642-8362-1fa6a8574f23-host-run-multus-certs\") pod \"multus-4cxgt\" (UID: \"47ccc73c-0a43-4642-8362-1fa6a8574f23\") " pod="openshift-multus/multus-4cxgt" Apr 24 14:24:08.858218 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.857534 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/be4da945-f6d1-4406-adea-f3ccedab88f6-run-ovn\") pod \"ovnkube-node-mjpgb\" (UID: \"be4da945-f6d1-4406-adea-f3ccedab88f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:08.858218 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.857549 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/be4da945-f6d1-4406-adea-f3ccedab88f6-env-overrides\") pod \"ovnkube-node-mjpgb\" (UID: \"be4da945-f6d1-4406-adea-f3ccedab88f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:08.858218 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.857559 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0-tmp\") pod \"tuned-7fkmg\" (UID: \"45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0\") " pod="openshift-cluster-node-tuning-operator/tuned-7fkmg" Apr 24 14:24:08.858218 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.857593 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/47ccc73c-0a43-4642-8362-1fa6a8574f23-system-cni-dir\") pod \"multus-4cxgt\" (UID: \"47ccc73c-0a43-4642-8362-1fa6a8574f23\") " pod="openshift-multus/multus-4cxgt" Apr 24 14:24:08.858218 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.857618 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/47ccc73c-0a43-4642-8362-1fa6a8574f23-host-var-lib-cni-bin\") pod \"multus-4cxgt\" (UID: \"47ccc73c-0a43-4642-8362-1fa6a8574f23\") " pod="openshift-multus/multus-4cxgt" Apr 24 14:24:08.858218 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.857627 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/be4da945-f6d1-4406-adea-f3ccedab88f6-var-lib-openvswitch\") pod \"ovnkube-node-mjpgb\" (UID: \"be4da945-f6d1-4406-adea-f3ccedab88f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:08.858218 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.857642 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/47ccc73c-0a43-4642-8362-1fa6a8574f23-multus-daemon-config\") pod \"multus-4cxgt\" (UID: \"47ccc73c-0a43-4642-8362-1fa6a8574f23\") " pod="openshift-multus/multus-4cxgt" Apr 24 14:24:08.858218 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.857669 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/47ccc73c-0a43-4642-8362-1fa6a8574f23-etc-kubernetes\") pod \"multus-4cxgt\" (UID: \"47ccc73c-0a43-4642-8362-1fa6a8574f23\") " pod="openshift-multus/multus-4cxgt" Apr 24 14:24:08.858218 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.857680 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/be4da945-f6d1-4406-adea-f3ccedab88f6-run-ovn\") pod \"ovnkube-node-mjpgb\" (UID: \"be4da945-f6d1-4406-adea-f3ccedab88f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:08.858218 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.857695 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/be4da945-f6d1-4406-adea-f3ccedab88f6-ovnkube-script-lib\") pod \"ovnkube-node-mjpgb\" (UID: \"be4da945-f6d1-4406-adea-f3ccedab88f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:08.858218 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.857708 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/47ccc73c-0a43-4642-8362-1fa6a8574f23-host-var-lib-cni-bin\") pod \"multus-4cxgt\" (UID: \"47ccc73c-0a43-4642-8362-1fa6a8574f23\") " pod="openshift-multus/multus-4cxgt" Apr 24 14:24:08.858218 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.857719 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e741aaa3-8cb4-407a-835a-a552c5b77737-tmp-dir\") pod \"node-resolver-2ngpp\" (UID: \"e741aaa3-8cb4-407a-835a-a552c5b77737\") " pod="openshift-dns/node-resolver-2ngpp" Apr 24 14:24:08.858218 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.857746 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jxkqx\" (UniqueName: \"kubernetes.io/projected/66892658-4db6-4064-b52f-60baa00dcc6d-kube-api-access-jxkqx\") pod \"network-check-target-tzvkp\" (UID: \"66892658-4db6-4064-b52f-60baa00dcc6d\") " pod="openshift-network-diagnostics/network-check-target-tzvkp" Apr 24 14:24:08.858218 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.857754 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/47ccc73c-0a43-4642-8362-1fa6a8574f23-etc-kubernetes\") pod \"multus-4cxgt\" (UID: \"47ccc73c-0a43-4642-8362-1fa6a8574f23\") " pod="openshift-multus/multus-4cxgt" Apr 24 14:24:08.858218 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.857777 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0-etc-modprobe-d\") pod \"tuned-7fkmg\" (UID: \"45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0\") " pod="openshift-cluster-node-tuning-operator/tuned-7fkmg" Apr 24 14:24:08.858218 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.857803 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0-etc-systemd\") pod \"tuned-7fkmg\" (UID: \"45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0\") " pod="openshift-cluster-node-tuning-operator/tuned-7fkmg" Apr 24 14:24:08.859063 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.857750 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/47ccc73c-0a43-4642-8362-1fa6a8574f23-system-cni-dir\") pod \"multus-4cxgt\" (UID: \"47ccc73c-0a43-4642-8362-1fa6a8574f23\") " pod="openshift-multus/multus-4cxgt" Apr 24 14:24:08.859063 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.857828 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/47ccc73c-0a43-4642-8362-1fa6a8574f23-os-release\") pod \"multus-4cxgt\" (UID: \"47ccc73c-0a43-4642-8362-1fa6a8574f23\") " pod="openshift-multus/multus-4cxgt" Apr 24 14:24:08.859063 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.857858 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0-lib-modules\") pod \"tuned-7fkmg\" (UID: \"45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0\") " pod="openshift-cluster-node-tuning-operator/tuned-7fkmg" Apr 24 14:24:08.859063 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.857889 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/47ccc73c-0a43-4642-8362-1fa6a8574f23-cnibin\") pod \"multus-4cxgt\" (UID: \"47ccc73c-0a43-4642-8362-1fa6a8574f23\") " pod="openshift-multus/multus-4cxgt" Apr 24 14:24:08.859063 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.857935 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/47ccc73c-0a43-4642-8362-1fa6a8574f23-host-run-k8s-cni-cncf-io\") pod \"multus-4cxgt\" (UID: \"47ccc73c-0a43-4642-8362-1fa6a8574f23\") " pod="openshift-multus/multus-4cxgt" Apr 24 14:24:08.859063 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.857957 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/47ccc73c-0a43-4642-8362-1fa6a8574f23-host-run-netns\") pod \"multus-4cxgt\" (UID: \"47ccc73c-0a43-4642-8362-1fa6a8574f23\") " pod="openshift-multus/multus-4cxgt" Apr 24 14:24:08.859063 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.857981 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lg5lh\" (UniqueName: \"kubernetes.io/projected/e741aaa3-8cb4-407a-835a-a552c5b77737-kube-api-access-lg5lh\") pod \"node-resolver-2ngpp\" (UID: \"e741aaa3-8cb4-407a-835a-a552c5b77737\") " pod="openshift-dns/node-resolver-2ngpp" Apr 24 14:24:08.859063 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.858007 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/be4da945-f6d1-4406-adea-f3ccedab88f6-host-kubelet\") pod \"ovnkube-node-mjpgb\" (UID: \"be4da945-f6d1-4406-adea-f3ccedab88f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:08.859063 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.858031 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5a0260d3-f8e0-4a3d-b95a-547feee30046-host\") pod \"node-ca-5f7nk\" (UID: \"5a0260d3-f8e0-4a3d-b95a-547feee30046\") " pod="openshift-image-registry/node-ca-5f7nk" Apr 24 14:24:08.859063 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.858040 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e741aaa3-8cb4-407a-835a-a552c5b77737-tmp-dir\") pod \"node-resolver-2ngpp\" (UID: \"e741aaa3-8cb4-407a-835a-a552c5b77737\") " pod="openshift-dns/node-resolver-2ngpp" Apr 24 14:24:08.859063 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.858054 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0-etc-sysctl-d\") pod \"tuned-7fkmg\" (UID: \"45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0\") " pod="openshift-cluster-node-tuning-operator/tuned-7fkmg" Apr 24 14:24:08.859063 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.858081 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0-etc-sysctl-conf\") pod \"tuned-7fkmg\" (UID: \"45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0\") " pod="openshift-cluster-node-tuning-operator/tuned-7fkmg" Apr 24 14:24:08.859063 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.858110 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0-var-lib-kubelet\") pod \"tuned-7fkmg\" (UID: \"45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0\") " pod="openshift-cluster-node-tuning-operator/tuned-7fkmg" Apr 24 14:24:08.859063 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.858140 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/47ccc73c-0a43-4642-8362-1fa6a8574f23-host-run-k8s-cni-cncf-io\") pod \"multus-4cxgt\" (UID: \"47ccc73c-0a43-4642-8362-1fa6a8574f23\") " pod="openshift-multus/multus-4cxgt" Apr 24 14:24:08.859063 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.858147 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/be4da945-f6d1-4406-adea-f3ccedab88f6-ovnkube-config\") pod \"ovnkube-node-mjpgb\" (UID: \"be4da945-f6d1-4406-adea-f3ccedab88f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:08.859063 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.858209 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/be4da945-f6d1-4406-adea-f3ccedab88f6-ovn-node-metrics-cert\") pod \"ovnkube-node-mjpgb\" (UID: \"be4da945-f6d1-4406-adea-f3ccedab88f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:08.859063 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.858240 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/47ccc73c-0a43-4642-8362-1fa6a8574f23-multus-cni-dir\") pod \"multus-4cxgt\" (UID: \"47ccc73c-0a43-4642-8362-1fa6a8574f23\") " pod="openshift-multus/multus-4cxgt" Apr 24 14:24:08.859063 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.858267 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9kl5d\" (UniqueName: \"kubernetes.io/projected/be4da945-f6d1-4406-adea-f3ccedab88f6-kube-api-access-9kl5d\") pod \"ovnkube-node-mjpgb\" (UID: \"be4da945-f6d1-4406-adea-f3ccedab88f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:08.859736 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.858290 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5a0260d3-f8e0-4a3d-b95a-547feee30046-serviceca\") pod \"node-ca-5f7nk\" (UID: \"5a0260d3-f8e0-4a3d-b95a-547feee30046\") " pod="openshift-image-registry/node-ca-5f7nk" Apr 24 14:24:08.859736 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.858292 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/47ccc73c-0a43-4642-8362-1fa6a8574f23-host-var-lib-cni-multus\") pod \"multus-4cxgt\" (UID: \"47ccc73c-0a43-4642-8362-1fa6a8574f23\") " pod="openshift-multus/multus-4cxgt" Apr 24 14:24:08.859736 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.858322 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/47ccc73c-0a43-4642-8362-1fa6a8574f23-host-var-lib-cni-multus\") pod \"multus-4cxgt\" (UID: \"47ccc73c-0a43-4642-8362-1fa6a8574f23\") " pod="openshift-multus/multus-4cxgt" Apr 24 14:24:08.859736 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.858322 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/47ccc73c-0a43-4642-8362-1fa6a8574f23-multus-daemon-config\") pod \"multus-4cxgt\" (UID: \"47ccc73c-0a43-4642-8362-1fa6a8574f23\") " pod="openshift-multus/multus-4cxgt" Apr 24 14:24:08.859736 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.858352 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/be4da945-f6d1-4406-adea-f3ccedab88f6-etc-openvswitch\") pod \"ovnkube-node-mjpgb\" (UID: \"be4da945-f6d1-4406-adea-f3ccedab88f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:08.859736 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.858379 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/be4da945-f6d1-4406-adea-f3ccedab88f6-run-openvswitch\") pod \"ovnkube-node-mjpgb\" (UID: \"be4da945-f6d1-4406-adea-f3ccedab88f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:08.859736 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.858387 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/47ccc73c-0a43-4642-8362-1fa6a8574f23-cnibin\") pod \"multus-4cxgt\" (UID: \"47ccc73c-0a43-4642-8362-1fa6a8574f23\") " pod="openshift-multus/multus-4cxgt" Apr 24 14:24:08.859736 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.858394 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0-lib-modules\") pod \"tuned-7fkmg\" (UID: \"45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0\") " pod="openshift-cluster-node-tuning-operator/tuned-7fkmg" Apr 24 14:24:08.859736 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.858404 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/47ccc73c-0a43-4642-8362-1fa6a8574f23-hostroot\") pod \"multus-4cxgt\" (UID: \"47ccc73c-0a43-4642-8362-1fa6a8574f23\") " pod="openshift-multus/multus-4cxgt" Apr 24 14:24:08.859736 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.858438 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/47ccc73c-0a43-4642-8362-1fa6a8574f23-hostroot\") pod \"multus-4cxgt\" (UID: \"47ccc73c-0a43-4642-8362-1fa6a8574f23\") " pod="openshift-multus/multus-4cxgt" Apr 24 14:24:08.859736 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.858444 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/47ccc73c-0a43-4642-8362-1fa6a8574f23-multus-cni-dir\") pod \"multus-4cxgt\" (UID: \"47ccc73c-0a43-4642-8362-1fa6a8574f23\") " pod="openshift-multus/multus-4cxgt" Apr 24 14:24:08.859736 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.858444 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/be4da945-f6d1-4406-adea-f3ccedab88f6-systemd-units\") pod \"ovnkube-node-mjpgb\" (UID: \"be4da945-f6d1-4406-adea-f3ccedab88f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:08.859736 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.858476 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/be4da945-f6d1-4406-adea-f3ccedab88f6-etc-openvswitch\") pod \"ovnkube-node-mjpgb\" (UID: \"be4da945-f6d1-4406-adea-f3ccedab88f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:08.859736 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.858511 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/be4da945-f6d1-4406-adea-f3ccedab88f6-run-openvswitch\") pod \"ovnkube-node-mjpgb\" (UID: \"be4da945-f6d1-4406-adea-f3ccedab88f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:08.859736 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.858518 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/47ccc73c-0a43-4642-8362-1fa6a8574f23-cni-binary-copy\") pod \"multus-4cxgt\" (UID: \"47ccc73c-0a43-4642-8362-1fa6a8574f23\") " pod="openshift-multus/multus-4cxgt" Apr 24 14:24:08.859736 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.858539 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0-etc-modprobe-d\") pod \"tuned-7fkmg\" (UID: \"45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0\") " pod="openshift-cluster-node-tuning-operator/tuned-7fkmg" Apr 24 14:24:08.859736 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.858539 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/be4da945-f6d1-4406-adea-f3ccedab88f6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mjpgb\" (UID: \"be4da945-f6d1-4406-adea-f3ccedab88f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:08.859736 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.858570 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/be4da945-f6d1-4406-adea-f3ccedab88f6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mjpgb\" (UID: \"be4da945-f6d1-4406-adea-f3ccedab88f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:08.860390 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.858582 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c5b32595-8f39-477c-8d42-8cc919341875-konnectivity-ca\") pod \"konnectivity-agent-k8phw\" (UID: \"c5b32595-8f39-477c-8d42-8cc919341875\") " pod="kube-system/konnectivity-agent-k8phw" Apr 24 14:24:08.860390 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.858584 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5a0260d3-f8e0-4a3d-b95a-547feee30046-host\") pod \"node-ca-5f7nk\" (UID: \"5a0260d3-f8e0-4a3d-b95a-547feee30046\") " pod="openshift-image-registry/node-ca-5f7nk" Apr 24 14:24:08.860390 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.858637 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0-etc-systemd\") pod \"tuned-7fkmg\" (UID: \"45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0\") " pod="openshift-cluster-node-tuning-operator/tuned-7fkmg" Apr 24 14:24:08.860390 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.858647 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/47ccc73c-0a43-4642-8362-1fa6a8574f23-host-run-netns\") pod \"multus-4cxgt\" (UID: \"47ccc73c-0a43-4642-8362-1fa6a8574f23\") " pod="openshift-multus/multus-4cxgt" Apr 24 14:24:08.860390 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.858651 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/be4da945-f6d1-4406-adea-f3ccedab88f6-systemd-units\") pod \"ovnkube-node-mjpgb\" (UID: \"be4da945-f6d1-4406-adea-f3ccedab88f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:08.860390 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.858692 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0-var-lib-kubelet\") pod \"tuned-7fkmg\" (UID: \"45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0\") " pod="openshift-cluster-node-tuning-operator/tuned-7fkmg" Apr 24 14:24:08.860390 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.858697 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/47ccc73c-0a43-4642-8362-1fa6a8574f23-os-release\") pod \"multus-4cxgt\" (UID: \"47ccc73c-0a43-4642-8362-1fa6a8574f23\") " pod="openshift-multus/multus-4cxgt" Apr 24 14:24:08.860390 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.858704 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0-etc-sysctl-conf\") pod \"tuned-7fkmg\" (UID: \"45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0\") " pod="openshift-cluster-node-tuning-operator/tuned-7fkmg" Apr 24 14:24:08.860390 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.858725 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/be4da945-f6d1-4406-adea-f3ccedab88f6-ovnkube-config\") pod \"ovnkube-node-mjpgb\" (UID: \"be4da945-f6d1-4406-adea-f3ccedab88f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:08.860390 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.858737 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/be4da945-f6d1-4406-adea-f3ccedab88f6-host-kubelet\") pod \"ovnkube-node-mjpgb\" (UID: \"be4da945-f6d1-4406-adea-f3ccedab88f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:08.860390 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.858765 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0-run\") pod \"tuned-7fkmg\" (UID: \"45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0\") " pod="openshift-cluster-node-tuning-operator/tuned-7fkmg" Apr 24 14:24:08.860390 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.858793 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0-host\") pod \"tuned-7fkmg\" (UID: \"45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0\") " pod="openshift-cluster-node-tuning-operator/tuned-7fkmg" Apr 24 14:24:08.860390 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.858820 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jwxnc\" (UniqueName: \"kubernetes.io/projected/45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0-kube-api-access-jwxnc\") pod \"tuned-7fkmg\" (UID: \"45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0\") " pod="openshift-cluster-node-tuning-operator/tuned-7fkmg" Apr 24 14:24:08.860390 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.858853 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0-etc-sysctl-d\") pod \"tuned-7fkmg\" (UID: \"45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0\") " pod="openshift-cluster-node-tuning-operator/tuned-7fkmg" Apr 24 14:24:08.860390 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.858854 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/47ccc73c-0a43-4642-8362-1fa6a8574f23-multus-conf-dir\") pod \"multus-4cxgt\" (UID: \"47ccc73c-0a43-4642-8362-1fa6a8574f23\") " pod="openshift-multus/multus-4cxgt" Apr 24 14:24:08.860390 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.858922 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0-run\") pod \"tuned-7fkmg\" (UID: \"45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0\") " pod="openshift-cluster-node-tuning-operator/tuned-7fkmg" Apr 24 14:24:08.860390 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.858929 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/47ccc73c-0a43-4642-8362-1fa6a8574f23-multus-socket-dir-parent\") pod \"multus-4cxgt\" (UID: \"47ccc73c-0a43-4642-8362-1fa6a8574f23\") " pod="openshift-multus/multus-4cxgt" Apr 24 14:24:08.860390 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.858907 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0-host\") pod \"tuned-7fkmg\" (UID: \"45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0\") " pod="openshift-cluster-node-tuning-operator/tuned-7fkmg" Apr 24 14:24:08.860877 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.858982 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/47ccc73c-0a43-4642-8362-1fa6a8574f23-multus-conf-dir\") pod \"multus-4cxgt\" (UID: \"47ccc73c-0a43-4642-8362-1fa6a8574f23\") " pod="openshift-multus/multus-4cxgt" Apr 24 14:24:08.860877 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.858987 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/47ccc73c-0a43-4642-8362-1fa6a8574f23-multus-socket-dir-parent\") pod \"multus-4cxgt\" (UID: \"47ccc73c-0a43-4642-8362-1fa6a8574f23\") " pod="openshift-multus/multus-4cxgt" Apr 24 14:24:08.860877 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.859110 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/47ccc73c-0a43-4642-8362-1fa6a8574f23-host-var-lib-kubelet\") pod \"multus-4cxgt\" (UID: \"47ccc73c-0a43-4642-8362-1fa6a8574f23\") " pod="openshift-multus/multus-4cxgt" Apr 24 14:24:08.860877 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.859173 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/47ccc73c-0a43-4642-8362-1fa6a8574f23-host-var-lib-kubelet\") pod \"multus-4cxgt\" (UID: \"47ccc73c-0a43-4642-8362-1fa6a8574f23\") " pod="openshift-multus/multus-4cxgt" Apr 24 14:24:08.860877 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.859218 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c5b32595-8f39-477c-8d42-8cc919341875-konnectivity-ca\") pod \"konnectivity-agent-k8phw\" (UID: \"c5b32595-8f39-477c-8d42-8cc919341875\") " pod="kube-system/konnectivity-agent-k8phw" Apr 24 14:24:08.860877 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.859324 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/be4da945-f6d1-4406-adea-f3ccedab88f6-ovnkube-script-lib\") pod \"ovnkube-node-mjpgb\" (UID: \"be4da945-f6d1-4406-adea-f3ccedab88f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:08.860877 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.859595 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0-etc-tuned\") pod \"tuned-7fkmg\" (UID: \"45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0\") " pod="openshift-cluster-node-tuning-operator/tuned-7fkmg" Apr 24 14:24:08.860877 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.860056 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c5b32595-8f39-477c-8d42-8cc919341875-agent-certs\") pod \"konnectivity-agent-k8phw\" (UID: \"c5b32595-8f39-477c-8d42-8cc919341875\") " pod="kube-system/konnectivity-agent-k8phw" Apr 24 14:24:08.861112 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.860957 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0-tmp\") pod \"tuned-7fkmg\" (UID: \"45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0\") " pod="openshift-cluster-node-tuning-operator/tuned-7fkmg" Apr 24 14:24:08.861112 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.861021 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/be4da945-f6d1-4406-adea-f3ccedab88f6-ovn-node-metrics-cert\") pod \"ovnkube-node-mjpgb\" (UID: \"be4da945-f6d1-4406-adea-f3ccedab88f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:08.863811 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:08.863794 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 14:24:08.863860 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:08.863812 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 14:24:08.863860 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:08.863822 2569 projected.go:194] Error preparing data for projected volume kube-api-access-jxkqx for pod openshift-network-diagnostics/network-check-target-tzvkp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:08.863969 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:08.863877 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/66892658-4db6-4064-b52f-60baa00dcc6d-kube-api-access-jxkqx podName:66892658-4db6-4064-b52f-60baa00dcc6d nodeName:}" failed. No retries permitted until 2026-04-24 14:24:09.363864003 +0000 UTC m=+2.157920898 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-jxkqx" (UniqueName: "kubernetes.io/projected/66892658-4db6-4064-b52f-60baa00dcc6d-kube-api-access-jxkqx") pod "network-check-target-tzvkp" (UID: "66892658-4db6-4064-b52f-60baa00dcc6d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:08.865503 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.865486 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nk96f\" (UniqueName: \"kubernetes.io/projected/5a0260d3-f8e0-4a3d-b95a-547feee30046-kube-api-access-nk96f\") pod \"node-ca-5f7nk\" (UID: \"5a0260d3-f8e0-4a3d-b95a-547feee30046\") " pod="openshift-image-registry/node-ca-5f7nk" Apr 24 14:24:08.867770 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.867749 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-455n6\" (UniqueName: \"kubernetes.io/projected/47ccc73c-0a43-4642-8362-1fa6a8574f23-kube-api-access-455n6\") pod \"multus-4cxgt\" (UID: \"47ccc73c-0a43-4642-8362-1fa6a8574f23\") " pod="openshift-multus/multus-4cxgt" Apr 24 14:24:08.867855 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.867773 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwxnc\" (UniqueName: \"kubernetes.io/projected/45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0-kube-api-access-jwxnc\") pod \"tuned-7fkmg\" (UID: \"45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0\") " pod="openshift-cluster-node-tuning-operator/tuned-7fkmg" Apr 24 14:24:08.868468 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.868452 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lg5lh\" (UniqueName: \"kubernetes.io/projected/e741aaa3-8cb4-407a-835a-a552c5b77737-kube-api-access-lg5lh\") pod \"node-resolver-2ngpp\" (UID: \"e741aaa3-8cb4-407a-835a-a552c5b77737\") " pod="openshift-dns/node-resolver-2ngpp" Apr 24 14:24:08.868698 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.868683 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kl5d\" (UniqueName: \"kubernetes.io/projected/be4da945-f6d1-4406-adea-f3ccedab88f6-kube-api-access-9kl5d\") pod \"ovnkube-node-mjpgb\" (UID: \"be4da945-f6d1-4406-adea-f3ccedab88f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:08.972960 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.972865 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7ptdz" Apr 24 14:24:08.979395 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:08.979345 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc88fdf6_9eaa_4f26_9344_140dd6d94ef5.slice/crio-79b9b122177f46000659688b83cd08fb070b104ce71a65c09670cbd460bdd6e1 WatchSource:0}: Error finding container 79b9b122177f46000659688b83cd08fb070b104ce71a65c09670cbd460bdd6e1: Status 404 returned error can't find the container with id 79b9b122177f46000659688b83cd08fb070b104ce71a65c09670cbd460bdd6e1 Apr 24 14:24:08.987704 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:08.987683 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-2bsbc" Apr 24 14:24:08.994720 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:08.994688 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod663fea9a_c74c_4a2d_8d62_31e14a29d43a.slice/crio-deecd2aa15bec4675bc45df72359801ed813def75b66fada4c111882f07fc566 WatchSource:0}: Error finding container deecd2aa15bec4675bc45df72359801ed813def75b66fada4c111882f07fc566: Status 404 returned error can't find the container with id deecd2aa15bec4675bc45df72359801ed813def75b66fada4c111882f07fc566 Apr 24 14:24:09.001976 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:09.001959 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-8grsv" Apr 24 14:24:09.007716 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:09.007687 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod853085d1_3eec_4e3a_a1e6_999af329c8d0.slice/crio-9a7c731620f54b480c6b867b46fac857ce8eccf6d9abee57857fe1dbf77a2fc1 WatchSource:0}: Error finding container 9a7c731620f54b480c6b867b46fac857ce8eccf6d9abee57857fe1dbf77a2fc1: Status 404 returned error can't find the container with id 9a7c731620f54b480c6b867b46fac857ce8eccf6d9abee57857fe1dbf77a2fc1 Apr 24 14:24:09.014967 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:09.014751 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:09.023031 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:09.023007 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe4da945_f6d1_4406_adea_f3ccedab88f6.slice/crio-440cc8acdd7dd4555c671977b01077fb278f77ba73abe54f770698cccb1d79c9 WatchSource:0}: Error finding container 440cc8acdd7dd4555c671977b01077fb278f77ba73abe54f770698cccb1d79c9: Status 404 returned error can't find the container with id 440cc8acdd7dd4555c671977b01077fb278f77ba73abe54f770698cccb1d79c9 Apr 24 14:24:09.031543 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:09.031521 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-k8phw" Apr 24 14:24:09.037208 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:09.037186 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5b32595_8f39_477c_8d42_8cc919341875.slice/crio-c4701e26d123a53a6d73d82956894ffff6a00d7ceb4efbe2e0abb0b43e5078c1 WatchSource:0}: Error finding container c4701e26d123a53a6d73d82956894ffff6a00d7ceb4efbe2e0abb0b43e5078c1: Status 404 returned error can't find the container with id c4701e26d123a53a6d73d82956894ffff6a00d7ceb4efbe2e0abb0b43e5078c1 Apr 24 14:24:09.047472 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:09.047454 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-5f7nk" Apr 24 14:24:09.053564 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:09.053535 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a0260d3_f8e0_4a3d_b95a_547feee30046.slice/crio-701d0afd5dfec446cf2021c7c705997d46bad65d70dbf544b08efefb65e110bf WatchSource:0}: Error finding container 701d0afd5dfec446cf2021c7c705997d46bad65d70dbf544b08efefb65e110bf: Status 404 returned error can't find the container with id 701d0afd5dfec446cf2021c7c705997d46bad65d70dbf544b08efefb65e110bf Apr 24 14:24:09.066664 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:09.064720 2569 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 14:24:09.078846 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:09.078820 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-7fkmg" Apr 24 14:24:09.083466 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:09.083445 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-4cxgt" Apr 24 14:24:09.085129 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:09.085095 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45ac4dc2_9823_4bb9_9fb7_9837cf58d4a0.slice/crio-26d5bba1bc1ebad5d1f1c49036408702d3385e6241453b664f210fef61341fbe WatchSource:0}: Error finding container 26d5bba1bc1ebad5d1f1c49036408702d3385e6241453b664f210fef61341fbe: Status 404 returned error can't find the container with id 26d5bba1bc1ebad5d1f1c49036408702d3385e6241453b664f210fef61341fbe Apr 24 14:24:09.089043 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:09.089026 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-2ngpp" Apr 24 14:24:09.089497 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:09.089477 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47ccc73c_0a43_4642_8362_1fa6a8574f23.slice/crio-f105a5aae7db51d7fa1fb5bbfb2845f7905fc3d6dc83f6541ca4298aad954340 WatchSource:0}: Error finding container f105a5aae7db51d7fa1fb5bbfb2845f7905fc3d6dc83f6541ca4298aad954340: Status 404 returned error can't find the container with id f105a5aae7db51d7fa1fb5bbfb2845f7905fc3d6dc83f6541ca4298aad954340 Apr 24 14:24:09.095187 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:09.095165 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode741aaa3_8cb4_407a_835a_a552c5b77737.slice/crio-d78756152413a770b7aa474627918ced63da72c14172e9ea1d33ba582226dba5 WatchSource:0}: Error finding container d78756152413a770b7aa474627918ced63da72c14172e9ea1d33ba582226dba5: Status 404 returned error can't find the container with id d78756152413a770b7aa474627918ced63da72c14172e9ea1d33ba582226dba5 Apr 24 14:24:09.261696 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:09.261606 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/496c729f-9eee-4311-8fe8-4502d4af37f8-metrics-certs\") pod \"network-metrics-daemon-d47r2\" (UID: \"496c729f-9eee-4311-8fe8-4502d4af37f8\") " pod="openshift-multus/network-metrics-daemon-d47r2" Apr 24 14:24:09.261864 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:09.261800 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:09.261950 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:09.261868 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/496c729f-9eee-4311-8fe8-4502d4af37f8-metrics-certs podName:496c729f-9eee-4311-8fe8-4502d4af37f8 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:10.261848122 +0000 UTC m=+3.055905008 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/496c729f-9eee-4311-8fe8-4502d4af37f8-metrics-certs") pod "network-metrics-daemon-d47r2" (UID: "496c729f-9eee-4311-8fe8-4502d4af37f8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:09.463958 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:09.463261 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jxkqx\" (UniqueName: \"kubernetes.io/projected/66892658-4db6-4064-b52f-60baa00dcc6d-kube-api-access-jxkqx\") pod \"network-check-target-tzvkp\" (UID: \"66892658-4db6-4064-b52f-60baa00dcc6d\") " pod="openshift-network-diagnostics/network-check-target-tzvkp" Apr 24 14:24:09.463958 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:09.463478 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 14:24:09.463958 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:09.463503 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 14:24:09.463958 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:09.463516 2569 projected.go:194] Error preparing data for projected volume kube-api-access-jxkqx for pod openshift-network-diagnostics/network-check-target-tzvkp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:09.463958 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:09.463578 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/66892658-4db6-4064-b52f-60baa00dcc6d-kube-api-access-jxkqx podName:66892658-4db6-4064-b52f-60baa00dcc6d nodeName:}" failed. No retries permitted until 2026-04-24 14:24:10.46355794 +0000 UTC m=+3.257614821 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-jxkqx" (UniqueName: "kubernetes.io/projected/66892658-4db6-4064-b52f-60baa00dcc6d-kube-api-access-jxkqx") pod "network-check-target-tzvkp" (UID: "66892658-4db6-4064-b52f-60baa00dcc6d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:09.538468 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:09.538388 2569 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 14:24:09.580645 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:09.580614 2569 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 14:24:09.693717 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:09.693573 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 14:19:08 +0000 UTC" deadline="2027-11-30 05:46:06.0476819 +0000 UTC" Apr 24 14:24:09.693717 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:09.693608 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14031h21m56.354076861s" Apr 24 14:24:09.824506 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:09.823921 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d47r2" Apr 24 14:24:09.824506 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:09.824058 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d47r2" podUID="496c729f-9eee-4311-8fe8-4502d4af37f8" Apr 24 14:24:09.835769 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:09.835722 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4cxgt" event={"ID":"47ccc73c-0a43-4642-8362-1fa6a8574f23","Type":"ContainerStarted","Data":"f105a5aae7db51d7fa1fb5bbfb2845f7905fc3d6dc83f6541ca4298aad954340"} Apr 24 14:24:09.857831 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:09.857787 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-7fkmg" event={"ID":"45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0","Type":"ContainerStarted","Data":"26d5bba1bc1ebad5d1f1c49036408702d3385e6241453b664f210fef61341fbe"} Apr 24 14:24:09.863522 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:09.863412 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-5f7nk" event={"ID":"5a0260d3-f8e0-4a3d-b95a-547feee30046","Type":"ContainerStarted","Data":"701d0afd5dfec446cf2021c7c705997d46bad65d70dbf544b08efefb65e110bf"} Apr 24 14:24:09.872267 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:09.872224 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-8grsv" event={"ID":"853085d1-3eec-4e3a-a1e6-999af329c8d0","Type":"ContainerStarted","Data":"9a7c731620f54b480c6b867b46fac857ce8eccf6d9abee57857fe1dbf77a2fc1"} Apr 24 14:24:09.876078 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:09.876044 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2bsbc" event={"ID":"663fea9a-c74c-4a2d-8d62-31e14a29d43a","Type":"ContainerStarted","Data":"deecd2aa15bec4675bc45df72359801ed813def75b66fada4c111882f07fc566"} Apr 24 14:24:09.887360 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:09.887323 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-2ngpp" event={"ID":"e741aaa3-8cb4-407a-835a-a552c5b77737","Type":"ContainerStarted","Data":"d78756152413a770b7aa474627918ced63da72c14172e9ea1d33ba582226dba5"} Apr 24 14:24:09.891282 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:09.891244 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-k8phw" event={"ID":"c5b32595-8f39-477c-8d42-8cc919341875","Type":"ContainerStarted","Data":"c4701e26d123a53a6d73d82956894ffff6a00d7ceb4efbe2e0abb0b43e5078c1"} Apr 24 14:24:09.899315 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:09.899277 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" event={"ID":"be4da945-f6d1-4406-adea-f3ccedab88f6","Type":"ContainerStarted","Data":"440cc8acdd7dd4555c671977b01077fb278f77ba73abe54f770698cccb1d79c9"} Apr 24 14:24:09.910331 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:09.910276 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7ptdz" event={"ID":"dc88fdf6-9eaa-4f26-9344-140dd6d94ef5","Type":"ContainerStarted","Data":"79b9b122177f46000659688b83cd08fb070b104ce71a65c09670cbd460bdd6e1"} Apr 24 14:24:10.269353 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:10.269243 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/496c729f-9eee-4311-8fe8-4502d4af37f8-metrics-certs\") pod \"network-metrics-daemon-d47r2\" (UID: \"496c729f-9eee-4311-8fe8-4502d4af37f8\") " pod="openshift-multus/network-metrics-daemon-d47r2" Apr 24 14:24:10.269522 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:10.269396 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:10.269522 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:10.269465 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/496c729f-9eee-4311-8fe8-4502d4af37f8-metrics-certs podName:496c729f-9eee-4311-8fe8-4502d4af37f8 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:12.269443529 +0000 UTC m=+5.063500408 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/496c729f-9eee-4311-8fe8-4502d4af37f8-metrics-certs") pod "network-metrics-daemon-d47r2" (UID: "496c729f-9eee-4311-8fe8-4502d4af37f8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:10.470951 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:10.470871 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jxkqx\" (UniqueName: \"kubernetes.io/projected/66892658-4db6-4064-b52f-60baa00dcc6d-kube-api-access-jxkqx\") pod \"network-check-target-tzvkp\" (UID: \"66892658-4db6-4064-b52f-60baa00dcc6d\") " pod="openshift-network-diagnostics/network-check-target-tzvkp" Apr 24 14:24:10.471158 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:10.471061 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 14:24:10.471158 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:10.471099 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 14:24:10.471158 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:10.471113 2569 projected.go:194] Error preparing data for projected volume kube-api-access-jxkqx for pod openshift-network-diagnostics/network-check-target-tzvkp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:10.471324 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:10.471173 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/66892658-4db6-4064-b52f-60baa00dcc6d-kube-api-access-jxkqx podName:66892658-4db6-4064-b52f-60baa00dcc6d nodeName:}" failed. No retries permitted until 2026-04-24 14:24:12.471152801 +0000 UTC m=+5.265209689 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-jxkqx" (UniqueName: "kubernetes.io/projected/66892658-4db6-4064-b52f-60baa00dcc6d-kube-api-access-jxkqx") pod "network-check-target-tzvkp" (UID: "66892658-4db6-4064-b52f-60baa00dcc6d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:10.694480 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:10.694344 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 14:19:08 +0000 UTC" deadline="2027-10-22 17:29:21.408806984 +0000 UTC" Apr 24 14:24:10.694480 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:10.694394 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13107h5m10.714416908s" Apr 24 14:24:10.824158 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:10.823638 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tzvkp" Apr 24 14:24:10.824158 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:10.823769 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tzvkp" podUID="66892658-4db6-4064-b52f-60baa00dcc6d" Apr 24 14:24:11.651392 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:11.651358 2569 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 14:24:11.824577 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:11.824544 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d47r2" Apr 24 14:24:11.825035 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:11.824681 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d47r2" podUID="496c729f-9eee-4311-8fe8-4502d4af37f8" Apr 24 14:24:12.286561 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:12.286450 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/496c729f-9eee-4311-8fe8-4502d4af37f8-metrics-certs\") pod \"network-metrics-daemon-d47r2\" (UID: \"496c729f-9eee-4311-8fe8-4502d4af37f8\") " pod="openshift-multus/network-metrics-daemon-d47r2" Apr 24 14:24:12.286765 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:12.286625 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:12.286765 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:12.286687 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/496c729f-9eee-4311-8fe8-4502d4af37f8-metrics-certs podName:496c729f-9eee-4311-8fe8-4502d4af37f8 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:16.286668698 +0000 UTC m=+9.080725592 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/496c729f-9eee-4311-8fe8-4502d4af37f8-metrics-certs") pod "network-metrics-daemon-d47r2" (UID: "496c729f-9eee-4311-8fe8-4502d4af37f8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:12.487952 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:12.487911 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jxkqx\" (UniqueName: \"kubernetes.io/projected/66892658-4db6-4064-b52f-60baa00dcc6d-kube-api-access-jxkqx\") pod \"network-check-target-tzvkp\" (UID: \"66892658-4db6-4064-b52f-60baa00dcc6d\") " pod="openshift-network-diagnostics/network-check-target-tzvkp" Apr 24 14:24:12.488143 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:12.488105 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 14:24:12.488143 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:12.488126 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 14:24:12.488143 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:12.488139 2569 projected.go:194] Error preparing data for projected volume kube-api-access-jxkqx for pod openshift-network-diagnostics/network-check-target-tzvkp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:12.488303 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:12.488202 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/66892658-4db6-4064-b52f-60baa00dcc6d-kube-api-access-jxkqx podName:66892658-4db6-4064-b52f-60baa00dcc6d nodeName:}" failed. No retries permitted until 2026-04-24 14:24:16.488182585 +0000 UTC m=+9.282239483 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-jxkqx" (UniqueName: "kubernetes.io/projected/66892658-4db6-4064-b52f-60baa00dcc6d-kube-api-access-jxkqx") pod "network-check-target-tzvkp" (UID: "66892658-4db6-4064-b52f-60baa00dcc6d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:12.824413 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:12.823623 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tzvkp" Apr 24 14:24:12.824413 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:12.824009 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tzvkp" podUID="66892658-4db6-4064-b52f-60baa00dcc6d" Apr 24 14:24:13.823757 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:13.823725 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d47r2" Apr 24 14:24:13.824227 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:13.823866 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d47r2" podUID="496c729f-9eee-4311-8fe8-4502d4af37f8" Apr 24 14:24:14.823382 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:14.823342 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tzvkp" Apr 24 14:24:14.823538 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:14.823483 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tzvkp" podUID="66892658-4db6-4064-b52f-60baa00dcc6d" Apr 24 14:24:15.823988 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:15.823494 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d47r2" Apr 24 14:24:15.823988 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:15.823648 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d47r2" podUID="496c729f-9eee-4311-8fe8-4502d4af37f8" Apr 24 14:24:16.324176 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:16.324149 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/496c729f-9eee-4311-8fe8-4502d4af37f8-metrics-certs\") pod \"network-metrics-daemon-d47r2\" (UID: \"496c729f-9eee-4311-8fe8-4502d4af37f8\") " pod="openshift-multus/network-metrics-daemon-d47r2" Apr 24 14:24:16.324353 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:16.324333 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:16.324436 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:16.324400 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/496c729f-9eee-4311-8fe8-4502d4af37f8-metrics-certs podName:496c729f-9eee-4311-8fe8-4502d4af37f8 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:24.324379593 +0000 UTC m=+17.118436492 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/496c729f-9eee-4311-8fe8-4502d4af37f8-metrics-certs") pod "network-metrics-daemon-d47r2" (UID: "496c729f-9eee-4311-8fe8-4502d4af37f8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:16.525805 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:16.525730 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jxkqx\" (UniqueName: \"kubernetes.io/projected/66892658-4db6-4064-b52f-60baa00dcc6d-kube-api-access-jxkqx\") pod \"network-check-target-tzvkp\" (UID: \"66892658-4db6-4064-b52f-60baa00dcc6d\") " pod="openshift-network-diagnostics/network-check-target-tzvkp" Apr 24 14:24:16.526017 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:16.525861 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 14:24:16.526017 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:16.525881 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 14:24:16.526017 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:16.525910 2569 projected.go:194] Error preparing data for projected volume kube-api-access-jxkqx for pod openshift-network-diagnostics/network-check-target-tzvkp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:16.526017 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:16.525968 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/66892658-4db6-4064-b52f-60baa00dcc6d-kube-api-access-jxkqx podName:66892658-4db6-4064-b52f-60baa00dcc6d nodeName:}" failed. No retries permitted until 2026-04-24 14:24:24.525949438 +0000 UTC m=+17.320006321 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-jxkqx" (UniqueName: "kubernetes.io/projected/66892658-4db6-4064-b52f-60baa00dcc6d-kube-api-access-jxkqx") pod "network-check-target-tzvkp" (UID: "66892658-4db6-4064-b52f-60baa00dcc6d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:16.824035 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:16.823841 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tzvkp" Apr 24 14:24:16.824035 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:16.823997 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tzvkp" podUID="66892658-4db6-4064-b52f-60baa00dcc6d" Apr 24 14:24:17.828603 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:17.828510 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tzvkp" Apr 24 14:24:17.828603 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:17.828531 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d47r2" Apr 24 14:24:17.829148 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:17.828630 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tzvkp" podUID="66892658-4db6-4064-b52f-60baa00dcc6d" Apr 24 14:24:17.829148 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:17.828767 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d47r2" podUID="496c729f-9eee-4311-8fe8-4502d4af37f8" Apr 24 14:24:19.823925 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:19.823831 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tzvkp" Apr 24 14:24:19.824369 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:19.823984 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tzvkp" podUID="66892658-4db6-4064-b52f-60baa00dcc6d" Apr 24 14:24:19.824452 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:19.824367 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d47r2" Apr 24 14:24:19.824516 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:19.824493 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d47r2" podUID="496c729f-9eee-4311-8fe8-4502d4af37f8" Apr 24 14:24:21.823645 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:21.823604 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tzvkp" Apr 24 14:24:21.824137 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:21.823660 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d47r2" Apr 24 14:24:21.824137 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:21.823745 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tzvkp" podUID="66892658-4db6-4064-b52f-60baa00dcc6d" Apr 24 14:24:21.824137 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:21.823823 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d47r2" podUID="496c729f-9eee-4311-8fe8-4502d4af37f8" Apr 24 14:24:23.823582 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:23.823537 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d47r2" Apr 24 14:24:23.824053 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:23.823595 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tzvkp" Apr 24 14:24:23.824053 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:23.823691 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d47r2" podUID="496c729f-9eee-4311-8fe8-4502d4af37f8" Apr 24 14:24:23.824053 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:23.823815 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tzvkp" podUID="66892658-4db6-4064-b52f-60baa00dcc6d" Apr 24 14:24:24.377412 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:24.377374 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/496c729f-9eee-4311-8fe8-4502d4af37f8-metrics-certs\") pod \"network-metrics-daemon-d47r2\" (UID: \"496c729f-9eee-4311-8fe8-4502d4af37f8\") " pod="openshift-multus/network-metrics-daemon-d47r2" Apr 24 14:24:24.377609 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:24.377519 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:24.377609 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:24.377590 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/496c729f-9eee-4311-8fe8-4502d4af37f8-metrics-certs podName:496c729f-9eee-4311-8fe8-4502d4af37f8 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:40.377573558 +0000 UTC m=+33.171630442 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/496c729f-9eee-4311-8fe8-4502d4af37f8-metrics-certs") pod "network-metrics-daemon-d47r2" (UID: "496c729f-9eee-4311-8fe8-4502d4af37f8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:24.578681 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:24.578638 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jxkqx\" (UniqueName: \"kubernetes.io/projected/66892658-4db6-4064-b52f-60baa00dcc6d-kube-api-access-jxkqx\") pod \"network-check-target-tzvkp\" (UID: \"66892658-4db6-4064-b52f-60baa00dcc6d\") " pod="openshift-network-diagnostics/network-check-target-tzvkp" Apr 24 14:24:24.578867 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:24.578778 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 14:24:24.578867 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:24.578798 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 14:24:24.578867 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:24.578809 2569 projected.go:194] Error preparing data for projected volume kube-api-access-jxkqx for pod openshift-network-diagnostics/network-check-target-tzvkp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:24.579018 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:24.578873 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/66892658-4db6-4064-b52f-60baa00dcc6d-kube-api-access-jxkqx podName:66892658-4db6-4064-b52f-60baa00dcc6d nodeName:}" failed. No retries permitted until 2026-04-24 14:24:40.578854563 +0000 UTC m=+33.372911448 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-jxkqx" (UniqueName: "kubernetes.io/projected/66892658-4db6-4064-b52f-60baa00dcc6d-kube-api-access-jxkqx") pod "network-check-target-tzvkp" (UID: "66892658-4db6-4064-b52f-60baa00dcc6d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:25.823300 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:25.823261 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d47r2" Apr 24 14:24:25.823300 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:25.823304 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tzvkp" Apr 24 14:24:25.823805 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:25.823392 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d47r2" podUID="496c729f-9eee-4311-8fe8-4502d4af37f8" Apr 24 14:24:25.823805 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:25.823520 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tzvkp" podUID="66892658-4db6-4064-b52f-60baa00dcc6d" Apr 24 14:24:26.938910 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:26.938862 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-231.ec2.internal" event={"ID":"b8b57e2ee8dbdeeec77b109df4b31692","Type":"ContainerStarted","Data":"9078ce72658c7a7ebc6df756a695888a5f35e3296552b198683ea0075aa5ba0d"} Apr 24 14:24:26.940378 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:26.940354 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" event={"ID":"be4da945-f6d1-4406-adea-f3ccedab88f6","Type":"ContainerStarted","Data":"e658194a9aa75a9126ca003cd916c8e1c764bb344993ae6c715d395c197b36f0"} Apr 24 14:24:26.940503 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:26.940385 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" event={"ID":"be4da945-f6d1-4406-adea-f3ccedab88f6","Type":"ContainerStarted","Data":"f94ea0036002b2d737f2f40e1a91dbd1b9729dc4172a539fcccdbfff236f092e"} Apr 24 14:24:26.941605 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:26.941581 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-7fkmg" event={"ID":"45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0","Type":"ContainerStarted","Data":"fe2c4be30ad84e067170a5a48c13d6daa7b519a2de702cb866326e2c8f46a15b"} Apr 24 14:24:26.973759 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:26.973708 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-231.ec2.internal" podStartSLOduration=18.973694559 podStartE2EDuration="18.973694559s" podCreationTimestamp="2026-04-24 14:24:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:24:26.973550799 +0000 UTC m=+19.767607712" watchObservedRunningTime="2026-04-24 14:24:26.973694559 +0000 UTC m=+19.767751460" Apr 24 14:24:27.824269 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:27.823935 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d47r2" Apr 24 14:24:27.824439 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:27.823974 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tzvkp" Apr 24 14:24:27.824439 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:27.824345 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d47r2" podUID="496c729f-9eee-4311-8fe8-4502d4af37f8" Apr 24 14:24:27.824439 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:27.824404 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tzvkp" podUID="66892658-4db6-4064-b52f-60baa00dcc6d" Apr 24 14:24:27.945152 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:27.945114 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4cxgt" event={"ID":"47ccc73c-0a43-4642-8362-1fa6a8574f23","Type":"ContainerStarted","Data":"9d8ac414110d52e2af5080e5563f9118ae01aa0bbaaab20ef892bfb04d956520"} Apr 24 14:24:27.946393 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:27.946369 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-5f7nk" event={"ID":"5a0260d3-f8e0-4a3d-b95a-547feee30046","Type":"ContainerStarted","Data":"1c776f113e241752d57d15d952a8b19ae0971952fc28b72de113f40db5fc5783"} Apr 24 14:24:27.947644 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:27.947621 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-8grsv" event={"ID":"853085d1-3eec-4e3a-a1e6-999af329c8d0","Type":"ContainerStarted","Data":"14a35d49d7fbf09cb3d4e9b69495e01d29e5ae5158ab3edfa8332cf0ede11eef"} Apr 24 14:24:27.949023 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:27.949002 2569 generic.go:358] "Generic (PLEG): container finished" podID="663fea9a-c74c-4a2d-8d62-31e14a29d43a" containerID="f5e22e3b5b1fe556ad9af455d60d5ff533cf04c447b6aaa61105a816ca08461b" exitCode=0 Apr 24 14:24:27.949096 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:27.949059 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2bsbc" event={"ID":"663fea9a-c74c-4a2d-8d62-31e14a29d43a","Type":"ContainerDied","Data":"f5e22e3b5b1fe556ad9af455d60d5ff533cf04c447b6aaa61105a816ca08461b"} Apr 24 14:24:27.952186 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:27.952165 2569 generic.go:358] "Generic (PLEG): container finished" podID="43b3b938de58a8a753e7808e14b92798" containerID="b46038213ecb5e8c498ede7865820adeb9c3b768c2eda1f0c4b732fcd2125694" exitCode=0 Apr 24 14:24:27.952295 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:27.952238 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-231.ec2.internal" event={"ID":"43b3b938de58a8a753e7808e14b92798","Type":"ContainerDied","Data":"b46038213ecb5e8c498ede7865820adeb9c3b768c2eda1f0c4b732fcd2125694"} Apr 24 14:24:27.953744 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:27.953722 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-2ngpp" event={"ID":"e741aaa3-8cb4-407a-835a-a552c5b77737","Type":"ContainerStarted","Data":"0aef0377ca5aea528a36a071e892184b8461c8cd1414dc9bfe5eb3c140216adf"} Apr 24 14:24:27.954942 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:27.954917 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-k8phw" event={"ID":"c5b32595-8f39-477c-8d42-8cc919341875","Type":"ContainerStarted","Data":"004cc184081af06fefca06aa6908152bb11bd20038adf4f95f336aba1275e838"} Apr 24 14:24:27.957688 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:27.957671 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mjpgb_be4da945-f6d1-4406-adea-f3ccedab88f6/ovn-acl-logging/0.log" Apr 24 14:24:27.957979 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:27.957958 2569 generic.go:358] "Generic (PLEG): container finished" podID="be4da945-f6d1-4406-adea-f3ccedab88f6" containerID="e658194a9aa75a9126ca003cd916c8e1c764bb344993ae6c715d395c197b36f0" exitCode=1 Apr 24 14:24:27.958059 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:27.958030 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" event={"ID":"be4da945-f6d1-4406-adea-f3ccedab88f6","Type":"ContainerDied","Data":"e658194a9aa75a9126ca003cd916c8e1c764bb344993ae6c715d395c197b36f0"} Apr 24 14:24:27.958110 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:27.958062 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" event={"ID":"be4da945-f6d1-4406-adea-f3ccedab88f6","Type":"ContainerStarted","Data":"9bfa61573d011f5524d443f2e0d007803ed19c502ca6364cb0976ff045a2b1b3"} Apr 24 14:24:27.958110 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:27.958076 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" event={"ID":"be4da945-f6d1-4406-adea-f3ccedab88f6","Type":"ContainerStarted","Data":"84cc0ea0f077cd5d4e682bf6e33a50cb230138421757c36ff05bde9c9dd6b2be"} Apr 24 14:24:27.958110 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:27.958088 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" event={"ID":"be4da945-f6d1-4406-adea-f3ccedab88f6","Type":"ContainerStarted","Data":"35fcae551fac2ef837c83b7a5d8bbb9f5c01820ef2b64877a405c2d5b250f043"} Apr 24 14:24:27.958110 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:27.958102 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" event={"ID":"be4da945-f6d1-4406-adea-f3ccedab88f6","Type":"ContainerStarted","Data":"160de0c6de69228f6d2f871d63d06fe596bd2456bb8504abd7d6aa744b397d3d"} Apr 24 14:24:27.959170 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:27.959153 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7ptdz" event={"ID":"dc88fdf6-9eaa-4f26-9344-140dd6d94ef5","Type":"ContainerStarted","Data":"fec792d6947dacd2f9f964a01af3fce97aabc5a324a9f0293357f41d9e09607f"} Apr 24 14:24:27.960547 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:27.960514 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-7fkmg" podStartSLOduration=3.451638968 podStartE2EDuration="20.960500882s" podCreationTimestamp="2026-04-24 14:24:07 +0000 UTC" firstStartedPulling="2026-04-24 14:24:09.08670091 +0000 UTC m=+1.880757790" lastFinishedPulling="2026-04-24 14:24:26.595562816 +0000 UTC m=+19.389619704" observedRunningTime="2026-04-24 14:24:26.993535111 +0000 UTC m=+19.787592013" watchObservedRunningTime="2026-04-24 14:24:27.960500882 +0000 UTC m=+20.754557785" Apr 24 14:24:27.960706 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:27.960678 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-4cxgt" podStartSLOduration=2.965922936 podStartE2EDuration="20.960668255s" podCreationTimestamp="2026-04-24 14:24:07 +0000 UTC" firstStartedPulling="2026-04-24 14:24:09.091753632 +0000 UTC m=+1.885810512" lastFinishedPulling="2026-04-24 14:24:27.08649895 +0000 UTC m=+19.880555831" observedRunningTime="2026-04-24 14:24:27.960313117 +0000 UTC m=+20.754370021" watchObservedRunningTime="2026-04-24 14:24:27.960668255 +0000 UTC m=+20.754725158" Apr 24 14:24:27.975344 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:27.975287 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-2ngpp" podStartSLOduration=3.4767821420000002 podStartE2EDuration="20.97526944s" podCreationTimestamp="2026-04-24 14:24:07 +0000 UTC" firstStartedPulling="2026-04-24 14:24:09.096654561 +0000 UTC m=+1.890711440" lastFinishedPulling="2026-04-24 14:24:26.595141856 +0000 UTC m=+19.389198738" observedRunningTime="2026-04-24 14:24:27.974945775 +0000 UTC m=+20.769002677" watchObservedRunningTime="2026-04-24 14:24:27.97526944 +0000 UTC m=+20.769326342" Apr 24 14:24:28.022000 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:28.021941 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-8grsv" podStartSLOduration=3.437709147 podStartE2EDuration="21.021922347s" podCreationTimestamp="2026-04-24 14:24:07 +0000 UTC" firstStartedPulling="2026-04-24 14:24:09.009288012 +0000 UTC m=+1.803344892" lastFinishedPulling="2026-04-24 14:24:26.593501204 +0000 UTC m=+19.387558092" observedRunningTime="2026-04-24 14:24:28.021721127 +0000 UTC m=+20.815778028" watchObservedRunningTime="2026-04-24 14:24:28.021922347 +0000 UTC m=+20.815979249" Apr 24 14:24:28.034373 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:28.034328 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-k8phw" podStartSLOduration=3.48487999 podStartE2EDuration="21.034313691s" podCreationTimestamp="2026-04-24 14:24:07 +0000 UTC" firstStartedPulling="2026-04-24 14:24:09.038514159 +0000 UTC m=+1.832571038" lastFinishedPulling="2026-04-24 14:24:26.587947856 +0000 UTC m=+19.382004739" observedRunningTime="2026-04-24 14:24:28.033848388 +0000 UTC m=+20.827905289" watchObservedRunningTime="2026-04-24 14:24:28.034313691 +0000 UTC m=+20.828370636" Apr 24 14:24:28.046910 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:28.046856 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-5f7nk" podStartSLOduration=3.514175495 podStartE2EDuration="21.046840687s" podCreationTimestamp="2026-04-24 14:24:07 +0000 UTC" firstStartedPulling="2026-04-24 14:24:09.055361783 +0000 UTC m=+1.849418679" lastFinishedPulling="2026-04-24 14:24:26.588026979 +0000 UTC m=+19.382083871" observedRunningTime="2026-04-24 14:24:28.046334491 +0000 UTC m=+20.840391391" watchObservedRunningTime="2026-04-24 14:24:28.046840687 +0000 UTC m=+20.840897588" Apr 24 14:24:28.754347 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:28.754328 2569 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 14:24:28.962833 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:28.962793 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-231.ec2.internal" event={"ID":"43b3b938de58a8a753e7808e14b92798","Type":"ContainerStarted","Data":"e29e59f5bee351853bd6ff458b036d677168c434a49c16c863aa291dcc4331e1"} Apr 24 14:24:28.964624 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:28.964589 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7ptdz" event={"ID":"dc88fdf6-9eaa-4f26-9344-140dd6d94ef5","Type":"ContainerStarted","Data":"57d43591baef44be73c84952bff3644fb17f227bda4a5a1510bcba25804f5d42"} Apr 24 14:24:28.977778 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:28.977731 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-231.ec2.internal" podStartSLOduration=20.977716958 podStartE2EDuration="20.977716958s" podCreationTimestamp="2026-04-24 14:24:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:24:28.977466546 +0000 UTC m=+21.771523692" watchObservedRunningTime="2026-04-24 14:24:28.977716958 +0000 UTC m=+21.771773860" Apr 24 14:24:29.726101 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:29.725981 2569 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T14:24:28.754343596Z","UUID":"319cf609-eeb5-41f5-9a80-ce8d4a8c70a1","Handler":null,"Name":"","Endpoint":""} Apr 24 14:24:29.728334 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:29.728287 2569 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 14:24:29.728334 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:29.728338 2569 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 14:24:29.823409 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:29.823313 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tzvkp" Apr 24 14:24:29.823409 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:29.823359 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d47r2" Apr 24 14:24:29.823638 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:29.823449 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tzvkp" podUID="66892658-4db6-4064-b52f-60baa00dcc6d" Apr 24 14:24:29.823699 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:29.823637 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d47r2" podUID="496c729f-9eee-4311-8fe8-4502d4af37f8" Apr 24 14:24:29.970249 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:29.970220 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mjpgb_be4da945-f6d1-4406-adea-f3ccedab88f6/ovn-acl-logging/0.log" Apr 24 14:24:29.970663 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:29.970596 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" event={"ID":"be4da945-f6d1-4406-adea-f3ccedab88f6","Type":"ContainerStarted","Data":"3eaf860ef5e72a46cdf348f38e3beca5df8e4f1d898a0410b5caa5969177866f"} Apr 24 14:24:29.973134 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:29.973105 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7ptdz" event={"ID":"dc88fdf6-9eaa-4f26-9344-140dd6d94ef5","Type":"ContainerStarted","Data":"f0ed7d9c45db7ab13e0789192ee7d00cbe068bcaa93d683cc8e9e1c24f2daff3"} Apr 24 14:24:29.991755 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:29.991713 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7ptdz" podStartSLOduration=2.440201265 podStartE2EDuration="22.991694693s" podCreationTimestamp="2026-04-24 14:24:07 +0000 UTC" firstStartedPulling="2026-04-24 14:24:08.98111418 +0000 UTC m=+1.775171061" lastFinishedPulling="2026-04-24 14:24:29.532607595 +0000 UTC m=+22.326664489" observedRunningTime="2026-04-24 14:24:29.991275254 +0000 UTC m=+22.785332155" watchObservedRunningTime="2026-04-24 14:24:29.991694693 +0000 UTC m=+22.785751613" Apr 24 14:24:30.060584 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:30.060552 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-k8phw" Apr 24 14:24:30.061229 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:30.061204 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-k8phw" Apr 24 14:24:30.974820 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:30.974560 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-k8phw" Apr 24 14:24:30.975248 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:30.974982 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-k8phw" Apr 24 14:24:31.823333 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:31.823296 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tzvkp" Apr 24 14:24:31.823544 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:31.823420 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tzvkp" podUID="66892658-4db6-4064-b52f-60baa00dcc6d" Apr 24 14:24:31.823544 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:31.823456 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d47r2" Apr 24 14:24:31.823655 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:31.823548 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d47r2" podUID="496c729f-9eee-4311-8fe8-4502d4af37f8" Apr 24 14:24:32.979258 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:32.978992 2569 generic.go:358] "Generic (PLEG): container finished" podID="663fea9a-c74c-4a2d-8d62-31e14a29d43a" containerID="2bf29afc75c2862e643f6cfc32d69d35b62edd5ca302838244ea0ebd7232f5e8" exitCode=0 Apr 24 14:24:32.979258 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:32.979077 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2bsbc" event={"ID":"663fea9a-c74c-4a2d-8d62-31e14a29d43a","Type":"ContainerDied","Data":"2bf29afc75c2862e643f6cfc32d69d35b62edd5ca302838244ea0ebd7232f5e8"} Apr 24 14:24:32.982508 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:32.982486 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mjpgb_be4da945-f6d1-4406-adea-f3ccedab88f6/ovn-acl-logging/0.log" Apr 24 14:24:32.982858 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:32.982829 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" event={"ID":"be4da945-f6d1-4406-adea-f3ccedab88f6","Type":"ContainerStarted","Data":"63843f6baa71bed6649d7be032ef3438f74584af6fb6213455bbcbf2f53b040d"} Apr 24 14:24:32.983301 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:32.983281 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:32.983384 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:32.983308 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:32.983384 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:32.983321 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:32.983481 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:32.983384 2569 scope.go:117] "RemoveContainer" containerID="e658194a9aa75a9126ca003cd916c8e1c764bb344993ae6c715d395c197b36f0" Apr 24 14:24:32.997851 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:32.997827 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:32.997988 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:32.997971 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:24:33.824136 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:33.824112 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d47r2" Apr 24 14:24:33.824269 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:33.824116 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tzvkp" Apr 24 14:24:33.824269 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:33.824216 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d47r2" podUID="496c729f-9eee-4311-8fe8-4502d4af37f8" Apr 24 14:24:33.824350 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:33.824301 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tzvkp" podUID="66892658-4db6-4064-b52f-60baa00dcc6d" Apr 24 14:24:33.985974 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:33.985941 2569 generic.go:358] "Generic (PLEG): container finished" podID="663fea9a-c74c-4a2d-8d62-31e14a29d43a" containerID="33ed362f585e924d51d43aef1452dbcab1598caa8d740826f4c1d79728922db8" exitCode=0 Apr 24 14:24:33.986411 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:33.986027 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2bsbc" event={"ID":"663fea9a-c74c-4a2d-8d62-31e14a29d43a","Type":"ContainerDied","Data":"33ed362f585e924d51d43aef1452dbcab1598caa8d740826f4c1d79728922db8"} Apr 24 14:24:33.989216 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:33.989198 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mjpgb_be4da945-f6d1-4406-adea-f3ccedab88f6/ovn-acl-logging/0.log" Apr 24 14:24:33.989506 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:33.989487 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" event={"ID":"be4da945-f6d1-4406-adea-f3ccedab88f6","Type":"ContainerStarted","Data":"c2da306cc754db91ac8fb478e87aa05fa2963071eb28c3ee60f99f5e7546fd74"} Apr 24 14:24:34.037000 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:34.036951 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" podStartSLOduration=9.412148895 podStartE2EDuration="27.036937464s" podCreationTimestamp="2026-04-24 14:24:07 +0000 UTC" firstStartedPulling="2026-04-24 14:24:09.024478542 +0000 UTC m=+1.818535421" lastFinishedPulling="2026-04-24 14:24:26.649267097 +0000 UTC m=+19.443323990" observedRunningTime="2026-04-24 14:24:34.036504658 +0000 UTC m=+26.830561560" watchObservedRunningTime="2026-04-24 14:24:34.036937464 +0000 UTC m=+26.830994365" Apr 24 14:24:34.993404 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:34.993368 2569 generic.go:358] "Generic (PLEG): container finished" podID="663fea9a-c74c-4a2d-8d62-31e14a29d43a" containerID="44d4e85d8f58234543fa81f87479967c9cca3a654fe6735dcca5944c665ed505" exitCode=0 Apr 24 14:24:34.993769 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:34.993456 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2bsbc" event={"ID":"663fea9a-c74c-4a2d-8d62-31e14a29d43a","Type":"ContainerDied","Data":"44d4e85d8f58234543fa81f87479967c9cca3a654fe6735dcca5944c665ed505"} Apr 24 14:24:35.823289 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:35.823253 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d47r2" Apr 24 14:24:35.823471 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:35.823253 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tzvkp" Apr 24 14:24:35.823471 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:35.823406 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d47r2" podUID="496c729f-9eee-4311-8fe8-4502d4af37f8" Apr 24 14:24:35.823471 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:35.823453 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tzvkp" podUID="66892658-4db6-4064-b52f-60baa00dcc6d" Apr 24 14:24:37.824790 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:37.824749 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tzvkp" Apr 24 14:24:37.825269 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:37.824870 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tzvkp" podUID="66892658-4db6-4064-b52f-60baa00dcc6d" Apr 24 14:24:37.825269 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:37.824928 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d47r2" Apr 24 14:24:37.825269 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:37.825054 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d47r2" podUID="496c729f-9eee-4311-8fe8-4502d4af37f8" Apr 24 14:24:39.824088 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:39.824055 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d47r2" Apr 24 14:24:39.824565 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:39.824055 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tzvkp" Apr 24 14:24:39.824565 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:39.824190 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d47r2" podUID="496c729f-9eee-4311-8fe8-4502d4af37f8" Apr 24 14:24:39.824565 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:39.824230 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tzvkp" podUID="66892658-4db6-4064-b52f-60baa00dcc6d" Apr 24 14:24:40.401575 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:40.401537 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/496c729f-9eee-4311-8fe8-4502d4af37f8-metrics-certs\") pod \"network-metrics-daemon-d47r2\" (UID: \"496c729f-9eee-4311-8fe8-4502d4af37f8\") " pod="openshift-multus/network-metrics-daemon-d47r2" Apr 24 14:24:40.401774 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:40.401715 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:40.401835 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:40.401784 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/496c729f-9eee-4311-8fe8-4502d4af37f8-metrics-certs podName:496c729f-9eee-4311-8fe8-4502d4af37f8 nodeName:}" failed. No retries permitted until 2026-04-24 14:25:12.401769257 +0000 UTC m=+65.195826139 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/496c729f-9eee-4311-8fe8-4502d4af37f8-metrics-certs") pod "network-metrics-daemon-d47r2" (UID: "496c729f-9eee-4311-8fe8-4502d4af37f8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:40.602697 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:40.602665 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jxkqx\" (UniqueName: \"kubernetes.io/projected/66892658-4db6-4064-b52f-60baa00dcc6d-kube-api-access-jxkqx\") pod \"network-check-target-tzvkp\" (UID: \"66892658-4db6-4064-b52f-60baa00dcc6d\") " pod="openshift-network-diagnostics/network-check-target-tzvkp" Apr 24 14:24:40.602863 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:40.602789 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 14:24:40.602863 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:40.602803 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 14:24:40.602863 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:40.602812 2569 projected.go:194] Error preparing data for projected volume kube-api-access-jxkqx for pod openshift-network-diagnostics/network-check-target-tzvkp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:40.602863 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:40.602864 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/66892658-4db6-4064-b52f-60baa00dcc6d-kube-api-access-jxkqx podName:66892658-4db6-4064-b52f-60baa00dcc6d nodeName:}" failed. No retries permitted until 2026-04-24 14:25:12.602849736 +0000 UTC m=+65.396906614 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-jxkqx" (UniqueName: "kubernetes.io/projected/66892658-4db6-4064-b52f-60baa00dcc6d-kube-api-access-jxkqx") pod "network-check-target-tzvkp" (UID: "66892658-4db6-4064-b52f-60baa00dcc6d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:40.939693 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:40.939658 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-d47r2"] Apr 24 14:24:40.940422 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:40.939864 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d47r2" Apr 24 14:24:40.940422 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:40.940018 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d47r2" podUID="496c729f-9eee-4311-8fe8-4502d4af37f8" Apr 24 14:24:40.942438 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:40.942411 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-tzvkp"] Apr 24 14:24:40.942553 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:40.942532 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tzvkp" Apr 24 14:24:40.942649 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:40.942628 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tzvkp" podUID="66892658-4db6-4064-b52f-60baa00dcc6d" Apr 24 14:24:41.008549 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:41.008469 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2bsbc" event={"ID":"663fea9a-c74c-4a2d-8d62-31e14a29d43a","Type":"ContainerStarted","Data":"03d3c86db503e1f1ab77d7912799e12f288302df2d72b0196a0003f6873c4d8d"} Apr 24 14:24:42.012252 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:42.012218 2569 generic.go:358] "Generic (PLEG): container finished" podID="663fea9a-c74c-4a2d-8d62-31e14a29d43a" containerID="03d3c86db503e1f1ab77d7912799e12f288302df2d72b0196a0003f6873c4d8d" exitCode=0 Apr 24 14:24:42.012252 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:42.012256 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2bsbc" event={"ID":"663fea9a-c74c-4a2d-8d62-31e14a29d43a","Type":"ContainerDied","Data":"03d3c86db503e1f1ab77d7912799e12f288302df2d72b0196a0003f6873c4d8d"} Apr 24 14:24:42.823374 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:42.823343 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tzvkp" Apr 24 14:24:42.823538 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:42.823343 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d47r2" Apr 24 14:24:42.823538 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:42.823448 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tzvkp" podUID="66892658-4db6-4064-b52f-60baa00dcc6d" Apr 24 14:24:42.823613 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:42.823530 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d47r2" podUID="496c729f-9eee-4311-8fe8-4502d4af37f8" Apr 24 14:24:43.016504 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:43.016472 2569 generic.go:358] "Generic (PLEG): container finished" podID="663fea9a-c74c-4a2d-8d62-31e14a29d43a" containerID="839c61c43493366d16cfa4d7f88005b614bd8e6e0c0527a57a31ebd9038abccb" exitCode=0 Apr 24 14:24:43.016855 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:43.016517 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2bsbc" event={"ID":"663fea9a-c74c-4a2d-8d62-31e14a29d43a","Type":"ContainerDied","Data":"839c61c43493366d16cfa4d7f88005b614bd8e6e0c0527a57a31ebd9038abccb"} Apr 24 14:24:44.021036 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:44.020819 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2bsbc" event={"ID":"663fea9a-c74c-4a2d-8d62-31e14a29d43a","Type":"ContainerStarted","Data":"af367eeecb192f973cc72027abc55a24e2a462c2e301eb753dda931448a1cbe1"} Apr 24 14:24:44.042595 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:44.042542 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-2bsbc" podStartSLOduration=5.230681761 podStartE2EDuration="37.042529062s" podCreationTimestamp="2026-04-24 14:24:07 +0000 UTC" firstStartedPulling="2026-04-24 14:24:08.996354168 +0000 UTC m=+1.790411050" lastFinishedPulling="2026-04-24 14:24:40.808201468 +0000 UTC m=+33.602258351" observedRunningTime="2026-04-24 14:24:44.041630109 +0000 UTC m=+36.835687010" watchObservedRunningTime="2026-04-24 14:24:44.042529062 +0000 UTC m=+36.836585963" Apr 24 14:24:44.100063 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:44.100039 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-2ngpp_e741aaa3-8cb4-407a-835a-a552c5b77737/dns-node-resolver/0.log" Apr 24 14:24:44.823329 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:44.823292 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d47r2" Apr 24 14:24:44.823499 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:44.823292 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tzvkp" Apr 24 14:24:44.823499 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:44.823426 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d47r2" podUID="496c729f-9eee-4311-8fe8-4502d4af37f8" Apr 24 14:24:44.823499 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:44.823464 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tzvkp" podUID="66892658-4db6-4064-b52f-60baa00dcc6d" Apr 24 14:24:45.084379 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:45.084305 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-5f7nk_5a0260d3-f8e0-4a3d-b95a-547feee30046/node-ca/0.log" Apr 24 14:24:45.494656 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:45.494634 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-231.ec2.internal" event="NodeReady" Apr 24 14:24:45.494771 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:45.494761 2569 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 14:24:45.539554 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:45.539511 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-q6p5b"] Apr 24 14:24:45.569545 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:45.569510 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-kkfvn"] Apr 24 14:24:45.569728 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:45.569687 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-q6p5b" Apr 24 14:24:45.572020 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:45.571997 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vrfmr\"" Apr 24 14:24:45.572150 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:45.571997 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 14:24:45.572482 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:45.572463 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 14:24:45.596396 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:45.596366 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-q6p5b"] Apr 24 14:24:45.596396 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:45.596396 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-kkfvn"] Apr 24 14:24:45.596571 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:45.596514 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kkfvn" Apr 24 14:24:45.598511 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:45.598489 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9tl5g\"" Apr 24 14:24:45.598663 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:45.598647 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 14:24:45.598718 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:45.598659 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 14:24:45.598833 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:45.598819 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 14:24:45.743660 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:45.743566 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc7xs\" (UniqueName: \"kubernetes.io/projected/4ea1b220-dbce-4a4c-bfe3-3c3c34321b6a-kube-api-access-pc7xs\") pod \"dns-default-q6p5b\" (UID: \"4ea1b220-dbce-4a4c-bfe3-3c3c34321b6a\") " pod="openshift-dns/dns-default-q6p5b" Apr 24 14:24:45.743660 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:45.743618 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4ea1b220-dbce-4a4c-bfe3-3c3c34321b6a-metrics-tls\") pod \"dns-default-q6p5b\" (UID: \"4ea1b220-dbce-4a4c-bfe3-3c3c34321b6a\") " pod="openshift-dns/dns-default-q6p5b" Apr 24 14:24:45.743660 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:45.743637 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4ea1b220-dbce-4a4c-bfe3-3c3c34321b6a-tmp-dir\") pod \"dns-default-q6p5b\" (UID: \"4ea1b220-dbce-4a4c-bfe3-3c3c34321b6a\") " pod="openshift-dns/dns-default-q6p5b" Apr 24 14:24:45.743660 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:45.743654 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4ea1b220-dbce-4a4c-bfe3-3c3c34321b6a-config-volume\") pod \"dns-default-q6p5b\" (UID: \"4ea1b220-dbce-4a4c-bfe3-3c3c34321b6a\") " pod="openshift-dns/dns-default-q6p5b" Apr 24 14:24:45.743950 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:45.743675 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/74ac045c-eb3d-44e5-b6cb-0da4951109bb-cert\") pod \"ingress-canary-kkfvn\" (UID: \"74ac045c-eb3d-44e5-b6cb-0da4951109bb\") " pod="openshift-ingress-canary/ingress-canary-kkfvn" Apr 24 14:24:45.743950 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:45.743691 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnpgp\" (UniqueName: \"kubernetes.io/projected/74ac045c-eb3d-44e5-b6cb-0da4951109bb-kube-api-access-xnpgp\") pod \"ingress-canary-kkfvn\" (UID: \"74ac045c-eb3d-44e5-b6cb-0da4951109bb\") " pod="openshift-ingress-canary/ingress-canary-kkfvn" Apr 24 14:24:45.844063 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:45.844019 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/74ac045c-eb3d-44e5-b6cb-0da4951109bb-cert\") pod \"ingress-canary-kkfvn\" (UID: \"74ac045c-eb3d-44e5-b6cb-0da4951109bb\") " pod="openshift-ingress-canary/ingress-canary-kkfvn" Apr 24 14:24:45.844063 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:45.844064 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xnpgp\" (UniqueName: \"kubernetes.io/projected/74ac045c-eb3d-44e5-b6cb-0da4951109bb-kube-api-access-xnpgp\") pod \"ingress-canary-kkfvn\" (UID: \"74ac045c-eb3d-44e5-b6cb-0da4951109bb\") " pod="openshift-ingress-canary/ingress-canary-kkfvn" Apr 24 14:24:45.844317 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:45.844125 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pc7xs\" (UniqueName: \"kubernetes.io/projected/4ea1b220-dbce-4a4c-bfe3-3c3c34321b6a-kube-api-access-pc7xs\") pod \"dns-default-q6p5b\" (UID: \"4ea1b220-dbce-4a4c-bfe3-3c3c34321b6a\") " pod="openshift-dns/dns-default-q6p5b" Apr 24 14:24:45.844317 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:45.844168 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4ea1b220-dbce-4a4c-bfe3-3c3c34321b6a-metrics-tls\") pod \"dns-default-q6p5b\" (UID: \"4ea1b220-dbce-4a4c-bfe3-3c3c34321b6a\") " pod="openshift-dns/dns-default-q6p5b" Apr 24 14:24:45.844424 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:45.844347 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4ea1b220-dbce-4a4c-bfe3-3c3c34321b6a-tmp-dir\") pod \"dns-default-q6p5b\" (UID: \"4ea1b220-dbce-4a4c-bfe3-3c3c34321b6a\") " pod="openshift-dns/dns-default-q6p5b" Apr 24 14:24:45.844424 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:45.844383 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4ea1b220-dbce-4a4c-bfe3-3c3c34321b6a-config-volume\") pod \"dns-default-q6p5b\" (UID: \"4ea1b220-dbce-4a4c-bfe3-3c3c34321b6a\") " pod="openshift-dns/dns-default-q6p5b" Apr 24 14:24:45.844757 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:45.844732 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4ea1b220-dbce-4a4c-bfe3-3c3c34321b6a-tmp-dir\") pod \"dns-default-q6p5b\" (UID: \"4ea1b220-dbce-4a4c-bfe3-3c3c34321b6a\") " pod="openshift-dns/dns-default-q6p5b" Apr 24 14:24:45.848086 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:45.848061 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4ea1b220-dbce-4a4c-bfe3-3c3c34321b6a-metrics-tls\") pod \"dns-default-q6p5b\" (UID: \"4ea1b220-dbce-4a4c-bfe3-3c3c34321b6a\") " pod="openshift-dns/dns-default-q6p5b" Apr 24 14:24:45.848182 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:45.848119 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/74ac045c-eb3d-44e5-b6cb-0da4951109bb-cert\") pod \"ingress-canary-kkfvn\" (UID: \"74ac045c-eb3d-44e5-b6cb-0da4951109bb\") " pod="openshift-ingress-canary/ingress-canary-kkfvn" Apr 24 14:24:45.848182 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:45.848133 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4ea1b220-dbce-4a4c-bfe3-3c3c34321b6a-config-volume\") pod \"dns-default-q6p5b\" (UID: \"4ea1b220-dbce-4a4c-bfe3-3c3c34321b6a\") " pod="openshift-dns/dns-default-q6p5b" Apr 24 14:24:45.854514 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:45.854487 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc7xs\" (UniqueName: \"kubernetes.io/projected/4ea1b220-dbce-4a4c-bfe3-3c3c34321b6a-kube-api-access-pc7xs\") pod \"dns-default-q6p5b\" (UID: \"4ea1b220-dbce-4a4c-bfe3-3c3c34321b6a\") " pod="openshift-dns/dns-default-q6p5b" Apr 24 14:24:45.863936 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:45.863906 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnpgp\" (UniqueName: \"kubernetes.io/projected/74ac045c-eb3d-44e5-b6cb-0da4951109bb-kube-api-access-xnpgp\") pod \"ingress-canary-kkfvn\" (UID: \"74ac045c-eb3d-44e5-b6cb-0da4951109bb\") " pod="openshift-ingress-canary/ingress-canary-kkfvn" Apr 24 14:24:45.879871 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:45.879844 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-q6p5b" Apr 24 14:24:45.907602 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:45.907570 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kkfvn" Apr 24 14:24:46.028471 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:46.028438 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-q6p5b"] Apr 24 14:24:46.031639 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:46.031609 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ea1b220_dbce_4a4c_bfe3_3c3c34321b6a.slice/crio-fcbedb209fdd6966008c57f947a956497cb4c8af4757a67301870f6e47840697 WatchSource:0}: Error finding container fcbedb209fdd6966008c57f947a956497cb4c8af4757a67301870f6e47840697: Status 404 returned error can't find the container with id fcbedb209fdd6966008c57f947a956497cb4c8af4757a67301870f6e47840697 Apr 24 14:24:46.045986 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:46.045957 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-kkfvn"] Apr 24 14:24:46.049375 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:46.049352 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74ac045c_eb3d_44e5_b6cb_0da4951109bb.slice/crio-ac6441b9ebbc461d9780bd713316ca80a0cf442e24c3043ba415cb7023ccc9d5 WatchSource:0}: Error finding container ac6441b9ebbc461d9780bd713316ca80a0cf442e24c3043ba415cb7023ccc9d5: Status 404 returned error can't find the container with id ac6441b9ebbc461d9780bd713316ca80a0cf442e24c3043ba415cb7023ccc9d5 Apr 24 14:24:46.823943 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:46.823477 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tzvkp" Apr 24 14:24:46.823943 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:46.823701 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d47r2" Apr 24 14:24:46.826847 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:46.826637 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 14:24:46.826847 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:46.826660 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-p2dnd\"" Apr 24 14:24:46.826847 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:46.826783 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 14:24:46.827117 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:46.826922 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 14:24:46.827117 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:46.827024 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-qmh6n\"" Apr 24 14:24:47.028484 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:47.028436 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-kkfvn" event={"ID":"74ac045c-eb3d-44e5-b6cb-0da4951109bb","Type":"ContainerStarted","Data":"ac6441b9ebbc461d9780bd713316ca80a0cf442e24c3043ba415cb7023ccc9d5"} Apr 24 14:24:47.029558 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:47.029528 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-q6p5b" event={"ID":"4ea1b220-dbce-4a4c-bfe3-3c3c34321b6a","Type":"ContainerStarted","Data":"fcbedb209fdd6966008c57f947a956497cb4c8af4757a67301870f6e47840697"} Apr 24 14:24:47.195024 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:47.194987 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-65cl5"] Apr 24 14:24:47.220450 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:47.220422 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-65cl5"] Apr 24 14:24:47.220627 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:47.220573 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-65cl5" Apr 24 14:24:47.225544 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:47.225517 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 14:24:47.225705 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:47.225551 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 14:24:47.225782 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:47.225707 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 14:24:47.228653 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:47.228553 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 14:24:47.228791 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:47.228770 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-gwkgg\"" Apr 24 14:24:47.355258 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:47.355222 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/4bbd5597-4646-4720-a0e3-cd74795416fa-crio-socket\") pod \"insights-runtime-extractor-65cl5\" (UID: \"4bbd5597-4646-4720-a0e3-cd74795416fa\") " pod="openshift-insights/insights-runtime-extractor-65cl5" Apr 24 14:24:47.355258 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:47.355278 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/4bbd5597-4646-4720-a0e3-cd74795416fa-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-65cl5\" (UID: \"4bbd5597-4646-4720-a0e3-cd74795416fa\") " pod="openshift-insights/insights-runtime-extractor-65cl5" Apr 24 14:24:47.355510 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:47.355322 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xczpz\" (UniqueName: \"kubernetes.io/projected/4bbd5597-4646-4720-a0e3-cd74795416fa-kube-api-access-xczpz\") pod \"insights-runtime-extractor-65cl5\" (UID: \"4bbd5597-4646-4720-a0e3-cd74795416fa\") " pod="openshift-insights/insights-runtime-extractor-65cl5" Apr 24 14:24:47.355510 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:47.355343 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/4bbd5597-4646-4720-a0e3-cd74795416fa-data-volume\") pod \"insights-runtime-extractor-65cl5\" (UID: \"4bbd5597-4646-4720-a0e3-cd74795416fa\") " pod="openshift-insights/insights-runtime-extractor-65cl5" Apr 24 14:24:47.355510 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:47.355366 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4bbd5597-4646-4720-a0e3-cd74795416fa-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-65cl5\" (UID: \"4bbd5597-4646-4720-a0e3-cd74795416fa\") " pod="openshift-insights/insights-runtime-extractor-65cl5" Apr 24 14:24:47.455840 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:47.455744 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/4bbd5597-4646-4720-a0e3-cd74795416fa-crio-socket\") pod \"insights-runtime-extractor-65cl5\" (UID: \"4bbd5597-4646-4720-a0e3-cd74795416fa\") " pod="openshift-insights/insights-runtime-extractor-65cl5" Apr 24 14:24:47.455840 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:47.455803 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/4bbd5597-4646-4720-a0e3-cd74795416fa-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-65cl5\" (UID: \"4bbd5597-4646-4720-a0e3-cd74795416fa\") " pod="openshift-insights/insights-runtime-extractor-65cl5" Apr 24 14:24:47.456088 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:47.455844 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xczpz\" (UniqueName: \"kubernetes.io/projected/4bbd5597-4646-4720-a0e3-cd74795416fa-kube-api-access-xczpz\") pod \"insights-runtime-extractor-65cl5\" (UID: \"4bbd5597-4646-4720-a0e3-cd74795416fa\") " pod="openshift-insights/insights-runtime-extractor-65cl5" Apr 24 14:24:47.456088 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:47.455873 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/4bbd5597-4646-4720-a0e3-cd74795416fa-data-volume\") pod \"insights-runtime-extractor-65cl5\" (UID: \"4bbd5597-4646-4720-a0e3-cd74795416fa\") " pod="openshift-insights/insights-runtime-extractor-65cl5" Apr 24 14:24:47.456088 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:47.455937 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4bbd5597-4646-4720-a0e3-cd74795416fa-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-65cl5\" (UID: \"4bbd5597-4646-4720-a0e3-cd74795416fa\") " pod="openshift-insights/insights-runtime-extractor-65cl5" Apr 24 14:24:47.456088 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:47.456016 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/4bbd5597-4646-4720-a0e3-cd74795416fa-crio-socket\") pod \"insights-runtime-extractor-65cl5\" (UID: \"4bbd5597-4646-4720-a0e3-cd74795416fa\") " pod="openshift-insights/insights-runtime-extractor-65cl5" Apr 24 14:24:47.456271 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:47.456200 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/4bbd5597-4646-4720-a0e3-cd74795416fa-data-volume\") pod \"insights-runtime-extractor-65cl5\" (UID: \"4bbd5597-4646-4720-a0e3-cd74795416fa\") " pod="openshift-insights/insights-runtime-extractor-65cl5" Apr 24 14:24:47.473049 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:47.473014 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/4bbd5597-4646-4720-a0e3-cd74795416fa-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-65cl5\" (UID: \"4bbd5597-4646-4720-a0e3-cd74795416fa\") " pod="openshift-insights/insights-runtime-extractor-65cl5" Apr 24 14:24:47.474796 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:47.474774 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4bbd5597-4646-4720-a0e3-cd74795416fa-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-65cl5\" (UID: \"4bbd5597-4646-4720-a0e3-cd74795416fa\") " pod="openshift-insights/insights-runtime-extractor-65cl5" Apr 24 14:24:47.475553 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:47.474804 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xczpz\" (UniqueName: \"kubernetes.io/projected/4bbd5597-4646-4720-a0e3-cd74795416fa-kube-api-access-xczpz\") pod \"insights-runtime-extractor-65cl5\" (UID: \"4bbd5597-4646-4720-a0e3-cd74795416fa\") " pod="openshift-insights/insights-runtime-extractor-65cl5" Apr 24 14:24:47.530977 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:47.530940 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-65cl5" Apr 24 14:24:48.555223 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:48.555196 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-65cl5"] Apr 24 14:24:48.558640 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:48.558609 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4bbd5597_4646_4720_a0e3_cd74795416fa.slice/crio-871bbc7c7fd2055c6dbb056e6996e0b5faf39d8f697c6ae66b4c342303a1c353 WatchSource:0}: Error finding container 871bbc7c7fd2055c6dbb056e6996e0b5faf39d8f697c6ae66b4c342303a1c353: Status 404 returned error can't find the container with id 871bbc7c7fd2055c6dbb056e6996e0b5faf39d8f697c6ae66b4c342303a1c353 Apr 24 14:24:49.034155 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:49.034120 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-kkfvn" event={"ID":"74ac045c-eb3d-44e5-b6cb-0da4951109bb","Type":"ContainerStarted","Data":"e333b7cc6ba0ab61be792ef941ff74317b8be6c426b7514c9111b14b025f67a7"} Apr 24 14:24:49.035695 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:49.035664 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-q6p5b" event={"ID":"4ea1b220-dbce-4a4c-bfe3-3c3c34321b6a","Type":"ContainerStarted","Data":"0ab44560cf02f0a427e0856ff0fd020c235683053454de258559fc760b885db1"} Apr 24 14:24:49.035819 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:49.035698 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-q6p5b" event={"ID":"4ea1b220-dbce-4a4c-bfe3-3c3c34321b6a","Type":"ContainerStarted","Data":"b26fb6f6f6b4ea259c1932e86c7484d5708e1a67ad2e3d17e0187d3df366ae99"} Apr 24 14:24:49.035819 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:49.035739 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-q6p5b" Apr 24 14:24:49.037108 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:49.037088 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-65cl5" event={"ID":"4bbd5597-4646-4720-a0e3-cd74795416fa","Type":"ContainerStarted","Data":"f21b842376ff39c2453e4c3abacb8f114a70273e8acc142a065c1ef36ece1ab3"} Apr 24 14:24:49.037108 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:49.037109 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-65cl5" event={"ID":"4bbd5597-4646-4720-a0e3-cd74795416fa","Type":"ContainerStarted","Data":"871bbc7c7fd2055c6dbb056e6996e0b5faf39d8f697c6ae66b4c342303a1c353"} Apr 24 14:24:49.048051 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:49.048005 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-kkfvn" podStartSLOduration=1.68551128 podStartE2EDuration="4.047972774s" podCreationTimestamp="2026-04-24 14:24:45 +0000 UTC" firstStartedPulling="2026-04-24 14:24:46.051091576 +0000 UTC m=+38.845148455" lastFinishedPulling="2026-04-24 14:24:48.413553062 +0000 UTC m=+41.207609949" observedRunningTime="2026-04-24 14:24:49.047257155 +0000 UTC m=+41.841314058" watchObservedRunningTime="2026-04-24 14:24:49.047972774 +0000 UTC m=+41.842029675" Apr 24 14:24:49.063776 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:49.063721 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-q6p5b" podStartSLOduration=1.688123466 podStartE2EDuration="4.063702705s" podCreationTimestamp="2026-04-24 14:24:45 +0000 UTC" firstStartedPulling="2026-04-24 14:24:46.034127605 +0000 UTC m=+38.828184484" lastFinishedPulling="2026-04-24 14:24:48.40970683 +0000 UTC m=+41.203763723" observedRunningTime="2026-04-24 14:24:49.063061586 +0000 UTC m=+41.857118488" watchObservedRunningTime="2026-04-24 14:24:49.063702705 +0000 UTC m=+41.857759607" Apr 24 14:24:49.840013 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:49.839986 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-r7s98"] Apr 24 14:24:49.871613 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:49.871587 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-r7s98"] Apr 24 14:24:49.871613 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:49.871615 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-6zmm6"] Apr 24 14:24:49.871802 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:49.871743 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-r7s98" Apr 24 14:24:49.873745 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:49.873722 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 24 14:24:49.873877 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:49.873727 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 14:24:49.873877 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:49.873755 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-xfxlk\"" Apr 24 14:24:49.874066 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:49.874049 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 24 14:24:49.874136 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:49.874053 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 14:24:49.874386 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:49.874369 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 14:24:49.896716 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:49.896682 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-27q6z"] Apr 24 14:24:49.896849 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:49.896833 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-6zmm6" Apr 24 14:24:49.899033 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:49.898999 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 14:24:49.900773 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:49.900658 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 14:24:49.901531 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:49.901404 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-72grf\"" Apr 24 14:24:49.902888 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:49.902863 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 14:24:49.914979 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:49.914959 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-27q6z"] Apr 24 14:24:49.915139 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:49.915126 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-27q6z" Apr 24 14:24:49.917067 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:49.917051 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 24 14:24:49.917198 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:49.917179 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 24 14:24:49.917287 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:49.917249 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 24 14:24:49.917374 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:49.917361 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-mbk2l\"" Apr 24 14:24:49.975742 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:49.975704 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j79tv\" (UniqueName: \"kubernetes.io/projected/a58cee7f-bde9-4e5c-9153-311121f9b806-kube-api-access-j79tv\") pod \"openshift-state-metrics-9d44df66c-r7s98\" (UID: \"a58cee7f-bde9-4e5c-9153-311121f9b806\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-r7s98" Apr 24 14:24:49.975934 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:49.975760 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a58cee7f-bde9-4e5c-9153-311121f9b806-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-r7s98\" (UID: \"a58cee7f-bde9-4e5c-9153-311121f9b806\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-r7s98" Apr 24 14:24:49.975934 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:49.975844 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a58cee7f-bde9-4e5c-9153-311121f9b806-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-r7s98\" (UID: \"a58cee7f-bde9-4e5c-9153-311121f9b806\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-r7s98" Apr 24 14:24:49.975934 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:49.975924 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a58cee7f-bde9-4e5c-9153-311121f9b806-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-r7s98\" (UID: \"a58cee7f-bde9-4e5c-9153-311121f9b806\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-r7s98" Apr 24 14:24:50.042317 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.042218 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-65cl5" event={"ID":"4bbd5597-4646-4720-a0e3-cd74795416fa","Type":"ContainerStarted","Data":"00e7d944c7bff6dd7761020820f5c931fe2afee81e1823659c547b8086a0b14f"} Apr 24 14:24:50.076917 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.076866 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a575d177-9774-44af-890a-2397d9ddba99-metrics-client-ca\") pod \"node-exporter-6zmm6\" (UID: \"a575d177-9774-44af-890a-2397d9ddba99\") " pod="openshift-monitoring/node-exporter-6zmm6" Apr 24 14:24:50.077081 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.076928 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l976h\" (UniqueName: \"kubernetes.io/projected/a575d177-9774-44af-890a-2397d9ddba99-kube-api-access-l976h\") pod \"node-exporter-6zmm6\" (UID: \"a575d177-9774-44af-890a-2397d9ddba99\") " pod="openshift-monitoring/node-exporter-6zmm6" Apr 24 14:24:50.077081 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.076973 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j79tv\" (UniqueName: \"kubernetes.io/projected/a58cee7f-bde9-4e5c-9153-311121f9b806-kube-api-access-j79tv\") pod \"openshift-state-metrics-9d44df66c-r7s98\" (UID: \"a58cee7f-bde9-4e5c-9153-311121f9b806\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-r7s98" Apr 24 14:24:50.077081 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.076992 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/3026d04a-ab27-47dd-8ae1-01c9844e4de6-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-27q6z\" (UID: \"3026d04a-ab27-47dd-8ae1-01c9844e4de6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-27q6z" Apr 24 14:24:50.077081 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.077020 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3026d04a-ab27-47dd-8ae1-01c9844e4de6-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-27q6z\" (UID: \"3026d04a-ab27-47dd-8ae1-01c9844e4de6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-27q6z" Apr 24 14:24:50.077081 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.077055 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/3026d04a-ab27-47dd-8ae1-01c9844e4de6-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-27q6z\" (UID: \"3026d04a-ab27-47dd-8ae1-01c9844e4de6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-27q6z" Apr 24 14:24:50.077283 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.077141 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/a575d177-9774-44af-890a-2397d9ddba99-node-exporter-wtmp\") pod \"node-exporter-6zmm6\" (UID: \"a575d177-9774-44af-890a-2397d9ddba99\") " pod="openshift-monitoring/node-exporter-6zmm6" Apr 24 14:24:50.077283 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.077220 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a575d177-9774-44af-890a-2397d9ddba99-sys\") pod \"node-exporter-6zmm6\" (UID: \"a575d177-9774-44af-890a-2397d9ddba99\") " pod="openshift-monitoring/node-exporter-6zmm6" Apr 24 14:24:50.077283 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.077250 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a58cee7f-bde9-4e5c-9153-311121f9b806-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-r7s98\" (UID: \"a58cee7f-bde9-4e5c-9153-311121f9b806\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-r7s98" Apr 24 14:24:50.077283 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.077271 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3026d04a-ab27-47dd-8ae1-01c9844e4de6-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-27q6z\" (UID: \"3026d04a-ab27-47dd-8ae1-01c9844e4de6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-27q6z" Apr 24 14:24:50.077454 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.077293 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz6j5\" (UniqueName: \"kubernetes.io/projected/3026d04a-ab27-47dd-8ae1-01c9844e4de6-kube-api-access-cz6j5\") pod \"kube-state-metrics-69db897b98-27q6z\" (UID: \"3026d04a-ab27-47dd-8ae1-01c9844e4de6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-27q6z" Apr 24 14:24:50.077454 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.077311 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a575d177-9774-44af-890a-2397d9ddba99-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-6zmm6\" (UID: \"a575d177-9774-44af-890a-2397d9ddba99\") " pod="openshift-monitoring/node-exporter-6zmm6" Apr 24 14:24:50.077454 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.077330 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/a575d177-9774-44af-890a-2397d9ddba99-node-exporter-textfile\") pod \"node-exporter-6zmm6\" (UID: \"a575d177-9774-44af-890a-2397d9ddba99\") " pod="openshift-monitoring/node-exporter-6zmm6" Apr 24 14:24:50.077454 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.077351 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/a575d177-9774-44af-890a-2397d9ddba99-node-exporter-accelerators-collector-config\") pod \"node-exporter-6zmm6\" (UID: \"a575d177-9774-44af-890a-2397d9ddba99\") " pod="openshift-monitoring/node-exporter-6zmm6" Apr 24 14:24:50.077454 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.077379 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a58cee7f-bde9-4e5c-9153-311121f9b806-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-r7s98\" (UID: \"a58cee7f-bde9-4e5c-9153-311121f9b806\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-r7s98" Apr 24 14:24:50.077454 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.077446 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a58cee7f-bde9-4e5c-9153-311121f9b806-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-r7s98\" (UID: \"a58cee7f-bde9-4e5c-9153-311121f9b806\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-r7s98" Apr 24 14:24:50.077744 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.077479 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/3026d04a-ab27-47dd-8ae1-01c9844e4de6-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-27q6z\" (UID: \"3026d04a-ab27-47dd-8ae1-01c9844e4de6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-27q6z" Apr 24 14:24:50.077744 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.077502 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/a575d177-9774-44af-890a-2397d9ddba99-root\") pod \"node-exporter-6zmm6\" (UID: \"a575d177-9774-44af-890a-2397d9ddba99\") " pod="openshift-monitoring/node-exporter-6zmm6" Apr 24 14:24:50.077744 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.077530 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a575d177-9774-44af-890a-2397d9ddba99-node-exporter-tls\") pod \"node-exporter-6zmm6\" (UID: \"a575d177-9774-44af-890a-2397d9ddba99\") " pod="openshift-monitoring/node-exporter-6zmm6" Apr 24 14:24:50.077744 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:50.077651 2569 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 24 14:24:50.077744 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:24:50.077727 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a58cee7f-bde9-4e5c-9153-311121f9b806-openshift-state-metrics-tls podName:a58cee7f-bde9-4e5c-9153-311121f9b806 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:50.57770753 +0000 UTC m=+43.371764424 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/a58cee7f-bde9-4e5c-9153-311121f9b806-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-r7s98" (UID: "a58cee7f-bde9-4e5c-9153-311121f9b806") : secret "openshift-state-metrics-tls" not found Apr 24 14:24:50.078305 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.078146 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a58cee7f-bde9-4e5c-9153-311121f9b806-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-r7s98\" (UID: \"a58cee7f-bde9-4e5c-9153-311121f9b806\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-r7s98" Apr 24 14:24:50.079909 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.079867 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a58cee7f-bde9-4e5c-9153-311121f9b806-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-r7s98\" (UID: \"a58cee7f-bde9-4e5c-9153-311121f9b806\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-r7s98" Apr 24 14:24:50.085809 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.085786 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j79tv\" (UniqueName: \"kubernetes.io/projected/a58cee7f-bde9-4e5c-9153-311121f9b806-kube-api-access-j79tv\") pod \"openshift-state-metrics-9d44df66c-r7s98\" (UID: \"a58cee7f-bde9-4e5c-9153-311121f9b806\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-r7s98" Apr 24 14:24:50.178119 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.178076 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/3026d04a-ab27-47dd-8ae1-01c9844e4de6-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-27q6z\" (UID: \"3026d04a-ab27-47dd-8ae1-01c9844e4de6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-27q6z" Apr 24 14:24:50.178297 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.178141 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/a575d177-9774-44af-890a-2397d9ddba99-node-exporter-wtmp\") pod \"node-exporter-6zmm6\" (UID: \"a575d177-9774-44af-890a-2397d9ddba99\") " pod="openshift-monitoring/node-exporter-6zmm6" Apr 24 14:24:50.178297 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.178198 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a575d177-9774-44af-890a-2397d9ddba99-sys\") pod \"node-exporter-6zmm6\" (UID: \"a575d177-9774-44af-890a-2397d9ddba99\") " pod="openshift-monitoring/node-exporter-6zmm6" Apr 24 14:24:50.178297 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.178222 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3026d04a-ab27-47dd-8ae1-01c9844e4de6-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-27q6z\" (UID: \"3026d04a-ab27-47dd-8ae1-01c9844e4de6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-27q6z" Apr 24 14:24:50.178297 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.178251 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cz6j5\" (UniqueName: \"kubernetes.io/projected/3026d04a-ab27-47dd-8ae1-01c9844e4de6-kube-api-access-cz6j5\") pod \"kube-state-metrics-69db897b98-27q6z\" (UID: \"3026d04a-ab27-47dd-8ae1-01c9844e4de6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-27q6z" Apr 24 14:24:50.178297 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.178276 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a575d177-9774-44af-890a-2397d9ddba99-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-6zmm6\" (UID: \"a575d177-9774-44af-890a-2397d9ddba99\") " pod="openshift-monitoring/node-exporter-6zmm6" Apr 24 14:24:50.178562 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.178290 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a575d177-9774-44af-890a-2397d9ddba99-sys\") pod \"node-exporter-6zmm6\" (UID: \"a575d177-9774-44af-890a-2397d9ddba99\") " pod="openshift-monitoring/node-exporter-6zmm6" Apr 24 14:24:50.178562 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.178305 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/a575d177-9774-44af-890a-2397d9ddba99-node-exporter-textfile\") pod \"node-exporter-6zmm6\" (UID: \"a575d177-9774-44af-890a-2397d9ddba99\") " pod="openshift-monitoring/node-exporter-6zmm6" Apr 24 14:24:50.178562 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.178332 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/a575d177-9774-44af-890a-2397d9ddba99-node-exporter-accelerators-collector-config\") pod \"node-exporter-6zmm6\" (UID: \"a575d177-9774-44af-890a-2397d9ddba99\") " pod="openshift-monitoring/node-exporter-6zmm6" Apr 24 14:24:50.178562 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.178354 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/a575d177-9774-44af-890a-2397d9ddba99-node-exporter-wtmp\") pod \"node-exporter-6zmm6\" (UID: \"a575d177-9774-44af-890a-2397d9ddba99\") " pod="openshift-monitoring/node-exporter-6zmm6" Apr 24 14:24:50.178562 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.178451 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/3026d04a-ab27-47dd-8ae1-01c9844e4de6-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-27q6z\" (UID: \"3026d04a-ab27-47dd-8ae1-01c9844e4de6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-27q6z" Apr 24 14:24:50.178562 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.178483 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/a575d177-9774-44af-890a-2397d9ddba99-root\") pod \"node-exporter-6zmm6\" (UID: \"a575d177-9774-44af-890a-2397d9ddba99\") " pod="openshift-monitoring/node-exporter-6zmm6" Apr 24 14:24:50.178562 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.178510 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a575d177-9774-44af-890a-2397d9ddba99-node-exporter-tls\") pod \"node-exporter-6zmm6\" (UID: \"a575d177-9774-44af-890a-2397d9ddba99\") " pod="openshift-monitoring/node-exporter-6zmm6" Apr 24 14:24:50.178562 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.178542 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a575d177-9774-44af-890a-2397d9ddba99-metrics-client-ca\") pod \"node-exporter-6zmm6\" (UID: \"a575d177-9774-44af-890a-2397d9ddba99\") " pod="openshift-monitoring/node-exporter-6zmm6" Apr 24 14:24:50.178976 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.178572 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l976h\" (UniqueName: \"kubernetes.io/projected/a575d177-9774-44af-890a-2397d9ddba99-kube-api-access-l976h\") pod \"node-exporter-6zmm6\" (UID: \"a575d177-9774-44af-890a-2397d9ddba99\") " pod="openshift-monitoring/node-exporter-6zmm6" Apr 24 14:24:50.178976 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.178613 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/a575d177-9774-44af-890a-2397d9ddba99-root\") pod \"node-exporter-6zmm6\" (UID: \"a575d177-9774-44af-890a-2397d9ddba99\") " pod="openshift-monitoring/node-exporter-6zmm6" Apr 24 14:24:50.178976 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.178627 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/3026d04a-ab27-47dd-8ae1-01c9844e4de6-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-27q6z\" (UID: \"3026d04a-ab27-47dd-8ae1-01c9844e4de6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-27q6z" Apr 24 14:24:50.178976 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.178659 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3026d04a-ab27-47dd-8ae1-01c9844e4de6-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-27q6z\" (UID: \"3026d04a-ab27-47dd-8ae1-01c9844e4de6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-27q6z" Apr 24 14:24:50.178976 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.178753 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/a575d177-9774-44af-890a-2397d9ddba99-node-exporter-textfile\") pod \"node-exporter-6zmm6\" (UID: \"a575d177-9774-44af-890a-2397d9ddba99\") " pod="openshift-monitoring/node-exporter-6zmm6" Apr 24 14:24:50.179284 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.179178 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/3026d04a-ab27-47dd-8ae1-01c9844e4de6-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-27q6z\" (UID: \"3026d04a-ab27-47dd-8ae1-01c9844e4de6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-27q6z" Apr 24 14:24:50.179284 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.179246 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/3026d04a-ab27-47dd-8ae1-01c9844e4de6-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-27q6z\" (UID: \"3026d04a-ab27-47dd-8ae1-01c9844e4de6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-27q6z" Apr 24 14:24:50.179284 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.179256 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3026d04a-ab27-47dd-8ae1-01c9844e4de6-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-27q6z\" (UID: \"3026d04a-ab27-47dd-8ae1-01c9844e4de6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-27q6z" Apr 24 14:24:50.179447 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.179428 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a575d177-9774-44af-890a-2397d9ddba99-metrics-client-ca\") pod \"node-exporter-6zmm6\" (UID: \"a575d177-9774-44af-890a-2397d9ddba99\") " pod="openshift-monitoring/node-exporter-6zmm6" Apr 24 14:24:50.179732 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.179701 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/a575d177-9774-44af-890a-2397d9ddba99-node-exporter-accelerators-collector-config\") pod \"node-exporter-6zmm6\" (UID: \"a575d177-9774-44af-890a-2397d9ddba99\") " pod="openshift-monitoring/node-exporter-6zmm6" Apr 24 14:24:50.181207 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.181188 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/3026d04a-ab27-47dd-8ae1-01c9844e4de6-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-27q6z\" (UID: \"3026d04a-ab27-47dd-8ae1-01c9844e4de6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-27q6z" Apr 24 14:24:50.181330 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.181312 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a575d177-9774-44af-890a-2397d9ddba99-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-6zmm6\" (UID: \"a575d177-9774-44af-890a-2397d9ddba99\") " pod="openshift-monitoring/node-exporter-6zmm6" Apr 24 14:24:50.181448 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.181427 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3026d04a-ab27-47dd-8ae1-01c9844e4de6-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-27q6z\" (UID: \"3026d04a-ab27-47dd-8ae1-01c9844e4de6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-27q6z" Apr 24 14:24:50.181659 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.181643 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a575d177-9774-44af-890a-2397d9ddba99-node-exporter-tls\") pod \"node-exporter-6zmm6\" (UID: \"a575d177-9774-44af-890a-2397d9ddba99\") " pod="openshift-monitoring/node-exporter-6zmm6" Apr 24 14:24:50.185719 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.185692 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l976h\" (UniqueName: \"kubernetes.io/projected/a575d177-9774-44af-890a-2397d9ddba99-kube-api-access-l976h\") pod \"node-exporter-6zmm6\" (UID: \"a575d177-9774-44af-890a-2397d9ddba99\") " pod="openshift-monitoring/node-exporter-6zmm6" Apr 24 14:24:50.185808 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.185787 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz6j5\" (UniqueName: \"kubernetes.io/projected/3026d04a-ab27-47dd-8ae1-01c9844e4de6-kube-api-access-cz6j5\") pod \"kube-state-metrics-69db897b98-27q6z\" (UID: \"3026d04a-ab27-47dd-8ae1-01c9844e4de6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-27q6z" Apr 24 14:24:50.209828 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.209794 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-6zmm6" Apr 24 14:24:50.219190 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:50.219162 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda575d177_9774_44af_890a_2397d9ddba99.slice/crio-5018466bb76ceee45b8df57f3c80d6d491a45f22b208f053247979f79d29590a WatchSource:0}: Error finding container 5018466bb76ceee45b8df57f3c80d6d491a45f22b208f053247979f79d29590a: Status 404 returned error can't find the container with id 5018466bb76ceee45b8df57f3c80d6d491a45f22b208f053247979f79d29590a Apr 24 14:24:50.223153 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.223136 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-27q6z" Apr 24 14:24:50.342974 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.342931 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-27q6z"] Apr 24 14:24:50.355143 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:50.355113 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3026d04a_ab27_47dd_8ae1_01c9844e4de6.slice/crio-3c2be695094ecb859332ce35023b907d4b54cabee62e84c613fb0de6f2c5eee5 WatchSource:0}: Error finding container 3c2be695094ecb859332ce35023b907d4b54cabee62e84c613fb0de6f2c5eee5: Status 404 returned error can't find the container with id 3c2be695094ecb859332ce35023b907d4b54cabee62e84c613fb0de6f2c5eee5 Apr 24 14:24:50.582729 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.582649 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a58cee7f-bde9-4e5c-9153-311121f9b806-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-r7s98\" (UID: \"a58cee7f-bde9-4e5c-9153-311121f9b806\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-r7s98" Apr 24 14:24:50.585799 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.585767 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a58cee7f-bde9-4e5c-9153-311121f9b806-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-r7s98\" (UID: \"a58cee7f-bde9-4e5c-9153-311121f9b806\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-r7s98" Apr 24 14:24:50.781370 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.781328 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-r7s98" Apr 24 14:24:50.921566 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.921531 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-r7s98"] Apr 24 14:24:50.955517 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.955486 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 14:24:50.987380 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.987354 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 14:24:50.987536 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.987513 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:24:50.989546 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.989521 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 24 14:24:50.989753 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.989737 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 24 14:24:50.990022 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.989970 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 24 14:24:50.990022 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.990008 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 24 14:24:50.990884 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.990862 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 24 14:24:50.991013 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.990864 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 24 14:24:50.991077 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.991030 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-ct8ch\"" Apr 24 14:24:50.991268 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.991252 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 24 14:24:50.991361 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.991253 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 24 14:24:50.993504 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:50.993489 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 24 14:24:51.046320 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:51.046281 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-27q6z" event={"ID":"3026d04a-ab27-47dd-8ae1-01c9844e4de6","Type":"ContainerStarted","Data":"3c2be695094ecb859332ce35023b907d4b54cabee62e84c613fb0de6f2c5eee5"} Apr 24 14:24:51.047656 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:51.047626 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-6zmm6" event={"ID":"a575d177-9774-44af-890a-2397d9ddba99","Type":"ContainerStarted","Data":"5018466bb76ceee45b8df57f3c80d6d491a45f22b208f053247979f79d29590a"} Apr 24 14:24:51.087324 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:51.087288 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/491cc533-cb26-42c1-aaaf-2a211675e0de-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"491cc533-cb26-42c1-aaaf-2a211675e0de\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:24:51.087324 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:51.087326 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mp57\" (UniqueName: \"kubernetes.io/projected/491cc533-cb26-42c1-aaaf-2a211675e0de-kube-api-access-8mp57\") pod \"alertmanager-main-0\" (UID: \"491cc533-cb26-42c1-aaaf-2a211675e0de\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:24:51.087565 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:51.087412 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/491cc533-cb26-42c1-aaaf-2a211675e0de-tls-assets\") pod \"alertmanager-main-0\" (UID: \"491cc533-cb26-42c1-aaaf-2a211675e0de\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:24:51.087565 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:51.087486 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/491cc533-cb26-42c1-aaaf-2a211675e0de-web-config\") pod \"alertmanager-main-0\" (UID: \"491cc533-cb26-42c1-aaaf-2a211675e0de\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:24:51.087636 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:51.087570 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/491cc533-cb26-42c1-aaaf-2a211675e0de-config-out\") pod \"alertmanager-main-0\" (UID: \"491cc533-cb26-42c1-aaaf-2a211675e0de\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:24:51.087636 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:51.087588 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/491cc533-cb26-42c1-aaaf-2a211675e0de-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"491cc533-cb26-42c1-aaaf-2a211675e0de\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:24:51.087636 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:51.087609 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/491cc533-cb26-42c1-aaaf-2a211675e0de-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"491cc533-cb26-42c1-aaaf-2a211675e0de\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:24:51.087737 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:51.087706 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/491cc533-cb26-42c1-aaaf-2a211675e0de-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"491cc533-cb26-42c1-aaaf-2a211675e0de\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:24:51.087776 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:51.087750 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/491cc533-cb26-42c1-aaaf-2a211675e0de-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"491cc533-cb26-42c1-aaaf-2a211675e0de\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:24:51.087821 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:51.087807 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/491cc533-cb26-42c1-aaaf-2a211675e0de-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"491cc533-cb26-42c1-aaaf-2a211675e0de\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:24:51.087872 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:51.087851 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/491cc533-cb26-42c1-aaaf-2a211675e0de-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"491cc533-cb26-42c1-aaaf-2a211675e0de\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:24:51.088017 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:51.087909 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/491cc533-cb26-42c1-aaaf-2a211675e0de-config-volume\") pod \"alertmanager-main-0\" (UID: \"491cc533-cb26-42c1-aaaf-2a211675e0de\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:24:51.088017 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:51.087933 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/491cc533-cb26-42c1-aaaf-2a211675e0de-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"491cc533-cb26-42c1-aaaf-2a211675e0de\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:24:51.135163 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:51.135076 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda58cee7f_bde9_4e5c_9153_311121f9b806.slice/crio-80098c1b9645bb126155bb786925f5f7048157249f725ce12b575798f0006daa WatchSource:0}: Error finding container 80098c1b9645bb126155bb786925f5f7048157249f725ce12b575798f0006daa: Status 404 returned error can't find the container with id 80098c1b9645bb126155bb786925f5f7048157249f725ce12b575798f0006daa Apr 24 14:24:51.188566 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:51.188529 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/491cc533-cb26-42c1-aaaf-2a211675e0de-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"491cc533-cb26-42c1-aaaf-2a211675e0de\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:24:51.188729 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:51.188575 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/491cc533-cb26-42c1-aaaf-2a211675e0de-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"491cc533-cb26-42c1-aaaf-2a211675e0de\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:24:51.188729 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:51.188594 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/491cc533-cb26-42c1-aaaf-2a211675e0de-config-volume\") pod \"alertmanager-main-0\" (UID: \"491cc533-cb26-42c1-aaaf-2a211675e0de\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:24:51.188729 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:51.188612 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/491cc533-cb26-42c1-aaaf-2a211675e0de-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"491cc533-cb26-42c1-aaaf-2a211675e0de\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:24:51.188729 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:51.188642 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/491cc533-cb26-42c1-aaaf-2a211675e0de-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"491cc533-cb26-42c1-aaaf-2a211675e0de\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:24:51.188729 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:51.188664 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8mp57\" (UniqueName: \"kubernetes.io/projected/491cc533-cb26-42c1-aaaf-2a211675e0de-kube-api-access-8mp57\") pod \"alertmanager-main-0\" (UID: \"491cc533-cb26-42c1-aaaf-2a211675e0de\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:24:51.188729 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:51.188698 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/491cc533-cb26-42c1-aaaf-2a211675e0de-tls-assets\") pod \"alertmanager-main-0\" (UID: \"491cc533-cb26-42c1-aaaf-2a211675e0de\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:24:51.188729 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:51.188722 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/491cc533-cb26-42c1-aaaf-2a211675e0de-web-config\") pod \"alertmanager-main-0\" (UID: \"491cc533-cb26-42c1-aaaf-2a211675e0de\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:24:51.189082 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:51.188756 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/491cc533-cb26-42c1-aaaf-2a211675e0de-config-out\") pod \"alertmanager-main-0\" (UID: \"491cc533-cb26-42c1-aaaf-2a211675e0de\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:24:51.189082 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:51.188779 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/491cc533-cb26-42c1-aaaf-2a211675e0de-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"491cc533-cb26-42c1-aaaf-2a211675e0de\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:24:51.189082 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:51.188807 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/491cc533-cb26-42c1-aaaf-2a211675e0de-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"491cc533-cb26-42c1-aaaf-2a211675e0de\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:24:51.189082 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:51.188882 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/491cc533-cb26-42c1-aaaf-2a211675e0de-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"491cc533-cb26-42c1-aaaf-2a211675e0de\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:24:51.189082 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:51.188926 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/491cc533-cb26-42c1-aaaf-2a211675e0de-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"491cc533-cb26-42c1-aaaf-2a211675e0de\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:24:51.189413 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:51.189385 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/491cc533-cb26-42c1-aaaf-2a211675e0de-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"491cc533-cb26-42c1-aaaf-2a211675e0de\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:24:51.189985 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:51.189955 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/491cc533-cb26-42c1-aaaf-2a211675e0de-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"491cc533-cb26-42c1-aaaf-2a211675e0de\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:24:51.190874 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:51.190801 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/491cc533-cb26-42c1-aaaf-2a211675e0de-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"491cc533-cb26-42c1-aaaf-2a211675e0de\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:24:51.192665 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:51.192622 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/491cc533-cb26-42c1-aaaf-2a211675e0de-config-volume\") pod \"alertmanager-main-0\" (UID: \"491cc533-cb26-42c1-aaaf-2a211675e0de\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:24:51.194284 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:51.193239 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/491cc533-cb26-42c1-aaaf-2a211675e0de-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"491cc533-cb26-42c1-aaaf-2a211675e0de\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:24:51.194284 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:51.193432 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/491cc533-cb26-42c1-aaaf-2a211675e0de-config-out\") pod \"alertmanager-main-0\" (UID: \"491cc533-cb26-42c1-aaaf-2a211675e0de\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:24:51.194284 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:51.193811 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/491cc533-cb26-42c1-aaaf-2a211675e0de-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"491cc533-cb26-42c1-aaaf-2a211675e0de\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:24:51.194284 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:51.194250 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/491cc533-cb26-42c1-aaaf-2a211675e0de-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"491cc533-cb26-42c1-aaaf-2a211675e0de\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:24:51.194520 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:51.194327 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/491cc533-cb26-42c1-aaaf-2a211675e0de-tls-assets\") pod \"alertmanager-main-0\" (UID: \"491cc533-cb26-42c1-aaaf-2a211675e0de\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:24:51.194520 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:51.194447 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/491cc533-cb26-42c1-aaaf-2a211675e0de-web-config\") pod \"alertmanager-main-0\" (UID: \"491cc533-cb26-42c1-aaaf-2a211675e0de\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:24:51.195300 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:51.195274 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/491cc533-cb26-42c1-aaaf-2a211675e0de-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"491cc533-cb26-42c1-aaaf-2a211675e0de\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:24:51.195657 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:51.195633 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/491cc533-cb26-42c1-aaaf-2a211675e0de-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"491cc533-cb26-42c1-aaaf-2a211675e0de\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:24:51.198529 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:51.198508 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mp57\" (UniqueName: \"kubernetes.io/projected/491cc533-cb26-42c1-aaaf-2a211675e0de-kube-api-access-8mp57\") pod \"alertmanager-main-0\" (UID: \"491cc533-cb26-42c1-aaaf-2a211675e0de\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:24:51.298095 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:51.298054 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:24:52.051869 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:52.051783 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-r7s98" event={"ID":"a58cee7f-bde9-4e5c-9153-311121f9b806","Type":"ContainerStarted","Data":"80098c1b9645bb126155bb786925f5f7048157249f725ce12b575798f0006daa"} Apr 24 14:24:52.700547 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:52.700522 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 14:24:52.704962 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:52.704932 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod491cc533_cb26_42c1_aaaf_2a211675e0de.slice/crio-073380d038718483246203fed0b7ec691b6ed1e122a25a6bf5e7b33de4343c22 WatchSource:0}: Error finding container 073380d038718483246203fed0b7ec691b6ed1e122a25a6bf5e7b33de4343c22: Status 404 returned error can't find the container with id 073380d038718483246203fed0b7ec691b6ed1e122a25a6bf5e7b33de4343c22 Apr 24 14:24:53.059424 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:53.059380 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-r7s98" event={"ID":"a58cee7f-bde9-4e5c-9153-311121f9b806","Type":"ContainerStarted","Data":"b5fdad2b082a7579094075371cdad227addf1932b91487abd30a4f52e2b87840"} Apr 24 14:24:53.059424 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:53.059425 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-r7s98" event={"ID":"a58cee7f-bde9-4e5c-9153-311121f9b806","Type":"ContainerStarted","Data":"20dc1252e3f71fa03842a368b0a8f5cc992792925494a43e70dac881274589d4"} Apr 24 14:24:53.061212 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:53.061179 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-65cl5" event={"ID":"4bbd5597-4646-4720-a0e3-cd74795416fa","Type":"ContainerStarted","Data":"e91f666a434888ab63dfac298272359724ddcd0e507be02382555a81e2855cb5"} Apr 24 14:24:53.062633 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:53.062607 2569 generic.go:358] "Generic (PLEG): container finished" podID="a575d177-9774-44af-890a-2397d9ddba99" containerID="acf1b97e6c7574a470814ffe8880eb8e648e6b6212ed5ab97505c91b9d0fc2b8" exitCode=0 Apr 24 14:24:53.062750 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:53.062668 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-6zmm6" event={"ID":"a575d177-9774-44af-890a-2397d9ddba99","Type":"ContainerDied","Data":"acf1b97e6c7574a470814ffe8880eb8e648e6b6212ed5ab97505c91b9d0fc2b8"} Apr 24 14:24:53.063820 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:53.063796 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"491cc533-cb26-42c1-aaaf-2a211675e0de","Type":"ContainerStarted","Data":"073380d038718483246203fed0b7ec691b6ed1e122a25a6bf5e7b33de4343c22"} Apr 24 14:24:53.065642 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:53.065617 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-27q6z" event={"ID":"3026d04a-ab27-47dd-8ae1-01c9844e4de6","Type":"ContainerStarted","Data":"53b6a93ff3535a03f36bccceba9a15221e27f1b7afec0b905776f1a5154a41a1"} Apr 24 14:24:53.065748 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:53.065644 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-27q6z" event={"ID":"3026d04a-ab27-47dd-8ae1-01c9844e4de6","Type":"ContainerStarted","Data":"369912f83009db465fec67e5737f9d974468889cdac426e4642a5084d8979083"} Apr 24 14:24:53.065748 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:53.065659 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-27q6z" event={"ID":"3026d04a-ab27-47dd-8ae1-01c9844e4de6","Type":"ContainerStarted","Data":"26601ad14ec54fee881b11393fbbbc59d1523d7a1cab7abe9fd39b5f15e80b13"} Apr 24 14:24:53.079031 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:53.078978 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-65cl5" podStartSLOduration=2.209857797 podStartE2EDuration="6.078966538s" podCreationTimestamp="2026-04-24 14:24:47 +0000 UTC" firstStartedPulling="2026-04-24 14:24:48.697415076 +0000 UTC m=+41.491471961" lastFinishedPulling="2026-04-24 14:24:52.566523809 +0000 UTC m=+45.360580702" observedRunningTime="2026-04-24 14:24:53.077865503 +0000 UTC m=+45.871922418" watchObservedRunningTime="2026-04-24 14:24:53.078966538 +0000 UTC m=+45.873023438" Apr 24 14:24:53.117534 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:53.117482 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-27q6z" podStartSLOduration=1.907234598 podStartE2EDuration="4.117467455s" podCreationTimestamp="2026-04-24 14:24:49 +0000 UTC" firstStartedPulling="2026-04-24 14:24:50.356915312 +0000 UTC m=+43.150972194" lastFinishedPulling="2026-04-24 14:24:52.567148155 +0000 UTC m=+45.361205051" observedRunningTime="2026-04-24 14:24:53.093811396 +0000 UTC m=+45.887868306" watchObservedRunningTime="2026-04-24 14:24:53.117467455 +0000 UTC m=+45.911524350" Apr 24 14:24:54.070254 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:54.070214 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-6zmm6" event={"ID":"a575d177-9774-44af-890a-2397d9ddba99","Type":"ContainerStarted","Data":"0899b8ff8470e4809171476d7dc6659caae1885bef7cef9e27d8e39227f246f7"} Apr 24 14:24:54.070254 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:54.070260 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-6zmm6" event={"ID":"a575d177-9774-44af-890a-2397d9ddba99","Type":"ContainerStarted","Data":"88bd230fc35e95b2d9d1136d07abdcc0895d60721f20485cb15f0921982a7229"} Apr 24 14:24:54.099505 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:54.099454 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-6zmm6" podStartSLOduration=2.753147789 podStartE2EDuration="5.099440201s" podCreationTimestamp="2026-04-24 14:24:49 +0000 UTC" firstStartedPulling="2026-04-24 14:24:50.220744515 +0000 UTC m=+43.014801395" lastFinishedPulling="2026-04-24 14:24:52.567036916 +0000 UTC m=+45.361093807" observedRunningTime="2026-04-24 14:24:54.099146011 +0000 UTC m=+46.893202915" watchObservedRunningTime="2026-04-24 14:24:54.099440201 +0000 UTC m=+46.893497099" Apr 24 14:24:55.074425 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:55.074387 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-r7s98" event={"ID":"a58cee7f-bde9-4e5c-9153-311121f9b806","Type":"ContainerStarted","Data":"2e7c60b9c02ac71aeec23e3231586b1ca01a83453d0fca71d16225889a32d85d"} Apr 24 14:24:55.075793 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:55.075766 2569 generic.go:358] "Generic (PLEG): container finished" podID="491cc533-cb26-42c1-aaaf-2a211675e0de" containerID="49f9471e6ef925d3fb657dc06e62d2d6e136d99e805473912cc84a23a309c8b0" exitCode=0 Apr 24 14:24:55.075883 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:55.075848 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"491cc533-cb26-42c1-aaaf-2a211675e0de","Type":"ContainerDied","Data":"49f9471e6ef925d3fb657dc06e62d2d6e136d99e805473912cc84a23a309c8b0"} Apr 24 14:24:55.091507 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:55.091463 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-r7s98" podStartSLOduration=4.350699963 podStartE2EDuration="6.091451203s" podCreationTimestamp="2026-04-24 14:24:49 +0000 UTC" firstStartedPulling="2026-04-24 14:24:52.776613224 +0000 UTC m=+45.570670103" lastFinishedPulling="2026-04-24 14:24:54.517364451 +0000 UTC m=+47.311421343" observedRunningTime="2026-04-24 14:24:55.090187881 +0000 UTC m=+47.884244819" watchObservedRunningTime="2026-04-24 14:24:55.091451203 +0000 UTC m=+47.885508104" Apr 24 14:24:56.077133 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.077105 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 14:24:56.096809 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.096777 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 14:24:56.097247 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.097223 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:24:56.099649 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.099628 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 24 14:24:56.099767 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.099740 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 24 14:24:56.099825 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.099811 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 24 14:24:56.099883 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.099740 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 24 14:24:56.100252 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.100090 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 24 14:24:56.100252 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.100152 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 24 14:24:56.100529 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.100511 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 24 14:24:56.100794 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.100756 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 24 14:24:56.100887 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.100851 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-83smvvk6e5q42\"" Apr 24 14:24:56.100974 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.100759 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 24 14:24:56.101362 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.101341 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-cb6hn\"" Apr 24 14:24:56.101362 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.101360 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 24 14:24:56.101626 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.101592 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 24 14:24:56.105027 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.104364 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 24 14:24:56.107376 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.107351 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 24 14:24:56.232196 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.232164 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/458b1119-d234-4d63-972b-ba4c4df4cf92-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:24:56.232373 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.232221 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/458b1119-d234-4d63-972b-ba4c4df4cf92-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:24:56.232373 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.232249 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/458b1119-d234-4d63-972b-ba4c4df4cf92-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:24:56.232373 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.232290 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/458b1119-d234-4d63-972b-ba4c4df4cf92-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:24:56.232509 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.232401 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/458b1119-d234-4d63-972b-ba4c4df4cf92-web-config\") pod \"prometheus-k8s-0\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:24:56.232509 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.232457 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/458b1119-d234-4d63-972b-ba4c4df4cf92-config-out\") pod \"prometheus-k8s-0\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:24:56.232509 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.232493 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/458b1119-d234-4d63-972b-ba4c4df4cf92-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:24:56.232648 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.232542 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/458b1119-d234-4d63-972b-ba4c4df4cf92-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:24:56.232648 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.232575 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/458b1119-d234-4d63-972b-ba4c4df4cf92-config\") pod \"prometheus-k8s-0\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:24:56.232648 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.232600 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpdpf\" (UniqueName: \"kubernetes.io/projected/458b1119-d234-4d63-972b-ba4c4df4cf92-kube-api-access-bpdpf\") pod \"prometheus-k8s-0\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:24:56.232648 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.232628 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/458b1119-d234-4d63-972b-ba4c4df4cf92-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:24:56.232781 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.232658 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/458b1119-d234-4d63-972b-ba4c4df4cf92-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:24:56.232781 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.232698 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/458b1119-d234-4d63-972b-ba4c4df4cf92-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:24:56.232781 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.232735 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/458b1119-d234-4d63-972b-ba4c4df4cf92-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:24:56.232781 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.232761 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/458b1119-d234-4d63-972b-ba4c4df4cf92-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:24:56.232968 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.232803 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/458b1119-d234-4d63-972b-ba4c4df4cf92-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:24:56.232968 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.232880 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/458b1119-d234-4d63-972b-ba4c4df4cf92-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:24:56.232968 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.232914 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/458b1119-d234-4d63-972b-ba4c4df4cf92-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:24:56.333641 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.333552 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/458b1119-d234-4d63-972b-ba4c4df4cf92-config\") pod \"prometheus-k8s-0\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:24:56.333641 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.333603 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bpdpf\" (UniqueName: \"kubernetes.io/projected/458b1119-d234-4d63-972b-ba4c4df4cf92-kube-api-access-bpdpf\") pod \"prometheus-k8s-0\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:24:56.333641 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.333632 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/458b1119-d234-4d63-972b-ba4c4df4cf92-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:24:56.333941 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.333662 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/458b1119-d234-4d63-972b-ba4c4df4cf92-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:24:56.333941 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.333694 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/458b1119-d234-4d63-972b-ba4c4df4cf92-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:24:56.333941 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.333724 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/458b1119-d234-4d63-972b-ba4c4df4cf92-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:24:56.334231 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.334186 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/458b1119-d234-4d63-972b-ba4c4df4cf92-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:24:56.334358 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.334242 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/458b1119-d234-4d63-972b-ba4c4df4cf92-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:24:56.334358 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.334307 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/458b1119-d234-4d63-972b-ba4c4df4cf92-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:24:56.334358 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.334338 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/458b1119-d234-4d63-972b-ba4c4df4cf92-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:24:56.334523 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.334429 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/458b1119-d234-4d63-972b-ba4c4df4cf92-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:24:56.334523 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.334467 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/458b1119-d234-4d63-972b-ba4c4df4cf92-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:24:56.334523 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.334498 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/458b1119-d234-4d63-972b-ba4c4df4cf92-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:24:56.334668 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.334544 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/458b1119-d234-4d63-972b-ba4c4df4cf92-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:24:56.334668 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.334586 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/458b1119-d234-4d63-972b-ba4c4df4cf92-web-config\") pod \"prometheus-k8s-0\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:24:56.334668 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.334613 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/458b1119-d234-4d63-972b-ba4c4df4cf92-config-out\") pod \"prometheus-k8s-0\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:24:56.334668 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.334639 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/458b1119-d234-4d63-972b-ba4c4df4cf92-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:24:56.334862 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.334675 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/458b1119-d234-4d63-972b-ba4c4df4cf92-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:24:56.334964 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.334937 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/458b1119-d234-4d63-972b-ba4c4df4cf92-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:24:56.336680 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.336650 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/458b1119-d234-4d63-972b-ba4c4df4cf92-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:24:56.336847 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.336824 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/458b1119-d234-4d63-972b-ba4c4df4cf92-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:24:56.337276 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.337156 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/458b1119-d234-4d63-972b-ba4c4df4cf92-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:24:56.337390 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.337286 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/458b1119-d234-4d63-972b-ba4c4df4cf92-config\") pod \"prometheus-k8s-0\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:24:56.337488 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.337464 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/458b1119-d234-4d63-972b-ba4c4df4cf92-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:24:56.337965 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.337610 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/458b1119-d234-4d63-972b-ba4c4df4cf92-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:24:56.338350 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.338246 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/458b1119-d234-4d63-972b-ba4c4df4cf92-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:24:56.338842 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.338642 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/458b1119-d234-4d63-972b-ba4c4df4cf92-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:24:56.339522 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.339415 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/458b1119-d234-4d63-972b-ba4c4df4cf92-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:24:56.339804 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.339782 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/458b1119-d234-4d63-972b-ba4c4df4cf92-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:24:56.339996 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.339973 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/458b1119-d234-4d63-972b-ba4c4df4cf92-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:24:56.340485 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.340461 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/458b1119-d234-4d63-972b-ba4c4df4cf92-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:24:56.340571 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.340515 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/458b1119-d234-4d63-972b-ba4c4df4cf92-config-out\") pod \"prometheus-k8s-0\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:24:56.340983 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.340964 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/458b1119-d234-4d63-972b-ba4c4df4cf92-web-config\") pod \"prometheus-k8s-0\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:24:56.342299 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.342277 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpdpf\" (UniqueName: \"kubernetes.io/projected/458b1119-d234-4d63-972b-ba4c4df4cf92-kube-api-access-bpdpf\") pod \"prometheus-k8s-0\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:24:56.356749 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.356724 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/458b1119-d234-4d63-972b-ba4c4df4cf92-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:24:56.365075 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.365052 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/458b1119-d234-4d63-972b-ba4c4df4cf92-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:24:56.409165 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.409125 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:24:56.706747 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:56.706715 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 14:24:56.716154 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:24:56.716112 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod458b1119_d234_4d63_972b_ba4c4df4cf92.slice/crio-fdf700e930fcb436939d7f566b1893a3d847b81cfc0ffd9ba1e22a34669d661d WatchSource:0}: Error finding container fdf700e930fcb436939d7f566b1893a3d847b81cfc0ffd9ba1e22a34669d661d: Status 404 returned error can't find the container with id fdf700e930fcb436939d7f566b1893a3d847b81cfc0ffd9ba1e22a34669d661d Apr 24 14:24:57.084832 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:57.084799 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"491cc533-cb26-42c1-aaaf-2a211675e0de","Type":"ContainerStarted","Data":"c7a964fde45a4895e5a34345a546137dd8086f9194dd4ea418414bf0b1467cc0"} Apr 24 14:24:57.084832 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:57.084835 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"491cc533-cb26-42c1-aaaf-2a211675e0de","Type":"ContainerStarted","Data":"c668e2c9ac7c932d990d3860ff99dad8b851d33eac39f488e4e42233c783ef92"} Apr 24 14:24:57.086187 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:57.086167 2569 generic.go:358] "Generic (PLEG): container finished" podID="458b1119-d234-4d63-972b-ba4c4df4cf92" containerID="6acc7374dc061699ab357ebf419d09b4a45ceaf0c918522d1f5f0dd89631430d" exitCode=0 Apr 24 14:24:57.086250 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:57.086237 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"458b1119-d234-4d63-972b-ba4c4df4cf92","Type":"ContainerDied","Data":"6acc7374dc061699ab357ebf419d09b4a45ceaf0c918522d1f5f0dd89631430d"} Apr 24 14:24:57.086326 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:57.086256 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"458b1119-d234-4d63-972b-ba4c4df4cf92","Type":"ContainerStarted","Data":"fdf700e930fcb436939d7f566b1893a3d847b81cfc0ffd9ba1e22a34669d661d"} Apr 24 14:24:58.094587 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:58.094541 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"491cc533-cb26-42c1-aaaf-2a211675e0de","Type":"ContainerStarted","Data":"bbe9a56f4cdce8305ef23e22e8d9ba0d08e4356652074d5b7726438407909c33"} Apr 24 14:24:58.095145 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:58.094598 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"491cc533-cb26-42c1-aaaf-2a211675e0de","Type":"ContainerStarted","Data":"23c23fe6692cb90e4210e73f641dc1e7631d0cfc28cfbd1c8b8909e09017bffb"} Apr 24 14:24:58.095145 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:58.094615 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"491cc533-cb26-42c1-aaaf-2a211675e0de","Type":"ContainerStarted","Data":"6cc20606a4cb7189d0ca7233ac05c795badbe675a1dcc9e3a45796eb94b8952d"} Apr 24 14:24:59.045327 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:59.045295 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-q6p5b" Apr 24 14:24:59.102427 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:59.102395 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"491cc533-cb26-42c1-aaaf-2a211675e0de","Type":"ContainerStarted","Data":"a69310c8821faf4db6c7807d9e2dfcaa71aa29b69a4cd6cc137858d17890b7cf"} Apr 24 14:24:59.126487 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:24:59.126428 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=3.664796795 podStartE2EDuration="9.126410754s" podCreationTimestamp="2026-04-24 14:24:50 +0000 UTC" firstStartedPulling="2026-04-24 14:24:52.707448861 +0000 UTC m=+45.501505743" lastFinishedPulling="2026-04-24 14:24:58.16906281 +0000 UTC m=+50.963119702" observedRunningTime="2026-04-24 14:24:59.125264053 +0000 UTC m=+51.919320957" watchObservedRunningTime="2026-04-24 14:24:59.126410754 +0000 UTC m=+51.920467695" Apr 24 14:25:01.110274 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:25:01.110236 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"458b1119-d234-4d63-972b-ba4c4df4cf92","Type":"ContainerStarted","Data":"e61069b50e611967356acf7d01168966d16076edbe579200c6af495ddd7d8647"} Apr 24 14:25:01.110274 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:25:01.110274 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"458b1119-d234-4d63-972b-ba4c4df4cf92","Type":"ContainerStarted","Data":"86e58bc6115a55d4672087408ec746f7edfd1833e0663f17c526f96be6236c64"} Apr 24 14:25:03.121009 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:25:03.120916 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"458b1119-d234-4d63-972b-ba4c4df4cf92","Type":"ContainerStarted","Data":"9a3198ec242ad368c26c77c13b0c1016f6d75531d0f1ce0ab1b78cf15eb7c09f"} Apr 24 14:25:03.121009 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:25:03.120955 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"458b1119-d234-4d63-972b-ba4c4df4cf92","Type":"ContainerStarted","Data":"dbb34f6def7a3afd66a9e2597315fc1350569ba1e9f5568f45e8955a06ad0d08"} Apr 24 14:25:03.121009 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:25:03.120966 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"458b1119-d234-4d63-972b-ba4c4df4cf92","Type":"ContainerStarted","Data":"96b926e0feb16bad177532867a108859c626a188c93b793c3c8030ea8a719b06"} Apr 24 14:25:03.121009 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:25:03.120981 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"458b1119-d234-4d63-972b-ba4c4df4cf92","Type":"ContainerStarted","Data":"049feeca0ffcdcaaccd16c053f944385a24719a2e0c8341a3e165fe47e7a728c"} Apr 24 14:25:03.147474 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:25:03.147410 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.531403244 podStartE2EDuration="7.147392647s" podCreationTimestamp="2026-04-24 14:24:56 +0000 UTC" firstStartedPulling="2026-04-24 14:24:57.087369137 +0000 UTC m=+49.881426016" lastFinishedPulling="2026-04-24 14:25:02.703358537 +0000 UTC m=+55.497415419" observedRunningTime="2026-04-24 14:25:03.146445357 +0000 UTC m=+55.940502258" watchObservedRunningTime="2026-04-24 14:25:03.147392647 +0000 UTC m=+55.941449547" Apr 24 14:25:05.012254 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:25:05.012225 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mjpgb" Apr 24 14:25:06.409705 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:25:06.409667 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:25:12.482785 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:25:12.482727 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/496c729f-9eee-4311-8fe8-4502d4af37f8-metrics-certs\") pod \"network-metrics-daemon-d47r2\" (UID: \"496c729f-9eee-4311-8fe8-4502d4af37f8\") " pod="openshift-multus/network-metrics-daemon-d47r2" Apr 24 14:25:12.484833 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:25:12.484817 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 14:25:12.495615 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:25:12.495580 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/496c729f-9eee-4311-8fe8-4502d4af37f8-metrics-certs\") pod \"network-metrics-daemon-d47r2\" (UID: \"496c729f-9eee-4311-8fe8-4502d4af37f8\") " pod="openshift-multus/network-metrics-daemon-d47r2" Apr 24 14:25:12.646648 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:25:12.646616 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-p2dnd\"" Apr 24 14:25:12.655501 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:25:12.655472 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d47r2" Apr 24 14:25:12.683530 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:25:12.683496 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jxkqx\" (UniqueName: \"kubernetes.io/projected/66892658-4db6-4064-b52f-60baa00dcc6d-kube-api-access-jxkqx\") pod \"network-check-target-tzvkp\" (UID: \"66892658-4db6-4064-b52f-60baa00dcc6d\") " pod="openshift-network-diagnostics/network-check-target-tzvkp" Apr 24 14:25:12.685872 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:25:12.685852 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 14:25:12.696414 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:25:12.696390 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 14:25:12.708462 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:25:12.708431 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxkqx\" (UniqueName: \"kubernetes.io/projected/66892658-4db6-4064-b52f-60baa00dcc6d-kube-api-access-jxkqx\") pod \"network-check-target-tzvkp\" (UID: \"66892658-4db6-4064-b52f-60baa00dcc6d\") " pod="openshift-network-diagnostics/network-check-target-tzvkp" Apr 24 14:25:12.781226 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:25:12.781193 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-d47r2"] Apr 24 14:25:12.785639 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:25:12.785611 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod496c729f_9eee_4311_8fe8_4502d4af37f8.slice/crio-37635ae1819a90ce032748e18f5f938581ce342a89e461f2524a031b5506af50 WatchSource:0}: Error finding container 37635ae1819a90ce032748e18f5f938581ce342a89e461f2524a031b5506af50: Status 404 returned error can't find the container with id 37635ae1819a90ce032748e18f5f938581ce342a89e461f2524a031b5506af50 Apr 24 14:25:12.939724 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:25:12.939694 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-qmh6n\"" Apr 24 14:25:12.948376 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:25:12.948349 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tzvkp" Apr 24 14:25:13.068994 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:25:13.068962 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-tzvkp"] Apr 24 14:25:13.072795 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:25:13.072765 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66892658_4db6_4064_b52f_60baa00dcc6d.slice/crio-26cb959d420ffc34481e08eb6e61818ffec07a99dc37514992e37db256ed7708 WatchSource:0}: Error finding container 26cb959d420ffc34481e08eb6e61818ffec07a99dc37514992e37db256ed7708: Status 404 returned error can't find the container with id 26cb959d420ffc34481e08eb6e61818ffec07a99dc37514992e37db256ed7708 Apr 24 14:25:13.148999 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:25:13.148960 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-tzvkp" event={"ID":"66892658-4db6-4064-b52f-60baa00dcc6d","Type":"ContainerStarted","Data":"26cb959d420ffc34481e08eb6e61818ffec07a99dc37514992e37db256ed7708"} Apr 24 14:25:13.149959 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:25:13.149930 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-d47r2" event={"ID":"496c729f-9eee-4311-8fe8-4502d4af37f8","Type":"ContainerStarted","Data":"37635ae1819a90ce032748e18f5f938581ce342a89e461f2524a031b5506af50"} Apr 24 14:25:15.158361 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:25:15.158322 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-d47r2" event={"ID":"496c729f-9eee-4311-8fe8-4502d4af37f8","Type":"ContainerStarted","Data":"1f6d11b4c9a56cf07d5e2d8af866e5101037789ce1442d289a356a1a43f13949"} Apr 24 14:25:15.158361 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:25:15.158361 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-d47r2" event={"ID":"496c729f-9eee-4311-8fe8-4502d4af37f8","Type":"ContainerStarted","Data":"c7f301d9f9899594270155ce3bee6a47bdd2eeb63082175d06defe13968ab5e5"} Apr 24 14:25:15.173081 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:25:15.173033 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-d47r2" podStartSLOduration=66.802409279 podStartE2EDuration="1m8.173018544s" podCreationTimestamp="2026-04-24 14:24:07 +0000 UTC" firstStartedPulling="2026-04-24 14:25:12.787539353 +0000 UTC m=+65.581596236" lastFinishedPulling="2026-04-24 14:25:14.158148614 +0000 UTC m=+66.952205501" observedRunningTime="2026-04-24 14:25:15.171594295 +0000 UTC m=+67.965651231" watchObservedRunningTime="2026-04-24 14:25:15.173018544 +0000 UTC m=+67.967075444" Apr 24 14:25:17.166162 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:25:17.166124 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-tzvkp" event={"ID":"66892658-4db6-4064-b52f-60baa00dcc6d","Type":"ContainerStarted","Data":"78de9dc1b9d6cff417d9a0f7a90665ef4d2ca0216c7f66467beb7b116d8c4f55"} Apr 24 14:25:17.166560 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:25:17.166250 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-tzvkp" Apr 24 14:25:17.180338 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:25:17.180276 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-tzvkp" podStartSLOduration=65.398411233 podStartE2EDuration="1m9.18025779s" podCreationTimestamp="2026-04-24 14:24:08 +0000 UTC" firstStartedPulling="2026-04-24 14:25:13.074678002 +0000 UTC m=+65.868734882" lastFinishedPulling="2026-04-24 14:25:16.856524546 +0000 UTC m=+69.650581439" observedRunningTime="2026-04-24 14:25:17.179592829 +0000 UTC m=+69.973649740" watchObservedRunningTime="2026-04-24 14:25:17.18025779 +0000 UTC m=+69.974314692" Apr 24 14:25:48.172444 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:25:48.172337 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-tzvkp" Apr 24 14:26:02.868300 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:02.868257 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:02.888212 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:02.888180 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:03.315449 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:03.315420 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:10.148702 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:10.148660 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 14:26:10.149553 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:10.149506 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="491cc533-cb26-42c1-aaaf-2a211675e0de" containerName="kube-rbac-proxy-web" containerID="cri-o://6cc20606a4cb7189d0ca7233ac05c795badbe675a1dcc9e3a45796eb94b8952d" gracePeriod=120 Apr 24 14:26:10.149706 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:10.149629 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="491cc533-cb26-42c1-aaaf-2a211675e0de" containerName="kube-rbac-proxy" containerID="cri-o://23c23fe6692cb90e4210e73f641dc1e7631d0cfc28cfbd1c8b8909e09017bffb" gracePeriod=120 Apr 24 14:26:10.149767 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:10.149712 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="491cc533-cb26-42c1-aaaf-2a211675e0de" containerName="config-reloader" containerID="cri-o://c7a964fde45a4895e5a34345a546137dd8086f9194dd4ea418414bf0b1467cc0" gracePeriod=120 Apr 24 14:26:10.149818 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:10.149759 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="491cc533-cb26-42c1-aaaf-2a211675e0de" containerName="alertmanager" containerID="cri-o://c668e2c9ac7c932d990d3860ff99dad8b851d33eac39f488e4e42233c783ef92" gracePeriod=120 Apr 24 14:26:10.149873 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:10.149821 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="491cc533-cb26-42c1-aaaf-2a211675e0de" containerName="kube-rbac-proxy-metric" containerID="cri-o://bbe9a56f4cdce8305ef23e22e8d9ba0d08e4356652074d5b7726438407909c33" gracePeriod=120 Apr 24 14:26:10.149958 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:10.149177 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="491cc533-cb26-42c1-aaaf-2a211675e0de" containerName="prom-label-proxy" containerID="cri-o://a69310c8821faf4db6c7807d9e2dfcaa71aa29b69a4cd6cc137858d17890b7cf" gracePeriod=120 Apr 24 14:26:10.322816 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:10.322779 2569 generic.go:358] "Generic (PLEG): container finished" podID="491cc533-cb26-42c1-aaaf-2a211675e0de" containerID="a69310c8821faf4db6c7807d9e2dfcaa71aa29b69a4cd6cc137858d17890b7cf" exitCode=0 Apr 24 14:26:10.322816 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:10.322804 2569 generic.go:358] "Generic (PLEG): container finished" podID="491cc533-cb26-42c1-aaaf-2a211675e0de" containerID="23c23fe6692cb90e4210e73f641dc1e7631d0cfc28cfbd1c8b8909e09017bffb" exitCode=0 Apr 24 14:26:10.322816 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:10.322811 2569 generic.go:358] "Generic (PLEG): container finished" podID="491cc533-cb26-42c1-aaaf-2a211675e0de" containerID="c7a964fde45a4895e5a34345a546137dd8086f9194dd4ea418414bf0b1467cc0" exitCode=0 Apr 24 14:26:10.322816 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:10.322817 2569 generic.go:358] "Generic (PLEG): container finished" podID="491cc533-cb26-42c1-aaaf-2a211675e0de" containerID="c668e2c9ac7c932d990d3860ff99dad8b851d33eac39f488e4e42233c783ef92" exitCode=0 Apr 24 14:26:10.323106 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:10.322851 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"491cc533-cb26-42c1-aaaf-2a211675e0de","Type":"ContainerDied","Data":"a69310c8821faf4db6c7807d9e2dfcaa71aa29b69a4cd6cc137858d17890b7cf"} Apr 24 14:26:10.323106 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:10.322906 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"491cc533-cb26-42c1-aaaf-2a211675e0de","Type":"ContainerDied","Data":"23c23fe6692cb90e4210e73f641dc1e7631d0cfc28cfbd1c8b8909e09017bffb"} Apr 24 14:26:10.323106 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:10.322922 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"491cc533-cb26-42c1-aaaf-2a211675e0de","Type":"ContainerDied","Data":"c7a964fde45a4895e5a34345a546137dd8086f9194dd4ea418414bf0b1467cc0"} Apr 24 14:26:10.323106 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:10.322933 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"491cc533-cb26-42c1-aaaf-2a211675e0de","Type":"ContainerDied","Data":"c668e2c9ac7c932d990d3860ff99dad8b851d33eac39f488e4e42233c783ef92"} Apr 24 14:26:11.328094 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:11.328064 2569 generic.go:358] "Generic (PLEG): container finished" podID="491cc533-cb26-42c1-aaaf-2a211675e0de" containerID="6cc20606a4cb7189d0ca7233ac05c795badbe675a1dcc9e3a45796eb94b8952d" exitCode=0 Apr 24 14:26:11.328446 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:11.328132 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"491cc533-cb26-42c1-aaaf-2a211675e0de","Type":"ContainerDied","Data":"6cc20606a4cb7189d0ca7233ac05c795badbe675a1dcc9e3a45796eb94b8952d"} Apr 24 14:26:11.498880 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:11.498857 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:11.536019 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:11.535988 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/491cc533-cb26-42c1-aaaf-2a211675e0de-alertmanager-main-db\") pod \"491cc533-cb26-42c1-aaaf-2a211675e0de\" (UID: \"491cc533-cb26-42c1-aaaf-2a211675e0de\") " Apr 24 14:26:11.536201 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:11.536033 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/491cc533-cb26-42c1-aaaf-2a211675e0de-metrics-client-ca\") pod \"491cc533-cb26-42c1-aaaf-2a211675e0de\" (UID: \"491cc533-cb26-42c1-aaaf-2a211675e0de\") " Apr 24 14:26:11.536201 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:11.536059 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mp57\" (UniqueName: \"kubernetes.io/projected/491cc533-cb26-42c1-aaaf-2a211675e0de-kube-api-access-8mp57\") pod \"491cc533-cb26-42c1-aaaf-2a211675e0de\" (UID: \"491cc533-cb26-42c1-aaaf-2a211675e0de\") " Apr 24 14:26:11.536201 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:11.536085 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/491cc533-cb26-42c1-aaaf-2a211675e0de-secret-alertmanager-kube-rbac-proxy-metric\") pod \"491cc533-cb26-42c1-aaaf-2a211675e0de\" (UID: \"491cc533-cb26-42c1-aaaf-2a211675e0de\") " Apr 24 14:26:11.536201 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:11.536128 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/491cc533-cb26-42c1-aaaf-2a211675e0de-web-config\") pod \"491cc533-cb26-42c1-aaaf-2a211675e0de\" (UID: \"491cc533-cb26-42c1-aaaf-2a211675e0de\") " Apr 24 14:26:11.536201 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:11.536153 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/491cc533-cb26-42c1-aaaf-2a211675e0de-secret-alertmanager-main-tls\") pod \"491cc533-cb26-42c1-aaaf-2a211675e0de\" (UID: \"491cc533-cb26-42c1-aaaf-2a211675e0de\") " Apr 24 14:26:11.536201 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:11.536199 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/491cc533-cb26-42c1-aaaf-2a211675e0de-cluster-tls-config\") pod \"491cc533-cb26-42c1-aaaf-2a211675e0de\" (UID: \"491cc533-cb26-42c1-aaaf-2a211675e0de\") " Apr 24 14:26:11.536512 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:11.536248 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/491cc533-cb26-42c1-aaaf-2a211675e0de-config-out\") pod \"491cc533-cb26-42c1-aaaf-2a211675e0de\" (UID: \"491cc533-cb26-42c1-aaaf-2a211675e0de\") " Apr 24 14:26:11.536512 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:11.536277 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/491cc533-cb26-42c1-aaaf-2a211675e0de-tls-assets\") pod \"491cc533-cb26-42c1-aaaf-2a211675e0de\" (UID: \"491cc533-cb26-42c1-aaaf-2a211675e0de\") " Apr 24 14:26:11.536512 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:11.536308 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/491cc533-cb26-42c1-aaaf-2a211675e0de-config-volume\") pod \"491cc533-cb26-42c1-aaaf-2a211675e0de\" (UID: \"491cc533-cb26-42c1-aaaf-2a211675e0de\") " Apr 24 14:26:11.536512 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:11.536335 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/491cc533-cb26-42c1-aaaf-2a211675e0de-secret-alertmanager-kube-rbac-proxy\") pod \"491cc533-cb26-42c1-aaaf-2a211675e0de\" (UID: \"491cc533-cb26-42c1-aaaf-2a211675e0de\") " Apr 24 14:26:11.536512 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:11.536365 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/491cc533-cb26-42c1-aaaf-2a211675e0de-alertmanager-trusted-ca-bundle\") pod \"491cc533-cb26-42c1-aaaf-2a211675e0de\" (UID: \"491cc533-cb26-42c1-aaaf-2a211675e0de\") " Apr 24 14:26:11.536512 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:11.536365 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/491cc533-cb26-42c1-aaaf-2a211675e0de-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "491cc533-cb26-42c1-aaaf-2a211675e0de" (UID: "491cc533-cb26-42c1-aaaf-2a211675e0de"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:26:11.536512 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:11.536419 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/491cc533-cb26-42c1-aaaf-2a211675e0de-secret-alertmanager-kube-rbac-proxy-web\") pod \"491cc533-cb26-42c1-aaaf-2a211675e0de\" (UID: \"491cc533-cb26-42c1-aaaf-2a211675e0de\") " Apr 24 14:26:11.536962 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:11.536613 2569 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/491cc533-cb26-42c1-aaaf-2a211675e0de-alertmanager-main-db\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 14:26:11.537019 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:11.536978 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/491cc533-cb26-42c1-aaaf-2a211675e0de-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "491cc533-cb26-42c1-aaaf-2a211675e0de" (UID: "491cc533-cb26-42c1-aaaf-2a211675e0de"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:26:11.537495 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:11.537465 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/491cc533-cb26-42c1-aaaf-2a211675e0de-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "491cc533-cb26-42c1-aaaf-2a211675e0de" (UID: "491cc533-cb26-42c1-aaaf-2a211675e0de"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:26:11.539742 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:11.539716 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/491cc533-cb26-42c1-aaaf-2a211675e0de-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "491cc533-cb26-42c1-aaaf-2a211675e0de" (UID: "491cc533-cb26-42c1-aaaf-2a211675e0de"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:26:11.540968 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:11.540663 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/491cc533-cb26-42c1-aaaf-2a211675e0de-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "491cc533-cb26-42c1-aaaf-2a211675e0de" (UID: "491cc533-cb26-42c1-aaaf-2a211675e0de"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:26:11.540968 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:11.540788 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/491cc533-cb26-42c1-aaaf-2a211675e0de-config-out" (OuterVolumeSpecName: "config-out") pod "491cc533-cb26-42c1-aaaf-2a211675e0de" (UID: "491cc533-cb26-42c1-aaaf-2a211675e0de"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:26:11.540968 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:11.540807 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/491cc533-cb26-42c1-aaaf-2a211675e0de-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "491cc533-cb26-42c1-aaaf-2a211675e0de" (UID: "491cc533-cb26-42c1-aaaf-2a211675e0de"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:26:11.540968 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:11.540826 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/491cc533-cb26-42c1-aaaf-2a211675e0de-kube-api-access-8mp57" (OuterVolumeSpecName: "kube-api-access-8mp57") pod "491cc533-cb26-42c1-aaaf-2a211675e0de" (UID: "491cc533-cb26-42c1-aaaf-2a211675e0de"). InnerVolumeSpecName "kube-api-access-8mp57". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 14:26:11.540968 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:11.540922 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/491cc533-cb26-42c1-aaaf-2a211675e0de-config-volume" (OuterVolumeSpecName: "config-volume") pod "491cc533-cb26-42c1-aaaf-2a211675e0de" (UID: "491cc533-cb26-42c1-aaaf-2a211675e0de"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:26:11.541363 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:11.541009 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/491cc533-cb26-42c1-aaaf-2a211675e0de-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "491cc533-cb26-42c1-aaaf-2a211675e0de" (UID: "491cc533-cb26-42c1-aaaf-2a211675e0de"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:26:11.542328 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:11.542304 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/491cc533-cb26-42c1-aaaf-2a211675e0de-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "491cc533-cb26-42c1-aaaf-2a211675e0de" (UID: "491cc533-cb26-42c1-aaaf-2a211675e0de"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 14:26:11.543745 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:11.543721 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/491cc533-cb26-42c1-aaaf-2a211675e0de-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "491cc533-cb26-42c1-aaaf-2a211675e0de" (UID: "491cc533-cb26-42c1-aaaf-2a211675e0de"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:26:11.550615 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:11.550592 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/491cc533-cb26-42c1-aaaf-2a211675e0de-web-config" (OuterVolumeSpecName: "web-config") pod "491cc533-cb26-42c1-aaaf-2a211675e0de" (UID: "491cc533-cb26-42c1-aaaf-2a211675e0de"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:26:11.637678 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:11.637637 2569 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/491cc533-cb26-42c1-aaaf-2a211675e0de-config-out\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 14:26:11.637678 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:11.637669 2569 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/491cc533-cb26-42c1-aaaf-2a211675e0de-tls-assets\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 14:26:11.637678 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:11.637681 2569 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/491cc533-cb26-42c1-aaaf-2a211675e0de-config-volume\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 14:26:11.637975 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:11.637695 2569 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/491cc533-cb26-42c1-aaaf-2a211675e0de-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 14:26:11.637975 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:11.637711 2569 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/491cc533-cb26-42c1-aaaf-2a211675e0de-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 14:26:11.637975 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:11.637725 2569 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/491cc533-cb26-42c1-aaaf-2a211675e0de-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 14:26:11.637975 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:11.637738 2569 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/491cc533-cb26-42c1-aaaf-2a211675e0de-metrics-client-ca\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 14:26:11.637975 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:11.637751 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8mp57\" (UniqueName: \"kubernetes.io/projected/491cc533-cb26-42c1-aaaf-2a211675e0de-kube-api-access-8mp57\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 14:26:11.637975 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:11.637763 2569 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/491cc533-cb26-42c1-aaaf-2a211675e0de-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 14:26:11.637975 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:11.637777 2569 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/491cc533-cb26-42c1-aaaf-2a211675e0de-web-config\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 14:26:11.637975 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:11.637789 2569 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/491cc533-cb26-42c1-aaaf-2a211675e0de-secret-alertmanager-main-tls\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 14:26:11.637975 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:11.637802 2569 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/491cc533-cb26-42c1-aaaf-2a211675e0de-cluster-tls-config\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 14:26:12.333950 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.333916 2569 generic.go:358] "Generic (PLEG): container finished" podID="491cc533-cb26-42c1-aaaf-2a211675e0de" containerID="bbe9a56f4cdce8305ef23e22e8d9ba0d08e4356652074d5b7726438407909c33" exitCode=0 Apr 24 14:26:12.334339 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.333974 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"491cc533-cb26-42c1-aaaf-2a211675e0de","Type":"ContainerDied","Data":"bbe9a56f4cdce8305ef23e22e8d9ba0d08e4356652074d5b7726438407909c33"} Apr 24 14:26:12.334339 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.334009 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"491cc533-cb26-42c1-aaaf-2a211675e0de","Type":"ContainerDied","Data":"073380d038718483246203fed0b7ec691b6ed1e122a25a6bf5e7b33de4343c22"} Apr 24 14:26:12.334339 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.334028 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:12.334339 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.334030 2569 scope.go:117] "RemoveContainer" containerID="a69310c8821faf4db6c7807d9e2dfcaa71aa29b69a4cd6cc137858d17890b7cf" Apr 24 14:26:12.343968 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.343949 2569 scope.go:117] "RemoveContainer" containerID="bbe9a56f4cdce8305ef23e22e8d9ba0d08e4356652074d5b7726438407909c33" Apr 24 14:26:12.350401 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.350384 2569 scope.go:117] "RemoveContainer" containerID="23c23fe6692cb90e4210e73f641dc1e7631d0cfc28cfbd1c8b8909e09017bffb" Apr 24 14:26:12.354525 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.353986 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 14:26:12.357777 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.357755 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 14:26:12.358758 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.358743 2569 scope.go:117] "RemoveContainer" containerID="6cc20606a4cb7189d0ca7233ac05c795badbe675a1dcc9e3a45796eb94b8952d" Apr 24 14:26:12.365332 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.365314 2569 scope.go:117] "RemoveContainer" containerID="c7a964fde45a4895e5a34345a546137dd8086f9194dd4ea418414bf0b1467cc0" Apr 24 14:26:12.371675 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.371655 2569 scope.go:117] "RemoveContainer" containerID="c668e2c9ac7c932d990d3860ff99dad8b851d33eac39f488e4e42233c783ef92" Apr 24 14:26:12.378608 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.378590 2569 scope.go:117] "RemoveContainer" containerID="49f9471e6ef925d3fb657dc06e62d2d6e136d99e805473912cc84a23a309c8b0" Apr 24 14:26:12.384676 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.384655 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 14:26:12.385084 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.385067 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="491cc533-cb26-42c1-aaaf-2a211675e0de" containerName="config-reloader" Apr 24 14:26:12.385084 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.385085 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="491cc533-cb26-42c1-aaaf-2a211675e0de" containerName="config-reloader" Apr 24 14:26:12.385203 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.385098 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="491cc533-cb26-42c1-aaaf-2a211675e0de" containerName="prom-label-proxy" Apr 24 14:26:12.385203 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.385104 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="491cc533-cb26-42c1-aaaf-2a211675e0de" containerName="prom-label-proxy" Apr 24 14:26:12.385203 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.385113 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="491cc533-cb26-42c1-aaaf-2a211675e0de" containerName="alertmanager" Apr 24 14:26:12.385203 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.385118 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="491cc533-cb26-42c1-aaaf-2a211675e0de" containerName="alertmanager" Apr 24 14:26:12.385203 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.385125 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="491cc533-cb26-42c1-aaaf-2a211675e0de" containerName="init-config-reloader" Apr 24 14:26:12.385203 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.385130 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="491cc533-cb26-42c1-aaaf-2a211675e0de" containerName="init-config-reloader" Apr 24 14:26:12.385203 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.385135 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="491cc533-cb26-42c1-aaaf-2a211675e0de" containerName="kube-rbac-proxy-web" Apr 24 14:26:12.385203 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.385140 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="491cc533-cb26-42c1-aaaf-2a211675e0de" containerName="kube-rbac-proxy-web" Apr 24 14:26:12.385203 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.385147 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="491cc533-cb26-42c1-aaaf-2a211675e0de" containerName="kube-rbac-proxy-metric" Apr 24 14:26:12.385203 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.385152 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="491cc533-cb26-42c1-aaaf-2a211675e0de" containerName="kube-rbac-proxy-metric" Apr 24 14:26:12.385203 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.385160 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="491cc533-cb26-42c1-aaaf-2a211675e0de" containerName="kube-rbac-proxy" Apr 24 14:26:12.385203 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.385164 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="491cc533-cb26-42c1-aaaf-2a211675e0de" containerName="kube-rbac-proxy" Apr 24 14:26:12.385203 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.385208 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="491cc533-cb26-42c1-aaaf-2a211675e0de" containerName="kube-rbac-proxy-web" Apr 24 14:26:12.385774 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.385216 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="491cc533-cb26-42c1-aaaf-2a211675e0de" containerName="kube-rbac-proxy-metric" Apr 24 14:26:12.385774 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.385222 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="491cc533-cb26-42c1-aaaf-2a211675e0de" containerName="config-reloader" Apr 24 14:26:12.385774 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.385229 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="491cc533-cb26-42c1-aaaf-2a211675e0de" containerName="prom-label-proxy" Apr 24 14:26:12.385774 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.385235 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="491cc533-cb26-42c1-aaaf-2a211675e0de" containerName="alertmanager" Apr 24 14:26:12.385774 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.385241 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="491cc533-cb26-42c1-aaaf-2a211675e0de" containerName="kube-rbac-proxy" Apr 24 14:26:12.386318 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.385823 2569 scope.go:117] "RemoveContainer" containerID="a69310c8821faf4db6c7807d9e2dfcaa71aa29b69a4cd6cc137858d17890b7cf" Apr 24 14:26:12.386318 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:26:12.386189 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a69310c8821faf4db6c7807d9e2dfcaa71aa29b69a4cd6cc137858d17890b7cf\": container with ID starting with a69310c8821faf4db6c7807d9e2dfcaa71aa29b69a4cd6cc137858d17890b7cf not found: ID does not exist" containerID="a69310c8821faf4db6c7807d9e2dfcaa71aa29b69a4cd6cc137858d17890b7cf" Apr 24 14:26:12.386318 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.386223 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a69310c8821faf4db6c7807d9e2dfcaa71aa29b69a4cd6cc137858d17890b7cf"} err="failed to get container status \"a69310c8821faf4db6c7807d9e2dfcaa71aa29b69a4cd6cc137858d17890b7cf\": rpc error: code = NotFound desc = could not find container \"a69310c8821faf4db6c7807d9e2dfcaa71aa29b69a4cd6cc137858d17890b7cf\": container with ID starting with a69310c8821faf4db6c7807d9e2dfcaa71aa29b69a4cd6cc137858d17890b7cf not found: ID does not exist" Apr 24 14:26:12.386318 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.386270 2569 scope.go:117] "RemoveContainer" containerID="bbe9a56f4cdce8305ef23e22e8d9ba0d08e4356652074d5b7726438407909c33" Apr 24 14:26:12.386557 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:26:12.386535 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbe9a56f4cdce8305ef23e22e8d9ba0d08e4356652074d5b7726438407909c33\": container with ID starting with bbe9a56f4cdce8305ef23e22e8d9ba0d08e4356652074d5b7726438407909c33 not found: ID does not exist" containerID="bbe9a56f4cdce8305ef23e22e8d9ba0d08e4356652074d5b7726438407909c33" Apr 24 14:26:12.386590 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.386563 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbe9a56f4cdce8305ef23e22e8d9ba0d08e4356652074d5b7726438407909c33"} err="failed to get container status \"bbe9a56f4cdce8305ef23e22e8d9ba0d08e4356652074d5b7726438407909c33\": rpc error: code = NotFound desc = could not find container \"bbe9a56f4cdce8305ef23e22e8d9ba0d08e4356652074d5b7726438407909c33\": container with ID starting with bbe9a56f4cdce8305ef23e22e8d9ba0d08e4356652074d5b7726438407909c33 not found: ID does not exist" Apr 24 14:26:12.386590 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.386580 2569 scope.go:117] "RemoveContainer" containerID="23c23fe6692cb90e4210e73f641dc1e7631d0cfc28cfbd1c8b8909e09017bffb" Apr 24 14:26:12.386829 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:26:12.386812 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23c23fe6692cb90e4210e73f641dc1e7631d0cfc28cfbd1c8b8909e09017bffb\": container with ID starting with 23c23fe6692cb90e4210e73f641dc1e7631d0cfc28cfbd1c8b8909e09017bffb not found: ID does not exist" containerID="23c23fe6692cb90e4210e73f641dc1e7631d0cfc28cfbd1c8b8909e09017bffb" Apr 24 14:26:12.386866 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.386835 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23c23fe6692cb90e4210e73f641dc1e7631d0cfc28cfbd1c8b8909e09017bffb"} err="failed to get container status \"23c23fe6692cb90e4210e73f641dc1e7631d0cfc28cfbd1c8b8909e09017bffb\": rpc error: code = NotFound desc = could not find container \"23c23fe6692cb90e4210e73f641dc1e7631d0cfc28cfbd1c8b8909e09017bffb\": container with ID starting with 23c23fe6692cb90e4210e73f641dc1e7631d0cfc28cfbd1c8b8909e09017bffb not found: ID does not exist" Apr 24 14:26:12.386866 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.386850 2569 scope.go:117] "RemoveContainer" containerID="6cc20606a4cb7189d0ca7233ac05c795badbe675a1dcc9e3a45796eb94b8952d" Apr 24 14:26:12.387091 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:26:12.387075 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cc20606a4cb7189d0ca7233ac05c795badbe675a1dcc9e3a45796eb94b8952d\": container with ID starting with 6cc20606a4cb7189d0ca7233ac05c795badbe675a1dcc9e3a45796eb94b8952d not found: ID does not exist" containerID="6cc20606a4cb7189d0ca7233ac05c795badbe675a1dcc9e3a45796eb94b8952d" Apr 24 14:26:12.387134 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.387097 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cc20606a4cb7189d0ca7233ac05c795badbe675a1dcc9e3a45796eb94b8952d"} err="failed to get container status \"6cc20606a4cb7189d0ca7233ac05c795badbe675a1dcc9e3a45796eb94b8952d\": rpc error: code = NotFound desc = could not find container \"6cc20606a4cb7189d0ca7233ac05c795badbe675a1dcc9e3a45796eb94b8952d\": container with ID starting with 6cc20606a4cb7189d0ca7233ac05c795badbe675a1dcc9e3a45796eb94b8952d not found: ID does not exist" Apr 24 14:26:12.387134 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.387119 2569 scope.go:117] "RemoveContainer" containerID="c7a964fde45a4895e5a34345a546137dd8086f9194dd4ea418414bf0b1467cc0" Apr 24 14:26:12.387355 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:26:12.387335 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7a964fde45a4895e5a34345a546137dd8086f9194dd4ea418414bf0b1467cc0\": container with ID starting with c7a964fde45a4895e5a34345a546137dd8086f9194dd4ea418414bf0b1467cc0 not found: ID does not exist" containerID="c7a964fde45a4895e5a34345a546137dd8086f9194dd4ea418414bf0b1467cc0" Apr 24 14:26:12.387421 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.387364 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7a964fde45a4895e5a34345a546137dd8086f9194dd4ea418414bf0b1467cc0"} err="failed to get container status \"c7a964fde45a4895e5a34345a546137dd8086f9194dd4ea418414bf0b1467cc0\": rpc error: code = NotFound desc = could not find container \"c7a964fde45a4895e5a34345a546137dd8086f9194dd4ea418414bf0b1467cc0\": container with ID starting with c7a964fde45a4895e5a34345a546137dd8086f9194dd4ea418414bf0b1467cc0 not found: ID does not exist" Apr 24 14:26:12.387421 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.387385 2569 scope.go:117] "RemoveContainer" containerID="c668e2c9ac7c932d990d3860ff99dad8b851d33eac39f488e4e42233c783ef92" Apr 24 14:26:12.387625 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:26:12.387609 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c668e2c9ac7c932d990d3860ff99dad8b851d33eac39f488e4e42233c783ef92\": container with ID starting with c668e2c9ac7c932d990d3860ff99dad8b851d33eac39f488e4e42233c783ef92 not found: ID does not exist" containerID="c668e2c9ac7c932d990d3860ff99dad8b851d33eac39f488e4e42233c783ef92" Apr 24 14:26:12.387670 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.387632 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c668e2c9ac7c932d990d3860ff99dad8b851d33eac39f488e4e42233c783ef92"} err="failed to get container status \"c668e2c9ac7c932d990d3860ff99dad8b851d33eac39f488e4e42233c783ef92\": rpc error: code = NotFound desc = could not find container \"c668e2c9ac7c932d990d3860ff99dad8b851d33eac39f488e4e42233c783ef92\": container with ID starting with c668e2c9ac7c932d990d3860ff99dad8b851d33eac39f488e4e42233c783ef92 not found: ID does not exist" Apr 24 14:26:12.387670 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.387646 2569 scope.go:117] "RemoveContainer" containerID="49f9471e6ef925d3fb657dc06e62d2d6e136d99e805473912cc84a23a309c8b0" Apr 24 14:26:12.387880 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:26:12.387863 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49f9471e6ef925d3fb657dc06e62d2d6e136d99e805473912cc84a23a309c8b0\": container with ID starting with 49f9471e6ef925d3fb657dc06e62d2d6e136d99e805473912cc84a23a309c8b0 not found: ID does not exist" containerID="49f9471e6ef925d3fb657dc06e62d2d6e136d99e805473912cc84a23a309c8b0" Apr 24 14:26:12.387947 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.387884 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49f9471e6ef925d3fb657dc06e62d2d6e136d99e805473912cc84a23a309c8b0"} err="failed to get container status \"49f9471e6ef925d3fb657dc06e62d2d6e136d99e805473912cc84a23a309c8b0\": rpc error: code = NotFound desc = could not find container \"49f9471e6ef925d3fb657dc06e62d2d6e136d99e805473912cc84a23a309c8b0\": container with ID starting with 49f9471e6ef925d3fb657dc06e62d2d6e136d99e805473912cc84a23a309c8b0 not found: ID does not exist" Apr 24 14:26:12.390071 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.390057 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:12.391858 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.391824 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 24 14:26:12.391858 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.391852 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 24 14:26:12.392041 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.391840 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-ct8ch\"" Apr 24 14:26:12.392237 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.392188 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 24 14:26:12.392326 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.392239 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 24 14:26:12.392326 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.392292 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 24 14:26:12.392326 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.392238 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 24 14:26:12.392490 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.392379 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 24 14:26:12.392718 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.392702 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 24 14:26:12.399618 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.399014 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 24 14:26:12.400826 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.400793 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 14:26:12.444289 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.444247 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c8f97875-527a-4f66-8d2b-339e82496bab-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"c8f97875-527a-4f66-8d2b-339e82496bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:12.444289 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.444293 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c8f97875-527a-4f66-8d2b-339e82496bab-web-config\") pod \"alertmanager-main-0\" (UID: \"c8f97875-527a-4f66-8d2b-339e82496bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:12.444493 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.444316 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gt4d7\" (UniqueName: \"kubernetes.io/projected/c8f97875-527a-4f66-8d2b-339e82496bab-kube-api-access-gt4d7\") pod \"alertmanager-main-0\" (UID: \"c8f97875-527a-4f66-8d2b-339e82496bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:12.444493 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.444396 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c8f97875-527a-4f66-8d2b-339e82496bab-config-out\") pod \"alertmanager-main-0\" (UID: \"c8f97875-527a-4f66-8d2b-339e82496bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:12.444493 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.444428 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c8f97875-527a-4f66-8d2b-339e82496bab-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"c8f97875-527a-4f66-8d2b-339e82496bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:12.444493 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.444446 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c8f97875-527a-4f66-8d2b-339e82496bab-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"c8f97875-527a-4f66-8d2b-339e82496bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:12.444493 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.444463 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c8f97875-527a-4f66-8d2b-339e82496bab-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"c8f97875-527a-4f66-8d2b-339e82496bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:12.444641 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.444512 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c8f97875-527a-4f66-8d2b-339e82496bab-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"c8f97875-527a-4f66-8d2b-339e82496bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:12.444641 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.444536 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/c8f97875-527a-4f66-8d2b-339e82496bab-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"c8f97875-527a-4f66-8d2b-339e82496bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:12.444641 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.444554 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8f97875-527a-4f66-8d2b-339e82496bab-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"c8f97875-527a-4f66-8d2b-339e82496bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:12.444641 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.444588 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c8f97875-527a-4f66-8d2b-339e82496bab-tls-assets\") pod \"alertmanager-main-0\" (UID: \"c8f97875-527a-4f66-8d2b-339e82496bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:12.444641 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.444604 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c8f97875-527a-4f66-8d2b-339e82496bab-config-volume\") pod \"alertmanager-main-0\" (UID: \"c8f97875-527a-4f66-8d2b-339e82496bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:12.444783 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.444645 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c8f97875-527a-4f66-8d2b-339e82496bab-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"c8f97875-527a-4f66-8d2b-339e82496bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:12.545809 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.545715 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c8f97875-527a-4f66-8d2b-339e82496bab-config-out\") pod \"alertmanager-main-0\" (UID: \"c8f97875-527a-4f66-8d2b-339e82496bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:12.545809 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.545768 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c8f97875-527a-4f66-8d2b-339e82496bab-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"c8f97875-527a-4f66-8d2b-339e82496bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:12.545809 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.545798 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c8f97875-527a-4f66-8d2b-339e82496bab-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"c8f97875-527a-4f66-8d2b-339e82496bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:12.546115 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.545825 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c8f97875-527a-4f66-8d2b-339e82496bab-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"c8f97875-527a-4f66-8d2b-339e82496bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:12.546115 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.545853 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c8f97875-527a-4f66-8d2b-339e82496bab-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"c8f97875-527a-4f66-8d2b-339e82496bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:12.546115 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.545879 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/c8f97875-527a-4f66-8d2b-339e82496bab-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"c8f97875-527a-4f66-8d2b-339e82496bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:12.546115 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.545933 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8f97875-527a-4f66-8d2b-339e82496bab-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"c8f97875-527a-4f66-8d2b-339e82496bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:12.546115 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.545980 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c8f97875-527a-4f66-8d2b-339e82496bab-tls-assets\") pod \"alertmanager-main-0\" (UID: \"c8f97875-527a-4f66-8d2b-339e82496bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:12.546115 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.546007 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c8f97875-527a-4f66-8d2b-339e82496bab-config-volume\") pod \"alertmanager-main-0\" (UID: \"c8f97875-527a-4f66-8d2b-339e82496bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:12.546115 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.546040 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c8f97875-527a-4f66-8d2b-339e82496bab-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"c8f97875-527a-4f66-8d2b-339e82496bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:12.546115 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.546066 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c8f97875-527a-4f66-8d2b-339e82496bab-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"c8f97875-527a-4f66-8d2b-339e82496bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:12.546115 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.546104 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c8f97875-527a-4f66-8d2b-339e82496bab-web-config\") pod \"alertmanager-main-0\" (UID: \"c8f97875-527a-4f66-8d2b-339e82496bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:12.546532 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.546136 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gt4d7\" (UniqueName: \"kubernetes.io/projected/c8f97875-527a-4f66-8d2b-339e82496bab-kube-api-access-gt4d7\") pod \"alertmanager-main-0\" (UID: \"c8f97875-527a-4f66-8d2b-339e82496bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:12.546532 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.546295 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c8f97875-527a-4f66-8d2b-339e82496bab-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"c8f97875-527a-4f66-8d2b-339e82496bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:12.549097 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.549000 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/c8f97875-527a-4f66-8d2b-339e82496bab-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"c8f97875-527a-4f66-8d2b-339e82496bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:12.549097 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.549000 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c8f97875-527a-4f66-8d2b-339e82496bab-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"c8f97875-527a-4f66-8d2b-339e82496bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:12.549353 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.549157 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c8f97875-527a-4f66-8d2b-339e82496bab-config-out\") pod \"alertmanager-main-0\" (UID: \"c8f97875-527a-4f66-8d2b-339e82496bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:12.549353 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.549163 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c8f97875-527a-4f66-8d2b-339e82496bab-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"c8f97875-527a-4f66-8d2b-339e82496bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:12.549353 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.549237 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c8f97875-527a-4f66-8d2b-339e82496bab-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"c8f97875-527a-4f66-8d2b-339e82496bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:12.549353 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.549297 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c8f97875-527a-4f66-8d2b-339e82496bab-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"c8f97875-527a-4f66-8d2b-339e82496bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:12.549568 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.549370 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c8f97875-527a-4f66-8d2b-339e82496bab-config-volume\") pod \"alertmanager-main-0\" (UID: \"c8f97875-527a-4f66-8d2b-339e82496bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:12.549568 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.549542 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c8f97875-527a-4f66-8d2b-339e82496bab-web-config\") pod \"alertmanager-main-0\" (UID: \"c8f97875-527a-4f66-8d2b-339e82496bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:12.549762 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.549741 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c8f97875-527a-4f66-8d2b-339e82496bab-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"c8f97875-527a-4f66-8d2b-339e82496bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:12.549826 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.549788 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8f97875-527a-4f66-8d2b-339e82496bab-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"c8f97875-527a-4f66-8d2b-339e82496bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:12.551211 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.551190 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c8f97875-527a-4f66-8d2b-339e82496bab-tls-assets\") pod \"alertmanager-main-0\" (UID: \"c8f97875-527a-4f66-8d2b-339e82496bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:12.554336 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.554313 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gt4d7\" (UniqueName: \"kubernetes.io/projected/c8f97875-527a-4f66-8d2b-339e82496bab-kube-api-access-gt4d7\") pod \"alertmanager-main-0\" (UID: \"c8f97875-527a-4f66-8d2b-339e82496bab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:12.701091 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.701051 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:12.828428 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:12.828319 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 14:26:12.832193 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:26:12.832163 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8f97875_527a_4f66_8d2b_339e82496bab.slice/crio-4336bcbd508a04f8cd2f31d61504dae409ed05a4b86840a5e31e78d213e5e11e WatchSource:0}: Error finding container 4336bcbd508a04f8cd2f31d61504dae409ed05a4b86840a5e31e78d213e5e11e: Status 404 returned error can't find the container with id 4336bcbd508a04f8cd2f31d61504dae409ed05a4b86840a5e31e78d213e5e11e Apr 24 14:26:13.338372 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:13.338340 2569 generic.go:358] "Generic (PLEG): container finished" podID="c8f97875-527a-4f66-8d2b-339e82496bab" containerID="4556a9b960998a57b156bb33ca8d12c875fa99e41aebc2be91f4050fe0c09f04" exitCode=0 Apr 24 14:26:13.338744 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:13.338406 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c8f97875-527a-4f66-8d2b-339e82496bab","Type":"ContainerDied","Data":"4556a9b960998a57b156bb33ca8d12c875fa99e41aebc2be91f4050fe0c09f04"} Apr 24 14:26:13.338744 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:13.338427 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c8f97875-527a-4f66-8d2b-339e82496bab","Type":"ContainerStarted","Data":"4336bcbd508a04f8cd2f31d61504dae409ed05a4b86840a5e31e78d213e5e11e"} Apr 24 14:26:13.828300 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:13.828265 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="491cc533-cb26-42c1-aaaf-2a211675e0de" path="/var/lib/kubelet/pods/491cc533-cb26-42c1-aaaf-2a211675e0de/volumes" Apr 24 14:26:14.164799 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.164763 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-64c55f559-vskpb"] Apr 24 14:26:14.168252 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.168231 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-64c55f559-vskpb" Apr 24 14:26:14.170231 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.170202 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 24 14:26:14.170349 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.170268 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 24 14:26:14.170349 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.170202 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 24 14:26:14.170471 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.170350 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-mb5cm\"" Apr 24 14:26:14.170704 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.170687 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 24 14:26:14.170704 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.170692 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 24 14:26:14.177829 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.176683 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 24 14:26:14.184458 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.184425 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-64c55f559-vskpb"] Apr 24 14:26:14.260024 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.259978 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cc9e438c-2d31-45f3-8009-2f9d7cbf559b-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-64c55f559-vskpb\" (UID: \"cc9e438c-2d31-45f3-8009-2f9d7cbf559b\") " pod="openshift-monitoring/telemeter-client-64c55f559-vskpb" Apr 24 14:26:14.260024 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.260027 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cc9e438c-2d31-45f3-8009-2f9d7cbf559b-metrics-client-ca\") pod \"telemeter-client-64c55f559-vskpb\" (UID: \"cc9e438c-2d31-45f3-8009-2f9d7cbf559b\") " pod="openshift-monitoring/telemeter-client-64c55f559-vskpb" Apr 24 14:26:14.260234 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.260107 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/cc9e438c-2d31-45f3-8009-2f9d7cbf559b-federate-client-tls\") pod \"telemeter-client-64c55f559-vskpb\" (UID: \"cc9e438c-2d31-45f3-8009-2f9d7cbf559b\") " pod="openshift-monitoring/telemeter-client-64c55f559-vskpb" Apr 24 14:26:14.260234 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.260140 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/cc9e438c-2d31-45f3-8009-2f9d7cbf559b-secret-telemeter-client\") pod \"telemeter-client-64c55f559-vskpb\" (UID: \"cc9e438c-2d31-45f3-8009-2f9d7cbf559b\") " pod="openshift-monitoring/telemeter-client-64c55f559-vskpb" Apr 24 14:26:14.260234 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.260157 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/cc9e438c-2d31-45f3-8009-2f9d7cbf559b-telemeter-client-tls\") pod \"telemeter-client-64c55f559-vskpb\" (UID: \"cc9e438c-2d31-45f3-8009-2f9d7cbf559b\") " pod="openshift-monitoring/telemeter-client-64c55f559-vskpb" Apr 24 14:26:14.260234 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.260173 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmrml\" (UniqueName: \"kubernetes.io/projected/cc9e438c-2d31-45f3-8009-2f9d7cbf559b-kube-api-access-qmrml\") pod \"telemeter-client-64c55f559-vskpb\" (UID: \"cc9e438c-2d31-45f3-8009-2f9d7cbf559b\") " pod="openshift-monitoring/telemeter-client-64c55f559-vskpb" Apr 24 14:26:14.260234 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.260210 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cc9e438c-2d31-45f3-8009-2f9d7cbf559b-telemeter-trusted-ca-bundle\") pod \"telemeter-client-64c55f559-vskpb\" (UID: \"cc9e438c-2d31-45f3-8009-2f9d7cbf559b\") " pod="openshift-monitoring/telemeter-client-64c55f559-vskpb" Apr 24 14:26:14.260384 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.260265 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cc9e438c-2d31-45f3-8009-2f9d7cbf559b-serving-certs-ca-bundle\") pod \"telemeter-client-64c55f559-vskpb\" (UID: \"cc9e438c-2d31-45f3-8009-2f9d7cbf559b\") " pod="openshift-monitoring/telemeter-client-64c55f559-vskpb" Apr 24 14:26:14.344232 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.344193 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c8f97875-527a-4f66-8d2b-339e82496bab","Type":"ContainerStarted","Data":"89baf59387d687c9e2870bfb2c43f767d25afb921e4da62c110fdaf990bd92ac"} Apr 24 14:26:14.344232 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.344231 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c8f97875-527a-4f66-8d2b-339e82496bab","Type":"ContainerStarted","Data":"7812e4793314e0d0c93110b76ba603bc2428ba02b5ca9e5e784699f0f6a17d61"} Apr 24 14:26:14.344232 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.344241 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c8f97875-527a-4f66-8d2b-339e82496bab","Type":"ContainerStarted","Data":"08f9e321280fa78a459bf49a058729aacd079500ca1daf6c73cee6ecec7f85ff"} Apr 24 14:26:14.344824 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.344250 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c8f97875-527a-4f66-8d2b-339e82496bab","Type":"ContainerStarted","Data":"b0f6b76de34582d0bf1cf3bfe065b0f86f29a90912d05416bf52c9ee28c20765"} Apr 24 14:26:14.344824 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.344258 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c8f97875-527a-4f66-8d2b-339e82496bab","Type":"ContainerStarted","Data":"7194dbe4493135d5ebfcf0332dd30e1f362764f75ff31dcd56affc6d84613f5c"} Apr 24 14:26:14.344824 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.344266 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c8f97875-527a-4f66-8d2b-339e82496bab","Type":"ContainerStarted","Data":"fae6bd54a087311156b801285cd2d40eedfeca17b14df9a25cf07ff944d05b98"} Apr 24 14:26:14.361594 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.361563 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/cc9e438c-2d31-45f3-8009-2f9d7cbf559b-federate-client-tls\") pod \"telemeter-client-64c55f559-vskpb\" (UID: \"cc9e438c-2d31-45f3-8009-2f9d7cbf559b\") " pod="openshift-monitoring/telemeter-client-64c55f559-vskpb" Apr 24 14:26:14.361594 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.361597 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/cc9e438c-2d31-45f3-8009-2f9d7cbf559b-secret-telemeter-client\") pod \"telemeter-client-64c55f559-vskpb\" (UID: \"cc9e438c-2d31-45f3-8009-2f9d7cbf559b\") " pod="openshift-monitoring/telemeter-client-64c55f559-vskpb" Apr 24 14:26:14.361779 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.361615 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/cc9e438c-2d31-45f3-8009-2f9d7cbf559b-telemeter-client-tls\") pod \"telemeter-client-64c55f559-vskpb\" (UID: \"cc9e438c-2d31-45f3-8009-2f9d7cbf559b\") " pod="openshift-monitoring/telemeter-client-64c55f559-vskpb" Apr 24 14:26:14.361779 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.361632 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qmrml\" (UniqueName: \"kubernetes.io/projected/cc9e438c-2d31-45f3-8009-2f9d7cbf559b-kube-api-access-qmrml\") pod \"telemeter-client-64c55f559-vskpb\" (UID: \"cc9e438c-2d31-45f3-8009-2f9d7cbf559b\") " pod="openshift-monitoring/telemeter-client-64c55f559-vskpb" Apr 24 14:26:14.361779 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.361661 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cc9e438c-2d31-45f3-8009-2f9d7cbf559b-telemeter-trusted-ca-bundle\") pod \"telemeter-client-64c55f559-vskpb\" (UID: \"cc9e438c-2d31-45f3-8009-2f9d7cbf559b\") " pod="openshift-monitoring/telemeter-client-64c55f559-vskpb" Apr 24 14:26:14.361779 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.361701 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cc9e438c-2d31-45f3-8009-2f9d7cbf559b-serving-certs-ca-bundle\") pod \"telemeter-client-64c55f559-vskpb\" (UID: \"cc9e438c-2d31-45f3-8009-2f9d7cbf559b\") " pod="openshift-monitoring/telemeter-client-64c55f559-vskpb" Apr 24 14:26:14.361779 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.361746 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cc9e438c-2d31-45f3-8009-2f9d7cbf559b-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-64c55f559-vskpb\" (UID: \"cc9e438c-2d31-45f3-8009-2f9d7cbf559b\") " pod="openshift-monitoring/telemeter-client-64c55f559-vskpb" Apr 24 14:26:14.361779 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.361771 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cc9e438c-2d31-45f3-8009-2f9d7cbf559b-metrics-client-ca\") pod \"telemeter-client-64c55f559-vskpb\" (UID: \"cc9e438c-2d31-45f3-8009-2f9d7cbf559b\") " pod="openshift-monitoring/telemeter-client-64c55f559-vskpb" Apr 24 14:26:14.362727 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.362699 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cc9e438c-2d31-45f3-8009-2f9d7cbf559b-metrics-client-ca\") pod \"telemeter-client-64c55f559-vskpb\" (UID: \"cc9e438c-2d31-45f3-8009-2f9d7cbf559b\") " pod="openshift-monitoring/telemeter-client-64c55f559-vskpb" Apr 24 14:26:14.362854 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.362730 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cc9e438c-2d31-45f3-8009-2f9d7cbf559b-serving-certs-ca-bundle\") pod \"telemeter-client-64c55f559-vskpb\" (UID: \"cc9e438c-2d31-45f3-8009-2f9d7cbf559b\") " pod="openshift-monitoring/telemeter-client-64c55f559-vskpb" Apr 24 14:26:14.362854 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.362781 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cc9e438c-2d31-45f3-8009-2f9d7cbf559b-telemeter-trusted-ca-bundle\") pod \"telemeter-client-64c55f559-vskpb\" (UID: \"cc9e438c-2d31-45f3-8009-2f9d7cbf559b\") " pod="openshift-monitoring/telemeter-client-64c55f559-vskpb" Apr 24 14:26:14.364411 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.364384 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cc9e438c-2d31-45f3-8009-2f9d7cbf559b-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-64c55f559-vskpb\" (UID: \"cc9e438c-2d31-45f3-8009-2f9d7cbf559b\") " pod="openshift-monitoring/telemeter-client-64c55f559-vskpb" Apr 24 14:26:14.364503 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.364435 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/cc9e438c-2d31-45f3-8009-2f9d7cbf559b-federate-client-tls\") pod \"telemeter-client-64c55f559-vskpb\" (UID: \"cc9e438c-2d31-45f3-8009-2f9d7cbf559b\") " pod="openshift-monitoring/telemeter-client-64c55f559-vskpb" Apr 24 14:26:14.364555 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.364535 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/cc9e438c-2d31-45f3-8009-2f9d7cbf559b-secret-telemeter-client\") pod \"telemeter-client-64c55f559-vskpb\" (UID: \"cc9e438c-2d31-45f3-8009-2f9d7cbf559b\") " pod="openshift-monitoring/telemeter-client-64c55f559-vskpb" Apr 24 14:26:14.364591 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.364537 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/cc9e438c-2d31-45f3-8009-2f9d7cbf559b-telemeter-client-tls\") pod \"telemeter-client-64c55f559-vskpb\" (UID: \"cc9e438c-2d31-45f3-8009-2f9d7cbf559b\") " pod="openshift-monitoring/telemeter-client-64c55f559-vskpb" Apr 24 14:26:14.368615 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.368571 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.368557036 podStartE2EDuration="2.368557036s" podCreationTimestamp="2026-04-24 14:26:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:26:14.367356531 +0000 UTC m=+127.161413459" watchObservedRunningTime="2026-04-24 14:26:14.368557036 +0000 UTC m=+127.162613989" Apr 24 14:26:14.371403 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.371386 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmrml\" (UniqueName: \"kubernetes.io/projected/cc9e438c-2d31-45f3-8009-2f9d7cbf559b-kube-api-access-qmrml\") pod \"telemeter-client-64c55f559-vskpb\" (UID: \"cc9e438c-2d31-45f3-8009-2f9d7cbf559b\") " pod="openshift-monitoring/telemeter-client-64c55f559-vskpb" Apr 24 14:26:14.479328 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.479248 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-64c55f559-vskpb" Apr 24 14:26:14.493016 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.492979 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 14:26:14.493534 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.493481 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="458b1119-d234-4d63-972b-ba4c4df4cf92" containerName="prometheus" containerID="cri-o://86e58bc6115a55d4672087408ec746f7edfd1833e0663f17c526f96be6236c64" gracePeriod=600 Apr 24 14:26:14.493649 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.493557 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="458b1119-d234-4d63-972b-ba4c4df4cf92" containerName="kube-rbac-proxy-web" containerID="cri-o://96b926e0feb16bad177532867a108859c626a188c93b793c3c8030ea8a719b06" gracePeriod=600 Apr 24 14:26:14.493649 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.493547 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="458b1119-d234-4d63-972b-ba4c4df4cf92" containerName="thanos-sidecar" containerID="cri-o://049feeca0ffcdcaaccd16c053f944385a24719a2e0c8341a3e165fe47e7a728c" gracePeriod=600 Apr 24 14:26:14.493649 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.493635 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="458b1119-d234-4d63-972b-ba4c4df4cf92" containerName="config-reloader" containerID="cri-o://e61069b50e611967356acf7d01168966d16076edbe579200c6af495ddd7d8647" gracePeriod=600 Apr 24 14:26:14.493797 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.493654 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="458b1119-d234-4d63-972b-ba4c4df4cf92" containerName="kube-rbac-proxy-thanos" containerID="cri-o://9a3198ec242ad368c26c77c13b0c1016f6d75531d0f1ce0ab1b78cf15eb7c09f" gracePeriod=600 Apr 24 14:26:14.494048 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.493943 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="458b1119-d234-4d63-972b-ba4c4df4cf92" containerName="kube-rbac-proxy" containerID="cri-o://dbb34f6def7a3afd66a9e2597315fc1350569ba1e9f5568f45e8955a06ad0d08" gracePeriod=600 Apr 24 14:26:14.610088 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.610053 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-64c55f559-vskpb"] Apr 24 14:26:14.613380 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:26:14.613352 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc9e438c_2d31_45f3_8009_2f9d7cbf559b.slice/crio-fbbcab95a27107a7346474aff02048d3380fd533263e953fb09ddcdce16c0db0 WatchSource:0}: Error finding container fbbcab95a27107a7346474aff02048d3380fd533263e953fb09ddcdce16c0db0: Status 404 returned error can't find the container with id fbbcab95a27107a7346474aff02048d3380fd533263e953fb09ddcdce16c0db0 Apr 24 14:26:14.747504 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.747480 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:14.866710 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.866670 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/458b1119-d234-4d63-972b-ba4c4df4cf92-tls-assets\") pod \"458b1119-d234-4d63-972b-ba4c4df4cf92\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " Apr 24 14:26:14.866710 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.866714 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/458b1119-d234-4d63-972b-ba4c4df4cf92-configmap-kubelet-serving-ca-bundle\") pod \"458b1119-d234-4d63-972b-ba4c4df4cf92\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " Apr 24 14:26:14.866991 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.866746 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/458b1119-d234-4d63-972b-ba4c4df4cf92-configmap-serving-certs-ca-bundle\") pod \"458b1119-d234-4d63-972b-ba4c4df4cf92\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " Apr 24 14:26:14.866991 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.866768 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/458b1119-d234-4d63-972b-ba4c4df4cf92-prometheus-k8s-db\") pod \"458b1119-d234-4d63-972b-ba4c4df4cf92\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " Apr 24 14:26:14.866991 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.866916 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/458b1119-d234-4d63-972b-ba4c4df4cf92-config\") pod \"458b1119-d234-4d63-972b-ba4c4df4cf92\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " Apr 24 14:26:14.866991 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.866960 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/458b1119-d234-4d63-972b-ba4c4df4cf92-secret-prometheus-k8s-tls\") pod \"458b1119-d234-4d63-972b-ba4c4df4cf92\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " Apr 24 14:26:14.867192 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.866993 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/458b1119-d234-4d63-972b-ba4c4df4cf92-prometheus-trusted-ca-bundle\") pod \"458b1119-d234-4d63-972b-ba4c4df4cf92\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " Apr 24 14:26:14.867192 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.867027 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/458b1119-d234-4d63-972b-ba4c4df4cf92-secret-grpc-tls\") pod \"458b1119-d234-4d63-972b-ba4c4df4cf92\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " Apr 24 14:26:14.867192 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.867060 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/458b1119-d234-4d63-972b-ba4c4df4cf92-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"458b1119-d234-4d63-972b-ba4c4df4cf92\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " Apr 24 14:26:14.867192 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.867088 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/458b1119-d234-4d63-972b-ba4c4df4cf92-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"458b1119-d234-4d63-972b-ba4c4df4cf92\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " Apr 24 14:26:14.867192 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.867132 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/458b1119-d234-4d63-972b-ba4c4df4cf92-secret-metrics-client-certs\") pod \"458b1119-d234-4d63-972b-ba4c4df4cf92\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " Apr 24 14:26:14.867192 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.867159 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/458b1119-d234-4d63-972b-ba4c4df4cf92-web-config\") pod \"458b1119-d234-4d63-972b-ba4c4df4cf92\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " Apr 24 14:26:14.867192 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.867181 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/458b1119-d234-4d63-972b-ba4c4df4cf92-prometheus-k8s-rulefiles-0\") pod \"458b1119-d234-4d63-972b-ba4c4df4cf92\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " Apr 24 14:26:14.867523 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.867204 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/458b1119-d234-4d63-972b-ba4c4df4cf92-configmap-metrics-client-ca\") pod \"458b1119-d234-4d63-972b-ba4c4df4cf92\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " Apr 24 14:26:14.867523 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.867259 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/458b1119-d234-4d63-972b-ba4c4df4cf92-config-out\") pod \"458b1119-d234-4d63-972b-ba4c4df4cf92\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " Apr 24 14:26:14.867523 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.867301 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpdpf\" (UniqueName: \"kubernetes.io/projected/458b1119-d234-4d63-972b-ba4c4df4cf92-kube-api-access-bpdpf\") pod \"458b1119-d234-4d63-972b-ba4c4df4cf92\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " Apr 24 14:26:14.869688 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.867199 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/458b1119-d234-4d63-972b-ba4c4df4cf92-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "458b1119-d234-4d63-972b-ba4c4df4cf92" (UID: "458b1119-d234-4d63-972b-ba4c4df4cf92"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:26:14.869688 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.867207 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/458b1119-d234-4d63-972b-ba4c4df4cf92-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "458b1119-d234-4d63-972b-ba4c4df4cf92" (UID: "458b1119-d234-4d63-972b-ba4c4df4cf92"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:26:14.869871 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.868046 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/458b1119-d234-4d63-972b-ba4c4df4cf92-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "458b1119-d234-4d63-972b-ba4c4df4cf92" (UID: "458b1119-d234-4d63-972b-ba4c4df4cf92"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:26:14.869871 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.869694 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/458b1119-d234-4d63-972b-ba4c4df4cf92-thanos-prometheus-http-client-file\") pod \"458b1119-d234-4d63-972b-ba4c4df4cf92\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " Apr 24 14:26:14.869871 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.869744 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/458b1119-d234-4d63-972b-ba4c4df4cf92-secret-kube-rbac-proxy\") pod \"458b1119-d234-4d63-972b-ba4c4df4cf92\" (UID: \"458b1119-d234-4d63-972b-ba4c4df4cf92\") " Apr 24 14:26:14.870041 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.870031 2569 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/458b1119-d234-4d63-972b-ba4c4df4cf92-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 14:26:14.870093 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.870051 2569 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/458b1119-d234-4d63-972b-ba4c4df4cf92-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 14:26:14.870093 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.869432 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/458b1119-d234-4d63-972b-ba4c4df4cf92-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "458b1119-d234-4d63-972b-ba4c4df4cf92" (UID: "458b1119-d234-4d63-972b-ba4c4df4cf92"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:26:14.870093 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.870068 2569 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/458b1119-d234-4d63-972b-ba4c4df4cf92-prometheus-k8s-db\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 14:26:14.870093 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.869431 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/458b1119-d234-4d63-972b-ba4c4df4cf92-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "458b1119-d234-4d63-972b-ba4c4df4cf92" (UID: "458b1119-d234-4d63-972b-ba4c4df4cf92"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:26:14.870093 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.869605 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/458b1119-d234-4d63-972b-ba4c4df4cf92-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "458b1119-d234-4d63-972b-ba4c4df4cf92" (UID: "458b1119-d234-4d63-972b-ba4c4df4cf92"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:26:14.870093 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.869619 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/458b1119-d234-4d63-972b-ba4c4df4cf92-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "458b1119-d234-4d63-972b-ba4c4df4cf92" (UID: "458b1119-d234-4d63-972b-ba4c4df4cf92"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 14:26:14.870388 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.869623 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/458b1119-d234-4d63-972b-ba4c4df4cf92-config" (OuterVolumeSpecName: "config") pod "458b1119-d234-4d63-972b-ba4c4df4cf92" (UID: "458b1119-d234-4d63-972b-ba4c4df4cf92"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:26:14.870388 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.869967 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/458b1119-d234-4d63-972b-ba4c4df4cf92-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "458b1119-d234-4d63-972b-ba4c4df4cf92" (UID: "458b1119-d234-4d63-972b-ba4c4df4cf92"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:26:14.870388 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.870008 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/458b1119-d234-4d63-972b-ba4c4df4cf92-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "458b1119-d234-4d63-972b-ba4c4df4cf92" (UID: "458b1119-d234-4d63-972b-ba4c4df4cf92"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:26:14.870388 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.870046 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/458b1119-d234-4d63-972b-ba4c4df4cf92-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "458b1119-d234-4d63-972b-ba4c4df4cf92" (UID: "458b1119-d234-4d63-972b-ba4c4df4cf92"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:26:14.870581 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.870473 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/458b1119-d234-4d63-972b-ba4c4df4cf92-kube-api-access-bpdpf" (OuterVolumeSpecName: "kube-api-access-bpdpf") pod "458b1119-d234-4d63-972b-ba4c4df4cf92" (UID: "458b1119-d234-4d63-972b-ba4c4df4cf92"). InnerVolumeSpecName "kube-api-access-bpdpf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 14:26:14.870667 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.870644 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/458b1119-d234-4d63-972b-ba4c4df4cf92-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "458b1119-d234-4d63-972b-ba4c4df4cf92" (UID: "458b1119-d234-4d63-972b-ba4c4df4cf92"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:26:14.871731 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.871704 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/458b1119-d234-4d63-972b-ba4c4df4cf92-config-out" (OuterVolumeSpecName: "config-out") pod "458b1119-d234-4d63-972b-ba4c4df4cf92" (UID: "458b1119-d234-4d63-972b-ba4c4df4cf92"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:26:14.871828 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.871753 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/458b1119-d234-4d63-972b-ba4c4df4cf92-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "458b1119-d234-4d63-972b-ba4c4df4cf92" (UID: "458b1119-d234-4d63-972b-ba4c4df4cf92"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:26:14.871977 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.871958 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/458b1119-d234-4d63-972b-ba4c4df4cf92-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "458b1119-d234-4d63-972b-ba4c4df4cf92" (UID: "458b1119-d234-4d63-972b-ba4c4df4cf92"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:26:14.872285 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.872269 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/458b1119-d234-4d63-972b-ba4c4df4cf92-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "458b1119-d234-4d63-972b-ba4c4df4cf92" (UID: "458b1119-d234-4d63-972b-ba4c4df4cf92"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:26:14.880772 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.880751 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/458b1119-d234-4d63-972b-ba4c4df4cf92-web-config" (OuterVolumeSpecName: "web-config") pod "458b1119-d234-4d63-972b-ba4c4df4cf92" (UID: "458b1119-d234-4d63-972b-ba4c4df4cf92"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:26:14.970375 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.970338 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bpdpf\" (UniqueName: \"kubernetes.io/projected/458b1119-d234-4d63-972b-ba4c4df4cf92-kube-api-access-bpdpf\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 14:26:14.970375 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.970368 2569 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/458b1119-d234-4d63-972b-ba4c4df4cf92-thanos-prometheus-http-client-file\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 14:26:14.970375 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.970378 2569 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/458b1119-d234-4d63-972b-ba4c4df4cf92-secret-kube-rbac-proxy\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 14:26:14.970613 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.970389 2569 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/458b1119-d234-4d63-972b-ba4c4df4cf92-tls-assets\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 14:26:14.970613 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.970399 2569 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/458b1119-d234-4d63-972b-ba4c4df4cf92-config\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 14:26:14.970613 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.970408 2569 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/458b1119-d234-4d63-972b-ba4c4df4cf92-secret-prometheus-k8s-tls\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 14:26:14.970613 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.970417 2569 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/458b1119-d234-4d63-972b-ba4c4df4cf92-prometheus-trusted-ca-bundle\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 14:26:14.970613 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.970426 2569 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/458b1119-d234-4d63-972b-ba4c4df4cf92-secret-grpc-tls\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 14:26:14.970613 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.970435 2569 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/458b1119-d234-4d63-972b-ba4c4df4cf92-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 14:26:14.970613 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.970444 2569 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/458b1119-d234-4d63-972b-ba4c4df4cf92-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 14:26:14.970613 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.970454 2569 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/458b1119-d234-4d63-972b-ba4c4df4cf92-secret-metrics-client-certs\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 14:26:14.970613 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.970463 2569 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/458b1119-d234-4d63-972b-ba4c4df4cf92-web-config\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 14:26:14.970613 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.970472 2569 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/458b1119-d234-4d63-972b-ba4c4df4cf92-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 14:26:14.970613 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.970482 2569 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/458b1119-d234-4d63-972b-ba4c4df4cf92-configmap-metrics-client-ca\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 14:26:14.970613 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:14.970490 2569 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/458b1119-d234-4d63-972b-ba4c4df4cf92-config-out\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 14:26:15.351070 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.351030 2569 generic.go:358] "Generic (PLEG): container finished" podID="458b1119-d234-4d63-972b-ba4c4df4cf92" containerID="9a3198ec242ad368c26c77c13b0c1016f6d75531d0f1ce0ab1b78cf15eb7c09f" exitCode=0 Apr 24 14:26:15.351070 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.351062 2569 generic.go:358] "Generic (PLEG): container finished" podID="458b1119-d234-4d63-972b-ba4c4df4cf92" containerID="dbb34f6def7a3afd66a9e2597315fc1350569ba1e9f5568f45e8955a06ad0d08" exitCode=0 Apr 24 14:26:15.351070 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.351069 2569 generic.go:358] "Generic (PLEG): container finished" podID="458b1119-d234-4d63-972b-ba4c4df4cf92" containerID="96b926e0feb16bad177532867a108859c626a188c93b793c3c8030ea8a719b06" exitCode=0 Apr 24 14:26:15.351070 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.351074 2569 generic.go:358] "Generic (PLEG): container finished" podID="458b1119-d234-4d63-972b-ba4c4df4cf92" containerID="049feeca0ffcdcaaccd16c053f944385a24719a2e0c8341a3e165fe47e7a728c" exitCode=0 Apr 24 14:26:15.351070 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.351079 2569 generic.go:358] "Generic (PLEG): container finished" podID="458b1119-d234-4d63-972b-ba4c4df4cf92" containerID="e61069b50e611967356acf7d01168966d16076edbe579200c6af495ddd7d8647" exitCode=0 Apr 24 14:26:15.351070 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.351085 2569 generic.go:358] "Generic (PLEG): container finished" podID="458b1119-d234-4d63-972b-ba4c4df4cf92" containerID="86e58bc6115a55d4672087408ec746f7edfd1833e0663f17c526f96be6236c64" exitCode=0 Apr 24 14:26:15.351738 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.351109 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"458b1119-d234-4d63-972b-ba4c4df4cf92","Type":"ContainerDied","Data":"9a3198ec242ad368c26c77c13b0c1016f6d75531d0f1ce0ab1b78cf15eb7c09f"} Apr 24 14:26:15.351738 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.351158 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"458b1119-d234-4d63-972b-ba4c4df4cf92","Type":"ContainerDied","Data":"dbb34f6def7a3afd66a9e2597315fc1350569ba1e9f5568f45e8955a06ad0d08"} Apr 24 14:26:15.351738 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.351171 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:15.351738 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.351190 2569 scope.go:117] "RemoveContainer" containerID="9a3198ec242ad368c26c77c13b0c1016f6d75531d0f1ce0ab1b78cf15eb7c09f" Apr 24 14:26:15.351738 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.351173 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"458b1119-d234-4d63-972b-ba4c4df4cf92","Type":"ContainerDied","Data":"96b926e0feb16bad177532867a108859c626a188c93b793c3c8030ea8a719b06"} Apr 24 14:26:15.351738 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.351286 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"458b1119-d234-4d63-972b-ba4c4df4cf92","Type":"ContainerDied","Data":"049feeca0ffcdcaaccd16c053f944385a24719a2e0c8341a3e165fe47e7a728c"} Apr 24 14:26:15.351738 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.351319 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"458b1119-d234-4d63-972b-ba4c4df4cf92","Type":"ContainerDied","Data":"e61069b50e611967356acf7d01168966d16076edbe579200c6af495ddd7d8647"} Apr 24 14:26:15.351738 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.351334 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"458b1119-d234-4d63-972b-ba4c4df4cf92","Type":"ContainerDied","Data":"86e58bc6115a55d4672087408ec746f7edfd1833e0663f17c526f96be6236c64"} Apr 24 14:26:15.351738 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.351349 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"458b1119-d234-4d63-972b-ba4c4df4cf92","Type":"ContainerDied","Data":"fdf700e930fcb436939d7f566b1893a3d847b81cfc0ffd9ba1e22a34669d661d"} Apr 24 14:26:15.352612 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.352584 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-64c55f559-vskpb" event={"ID":"cc9e438c-2d31-45f3-8009-2f9d7cbf559b","Type":"ContainerStarted","Data":"fbbcab95a27107a7346474aff02048d3380fd533263e953fb09ddcdce16c0db0"} Apr 24 14:26:15.360042 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.360025 2569 scope.go:117] "RemoveContainer" containerID="dbb34f6def7a3afd66a9e2597315fc1350569ba1e9f5568f45e8955a06ad0d08" Apr 24 14:26:15.366606 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.366590 2569 scope.go:117] "RemoveContainer" containerID="96b926e0feb16bad177532867a108859c626a188c93b793c3c8030ea8a719b06" Apr 24 14:26:15.373554 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.373532 2569 scope.go:117] "RemoveContainer" containerID="049feeca0ffcdcaaccd16c053f944385a24719a2e0c8341a3e165fe47e7a728c" Apr 24 14:26:15.375294 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.375268 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 14:26:15.379597 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.379569 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 14:26:15.383648 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.383626 2569 scope.go:117] "RemoveContainer" containerID="e61069b50e611967356acf7d01168966d16076edbe579200c6af495ddd7d8647" Apr 24 14:26:15.391005 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.390982 2569 scope.go:117] "RemoveContainer" containerID="86e58bc6115a55d4672087408ec746f7edfd1833e0663f17c526f96be6236c64" Apr 24 14:26:15.399375 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.399350 2569 scope.go:117] "RemoveContainer" containerID="6acc7374dc061699ab357ebf419d09b4a45ceaf0c918522d1f5f0dd89631430d" Apr 24 14:26:15.403293 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.403270 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 14:26:15.403636 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.403620 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="458b1119-d234-4d63-972b-ba4c4df4cf92" containerName="kube-rbac-proxy-web" Apr 24 14:26:15.403636 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.403639 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="458b1119-d234-4d63-972b-ba4c4df4cf92" containerName="kube-rbac-proxy-web" Apr 24 14:26:15.403771 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.403655 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="458b1119-d234-4d63-972b-ba4c4df4cf92" containerName="config-reloader" Apr 24 14:26:15.403771 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.403663 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="458b1119-d234-4d63-972b-ba4c4df4cf92" containerName="config-reloader" Apr 24 14:26:15.403771 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.403680 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="458b1119-d234-4d63-972b-ba4c4df4cf92" containerName="prometheus" Apr 24 14:26:15.403771 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.403689 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="458b1119-d234-4d63-972b-ba4c4df4cf92" containerName="prometheus" Apr 24 14:26:15.403771 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.403702 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="458b1119-d234-4d63-972b-ba4c4df4cf92" containerName="kube-rbac-proxy" Apr 24 14:26:15.403771 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.403711 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="458b1119-d234-4d63-972b-ba4c4df4cf92" containerName="kube-rbac-proxy" Apr 24 14:26:15.403771 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.403725 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="458b1119-d234-4d63-972b-ba4c4df4cf92" containerName="kube-rbac-proxy-thanos" Apr 24 14:26:15.403771 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.403733 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="458b1119-d234-4d63-972b-ba4c4df4cf92" containerName="kube-rbac-proxy-thanos" Apr 24 14:26:15.403771 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.403748 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="458b1119-d234-4d63-972b-ba4c4df4cf92" containerName="init-config-reloader" Apr 24 14:26:15.403771 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.403757 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="458b1119-d234-4d63-972b-ba4c4df4cf92" containerName="init-config-reloader" Apr 24 14:26:15.403771 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.403767 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="458b1119-d234-4d63-972b-ba4c4df4cf92" containerName="thanos-sidecar" Apr 24 14:26:15.403771 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.403775 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="458b1119-d234-4d63-972b-ba4c4df4cf92" containerName="thanos-sidecar" Apr 24 14:26:15.404328 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.403841 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="458b1119-d234-4d63-972b-ba4c4df4cf92" containerName="thanos-sidecar" Apr 24 14:26:15.404328 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.403853 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="458b1119-d234-4d63-972b-ba4c4df4cf92" containerName="kube-rbac-proxy" Apr 24 14:26:15.404328 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.403866 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="458b1119-d234-4d63-972b-ba4c4df4cf92" containerName="prometheus" Apr 24 14:26:15.404328 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.403875 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="458b1119-d234-4d63-972b-ba4c4df4cf92" containerName="config-reloader" Apr 24 14:26:15.404328 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.403885 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="458b1119-d234-4d63-972b-ba4c4df4cf92" containerName="kube-rbac-proxy-web" Apr 24 14:26:15.404328 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.403910 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="458b1119-d234-4d63-972b-ba4c4df4cf92" containerName="kube-rbac-proxy-thanos" Apr 24 14:26:15.408660 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.408632 2569 scope.go:117] "RemoveContainer" containerID="9a3198ec242ad368c26c77c13b0c1016f6d75531d0f1ce0ab1b78cf15eb7c09f" Apr 24 14:26:15.409022 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:26:15.409001 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a3198ec242ad368c26c77c13b0c1016f6d75531d0f1ce0ab1b78cf15eb7c09f\": container with ID starting with 9a3198ec242ad368c26c77c13b0c1016f6d75531d0f1ce0ab1b78cf15eb7c09f not found: ID does not exist" containerID="9a3198ec242ad368c26c77c13b0c1016f6d75531d0f1ce0ab1b78cf15eb7c09f" Apr 24 14:26:15.409108 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.409030 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a3198ec242ad368c26c77c13b0c1016f6d75531d0f1ce0ab1b78cf15eb7c09f"} err="failed to get container status \"9a3198ec242ad368c26c77c13b0c1016f6d75531d0f1ce0ab1b78cf15eb7c09f\": rpc error: code = NotFound desc = could not find container \"9a3198ec242ad368c26c77c13b0c1016f6d75531d0f1ce0ab1b78cf15eb7c09f\": container with ID starting with 9a3198ec242ad368c26c77c13b0c1016f6d75531d0f1ce0ab1b78cf15eb7c09f not found: ID does not exist" Apr 24 14:26:15.409108 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.409053 2569 scope.go:117] "RemoveContainer" containerID="dbb34f6def7a3afd66a9e2597315fc1350569ba1e9f5568f45e8955a06ad0d08" Apr 24 14:26:15.409192 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.409139 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:15.409500 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:26:15.409451 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbb34f6def7a3afd66a9e2597315fc1350569ba1e9f5568f45e8955a06ad0d08\": container with ID starting with dbb34f6def7a3afd66a9e2597315fc1350569ba1e9f5568f45e8955a06ad0d08 not found: ID does not exist" containerID="dbb34f6def7a3afd66a9e2597315fc1350569ba1e9f5568f45e8955a06ad0d08" Apr 24 14:26:15.410979 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.409510 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbb34f6def7a3afd66a9e2597315fc1350569ba1e9f5568f45e8955a06ad0d08"} err="failed to get container status \"dbb34f6def7a3afd66a9e2597315fc1350569ba1e9f5568f45e8955a06ad0d08\": rpc error: code = NotFound desc = could not find container \"dbb34f6def7a3afd66a9e2597315fc1350569ba1e9f5568f45e8955a06ad0d08\": container with ID starting with dbb34f6def7a3afd66a9e2597315fc1350569ba1e9f5568f45e8955a06ad0d08 not found: ID does not exist" Apr 24 14:26:15.410979 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.410745 2569 scope.go:117] "RemoveContainer" containerID="96b926e0feb16bad177532867a108859c626a188c93b793c3c8030ea8a719b06" Apr 24 14:26:15.411466 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:26:15.411446 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96b926e0feb16bad177532867a108859c626a188c93b793c3c8030ea8a719b06\": container with ID starting with 96b926e0feb16bad177532867a108859c626a188c93b793c3c8030ea8a719b06 not found: ID does not exist" containerID="96b926e0feb16bad177532867a108859c626a188c93b793c3c8030ea8a719b06" Apr 24 14:26:15.411553 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.411477 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96b926e0feb16bad177532867a108859c626a188c93b793c3c8030ea8a719b06"} err="failed to get container status \"96b926e0feb16bad177532867a108859c626a188c93b793c3c8030ea8a719b06\": rpc error: code = NotFound desc = could not find container \"96b926e0feb16bad177532867a108859c626a188c93b793c3c8030ea8a719b06\": container with ID starting with 96b926e0feb16bad177532867a108859c626a188c93b793c3c8030ea8a719b06 not found: ID does not exist" Apr 24 14:26:15.411553 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.411528 2569 scope.go:117] "RemoveContainer" containerID="049feeca0ffcdcaaccd16c053f944385a24719a2e0c8341a3e165fe47e7a728c" Apr 24 14:26:15.412402 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:26:15.412027 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"049feeca0ffcdcaaccd16c053f944385a24719a2e0c8341a3e165fe47e7a728c\": container with ID starting with 049feeca0ffcdcaaccd16c053f944385a24719a2e0c8341a3e165fe47e7a728c not found: ID does not exist" containerID="049feeca0ffcdcaaccd16c053f944385a24719a2e0c8341a3e165fe47e7a728c" Apr 24 14:26:15.412402 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.412059 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"049feeca0ffcdcaaccd16c053f944385a24719a2e0c8341a3e165fe47e7a728c"} err="failed to get container status \"049feeca0ffcdcaaccd16c053f944385a24719a2e0c8341a3e165fe47e7a728c\": rpc error: code = NotFound desc = could not find container \"049feeca0ffcdcaaccd16c053f944385a24719a2e0c8341a3e165fe47e7a728c\": container with ID starting with 049feeca0ffcdcaaccd16c053f944385a24719a2e0c8341a3e165fe47e7a728c not found: ID does not exist" Apr 24 14:26:15.412402 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.412081 2569 scope.go:117] "RemoveContainer" containerID="e61069b50e611967356acf7d01168966d16076edbe579200c6af495ddd7d8647" Apr 24 14:26:15.412608 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:26:15.412569 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e61069b50e611967356acf7d01168966d16076edbe579200c6af495ddd7d8647\": container with ID starting with e61069b50e611967356acf7d01168966d16076edbe579200c6af495ddd7d8647 not found: ID does not exist" containerID="e61069b50e611967356acf7d01168966d16076edbe579200c6af495ddd7d8647" Apr 24 14:26:15.412663 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.412602 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e61069b50e611967356acf7d01168966d16076edbe579200c6af495ddd7d8647"} err="failed to get container status \"e61069b50e611967356acf7d01168966d16076edbe579200c6af495ddd7d8647\": rpc error: code = NotFound desc = could not find container \"e61069b50e611967356acf7d01168966d16076edbe579200c6af495ddd7d8647\": container with ID starting with e61069b50e611967356acf7d01168966d16076edbe579200c6af495ddd7d8647 not found: ID does not exist" Apr 24 14:26:15.412663 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.412626 2569 scope.go:117] "RemoveContainer" containerID="86e58bc6115a55d4672087408ec746f7edfd1833e0663f17c526f96be6236c64" Apr 24 14:26:15.413276 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.412976 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 24 14:26:15.413276 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.413028 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 24 14:26:15.414292 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.413488 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 24 14:26:15.414292 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:26:15.413507 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86e58bc6115a55d4672087408ec746f7edfd1833e0663f17c526f96be6236c64\": container with ID starting with 86e58bc6115a55d4672087408ec746f7edfd1833e0663f17c526f96be6236c64 not found: ID does not exist" containerID="86e58bc6115a55d4672087408ec746f7edfd1833e0663f17c526f96be6236c64" Apr 24 14:26:15.414292 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.413538 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86e58bc6115a55d4672087408ec746f7edfd1833e0663f17c526f96be6236c64"} err="failed to get container status \"86e58bc6115a55d4672087408ec746f7edfd1833e0663f17c526f96be6236c64\": rpc error: code = NotFound desc = could not find container \"86e58bc6115a55d4672087408ec746f7edfd1833e0663f17c526f96be6236c64\": container with ID starting with 86e58bc6115a55d4672087408ec746f7edfd1833e0663f17c526f96be6236c64 not found: ID does not exist" Apr 24 14:26:15.414292 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.413559 2569 scope.go:117] "RemoveContainer" containerID="6acc7374dc061699ab357ebf419d09b4a45ceaf0c918522d1f5f0dd89631430d" Apr 24 14:26:15.414292 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.413768 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 24 14:26:15.414292 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.413790 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 24 14:26:15.414292 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.413842 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 24 14:26:15.414292 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:26:15.414000 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6acc7374dc061699ab357ebf419d09b4a45ceaf0c918522d1f5f0dd89631430d\": container with ID starting with 6acc7374dc061699ab357ebf419d09b4a45ceaf0c918522d1f5f0dd89631430d not found: ID does not exist" containerID="6acc7374dc061699ab357ebf419d09b4a45ceaf0c918522d1f5f0dd89631430d" Apr 24 14:26:15.414292 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.414026 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6acc7374dc061699ab357ebf419d09b4a45ceaf0c918522d1f5f0dd89631430d"} err="failed to get container status \"6acc7374dc061699ab357ebf419d09b4a45ceaf0c918522d1f5f0dd89631430d\": rpc error: code = NotFound desc = could not find container \"6acc7374dc061699ab357ebf419d09b4a45ceaf0c918522d1f5f0dd89631430d\": container with ID starting with 6acc7374dc061699ab357ebf419d09b4a45ceaf0c918522d1f5f0dd89631430d not found: ID does not exist" Apr 24 14:26:15.414292 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.414054 2569 scope.go:117] "RemoveContainer" containerID="9a3198ec242ad368c26c77c13b0c1016f6d75531d0f1ce0ab1b78cf15eb7c09f" Apr 24 14:26:15.414292 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.414008 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-83smvvk6e5q42\"" Apr 24 14:26:15.414886 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.414396 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-cb6hn\"" Apr 24 14:26:15.414886 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.414686 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 24 14:26:15.414886 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.414723 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 24 14:26:15.415048 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.414959 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a3198ec242ad368c26c77c13b0c1016f6d75531d0f1ce0ab1b78cf15eb7c09f"} err="failed to get container status \"9a3198ec242ad368c26c77c13b0c1016f6d75531d0f1ce0ab1b78cf15eb7c09f\": rpc error: code = NotFound desc = could not find container \"9a3198ec242ad368c26c77c13b0c1016f6d75531d0f1ce0ab1b78cf15eb7c09f\": container with ID starting with 9a3198ec242ad368c26c77c13b0c1016f6d75531d0f1ce0ab1b78cf15eb7c09f not found: ID does not exist" Apr 24 14:26:15.415048 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.414984 2569 scope.go:117] "RemoveContainer" containerID="dbb34f6def7a3afd66a9e2597315fc1350569ba1e9f5568f45e8955a06ad0d08" Apr 24 14:26:15.415576 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.415485 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbb34f6def7a3afd66a9e2597315fc1350569ba1e9f5568f45e8955a06ad0d08"} err="failed to get container status \"dbb34f6def7a3afd66a9e2597315fc1350569ba1e9f5568f45e8955a06ad0d08\": rpc error: code = NotFound desc = could not find container \"dbb34f6def7a3afd66a9e2597315fc1350569ba1e9f5568f45e8955a06ad0d08\": container with ID starting with dbb34f6def7a3afd66a9e2597315fc1350569ba1e9f5568f45e8955a06ad0d08 not found: ID does not exist" Apr 24 14:26:15.415576 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.415514 2569 scope.go:117] "RemoveContainer" containerID="96b926e0feb16bad177532867a108859c626a188c93b793c3c8030ea8a719b06" Apr 24 14:26:15.416559 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.416227 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 24 14:26:15.417450 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.417031 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96b926e0feb16bad177532867a108859c626a188c93b793c3c8030ea8a719b06"} err="failed to get container status \"96b926e0feb16bad177532867a108859c626a188c93b793c3c8030ea8a719b06\": rpc error: code = NotFound desc = could not find container \"96b926e0feb16bad177532867a108859c626a188c93b793c3c8030ea8a719b06\": container with ID starting with 96b926e0feb16bad177532867a108859c626a188c93b793c3c8030ea8a719b06 not found: ID does not exist" Apr 24 14:26:15.417450 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.417063 2569 scope.go:117] "RemoveContainer" containerID="049feeca0ffcdcaaccd16c053f944385a24719a2e0c8341a3e165fe47e7a728c" Apr 24 14:26:15.419494 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.418919 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 24 14:26:15.419732 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.419711 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 24 14:26:15.419854 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.419772 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 24 14:26:15.421322 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.420838 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"049feeca0ffcdcaaccd16c053f944385a24719a2e0c8341a3e165fe47e7a728c"} err="failed to get container status \"049feeca0ffcdcaaccd16c053f944385a24719a2e0c8341a3e165fe47e7a728c\": rpc error: code = NotFound desc = could not find container \"049feeca0ffcdcaaccd16c053f944385a24719a2e0c8341a3e165fe47e7a728c\": container with ID starting with 049feeca0ffcdcaaccd16c053f944385a24719a2e0c8341a3e165fe47e7a728c not found: ID does not exist" Apr 24 14:26:15.421322 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.420866 2569 scope.go:117] "RemoveContainer" containerID="e61069b50e611967356acf7d01168966d16076edbe579200c6af495ddd7d8647" Apr 24 14:26:15.421687 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.421653 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 14:26:15.422046 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.421956 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e61069b50e611967356acf7d01168966d16076edbe579200c6af495ddd7d8647"} err="failed to get container status \"e61069b50e611967356acf7d01168966d16076edbe579200c6af495ddd7d8647\": rpc error: code = NotFound desc = could not find container \"e61069b50e611967356acf7d01168966d16076edbe579200c6af495ddd7d8647\": container with ID starting with e61069b50e611967356acf7d01168966d16076edbe579200c6af495ddd7d8647 not found: ID does not exist" Apr 24 14:26:15.422046 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.421984 2569 scope.go:117] "RemoveContainer" containerID="86e58bc6115a55d4672087408ec746f7edfd1833e0663f17c526f96be6236c64" Apr 24 14:26:15.422277 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.422248 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86e58bc6115a55d4672087408ec746f7edfd1833e0663f17c526f96be6236c64"} err="failed to get container status \"86e58bc6115a55d4672087408ec746f7edfd1833e0663f17c526f96be6236c64\": rpc error: code = NotFound desc = could not find container \"86e58bc6115a55d4672087408ec746f7edfd1833e0663f17c526f96be6236c64\": container with ID starting with 86e58bc6115a55d4672087408ec746f7edfd1833e0663f17c526f96be6236c64 not found: ID does not exist" Apr 24 14:26:15.422277 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.422277 2569 scope.go:117] "RemoveContainer" containerID="6acc7374dc061699ab357ebf419d09b4a45ceaf0c918522d1f5f0dd89631430d" Apr 24 14:26:15.422610 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.422547 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6acc7374dc061699ab357ebf419d09b4a45ceaf0c918522d1f5f0dd89631430d"} err="failed to get container status \"6acc7374dc061699ab357ebf419d09b4a45ceaf0c918522d1f5f0dd89631430d\": rpc error: code = NotFound desc = could not find container \"6acc7374dc061699ab357ebf419d09b4a45ceaf0c918522d1f5f0dd89631430d\": container with ID starting with 6acc7374dc061699ab357ebf419d09b4a45ceaf0c918522d1f5f0dd89631430d not found: ID does not exist" Apr 24 14:26:15.422692 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.422614 2569 scope.go:117] "RemoveContainer" containerID="9a3198ec242ad368c26c77c13b0c1016f6d75531d0f1ce0ab1b78cf15eb7c09f" Apr 24 14:26:15.423435 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.423347 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a3198ec242ad368c26c77c13b0c1016f6d75531d0f1ce0ab1b78cf15eb7c09f"} err="failed to get container status \"9a3198ec242ad368c26c77c13b0c1016f6d75531d0f1ce0ab1b78cf15eb7c09f\": rpc error: code = NotFound desc = could not find container \"9a3198ec242ad368c26c77c13b0c1016f6d75531d0f1ce0ab1b78cf15eb7c09f\": container with ID starting with 9a3198ec242ad368c26c77c13b0c1016f6d75531d0f1ce0ab1b78cf15eb7c09f not found: ID does not exist" Apr 24 14:26:15.423435 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.423374 2569 scope.go:117] "RemoveContainer" containerID="dbb34f6def7a3afd66a9e2597315fc1350569ba1e9f5568f45e8955a06ad0d08" Apr 24 14:26:15.423689 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.423666 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 24 14:26:15.423816 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.423790 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbb34f6def7a3afd66a9e2597315fc1350569ba1e9f5568f45e8955a06ad0d08"} err="failed to get container status \"dbb34f6def7a3afd66a9e2597315fc1350569ba1e9f5568f45e8955a06ad0d08\": rpc error: code = NotFound desc = could not find container \"dbb34f6def7a3afd66a9e2597315fc1350569ba1e9f5568f45e8955a06ad0d08\": container with ID starting with dbb34f6def7a3afd66a9e2597315fc1350569ba1e9f5568f45e8955a06ad0d08 not found: ID does not exist" Apr 24 14:26:15.423912 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.423818 2569 scope.go:117] "RemoveContainer" containerID="96b926e0feb16bad177532867a108859c626a188c93b793c3c8030ea8a719b06" Apr 24 14:26:15.424098 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.424072 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96b926e0feb16bad177532867a108859c626a188c93b793c3c8030ea8a719b06"} err="failed to get container status \"96b926e0feb16bad177532867a108859c626a188c93b793c3c8030ea8a719b06\": rpc error: code = NotFound desc = could not find container \"96b926e0feb16bad177532867a108859c626a188c93b793c3c8030ea8a719b06\": container with ID starting with 96b926e0feb16bad177532867a108859c626a188c93b793c3c8030ea8a719b06 not found: ID does not exist" Apr 24 14:26:15.424188 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.424099 2569 scope.go:117] "RemoveContainer" containerID="049feeca0ffcdcaaccd16c053f944385a24719a2e0c8341a3e165fe47e7a728c" Apr 24 14:26:15.424366 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.424346 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"049feeca0ffcdcaaccd16c053f944385a24719a2e0c8341a3e165fe47e7a728c"} err="failed to get container status \"049feeca0ffcdcaaccd16c053f944385a24719a2e0c8341a3e165fe47e7a728c\": rpc error: code = NotFound desc = could not find container \"049feeca0ffcdcaaccd16c053f944385a24719a2e0c8341a3e165fe47e7a728c\": container with ID starting with 049feeca0ffcdcaaccd16c053f944385a24719a2e0c8341a3e165fe47e7a728c not found: ID does not exist" Apr 24 14:26:15.424366 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.424366 2569 scope.go:117] "RemoveContainer" containerID="e61069b50e611967356acf7d01168966d16076edbe579200c6af495ddd7d8647" Apr 24 14:26:15.424605 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.424576 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e61069b50e611967356acf7d01168966d16076edbe579200c6af495ddd7d8647"} err="failed to get container status \"e61069b50e611967356acf7d01168966d16076edbe579200c6af495ddd7d8647\": rpc error: code = NotFound desc = could not find container \"e61069b50e611967356acf7d01168966d16076edbe579200c6af495ddd7d8647\": container with ID starting with e61069b50e611967356acf7d01168966d16076edbe579200c6af495ddd7d8647 not found: ID does not exist" Apr 24 14:26:15.424698 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.424606 2569 scope.go:117] "RemoveContainer" containerID="86e58bc6115a55d4672087408ec746f7edfd1833e0663f17c526f96be6236c64" Apr 24 14:26:15.424856 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.424831 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86e58bc6115a55d4672087408ec746f7edfd1833e0663f17c526f96be6236c64"} err="failed to get container status \"86e58bc6115a55d4672087408ec746f7edfd1833e0663f17c526f96be6236c64\": rpc error: code = NotFound desc = could not find container \"86e58bc6115a55d4672087408ec746f7edfd1833e0663f17c526f96be6236c64\": container with ID starting with 86e58bc6115a55d4672087408ec746f7edfd1833e0663f17c526f96be6236c64 not found: ID does not exist" Apr 24 14:26:15.424856 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.424855 2569 scope.go:117] "RemoveContainer" containerID="6acc7374dc061699ab357ebf419d09b4a45ceaf0c918522d1f5f0dd89631430d" Apr 24 14:26:15.425145 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.425121 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6acc7374dc061699ab357ebf419d09b4a45ceaf0c918522d1f5f0dd89631430d"} err="failed to get container status \"6acc7374dc061699ab357ebf419d09b4a45ceaf0c918522d1f5f0dd89631430d\": rpc error: code = NotFound desc = could not find container \"6acc7374dc061699ab357ebf419d09b4a45ceaf0c918522d1f5f0dd89631430d\": container with ID starting with 6acc7374dc061699ab357ebf419d09b4a45ceaf0c918522d1f5f0dd89631430d not found: ID does not exist" Apr 24 14:26:15.425262 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.425241 2569 scope.go:117] "RemoveContainer" containerID="9a3198ec242ad368c26c77c13b0c1016f6d75531d0f1ce0ab1b78cf15eb7c09f" Apr 24 14:26:15.425563 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.425538 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a3198ec242ad368c26c77c13b0c1016f6d75531d0f1ce0ab1b78cf15eb7c09f"} err="failed to get container status \"9a3198ec242ad368c26c77c13b0c1016f6d75531d0f1ce0ab1b78cf15eb7c09f\": rpc error: code = NotFound desc = could not find container \"9a3198ec242ad368c26c77c13b0c1016f6d75531d0f1ce0ab1b78cf15eb7c09f\": container with ID starting with 9a3198ec242ad368c26c77c13b0c1016f6d75531d0f1ce0ab1b78cf15eb7c09f not found: ID does not exist" Apr 24 14:26:15.425647 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.425566 2569 scope.go:117] "RemoveContainer" containerID="dbb34f6def7a3afd66a9e2597315fc1350569ba1e9f5568f45e8955a06ad0d08" Apr 24 14:26:15.425884 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.425856 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbb34f6def7a3afd66a9e2597315fc1350569ba1e9f5568f45e8955a06ad0d08"} err="failed to get container status \"dbb34f6def7a3afd66a9e2597315fc1350569ba1e9f5568f45e8955a06ad0d08\": rpc error: code = NotFound desc = could not find container \"dbb34f6def7a3afd66a9e2597315fc1350569ba1e9f5568f45e8955a06ad0d08\": container with ID starting with dbb34f6def7a3afd66a9e2597315fc1350569ba1e9f5568f45e8955a06ad0d08 not found: ID does not exist" Apr 24 14:26:15.425994 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.425885 2569 scope.go:117] "RemoveContainer" containerID="96b926e0feb16bad177532867a108859c626a188c93b793c3c8030ea8a719b06" Apr 24 14:26:15.426188 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.426162 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96b926e0feb16bad177532867a108859c626a188c93b793c3c8030ea8a719b06"} err="failed to get container status \"96b926e0feb16bad177532867a108859c626a188c93b793c3c8030ea8a719b06\": rpc error: code = NotFound desc = could not find container \"96b926e0feb16bad177532867a108859c626a188c93b793c3c8030ea8a719b06\": container with ID starting with 96b926e0feb16bad177532867a108859c626a188c93b793c3c8030ea8a719b06 not found: ID does not exist" Apr 24 14:26:15.426271 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.426190 2569 scope.go:117] "RemoveContainer" containerID="049feeca0ffcdcaaccd16c053f944385a24719a2e0c8341a3e165fe47e7a728c" Apr 24 14:26:15.426466 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.426427 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"049feeca0ffcdcaaccd16c053f944385a24719a2e0c8341a3e165fe47e7a728c"} err="failed to get container status \"049feeca0ffcdcaaccd16c053f944385a24719a2e0c8341a3e165fe47e7a728c\": rpc error: code = NotFound desc = could not find container \"049feeca0ffcdcaaccd16c053f944385a24719a2e0c8341a3e165fe47e7a728c\": container with ID starting with 049feeca0ffcdcaaccd16c053f944385a24719a2e0c8341a3e165fe47e7a728c not found: ID does not exist" Apr 24 14:26:15.426466 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.426455 2569 scope.go:117] "RemoveContainer" containerID="e61069b50e611967356acf7d01168966d16076edbe579200c6af495ddd7d8647" Apr 24 14:26:15.426702 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.426671 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e61069b50e611967356acf7d01168966d16076edbe579200c6af495ddd7d8647"} err="failed to get container status \"e61069b50e611967356acf7d01168966d16076edbe579200c6af495ddd7d8647\": rpc error: code = NotFound desc = could not find container \"e61069b50e611967356acf7d01168966d16076edbe579200c6af495ddd7d8647\": container with ID starting with e61069b50e611967356acf7d01168966d16076edbe579200c6af495ddd7d8647 not found: ID does not exist" Apr 24 14:26:15.426702 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.426700 2569 scope.go:117] "RemoveContainer" containerID="86e58bc6115a55d4672087408ec746f7edfd1833e0663f17c526f96be6236c64" Apr 24 14:26:15.426943 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.426921 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86e58bc6115a55d4672087408ec746f7edfd1833e0663f17c526f96be6236c64"} err="failed to get container status \"86e58bc6115a55d4672087408ec746f7edfd1833e0663f17c526f96be6236c64\": rpc error: code = NotFound desc = could not find container \"86e58bc6115a55d4672087408ec746f7edfd1833e0663f17c526f96be6236c64\": container with ID starting with 86e58bc6115a55d4672087408ec746f7edfd1833e0663f17c526f96be6236c64 not found: ID does not exist" Apr 24 14:26:15.427032 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.426946 2569 scope.go:117] "RemoveContainer" containerID="6acc7374dc061699ab357ebf419d09b4a45ceaf0c918522d1f5f0dd89631430d" Apr 24 14:26:15.427237 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.427204 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6acc7374dc061699ab357ebf419d09b4a45ceaf0c918522d1f5f0dd89631430d"} err="failed to get container status \"6acc7374dc061699ab357ebf419d09b4a45ceaf0c918522d1f5f0dd89631430d\": rpc error: code = NotFound desc = could not find container \"6acc7374dc061699ab357ebf419d09b4a45ceaf0c918522d1f5f0dd89631430d\": container with ID starting with 6acc7374dc061699ab357ebf419d09b4a45ceaf0c918522d1f5f0dd89631430d not found: ID does not exist" Apr 24 14:26:15.427296 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.427239 2569 scope.go:117] "RemoveContainer" containerID="9a3198ec242ad368c26c77c13b0c1016f6d75531d0f1ce0ab1b78cf15eb7c09f" Apr 24 14:26:15.427508 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.427485 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a3198ec242ad368c26c77c13b0c1016f6d75531d0f1ce0ab1b78cf15eb7c09f"} err="failed to get container status \"9a3198ec242ad368c26c77c13b0c1016f6d75531d0f1ce0ab1b78cf15eb7c09f\": rpc error: code = NotFound desc = could not find container \"9a3198ec242ad368c26c77c13b0c1016f6d75531d0f1ce0ab1b78cf15eb7c09f\": container with ID starting with 9a3198ec242ad368c26c77c13b0c1016f6d75531d0f1ce0ab1b78cf15eb7c09f not found: ID does not exist" Apr 24 14:26:15.427572 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.427510 2569 scope.go:117] "RemoveContainer" containerID="dbb34f6def7a3afd66a9e2597315fc1350569ba1e9f5568f45e8955a06ad0d08" Apr 24 14:26:15.427777 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.427751 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbb34f6def7a3afd66a9e2597315fc1350569ba1e9f5568f45e8955a06ad0d08"} err="failed to get container status \"dbb34f6def7a3afd66a9e2597315fc1350569ba1e9f5568f45e8955a06ad0d08\": rpc error: code = NotFound desc = could not find container \"dbb34f6def7a3afd66a9e2597315fc1350569ba1e9f5568f45e8955a06ad0d08\": container with ID starting with dbb34f6def7a3afd66a9e2597315fc1350569ba1e9f5568f45e8955a06ad0d08 not found: ID does not exist" Apr 24 14:26:15.427878 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.427778 2569 scope.go:117] "RemoveContainer" containerID="96b926e0feb16bad177532867a108859c626a188c93b793c3c8030ea8a719b06" Apr 24 14:26:15.428022 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.428001 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96b926e0feb16bad177532867a108859c626a188c93b793c3c8030ea8a719b06"} err="failed to get container status \"96b926e0feb16bad177532867a108859c626a188c93b793c3c8030ea8a719b06\": rpc error: code = NotFound desc = could not find container \"96b926e0feb16bad177532867a108859c626a188c93b793c3c8030ea8a719b06\": container with ID starting with 96b926e0feb16bad177532867a108859c626a188c93b793c3c8030ea8a719b06 not found: ID does not exist" Apr 24 14:26:15.428093 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.428022 2569 scope.go:117] "RemoveContainer" containerID="049feeca0ffcdcaaccd16c053f944385a24719a2e0c8341a3e165fe47e7a728c" Apr 24 14:26:15.428267 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.428246 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"049feeca0ffcdcaaccd16c053f944385a24719a2e0c8341a3e165fe47e7a728c"} err="failed to get container status \"049feeca0ffcdcaaccd16c053f944385a24719a2e0c8341a3e165fe47e7a728c\": rpc error: code = NotFound desc = could not find container \"049feeca0ffcdcaaccd16c053f944385a24719a2e0c8341a3e165fe47e7a728c\": container with ID starting with 049feeca0ffcdcaaccd16c053f944385a24719a2e0c8341a3e165fe47e7a728c not found: ID does not exist" Apr 24 14:26:15.428344 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.428268 2569 scope.go:117] "RemoveContainer" containerID="e61069b50e611967356acf7d01168966d16076edbe579200c6af495ddd7d8647" Apr 24 14:26:15.428537 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.428514 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e61069b50e611967356acf7d01168966d16076edbe579200c6af495ddd7d8647"} err="failed to get container status \"e61069b50e611967356acf7d01168966d16076edbe579200c6af495ddd7d8647\": rpc error: code = NotFound desc = could not find container \"e61069b50e611967356acf7d01168966d16076edbe579200c6af495ddd7d8647\": container with ID starting with e61069b50e611967356acf7d01168966d16076edbe579200c6af495ddd7d8647 not found: ID does not exist" Apr 24 14:26:15.428537 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.428535 2569 scope.go:117] "RemoveContainer" containerID="86e58bc6115a55d4672087408ec746f7edfd1833e0663f17c526f96be6236c64" Apr 24 14:26:15.428795 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.428777 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86e58bc6115a55d4672087408ec746f7edfd1833e0663f17c526f96be6236c64"} err="failed to get container status \"86e58bc6115a55d4672087408ec746f7edfd1833e0663f17c526f96be6236c64\": rpc error: code = NotFound desc = could not find container \"86e58bc6115a55d4672087408ec746f7edfd1833e0663f17c526f96be6236c64\": container with ID starting with 86e58bc6115a55d4672087408ec746f7edfd1833e0663f17c526f96be6236c64 not found: ID does not exist" Apr 24 14:26:15.428873 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.428796 2569 scope.go:117] "RemoveContainer" containerID="6acc7374dc061699ab357ebf419d09b4a45ceaf0c918522d1f5f0dd89631430d" Apr 24 14:26:15.429074 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.429056 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6acc7374dc061699ab357ebf419d09b4a45ceaf0c918522d1f5f0dd89631430d"} err="failed to get container status \"6acc7374dc061699ab357ebf419d09b4a45ceaf0c918522d1f5f0dd89631430d\": rpc error: code = NotFound desc = could not find container \"6acc7374dc061699ab357ebf419d09b4a45ceaf0c918522d1f5f0dd89631430d\": container with ID starting with 6acc7374dc061699ab357ebf419d09b4a45ceaf0c918522d1f5f0dd89631430d not found: ID does not exist" Apr 24 14:26:15.429137 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.429074 2569 scope.go:117] "RemoveContainer" containerID="9a3198ec242ad368c26c77c13b0c1016f6d75531d0f1ce0ab1b78cf15eb7c09f" Apr 24 14:26:15.429327 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.429291 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a3198ec242ad368c26c77c13b0c1016f6d75531d0f1ce0ab1b78cf15eb7c09f"} err="failed to get container status \"9a3198ec242ad368c26c77c13b0c1016f6d75531d0f1ce0ab1b78cf15eb7c09f\": rpc error: code = NotFound desc = could not find container \"9a3198ec242ad368c26c77c13b0c1016f6d75531d0f1ce0ab1b78cf15eb7c09f\": container with ID starting with 9a3198ec242ad368c26c77c13b0c1016f6d75531d0f1ce0ab1b78cf15eb7c09f not found: ID does not exist" Apr 24 14:26:15.429394 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.429328 2569 scope.go:117] "RemoveContainer" containerID="dbb34f6def7a3afd66a9e2597315fc1350569ba1e9f5568f45e8955a06ad0d08" Apr 24 14:26:15.429535 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.429518 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbb34f6def7a3afd66a9e2597315fc1350569ba1e9f5568f45e8955a06ad0d08"} err="failed to get container status \"dbb34f6def7a3afd66a9e2597315fc1350569ba1e9f5568f45e8955a06ad0d08\": rpc error: code = NotFound desc = could not find container \"dbb34f6def7a3afd66a9e2597315fc1350569ba1e9f5568f45e8955a06ad0d08\": container with ID starting with dbb34f6def7a3afd66a9e2597315fc1350569ba1e9f5568f45e8955a06ad0d08 not found: ID does not exist" Apr 24 14:26:15.429605 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.429536 2569 scope.go:117] "RemoveContainer" containerID="96b926e0feb16bad177532867a108859c626a188c93b793c3c8030ea8a719b06" Apr 24 14:26:15.429804 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.429774 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96b926e0feb16bad177532867a108859c626a188c93b793c3c8030ea8a719b06"} err="failed to get container status \"96b926e0feb16bad177532867a108859c626a188c93b793c3c8030ea8a719b06\": rpc error: code = NotFound desc = could not find container \"96b926e0feb16bad177532867a108859c626a188c93b793c3c8030ea8a719b06\": container with ID starting with 96b926e0feb16bad177532867a108859c626a188c93b793c3c8030ea8a719b06 not found: ID does not exist" Apr 24 14:26:15.429804 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.429799 2569 scope.go:117] "RemoveContainer" containerID="049feeca0ffcdcaaccd16c053f944385a24719a2e0c8341a3e165fe47e7a728c" Apr 24 14:26:15.430024 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.430005 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"049feeca0ffcdcaaccd16c053f944385a24719a2e0c8341a3e165fe47e7a728c"} err="failed to get container status \"049feeca0ffcdcaaccd16c053f944385a24719a2e0c8341a3e165fe47e7a728c\": rpc error: code = NotFound desc = could not find container \"049feeca0ffcdcaaccd16c053f944385a24719a2e0c8341a3e165fe47e7a728c\": container with ID starting with 049feeca0ffcdcaaccd16c053f944385a24719a2e0c8341a3e165fe47e7a728c not found: ID does not exist" Apr 24 14:26:15.430024 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.430023 2569 scope.go:117] "RemoveContainer" containerID="e61069b50e611967356acf7d01168966d16076edbe579200c6af495ddd7d8647" Apr 24 14:26:15.430340 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.430315 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e61069b50e611967356acf7d01168966d16076edbe579200c6af495ddd7d8647"} err="failed to get container status \"e61069b50e611967356acf7d01168966d16076edbe579200c6af495ddd7d8647\": rpc error: code = NotFound desc = could not find container \"e61069b50e611967356acf7d01168966d16076edbe579200c6af495ddd7d8647\": container with ID starting with e61069b50e611967356acf7d01168966d16076edbe579200c6af495ddd7d8647 not found: ID does not exist" Apr 24 14:26:15.430417 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.430341 2569 scope.go:117] "RemoveContainer" containerID="86e58bc6115a55d4672087408ec746f7edfd1833e0663f17c526f96be6236c64" Apr 24 14:26:15.430557 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.430541 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86e58bc6115a55d4672087408ec746f7edfd1833e0663f17c526f96be6236c64"} err="failed to get container status \"86e58bc6115a55d4672087408ec746f7edfd1833e0663f17c526f96be6236c64\": rpc error: code = NotFound desc = could not find container \"86e58bc6115a55d4672087408ec746f7edfd1833e0663f17c526f96be6236c64\": container with ID starting with 86e58bc6115a55d4672087408ec746f7edfd1833e0663f17c526f96be6236c64 not found: ID does not exist" Apr 24 14:26:15.430621 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.430557 2569 scope.go:117] "RemoveContainer" containerID="6acc7374dc061699ab357ebf419d09b4a45ceaf0c918522d1f5f0dd89631430d" Apr 24 14:26:15.430795 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.430775 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6acc7374dc061699ab357ebf419d09b4a45ceaf0c918522d1f5f0dd89631430d"} err="failed to get container status \"6acc7374dc061699ab357ebf419d09b4a45ceaf0c918522d1f5f0dd89631430d\": rpc error: code = NotFound desc = could not find container \"6acc7374dc061699ab357ebf419d09b4a45ceaf0c918522d1f5f0dd89631430d\": container with ID starting with 6acc7374dc061699ab357ebf419d09b4a45ceaf0c918522d1f5f0dd89631430d not found: ID does not exist" Apr 24 14:26:15.474448 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.474407 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec9e6a83-e9de-48cd-9b7d-783eafaf3f92-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ec9e6a83-e9de-48cd-9b7d-783eafaf3f92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:15.474637 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.474555 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ec9e6a83-e9de-48cd-9b7d-783eafaf3f92-web-config\") pod \"prometheus-k8s-0\" (UID: \"ec9e6a83-e9de-48cd-9b7d-783eafaf3f92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:15.474637 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.474604 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ec9e6a83-e9de-48cd-9b7d-783eafaf3f92-config\") pod \"prometheus-k8s-0\" (UID: \"ec9e6a83-e9de-48cd-9b7d-783eafaf3f92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:15.474760 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.474673 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ec9e6a83-e9de-48cd-9b7d-783eafaf3f92-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"ec9e6a83-e9de-48cd-9b7d-783eafaf3f92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:15.474760 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.474700 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ec9e6a83-e9de-48cd-9b7d-783eafaf3f92-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"ec9e6a83-e9de-48cd-9b7d-783eafaf3f92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:15.474760 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.474731 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ec9e6a83-e9de-48cd-9b7d-783eafaf3f92-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"ec9e6a83-e9de-48cd-9b7d-783eafaf3f92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:15.474876 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.474769 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec9e6a83-e9de-48cd-9b7d-783eafaf3f92-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ec9e6a83-e9de-48cd-9b7d-783eafaf3f92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:15.474876 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.474831 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ec9e6a83-e9de-48cd-9b7d-783eafaf3f92-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"ec9e6a83-e9de-48cd-9b7d-783eafaf3f92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:15.474876 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.474865 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ec9e6a83-e9de-48cd-9b7d-783eafaf3f92-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"ec9e6a83-e9de-48cd-9b7d-783eafaf3f92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:15.475008 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.474908 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ec9e6a83-e9de-48cd-9b7d-783eafaf3f92-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"ec9e6a83-e9de-48cd-9b7d-783eafaf3f92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:15.475103 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.475077 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ec9e6a83-e9de-48cd-9b7d-783eafaf3f92-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"ec9e6a83-e9de-48cd-9b7d-783eafaf3f92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:15.475178 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.475146 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ec9e6a83-e9de-48cd-9b7d-783eafaf3f92-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"ec9e6a83-e9de-48cd-9b7d-783eafaf3f92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:15.475237 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.475192 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec9e6a83-e9de-48cd-9b7d-783eafaf3f92-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ec9e6a83-e9de-48cd-9b7d-783eafaf3f92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:15.475401 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.475372 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ec9e6a83-e9de-48cd-9b7d-783eafaf3f92-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"ec9e6a83-e9de-48cd-9b7d-783eafaf3f92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:15.475518 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.475434 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ec9e6a83-e9de-48cd-9b7d-783eafaf3f92-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"ec9e6a83-e9de-48cd-9b7d-783eafaf3f92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:15.475518 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.475493 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkvsl\" (UniqueName: \"kubernetes.io/projected/ec9e6a83-e9de-48cd-9b7d-783eafaf3f92-kube-api-access-pkvsl\") pod \"prometheus-k8s-0\" (UID: \"ec9e6a83-e9de-48cd-9b7d-783eafaf3f92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:15.475631 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.475555 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ec9e6a83-e9de-48cd-9b7d-783eafaf3f92-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"ec9e6a83-e9de-48cd-9b7d-783eafaf3f92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:15.475631 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.475617 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ec9e6a83-e9de-48cd-9b7d-783eafaf3f92-config-out\") pod \"prometheus-k8s-0\" (UID: \"ec9e6a83-e9de-48cd-9b7d-783eafaf3f92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:15.576438 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.576345 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ec9e6a83-e9de-48cd-9b7d-783eafaf3f92-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"ec9e6a83-e9de-48cd-9b7d-783eafaf3f92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:15.576438 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.576398 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ec9e6a83-e9de-48cd-9b7d-783eafaf3f92-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"ec9e6a83-e9de-48cd-9b7d-783eafaf3f92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:15.576438 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.576428 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pkvsl\" (UniqueName: \"kubernetes.io/projected/ec9e6a83-e9de-48cd-9b7d-783eafaf3f92-kube-api-access-pkvsl\") pod \"prometheus-k8s-0\" (UID: \"ec9e6a83-e9de-48cd-9b7d-783eafaf3f92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:15.576704 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.576455 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ec9e6a83-e9de-48cd-9b7d-783eafaf3f92-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"ec9e6a83-e9de-48cd-9b7d-783eafaf3f92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:15.576704 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.576480 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ec9e6a83-e9de-48cd-9b7d-783eafaf3f92-config-out\") pod \"prometheus-k8s-0\" (UID: \"ec9e6a83-e9de-48cd-9b7d-783eafaf3f92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:15.576704 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.576516 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec9e6a83-e9de-48cd-9b7d-783eafaf3f92-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ec9e6a83-e9de-48cd-9b7d-783eafaf3f92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:15.576704 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.576539 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ec9e6a83-e9de-48cd-9b7d-783eafaf3f92-web-config\") pod \"prometheus-k8s-0\" (UID: \"ec9e6a83-e9de-48cd-9b7d-783eafaf3f92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:15.576704 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.576563 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ec9e6a83-e9de-48cd-9b7d-783eafaf3f92-config\") pod \"prometheus-k8s-0\" (UID: \"ec9e6a83-e9de-48cd-9b7d-783eafaf3f92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:15.576704 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.576591 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ec9e6a83-e9de-48cd-9b7d-783eafaf3f92-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"ec9e6a83-e9de-48cd-9b7d-783eafaf3f92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:15.576704 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.576612 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ec9e6a83-e9de-48cd-9b7d-783eafaf3f92-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"ec9e6a83-e9de-48cd-9b7d-783eafaf3f92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:15.576704 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.576639 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ec9e6a83-e9de-48cd-9b7d-783eafaf3f92-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"ec9e6a83-e9de-48cd-9b7d-783eafaf3f92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:15.576704 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.576667 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec9e6a83-e9de-48cd-9b7d-783eafaf3f92-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ec9e6a83-e9de-48cd-9b7d-783eafaf3f92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:15.576704 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.576702 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ec9e6a83-e9de-48cd-9b7d-783eafaf3f92-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"ec9e6a83-e9de-48cd-9b7d-783eafaf3f92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:15.577239 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.576735 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ec9e6a83-e9de-48cd-9b7d-783eafaf3f92-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"ec9e6a83-e9de-48cd-9b7d-783eafaf3f92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:15.577239 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.576794 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ec9e6a83-e9de-48cd-9b7d-783eafaf3f92-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"ec9e6a83-e9de-48cd-9b7d-783eafaf3f92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:15.577239 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.576841 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ec9e6a83-e9de-48cd-9b7d-783eafaf3f92-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"ec9e6a83-e9de-48cd-9b7d-783eafaf3f92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:15.577239 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.576881 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ec9e6a83-e9de-48cd-9b7d-783eafaf3f92-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"ec9e6a83-e9de-48cd-9b7d-783eafaf3f92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:15.577239 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.576932 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec9e6a83-e9de-48cd-9b7d-783eafaf3f92-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ec9e6a83-e9de-48cd-9b7d-783eafaf3f92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:15.577483 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.577345 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ec9e6a83-e9de-48cd-9b7d-783eafaf3f92-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"ec9e6a83-e9de-48cd-9b7d-783eafaf3f92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:15.577951 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.577909 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec9e6a83-e9de-48cd-9b7d-783eafaf3f92-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ec9e6a83-e9de-48cd-9b7d-783eafaf3f92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:15.578200 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.578172 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ec9e6a83-e9de-48cd-9b7d-783eafaf3f92-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"ec9e6a83-e9de-48cd-9b7d-783eafaf3f92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:15.578949 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.578927 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec9e6a83-e9de-48cd-9b7d-783eafaf3f92-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ec9e6a83-e9de-48cd-9b7d-783eafaf3f92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:15.579648 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.579623 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ec9e6a83-e9de-48cd-9b7d-783eafaf3f92-web-config\") pod \"prometheus-k8s-0\" (UID: \"ec9e6a83-e9de-48cd-9b7d-783eafaf3f92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:15.580104 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.579958 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ec9e6a83-e9de-48cd-9b7d-783eafaf3f92-config-out\") pod \"prometheus-k8s-0\" (UID: \"ec9e6a83-e9de-48cd-9b7d-783eafaf3f92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:15.582771 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.580352 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ec9e6a83-e9de-48cd-9b7d-783eafaf3f92-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"ec9e6a83-e9de-48cd-9b7d-783eafaf3f92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:15.582771 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.580565 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec9e6a83-e9de-48cd-9b7d-783eafaf3f92-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ec9e6a83-e9de-48cd-9b7d-783eafaf3f92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:15.582771 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.580580 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ec9e6a83-e9de-48cd-9b7d-783eafaf3f92-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"ec9e6a83-e9de-48cd-9b7d-783eafaf3f92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:15.582771 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.581365 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ec9e6a83-e9de-48cd-9b7d-783eafaf3f92-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"ec9e6a83-e9de-48cd-9b7d-783eafaf3f92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:15.582771 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.581774 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ec9e6a83-e9de-48cd-9b7d-783eafaf3f92-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"ec9e6a83-e9de-48cd-9b7d-783eafaf3f92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:15.582771 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.582671 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ec9e6a83-e9de-48cd-9b7d-783eafaf3f92-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"ec9e6a83-e9de-48cd-9b7d-783eafaf3f92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:15.583161 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.582967 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ec9e6a83-e9de-48cd-9b7d-783eafaf3f92-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"ec9e6a83-e9de-48cd-9b7d-783eafaf3f92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:15.583161 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.583138 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ec9e6a83-e9de-48cd-9b7d-783eafaf3f92-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"ec9e6a83-e9de-48cd-9b7d-783eafaf3f92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:15.583161 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.583154 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ec9e6a83-e9de-48cd-9b7d-783eafaf3f92-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"ec9e6a83-e9de-48cd-9b7d-783eafaf3f92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:15.583910 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.583877 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ec9e6a83-e9de-48cd-9b7d-783eafaf3f92-config\") pod \"prometheus-k8s-0\" (UID: \"ec9e6a83-e9de-48cd-9b7d-783eafaf3f92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:15.584419 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.584381 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ec9e6a83-e9de-48cd-9b7d-783eafaf3f92-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"ec9e6a83-e9de-48cd-9b7d-783eafaf3f92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:15.584637 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.584611 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkvsl\" (UniqueName: \"kubernetes.io/projected/ec9e6a83-e9de-48cd-9b7d-783eafaf3f92-kube-api-access-pkvsl\") pod \"prometheus-k8s-0\" (UID: \"ec9e6a83-e9de-48cd-9b7d-783eafaf3f92\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:15.726922 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.726868 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:15.830763 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.829488 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="458b1119-d234-4d63-972b-ba4c4df4cf92" path="/var/lib/kubelet/pods/458b1119-d234-4d63-972b-ba4c4df4cf92/volumes" Apr 24 14:26:15.867884 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:15.867852 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 14:26:15.871704 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:26:15.871675 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec9e6a83_e9de_48cd_9b7d_783eafaf3f92.slice/crio-d43be2e10d112127e7c37143de00f1dfd1b09974a99bed5198cb175bc1cdd485 WatchSource:0}: Error finding container d43be2e10d112127e7c37143de00f1dfd1b09974a99bed5198cb175bc1cdd485: Status 404 returned error can't find the container with id d43be2e10d112127e7c37143de00f1dfd1b09974a99bed5198cb175bc1cdd485 Apr 24 14:26:16.357043 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:16.357000 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ec9e6a83-e9de-48cd-9b7d-783eafaf3f92","Type":"ContainerStarted","Data":"179222269dc9a4bb2898e6d344e55732d41f9bf6062dffc9687e9e198212938d"} Apr 24 14:26:16.357043 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:16.357048 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ec9e6a83-e9de-48cd-9b7d-783eafaf3f92","Type":"ContainerStarted","Data":"d43be2e10d112127e7c37143de00f1dfd1b09974a99bed5198cb175bc1cdd485"} Apr 24 14:26:17.363254 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:17.363210 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-64c55f559-vskpb" event={"ID":"cc9e438c-2d31-45f3-8009-2f9d7cbf559b","Type":"ContainerStarted","Data":"8090f0b2a9c6631849ed784487fc3ee582968e31820f630fa4fd848807ba26f9"} Apr 24 14:26:17.363254 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:17.363255 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-64c55f559-vskpb" event={"ID":"cc9e438c-2d31-45f3-8009-2f9d7cbf559b","Type":"ContainerStarted","Data":"dc610b459271a7186634b1fa00c513b436aca8daa14c1dcad1e9f595582c4ba1"} Apr 24 14:26:17.363769 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:17.363269 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-64c55f559-vskpb" event={"ID":"cc9e438c-2d31-45f3-8009-2f9d7cbf559b","Type":"ContainerStarted","Data":"f623fe9c886152804a3feba6a234d287b43fc85c1e5944e80a2621f66979891e"} Apr 24 14:26:17.364503 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:17.364480 2569 generic.go:358] "Generic (PLEG): container finished" podID="ec9e6a83-e9de-48cd-9b7d-783eafaf3f92" containerID="179222269dc9a4bb2898e6d344e55732d41f9bf6062dffc9687e9e198212938d" exitCode=0 Apr 24 14:26:17.364586 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:17.364549 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ec9e6a83-e9de-48cd-9b7d-783eafaf3f92","Type":"ContainerDied","Data":"179222269dc9a4bb2898e6d344e55732d41f9bf6062dffc9687e9e198212938d"} Apr 24 14:26:17.383057 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:17.383009 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-64c55f559-vskpb" podStartSLOduration=1.549839944 podStartE2EDuration="3.382995578s" podCreationTimestamp="2026-04-24 14:26:14 +0000 UTC" firstStartedPulling="2026-04-24 14:26:14.615209528 +0000 UTC m=+127.409266410" lastFinishedPulling="2026-04-24 14:26:16.448365152 +0000 UTC m=+129.242422044" observedRunningTime="2026-04-24 14:26:17.381845457 +0000 UTC m=+130.175902364" watchObservedRunningTime="2026-04-24 14:26:17.382995578 +0000 UTC m=+130.177052481" Apr 24 14:26:18.370409 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:18.370365 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ec9e6a83-e9de-48cd-9b7d-783eafaf3f92","Type":"ContainerStarted","Data":"cb06dd8e597baef6b471d3c6d6885978d6398305e71fd69ad7178699d093d24e"} Apr 24 14:26:18.370409 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:18.370408 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ec9e6a83-e9de-48cd-9b7d-783eafaf3f92","Type":"ContainerStarted","Data":"29cd5e4ad5be113f83ec51a4fa65dc10d3d91eb72336d027112945d2d29f7587"} Apr 24 14:26:18.370409 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:18.370418 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ec9e6a83-e9de-48cd-9b7d-783eafaf3f92","Type":"ContainerStarted","Data":"d69be0ba69c79c7831a5921a274b02f72eef1b5c737c5185f8d0fc86111706a1"} Apr 24 14:26:18.371014 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:18.370427 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ec9e6a83-e9de-48cd-9b7d-783eafaf3f92","Type":"ContainerStarted","Data":"c2d1d45fef57b0517e474c6df2310a426c12a6b557ef545f6bbce2c714fa9df2"} Apr 24 14:26:18.371014 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:18.370435 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ec9e6a83-e9de-48cd-9b7d-783eafaf3f92","Type":"ContainerStarted","Data":"069a4cfd06b2a3ec07493b2476d2094d5e4061cf4a4154273771206706f80769"} Apr 24 14:26:18.371014 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:18.370443 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ec9e6a83-e9de-48cd-9b7d-783eafaf3f92","Type":"ContainerStarted","Data":"73236ba0bae861e5519de604131a5296746e9f4a00672a7cd1d910d5fa271794"} Apr 24 14:26:18.396962 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:18.396909 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=3.396878902 podStartE2EDuration="3.396878902s" podCreationTimestamp="2026-04-24 14:26:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:26:18.394955039 +0000 UTC m=+131.189011939" watchObservedRunningTime="2026-04-24 14:26:18.396878902 +0000 UTC m=+131.190935802" Apr 24 14:26:20.727599 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:26:20.727542 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:15.727244 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:27:15.727207 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:15.742594 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:27:15.742567 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:16.560874 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:27:16.560845 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:54.715427 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:27:54.715390 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-jbgh4"] Apr 24 14:27:54.718807 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:27:54.718787 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jbgh4" Apr 24 14:27:54.720974 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:27:54.720950 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 14:27:54.725921 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:27:54.725877 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-jbgh4"] Apr 24 14:27:54.786149 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:27:54.786109 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/6e81e6c3-58b8-46a4-8e92-6b8943176758-dbus\") pod \"global-pull-secret-syncer-jbgh4\" (UID: \"6e81e6c3-58b8-46a4-8e92-6b8943176758\") " pod="kube-system/global-pull-secret-syncer-jbgh4" Apr 24 14:27:54.786321 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:27:54.786205 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6e81e6c3-58b8-46a4-8e92-6b8943176758-original-pull-secret\") pod \"global-pull-secret-syncer-jbgh4\" (UID: \"6e81e6c3-58b8-46a4-8e92-6b8943176758\") " pod="kube-system/global-pull-secret-syncer-jbgh4" Apr 24 14:27:54.786321 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:27:54.786239 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/6e81e6c3-58b8-46a4-8e92-6b8943176758-kubelet-config\") pod \"global-pull-secret-syncer-jbgh4\" (UID: \"6e81e6c3-58b8-46a4-8e92-6b8943176758\") " pod="kube-system/global-pull-secret-syncer-jbgh4" Apr 24 14:27:54.886675 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:27:54.886640 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/6e81e6c3-58b8-46a4-8e92-6b8943176758-kubelet-config\") pod \"global-pull-secret-syncer-jbgh4\" (UID: \"6e81e6c3-58b8-46a4-8e92-6b8943176758\") " pod="kube-system/global-pull-secret-syncer-jbgh4" Apr 24 14:27:54.886675 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:27:54.886685 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/6e81e6c3-58b8-46a4-8e92-6b8943176758-dbus\") pod \"global-pull-secret-syncer-jbgh4\" (UID: \"6e81e6c3-58b8-46a4-8e92-6b8943176758\") " pod="kube-system/global-pull-secret-syncer-jbgh4" Apr 24 14:27:54.886938 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:27:54.886742 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6e81e6c3-58b8-46a4-8e92-6b8943176758-original-pull-secret\") pod \"global-pull-secret-syncer-jbgh4\" (UID: \"6e81e6c3-58b8-46a4-8e92-6b8943176758\") " pod="kube-system/global-pull-secret-syncer-jbgh4" Apr 24 14:27:54.886938 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:27:54.886782 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/6e81e6c3-58b8-46a4-8e92-6b8943176758-kubelet-config\") pod \"global-pull-secret-syncer-jbgh4\" (UID: \"6e81e6c3-58b8-46a4-8e92-6b8943176758\") " pod="kube-system/global-pull-secret-syncer-jbgh4" Apr 24 14:27:54.887040 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:27:54.886955 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/6e81e6c3-58b8-46a4-8e92-6b8943176758-dbus\") pod \"global-pull-secret-syncer-jbgh4\" (UID: \"6e81e6c3-58b8-46a4-8e92-6b8943176758\") " pod="kube-system/global-pull-secret-syncer-jbgh4" Apr 24 14:27:54.888954 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:27:54.888930 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6e81e6c3-58b8-46a4-8e92-6b8943176758-original-pull-secret\") pod \"global-pull-secret-syncer-jbgh4\" (UID: \"6e81e6c3-58b8-46a4-8e92-6b8943176758\") " pod="kube-system/global-pull-secret-syncer-jbgh4" Apr 24 14:27:55.029542 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:27:55.029445 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jbgh4" Apr 24 14:27:55.145780 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:27:55.145741 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-jbgh4"] Apr 24 14:27:55.150409 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:27:55.150379 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e81e6c3_58b8_46a4_8e92_6b8943176758.slice/crio-c46d4d94ba1ba6bc247be7476f89657dd2d8f0e37dbdad55a9ed8db8278b6a09 WatchSource:0}: Error finding container c46d4d94ba1ba6bc247be7476f89657dd2d8f0e37dbdad55a9ed8db8278b6a09: Status 404 returned error can't find the container with id c46d4d94ba1ba6bc247be7476f89657dd2d8f0e37dbdad55a9ed8db8278b6a09 Apr 24 14:27:55.658333 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:27:55.658297 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-jbgh4" event={"ID":"6e81e6c3-58b8-46a4-8e92-6b8943176758","Type":"ContainerStarted","Data":"c46d4d94ba1ba6bc247be7476f89657dd2d8f0e37dbdad55a9ed8db8278b6a09"} Apr 24 14:27:59.672020 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:27:59.671977 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-jbgh4" event={"ID":"6e81e6c3-58b8-46a4-8e92-6b8943176758","Type":"ContainerStarted","Data":"8f6b97be92637e43492ab87e49415b28f5f737525b1c7e4095598c55e4f43e6c"} Apr 24 14:27:59.685568 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:27:59.685517 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-jbgh4" podStartSLOduration=1.891882853 podStartE2EDuration="5.685500428s" podCreationTimestamp="2026-04-24 14:27:54 +0000 UTC" firstStartedPulling="2026-04-24 14:27:55.152058969 +0000 UTC m=+227.946115848" lastFinishedPulling="2026-04-24 14:27:58.945676534 +0000 UTC m=+231.739733423" observedRunningTime="2026-04-24 14:27:59.684659652 +0000 UTC m=+232.478716557" watchObservedRunningTime="2026-04-24 14:27:59.685500428 +0000 UTC m=+232.479557332" Apr 24 14:29:07.668383 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:29:07.668356 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mjpgb_be4da945-f6d1-4406-adea-f3ccedab88f6/ovn-acl-logging/0.log" Apr 24 14:29:07.668921 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:29:07.668758 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mjpgb_be4da945-f6d1-4406-adea-f3ccedab88f6/ovn-acl-logging/0.log" Apr 24 14:29:07.673770 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:29:07.673748 2569 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 14:31:30.333637 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:30.333549 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-2nnn4"] Apr 24 14:31:30.336745 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:30.336728 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-2nnn4" Apr 24 14:31:30.338882 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:30.338859 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 14:31:30.339291 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:30.339271 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-k57jn\"" Apr 24 14:31:30.339352 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:30.339271 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 14:31:30.339352 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:30.339272 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 24 14:31:30.346216 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:30.346164 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-2nnn4"] Apr 24 14:31:30.351054 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:30.351027 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-fcvjw"] Apr 24 14:31:30.354912 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:30.354876 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-fcvjw" Apr 24 14:31:30.357017 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:30.356996 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-wlc6k\"" Apr 24 14:31:30.357129 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:30.357019 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 24 14:31:30.364705 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:30.364681 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-fcvjw"] Apr 24 14:31:30.455158 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:30.455119 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6k5w9\" (UniqueName: \"kubernetes.io/projected/828e9036-22ef-4b11-9e4b-23d76347c236-kube-api-access-6k5w9\") pod \"model-serving-api-86f7b4b499-2nnn4\" (UID: \"828e9036-22ef-4b11-9e4b-23d76347c236\") " pod="kserve/model-serving-api-86f7b4b499-2nnn4" Apr 24 14:31:30.455371 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:30.455172 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfcww\" (UniqueName: \"kubernetes.io/projected/859fe824-f035-4f43-b93e-db5cf4147cca-kube-api-access-cfcww\") pod \"odh-model-controller-696fc77849-fcvjw\" (UID: \"859fe824-f035-4f43-b93e-db5cf4147cca\") " pod="kserve/odh-model-controller-696fc77849-fcvjw" Apr 24 14:31:30.455371 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:30.455268 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/859fe824-f035-4f43-b93e-db5cf4147cca-cert\") pod \"odh-model-controller-696fc77849-fcvjw\" (UID: \"859fe824-f035-4f43-b93e-db5cf4147cca\") " pod="kserve/odh-model-controller-696fc77849-fcvjw" Apr 24 14:31:30.455371 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:30.455316 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/828e9036-22ef-4b11-9e4b-23d76347c236-tls-certs\") pod \"model-serving-api-86f7b4b499-2nnn4\" (UID: \"828e9036-22ef-4b11-9e4b-23d76347c236\") " pod="kserve/model-serving-api-86f7b4b499-2nnn4" Apr 24 14:31:30.556264 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:30.556227 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6k5w9\" (UniqueName: \"kubernetes.io/projected/828e9036-22ef-4b11-9e4b-23d76347c236-kube-api-access-6k5w9\") pod \"model-serving-api-86f7b4b499-2nnn4\" (UID: \"828e9036-22ef-4b11-9e4b-23d76347c236\") " pod="kserve/model-serving-api-86f7b4b499-2nnn4" Apr 24 14:31:30.556457 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:30.556290 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cfcww\" (UniqueName: \"kubernetes.io/projected/859fe824-f035-4f43-b93e-db5cf4147cca-kube-api-access-cfcww\") pod \"odh-model-controller-696fc77849-fcvjw\" (UID: \"859fe824-f035-4f43-b93e-db5cf4147cca\") " pod="kserve/odh-model-controller-696fc77849-fcvjw" Apr 24 14:31:30.556457 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:30.556338 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/859fe824-f035-4f43-b93e-db5cf4147cca-cert\") pod \"odh-model-controller-696fc77849-fcvjw\" (UID: \"859fe824-f035-4f43-b93e-db5cf4147cca\") " pod="kserve/odh-model-controller-696fc77849-fcvjw" Apr 24 14:31:30.556457 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:30.556366 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/828e9036-22ef-4b11-9e4b-23d76347c236-tls-certs\") pod \"model-serving-api-86f7b4b499-2nnn4\" (UID: \"828e9036-22ef-4b11-9e4b-23d76347c236\") " pod="kserve/model-serving-api-86f7b4b499-2nnn4" Apr 24 14:31:30.556621 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:31:30.556485 2569 secret.go:189] Couldn't get secret kserve/model-serving-api-tls: secret "model-serving-api-tls" not found Apr 24 14:31:30.556621 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:31:30.556562 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/828e9036-22ef-4b11-9e4b-23d76347c236-tls-certs podName:828e9036-22ef-4b11-9e4b-23d76347c236 nodeName:}" failed. No retries permitted until 2026-04-24 14:31:31.056540684 +0000 UTC m=+443.850597564 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/828e9036-22ef-4b11-9e4b-23d76347c236-tls-certs") pod "model-serving-api-86f7b4b499-2nnn4" (UID: "828e9036-22ef-4b11-9e4b-23d76347c236") : secret "model-serving-api-tls" not found Apr 24 14:31:30.558904 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:30.558868 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/859fe824-f035-4f43-b93e-db5cf4147cca-cert\") pod \"odh-model-controller-696fc77849-fcvjw\" (UID: \"859fe824-f035-4f43-b93e-db5cf4147cca\") " pod="kserve/odh-model-controller-696fc77849-fcvjw" Apr 24 14:31:30.566365 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:30.566341 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfcww\" (UniqueName: \"kubernetes.io/projected/859fe824-f035-4f43-b93e-db5cf4147cca-kube-api-access-cfcww\") pod \"odh-model-controller-696fc77849-fcvjw\" (UID: \"859fe824-f035-4f43-b93e-db5cf4147cca\") " pod="kserve/odh-model-controller-696fc77849-fcvjw" Apr 24 14:31:30.567161 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:30.567140 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6k5w9\" (UniqueName: \"kubernetes.io/projected/828e9036-22ef-4b11-9e4b-23d76347c236-kube-api-access-6k5w9\") pod \"model-serving-api-86f7b4b499-2nnn4\" (UID: \"828e9036-22ef-4b11-9e4b-23d76347c236\") " pod="kserve/model-serving-api-86f7b4b499-2nnn4" Apr 24 14:31:30.665248 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:30.665219 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-fcvjw" Apr 24 14:31:30.785388 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:30.785209 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-fcvjw"] Apr 24 14:31:30.788045 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:31:30.788018 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod859fe824_f035_4f43_b93e_db5cf4147cca.slice/crio-5cd1c6a4ca9ef4d8c3f045c2ea5834e9b6ae0977e0df559c0a8e63f20a2a2ab3 WatchSource:0}: Error finding container 5cd1c6a4ca9ef4d8c3f045c2ea5834e9b6ae0977e0df559c0a8e63f20a2a2ab3: Status 404 returned error can't find the container with id 5cd1c6a4ca9ef4d8c3f045c2ea5834e9b6ae0977e0df559c0a8e63f20a2a2ab3 Apr 24 14:31:30.789268 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:30.789253 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 14:31:31.060696 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:31.060603 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/828e9036-22ef-4b11-9e4b-23d76347c236-tls-certs\") pod \"model-serving-api-86f7b4b499-2nnn4\" (UID: \"828e9036-22ef-4b11-9e4b-23d76347c236\") " pod="kserve/model-serving-api-86f7b4b499-2nnn4" Apr 24 14:31:31.063049 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:31.063030 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/828e9036-22ef-4b11-9e4b-23d76347c236-tls-certs\") pod \"model-serving-api-86f7b4b499-2nnn4\" (UID: \"828e9036-22ef-4b11-9e4b-23d76347c236\") " pod="kserve/model-serving-api-86f7b4b499-2nnn4" Apr 24 14:31:31.247825 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:31.247775 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-2nnn4" Apr 24 14:31:31.251241 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:31.251210 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-fcvjw" event={"ID":"859fe824-f035-4f43-b93e-db5cf4147cca","Type":"ContainerStarted","Data":"5cd1c6a4ca9ef4d8c3f045c2ea5834e9b6ae0977e0df559c0a8e63f20a2a2ab3"} Apr 24 14:31:31.382438 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:31.382398 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-2nnn4"] Apr 24 14:31:31.386287 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:31:31.386248 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod828e9036_22ef_4b11_9e4b_23d76347c236.slice/crio-36ea037a5c67ad79a5afbd3960fb974410799694ca96f68a2b29bc3087309e10 WatchSource:0}: Error finding container 36ea037a5c67ad79a5afbd3960fb974410799694ca96f68a2b29bc3087309e10: Status 404 returned error can't find the container with id 36ea037a5c67ad79a5afbd3960fb974410799694ca96f68a2b29bc3087309e10 Apr 24 14:31:32.257786 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:32.257608 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-2nnn4" event={"ID":"828e9036-22ef-4b11-9e4b-23d76347c236","Type":"ContainerStarted","Data":"36ea037a5c67ad79a5afbd3960fb974410799694ca96f68a2b29bc3087309e10"} Apr 24 14:31:35.269381 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:35.269336 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-fcvjw" event={"ID":"859fe824-f035-4f43-b93e-db5cf4147cca","Type":"ContainerStarted","Data":"60893be7572f6812f45a4d1ece69065d8a8343ce6256be6b61392bf29f4dba70"} Apr 24 14:31:35.269842 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:35.269511 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-fcvjw" Apr 24 14:31:35.270669 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:35.270649 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-2nnn4" event={"ID":"828e9036-22ef-4b11-9e4b-23d76347c236","Type":"ContainerStarted","Data":"afc574a499c5a8da0c18c715939a2c4f34da9dbedadef4ed168253d4c627a766"} Apr 24 14:31:35.270771 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:35.270759 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-2nnn4" Apr 24 14:31:35.284650 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:35.284590 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-fcvjw" podStartSLOduration=1.4989830579999999 podStartE2EDuration="5.284572611s" podCreationTimestamp="2026-04-24 14:31:30 +0000 UTC" firstStartedPulling="2026-04-24 14:31:30.789381583 +0000 UTC m=+443.583438461" lastFinishedPulling="2026-04-24 14:31:34.574971132 +0000 UTC m=+447.369028014" observedRunningTime="2026-04-24 14:31:35.283082105 +0000 UTC m=+448.077139008" watchObservedRunningTime="2026-04-24 14:31:35.284572611 +0000 UTC m=+448.078629513" Apr 24 14:31:35.301064 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:35.301004 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-2nnn4" podStartSLOduration=2.114107133 podStartE2EDuration="5.30099123s" podCreationTimestamp="2026-04-24 14:31:30 +0000 UTC" firstStartedPulling="2026-04-24 14:31:31.388453989 +0000 UTC m=+444.182510872" lastFinishedPulling="2026-04-24 14:31:34.575338087 +0000 UTC m=+447.369394969" observedRunningTime="2026-04-24 14:31:35.300630366 +0000 UTC m=+448.094687266" watchObservedRunningTime="2026-04-24 14:31:35.30099123 +0000 UTC m=+448.095048132" Apr 24 14:31:46.276625 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:46.276582 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-fcvjw" Apr 24 14:31:46.278619 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:46.278595 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-2nnn4" Apr 24 14:31:46.963292 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:46.963259 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-2cjh7"] Apr 24 14:31:46.969646 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:46.969621 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-2cjh7" Apr 24 14:31:46.971788 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:46.971757 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-lngv4\"" Apr 24 14:31:46.971925 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:46.971784 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 24 14:31:46.974355 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:46.974309 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-2cjh7"] Apr 24 14:31:46.985280 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:46.985242 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7zfw\" (UniqueName: \"kubernetes.io/projected/c386d0ac-d3e3-4ac1-b91c-c57e23d548b8-kube-api-access-p7zfw\") pod \"s3-init-2cjh7\" (UID: \"c386d0ac-d3e3-4ac1-b91c-c57e23d548b8\") " pod="kserve/s3-init-2cjh7" Apr 24 14:31:47.085952 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:47.085885 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p7zfw\" (UniqueName: \"kubernetes.io/projected/c386d0ac-d3e3-4ac1-b91c-c57e23d548b8-kube-api-access-p7zfw\") pod \"s3-init-2cjh7\" (UID: \"c386d0ac-d3e3-4ac1-b91c-c57e23d548b8\") " pod="kserve/s3-init-2cjh7" Apr 24 14:31:47.094652 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:47.094628 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7zfw\" (UniqueName: \"kubernetes.io/projected/c386d0ac-d3e3-4ac1-b91c-c57e23d548b8-kube-api-access-p7zfw\") pod \"s3-init-2cjh7\" (UID: \"c386d0ac-d3e3-4ac1-b91c-c57e23d548b8\") " pod="kserve/s3-init-2cjh7" Apr 24 14:31:47.296237 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:47.296132 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-2cjh7" Apr 24 14:31:47.417791 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:47.417756 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-2cjh7"] Apr 24 14:31:47.420748 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:31:47.420715 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc386d0ac_d3e3_4ac1_b91c_c57e23d548b8.slice/crio-f80ac3b1e53f3157a70816da0cadaacf18bceca4086aa39378943776e0e809c8 WatchSource:0}: Error finding container f80ac3b1e53f3157a70816da0cadaacf18bceca4086aa39378943776e0e809c8: Status 404 returned error can't find the container with id f80ac3b1e53f3157a70816da0cadaacf18bceca4086aa39378943776e0e809c8 Apr 24 14:31:48.314940 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:48.314885 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-2cjh7" event={"ID":"c386d0ac-d3e3-4ac1-b91c-c57e23d548b8","Type":"ContainerStarted","Data":"f80ac3b1e53f3157a70816da0cadaacf18bceca4086aa39378943776e0e809c8"} Apr 24 14:31:53.332210 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:53.332171 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-2cjh7" event={"ID":"c386d0ac-d3e3-4ac1-b91c-c57e23d548b8","Type":"ContainerStarted","Data":"244d1af2c94fcf53147d38f3b064599d5c73458f5e2b677eb531b93693fbffd7"} Apr 24 14:31:53.345610 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:53.345558 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-2cjh7" podStartSLOduration=2.002777119 podStartE2EDuration="7.3455426s" podCreationTimestamp="2026-04-24 14:31:46 +0000 UTC" firstStartedPulling="2026-04-24 14:31:47.422512039 +0000 UTC m=+460.216568918" lastFinishedPulling="2026-04-24 14:31:52.765277517 +0000 UTC m=+465.559334399" observedRunningTime="2026-04-24 14:31:53.344733364 +0000 UTC m=+466.138790287" watchObservedRunningTime="2026-04-24 14:31:53.3455426 +0000 UTC m=+466.139599498" Apr 24 14:31:56.343814 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:56.343730 2569 generic.go:358] "Generic (PLEG): container finished" podID="c386d0ac-d3e3-4ac1-b91c-c57e23d548b8" containerID="244d1af2c94fcf53147d38f3b064599d5c73458f5e2b677eb531b93693fbffd7" exitCode=0 Apr 24 14:31:56.344179 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:56.343807 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-2cjh7" event={"ID":"c386d0ac-d3e3-4ac1-b91c-c57e23d548b8","Type":"ContainerDied","Data":"244d1af2c94fcf53147d38f3b064599d5c73458f5e2b677eb531b93693fbffd7"} Apr 24 14:31:57.479633 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:57.479610 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-2cjh7" Apr 24 14:31:57.565792 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:57.565742 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7zfw\" (UniqueName: \"kubernetes.io/projected/c386d0ac-d3e3-4ac1-b91c-c57e23d548b8-kube-api-access-p7zfw\") pod \"c386d0ac-d3e3-4ac1-b91c-c57e23d548b8\" (UID: \"c386d0ac-d3e3-4ac1-b91c-c57e23d548b8\") " Apr 24 14:31:57.567803 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:57.567771 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c386d0ac-d3e3-4ac1-b91c-c57e23d548b8-kube-api-access-p7zfw" (OuterVolumeSpecName: "kube-api-access-p7zfw") pod "c386d0ac-d3e3-4ac1-b91c-c57e23d548b8" (UID: "c386d0ac-d3e3-4ac1-b91c-c57e23d548b8"). InnerVolumeSpecName "kube-api-access-p7zfw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 14:31:57.667090 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:57.667059 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p7zfw\" (UniqueName: \"kubernetes.io/projected/c386d0ac-d3e3-4ac1-b91c-c57e23d548b8-kube-api-access-p7zfw\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 14:31:58.350806 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:58.350778 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-2cjh7" Apr 24 14:31:58.351001 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:58.350805 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-2cjh7" event={"ID":"c386d0ac-d3e3-4ac1-b91c-c57e23d548b8","Type":"ContainerDied","Data":"f80ac3b1e53f3157a70816da0cadaacf18bceca4086aa39378943776e0e809c8"} Apr 24 14:31:58.351001 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:58.350832 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f80ac3b1e53f3157a70816da0cadaacf18bceca4086aa39378943776e0e809c8" Apr 24 14:31:59.032624 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:59.032584 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-p8477"] Apr 24 14:31:59.033094 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:59.033053 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c386d0ac-d3e3-4ac1-b91c-c57e23d548b8" containerName="s3-init" Apr 24 14:31:59.033094 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:59.033073 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="c386d0ac-d3e3-4ac1-b91c-c57e23d548b8" containerName="s3-init" Apr 24 14:31:59.033205 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:59.033155 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="c386d0ac-d3e3-4ac1-b91c-c57e23d548b8" containerName="s3-init" Apr 24 14:31:59.036089 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:59.036064 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-p8477" Apr 24 14:31:59.038169 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:59.038142 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-lngv4\"" Apr 24 14:31:59.038285 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:59.038193 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 24 14:31:59.042465 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:59.042431 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-p8477"] Apr 24 14:31:59.079099 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:59.079060 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqbpm\" (UniqueName: \"kubernetes.io/projected/6a1c184a-7524-403b-a55b-a934783576c5-kube-api-access-gqbpm\") pod \"seaweedfs-tls-custom-ddd4dbfd-p8477\" (UID: \"6a1c184a-7524-403b-a55b-a934783576c5\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-p8477" Apr 24 14:31:59.079099 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:59.079100 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/6a1c184a-7524-403b-a55b-a934783576c5-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-p8477\" (UID: \"6a1c184a-7524-403b-a55b-a934783576c5\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-p8477" Apr 24 14:31:59.179780 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:59.179743 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gqbpm\" (UniqueName: \"kubernetes.io/projected/6a1c184a-7524-403b-a55b-a934783576c5-kube-api-access-gqbpm\") pod \"seaweedfs-tls-custom-ddd4dbfd-p8477\" (UID: \"6a1c184a-7524-403b-a55b-a934783576c5\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-p8477" Apr 24 14:31:59.179780 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:59.179779 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/6a1c184a-7524-403b-a55b-a934783576c5-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-p8477\" (UID: \"6a1c184a-7524-403b-a55b-a934783576c5\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-p8477" Apr 24 14:31:59.180138 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:59.180123 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/6a1c184a-7524-403b-a55b-a934783576c5-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-p8477\" (UID: \"6a1c184a-7524-403b-a55b-a934783576c5\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-p8477" Apr 24 14:31:59.188286 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:59.188266 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqbpm\" (UniqueName: \"kubernetes.io/projected/6a1c184a-7524-403b-a55b-a934783576c5-kube-api-access-gqbpm\") pod \"seaweedfs-tls-custom-ddd4dbfd-p8477\" (UID: \"6a1c184a-7524-403b-a55b-a934783576c5\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-p8477" Apr 24 14:31:59.346238 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:59.346148 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-p8477" Apr 24 14:31:59.463819 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:31:59.463790 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-p8477"] Apr 24 14:31:59.466517 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:31:59.466484 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a1c184a_7524_403b_a55b_a934783576c5.slice/crio-714dc6e551e869b3dc9066ffc8d6b98c987a03f2c6a7b420dbe21e15e6b41460 WatchSource:0}: Error finding container 714dc6e551e869b3dc9066ffc8d6b98c987a03f2c6a7b420dbe21e15e6b41460: Status 404 returned error can't find the container with id 714dc6e551e869b3dc9066ffc8d6b98c987a03f2c6a7b420dbe21e15e6b41460 Apr 24 14:32:00.358233 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:00.358194 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-p8477" event={"ID":"6a1c184a-7524-403b-a55b-a934783576c5","Type":"ContainerStarted","Data":"714dc6e551e869b3dc9066ffc8d6b98c987a03f2c6a7b420dbe21e15e6b41460"} Apr 24 14:32:02.366493 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:02.366406 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-p8477" event={"ID":"6a1c184a-7524-403b-a55b-a934783576c5","Type":"ContainerStarted","Data":"2a9af68e4f8f7520395b591b0726d8cb3d7299b2b4c8ddc25e7d62550429971c"} Apr 24 14:32:02.382945 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:02.382879 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-p8477" podStartSLOduration=0.879180909 podStartE2EDuration="3.382864966s" podCreationTimestamp="2026-04-24 14:31:59 +0000 UTC" firstStartedPulling="2026-04-24 14:31:59.467888865 +0000 UTC m=+472.261945751" lastFinishedPulling="2026-04-24 14:32:01.971572916 +0000 UTC m=+474.765629808" observedRunningTime="2026-04-24 14:32:02.380980275 +0000 UTC m=+475.175037200" watchObservedRunningTime="2026-04-24 14:32:02.382864966 +0000 UTC m=+475.176921866" Apr 24 14:32:03.664350 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:03.664308 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-p8477"] Apr 24 14:32:04.371821 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:04.371764 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-p8477" podUID="6a1c184a-7524-403b-a55b-a934783576c5" containerName="seaweedfs-tls-custom" containerID="cri-o://2a9af68e4f8f7520395b591b0726d8cb3d7299b2b4c8ddc25e7d62550429971c" gracePeriod=30 Apr 24 14:32:05.621241 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:05.621216 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-p8477" Apr 24 14:32:05.734511 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:05.734480 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/6a1c184a-7524-403b-a55b-a934783576c5-data\") pod \"6a1c184a-7524-403b-a55b-a934783576c5\" (UID: \"6a1c184a-7524-403b-a55b-a934783576c5\") " Apr 24 14:32:05.734693 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:05.734530 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqbpm\" (UniqueName: \"kubernetes.io/projected/6a1c184a-7524-403b-a55b-a934783576c5-kube-api-access-gqbpm\") pod \"6a1c184a-7524-403b-a55b-a934783576c5\" (UID: \"6a1c184a-7524-403b-a55b-a934783576c5\") " Apr 24 14:32:05.735768 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:05.735741 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a1c184a-7524-403b-a55b-a934783576c5-data" (OuterVolumeSpecName: "data") pod "6a1c184a-7524-403b-a55b-a934783576c5" (UID: "6a1c184a-7524-403b-a55b-a934783576c5"). InnerVolumeSpecName "data". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:32:05.736497 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:05.736470 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a1c184a-7524-403b-a55b-a934783576c5-kube-api-access-gqbpm" (OuterVolumeSpecName: "kube-api-access-gqbpm") pod "6a1c184a-7524-403b-a55b-a934783576c5" (UID: "6a1c184a-7524-403b-a55b-a934783576c5"). InnerVolumeSpecName "kube-api-access-gqbpm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 14:32:05.835064 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:05.835042 2569 reconciler_common.go:299] "Volume detached for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/6a1c184a-7524-403b-a55b-a934783576c5-data\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 14:32:05.835064 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:05.835066 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gqbpm\" (UniqueName: \"kubernetes.io/projected/6a1c184a-7524-403b-a55b-a934783576c5-kube-api-access-gqbpm\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 14:32:06.377970 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:06.377934 2569 generic.go:358] "Generic (PLEG): container finished" podID="6a1c184a-7524-403b-a55b-a934783576c5" containerID="2a9af68e4f8f7520395b591b0726d8cb3d7299b2b4c8ddc25e7d62550429971c" exitCode=0 Apr 24 14:32:06.378142 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:06.377998 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-p8477" Apr 24 14:32:06.378142 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:06.378021 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-p8477" event={"ID":"6a1c184a-7524-403b-a55b-a934783576c5","Type":"ContainerDied","Data":"2a9af68e4f8f7520395b591b0726d8cb3d7299b2b4c8ddc25e7d62550429971c"} Apr 24 14:32:06.378142 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:06.378062 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-p8477" event={"ID":"6a1c184a-7524-403b-a55b-a934783576c5","Type":"ContainerDied","Data":"714dc6e551e869b3dc9066ffc8d6b98c987a03f2c6a7b420dbe21e15e6b41460"} Apr 24 14:32:06.378142 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:06.378082 2569 scope.go:117] "RemoveContainer" containerID="2a9af68e4f8f7520395b591b0726d8cb3d7299b2b4c8ddc25e7d62550429971c" Apr 24 14:32:06.386932 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:06.386860 2569 scope.go:117] "RemoveContainer" containerID="2a9af68e4f8f7520395b591b0726d8cb3d7299b2b4c8ddc25e7d62550429971c" Apr 24 14:32:06.387165 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:32:06.387144 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a9af68e4f8f7520395b591b0726d8cb3d7299b2b4c8ddc25e7d62550429971c\": container with ID starting with 2a9af68e4f8f7520395b591b0726d8cb3d7299b2b4c8ddc25e7d62550429971c not found: ID does not exist" containerID="2a9af68e4f8f7520395b591b0726d8cb3d7299b2b4c8ddc25e7d62550429971c" Apr 24 14:32:06.387230 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:06.387173 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a9af68e4f8f7520395b591b0726d8cb3d7299b2b4c8ddc25e7d62550429971c"} err="failed to get container status \"2a9af68e4f8f7520395b591b0726d8cb3d7299b2b4c8ddc25e7d62550429971c\": rpc error: code = NotFound desc = could not find container \"2a9af68e4f8f7520395b591b0726d8cb3d7299b2b4c8ddc25e7d62550429971c\": container with ID starting with 2a9af68e4f8f7520395b591b0726d8cb3d7299b2b4c8ddc25e7d62550429971c not found: ID does not exist" Apr 24 14:32:06.392009 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:06.391987 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-p8477"] Apr 24 14:32:06.395438 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:06.395418 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-p8477"] Apr 24 14:32:07.827078 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:07.827041 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a1c184a-7524-403b-a55b-a934783576c5" path="/var/lib/kubelet/pods/6a1c184a-7524-403b-a55b-a934783576c5/volumes" Apr 24 14:32:08.007991 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:08.007908 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-tls-init-custom-z859z"] Apr 24 14:32:08.008240 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:08.008227 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6a1c184a-7524-403b-a55b-a934783576c5" containerName="seaweedfs-tls-custom" Apr 24 14:32:08.008290 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:08.008241 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a1c184a-7524-403b-a55b-a934783576c5" containerName="seaweedfs-tls-custom" Apr 24 14:32:08.008290 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:08.008285 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="6a1c184a-7524-403b-a55b-a934783576c5" containerName="seaweedfs-tls-custom" Apr 24 14:32:08.012665 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:08.012649 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-z859z" Apr 24 14:32:08.015104 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:08.015081 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-lngv4\"" Apr 24 14:32:08.015702 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:08.015082 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 24 14:32:08.017025 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:08.017000 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-custom-z859z"] Apr 24 14:32:08.053691 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:08.053652 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdvtx\" (UniqueName: \"kubernetes.io/projected/07fc62cc-dc14-4433-b588-31a1a40d5d27-kube-api-access-fdvtx\") pod \"s3-tls-init-custom-z859z\" (UID: \"07fc62cc-dc14-4433-b588-31a1a40d5d27\") " pod="kserve/s3-tls-init-custom-z859z" Apr 24 14:32:08.154641 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:08.154601 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fdvtx\" (UniqueName: \"kubernetes.io/projected/07fc62cc-dc14-4433-b588-31a1a40d5d27-kube-api-access-fdvtx\") pod \"s3-tls-init-custom-z859z\" (UID: \"07fc62cc-dc14-4433-b588-31a1a40d5d27\") " pod="kserve/s3-tls-init-custom-z859z" Apr 24 14:32:08.163823 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:08.163797 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdvtx\" (UniqueName: \"kubernetes.io/projected/07fc62cc-dc14-4433-b588-31a1a40d5d27-kube-api-access-fdvtx\") pod \"s3-tls-init-custom-z859z\" (UID: \"07fc62cc-dc14-4433-b588-31a1a40d5d27\") " pod="kserve/s3-tls-init-custom-z859z" Apr 24 14:32:08.334840 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:08.334746 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-z859z" Apr 24 14:32:08.455295 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:08.455258 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-custom-z859z"] Apr 24 14:32:08.458234 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:32:08.458203 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod07fc62cc_dc14_4433_b588_31a1a40d5d27.slice/crio-1fc99ecfbc0ad956c22ec1e95beca49bbaa08fafede380374b35d0b2c5e2535a WatchSource:0}: Error finding container 1fc99ecfbc0ad956c22ec1e95beca49bbaa08fafede380374b35d0b2c5e2535a: Status 404 returned error can't find the container with id 1fc99ecfbc0ad956c22ec1e95beca49bbaa08fafede380374b35d0b2c5e2535a Apr 24 14:32:09.389944 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:09.389888 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-z859z" event={"ID":"07fc62cc-dc14-4433-b588-31a1a40d5d27","Type":"ContainerStarted","Data":"26eb7fe57c1cfbdf1d26aae1749a1f290f91abd6a19660ef0c27d6742971ab6d"} Apr 24 14:32:09.389944 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:09.389948 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-z859z" event={"ID":"07fc62cc-dc14-4433-b588-31a1a40d5d27","Type":"ContainerStarted","Data":"1fc99ecfbc0ad956c22ec1e95beca49bbaa08fafede380374b35d0b2c5e2535a"} Apr 24 14:32:09.413646 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:09.413590 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-tls-init-custom-z859z" podStartSLOduration=2.413572972 podStartE2EDuration="2.413572972s" podCreationTimestamp="2026-04-24 14:32:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:32:09.412252719 +0000 UTC m=+482.206309619" watchObservedRunningTime="2026-04-24 14:32:09.413572972 +0000 UTC m=+482.207629873" Apr 24 14:32:13.402525 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:13.402490 2569 generic.go:358] "Generic (PLEG): container finished" podID="07fc62cc-dc14-4433-b588-31a1a40d5d27" containerID="26eb7fe57c1cfbdf1d26aae1749a1f290f91abd6a19660ef0c27d6742971ab6d" exitCode=0 Apr 24 14:32:13.402941 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:13.402551 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-z859z" event={"ID":"07fc62cc-dc14-4433-b588-31a1a40d5d27","Type":"ContainerDied","Data":"26eb7fe57c1cfbdf1d26aae1749a1f290f91abd6a19660ef0c27d6742971ab6d"} Apr 24 14:32:14.532040 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:14.532017 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-z859z" Apr 24 14:32:14.609792 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:14.609760 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdvtx\" (UniqueName: \"kubernetes.io/projected/07fc62cc-dc14-4433-b588-31a1a40d5d27-kube-api-access-fdvtx\") pod \"07fc62cc-dc14-4433-b588-31a1a40d5d27\" (UID: \"07fc62cc-dc14-4433-b588-31a1a40d5d27\") " Apr 24 14:32:14.611807 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:14.611784 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07fc62cc-dc14-4433-b588-31a1a40d5d27-kube-api-access-fdvtx" (OuterVolumeSpecName: "kube-api-access-fdvtx") pod "07fc62cc-dc14-4433-b588-31a1a40d5d27" (UID: "07fc62cc-dc14-4433-b588-31a1a40d5d27"). InnerVolumeSpecName "kube-api-access-fdvtx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 14:32:14.710506 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:14.710465 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fdvtx\" (UniqueName: \"kubernetes.io/projected/07fc62cc-dc14-4433-b588-31a1a40d5d27-kube-api-access-fdvtx\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 14:32:15.408855 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:15.408818 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-z859z" event={"ID":"07fc62cc-dc14-4433-b588-31a1a40d5d27","Type":"ContainerDied","Data":"1fc99ecfbc0ad956c22ec1e95beca49bbaa08fafede380374b35d0b2c5e2535a"} Apr 24 14:32:15.408855 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:15.408854 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1fc99ecfbc0ad956c22ec1e95beca49bbaa08fafede380374b35d0b2c5e2535a" Apr 24 14:32:15.408855 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:15.408825 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-z859z" Apr 24 14:32:16.013938 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:16.013909 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-lstx2"] Apr 24 14:32:16.014298 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:16.014212 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="07fc62cc-dc14-4433-b588-31a1a40d5d27" containerName="s3-tls-init-custom" Apr 24 14:32:16.014298 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:16.014222 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="07fc62cc-dc14-4433-b588-31a1a40d5d27" containerName="s3-tls-init-custom" Apr 24 14:32:16.014298 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:16.014266 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="07fc62cc-dc14-4433-b588-31a1a40d5d27" containerName="s3-tls-init-custom" Apr 24 14:32:16.017299 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:16.017283 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-lstx2" Apr 24 14:32:16.030152 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:16.030128 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving-artifact\"" Apr 24 14:32:16.030715 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:16.030702 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving\"" Apr 24 14:32:16.031103 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:16.031087 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-lngv4\"" Apr 24 14:32:16.051723 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:16.051691 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-lstx2"] Apr 24 14:32:16.121230 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:16.121199 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/dedfe7b4-3fd9-43f5-8aad-cb4ee3ddc9aa-data\") pod \"seaweedfs-tls-serving-7fd5766db9-lstx2\" (UID: \"dedfe7b4-3fd9-43f5-8aad-cb4ee3ddc9aa\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-lstx2" Apr 24 14:32:16.121412 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:16.121266 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/dedfe7b4-3fd9-43f5-8aad-cb4ee3ddc9aa-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-lstx2\" (UID: \"dedfe7b4-3fd9-43f5-8aad-cb4ee3ddc9aa\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-lstx2" Apr 24 14:32:16.121412 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:16.121364 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brb98\" (UniqueName: \"kubernetes.io/projected/dedfe7b4-3fd9-43f5-8aad-cb4ee3ddc9aa-kube-api-access-brb98\") pod \"seaweedfs-tls-serving-7fd5766db9-lstx2\" (UID: \"dedfe7b4-3fd9-43f5-8aad-cb4ee3ddc9aa\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-lstx2" Apr 24 14:32:16.222555 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:16.222514 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/dedfe7b4-3fd9-43f5-8aad-cb4ee3ddc9aa-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-lstx2\" (UID: \"dedfe7b4-3fd9-43f5-8aad-cb4ee3ddc9aa\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-lstx2" Apr 24 14:32:16.222729 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:16.222611 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-brb98\" (UniqueName: \"kubernetes.io/projected/dedfe7b4-3fd9-43f5-8aad-cb4ee3ddc9aa-kube-api-access-brb98\") pod \"seaweedfs-tls-serving-7fd5766db9-lstx2\" (UID: \"dedfe7b4-3fd9-43f5-8aad-cb4ee3ddc9aa\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-lstx2" Apr 24 14:32:16.222729 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:16.222658 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/dedfe7b4-3fd9-43f5-8aad-cb4ee3ddc9aa-data\") pod \"seaweedfs-tls-serving-7fd5766db9-lstx2\" (UID: \"dedfe7b4-3fd9-43f5-8aad-cb4ee3ddc9aa\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-lstx2" Apr 24 14:32:16.223109 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:16.223090 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/dedfe7b4-3fd9-43f5-8aad-cb4ee3ddc9aa-data\") pod \"seaweedfs-tls-serving-7fd5766db9-lstx2\" (UID: \"dedfe7b4-3fd9-43f5-8aad-cb4ee3ddc9aa\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-lstx2" Apr 24 14:32:16.225093 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:16.225072 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/dedfe7b4-3fd9-43f5-8aad-cb4ee3ddc9aa-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-lstx2\" (UID: \"dedfe7b4-3fd9-43f5-8aad-cb4ee3ddc9aa\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-lstx2" Apr 24 14:32:16.238475 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:16.238446 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-brb98\" (UniqueName: \"kubernetes.io/projected/dedfe7b4-3fd9-43f5-8aad-cb4ee3ddc9aa-kube-api-access-brb98\") pod \"seaweedfs-tls-serving-7fd5766db9-lstx2\" (UID: \"dedfe7b4-3fd9-43f5-8aad-cb4ee3ddc9aa\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-lstx2" Apr 24 14:32:16.326054 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:16.325961 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-lstx2" Apr 24 14:32:16.447808 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:16.447777 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-lstx2"] Apr 24 14:32:16.451094 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:32:16.451060 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddedfe7b4_3fd9_43f5_8aad_cb4ee3ddc9aa.slice/crio-6394c718118be92f4dfacd23f46734ca339a395f50bbb04073784fbd316ad669 WatchSource:0}: Error finding container 6394c718118be92f4dfacd23f46734ca339a395f50bbb04073784fbd316ad669: Status 404 returned error can't find the container with id 6394c718118be92f4dfacd23f46734ca339a395f50bbb04073784fbd316ad669 Apr 24 14:32:17.418107 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:17.418054 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-lstx2" event={"ID":"dedfe7b4-3fd9-43f5-8aad-cb4ee3ddc9aa","Type":"ContainerStarted","Data":"05dfa6842f1d1f0adadd16af58b5329f3a7797456dd47d9b838ea4968a9f55fa"} Apr 24 14:32:17.418606 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:17.418117 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-lstx2" event={"ID":"dedfe7b4-3fd9-43f5-8aad-cb4ee3ddc9aa","Type":"ContainerStarted","Data":"6394c718118be92f4dfacd23f46734ca339a395f50bbb04073784fbd316ad669"} Apr 24 14:32:17.433581 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:17.433530 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-serving-7fd5766db9-lstx2" podStartSLOduration=2.173667715 podStartE2EDuration="2.433514826s" podCreationTimestamp="2026-04-24 14:32:15 +0000 UTC" firstStartedPulling="2026-04-24 14:32:16.452489342 +0000 UTC m=+489.246546222" lastFinishedPulling="2026-04-24 14:32:16.71233645 +0000 UTC m=+489.506393333" observedRunningTime="2026-04-24 14:32:17.431993672 +0000 UTC m=+490.226050573" watchObservedRunningTime="2026-04-24 14:32:17.433514826 +0000 UTC m=+490.227571744" Apr 24 14:32:17.956076 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:17.956034 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-tls-init-serving-rgjhp"] Apr 24 14:32:17.960472 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:17.960452 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-rgjhp" Apr 24 14:32:17.966038 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:17.966013 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-serving-rgjhp"] Apr 24 14:32:18.037115 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:18.037068 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqqwd\" (UniqueName: \"kubernetes.io/projected/e6a4b37e-7c77-44e6-9bd3-994938f6dba6-kube-api-access-gqqwd\") pod \"s3-tls-init-serving-rgjhp\" (UID: \"e6a4b37e-7c77-44e6-9bd3-994938f6dba6\") " pod="kserve/s3-tls-init-serving-rgjhp" Apr 24 14:32:18.137848 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:18.137806 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gqqwd\" (UniqueName: \"kubernetes.io/projected/e6a4b37e-7c77-44e6-9bd3-994938f6dba6-kube-api-access-gqqwd\") pod \"s3-tls-init-serving-rgjhp\" (UID: \"e6a4b37e-7c77-44e6-9bd3-994938f6dba6\") " pod="kserve/s3-tls-init-serving-rgjhp" Apr 24 14:32:18.145444 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:18.145422 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqqwd\" (UniqueName: \"kubernetes.io/projected/e6a4b37e-7c77-44e6-9bd3-994938f6dba6-kube-api-access-gqqwd\") pod \"s3-tls-init-serving-rgjhp\" (UID: \"e6a4b37e-7c77-44e6-9bd3-994938f6dba6\") " pod="kserve/s3-tls-init-serving-rgjhp" Apr 24 14:32:18.286113 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:18.286024 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-rgjhp" Apr 24 14:32:18.409126 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:18.409092 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-serving-rgjhp"] Apr 24 14:32:18.412155 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:32:18.412124 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6a4b37e_7c77_44e6_9bd3_994938f6dba6.slice/crio-21dc7272398a263e62d7e40a787547ff28738b6fc16f1c2615158d56d9462f23 WatchSource:0}: Error finding container 21dc7272398a263e62d7e40a787547ff28738b6fc16f1c2615158d56d9462f23: Status 404 returned error can't find the container with id 21dc7272398a263e62d7e40a787547ff28738b6fc16f1c2615158d56d9462f23 Apr 24 14:32:18.421721 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:18.421697 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-rgjhp" event={"ID":"e6a4b37e-7c77-44e6-9bd3-994938f6dba6","Type":"ContainerStarted","Data":"21dc7272398a263e62d7e40a787547ff28738b6fc16f1c2615158d56d9462f23"} Apr 24 14:32:19.425762 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:19.425729 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-rgjhp" event={"ID":"e6a4b37e-7c77-44e6-9bd3-994938f6dba6","Type":"ContainerStarted","Data":"6eb93b2b4fcd67c5efe7de820b6323323e01763b4fe91f7c68b29bcdfb11f321"} Apr 24 14:32:19.444022 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:19.443965 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-tls-init-serving-rgjhp" podStartSLOduration=2.443943301 podStartE2EDuration="2.443943301s" podCreationTimestamp="2026-04-24 14:32:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:32:19.442604899 +0000 UTC m=+492.236661801" watchObservedRunningTime="2026-04-24 14:32:19.443943301 +0000 UTC m=+492.238000204" Apr 24 14:32:23.439332 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:23.439299 2569 generic.go:358] "Generic (PLEG): container finished" podID="e6a4b37e-7c77-44e6-9bd3-994938f6dba6" containerID="6eb93b2b4fcd67c5efe7de820b6323323e01763b4fe91f7c68b29bcdfb11f321" exitCode=0 Apr 24 14:32:23.439683 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:23.439370 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-rgjhp" event={"ID":"e6a4b37e-7c77-44e6-9bd3-994938f6dba6","Type":"ContainerDied","Data":"6eb93b2b4fcd67c5efe7de820b6323323e01763b4fe91f7c68b29bcdfb11f321"} Apr 24 14:32:24.575638 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:24.575615 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-rgjhp" Apr 24 14:32:24.694518 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:24.694434 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqqwd\" (UniqueName: \"kubernetes.io/projected/e6a4b37e-7c77-44e6-9bd3-994938f6dba6-kube-api-access-gqqwd\") pod \"e6a4b37e-7c77-44e6-9bd3-994938f6dba6\" (UID: \"e6a4b37e-7c77-44e6-9bd3-994938f6dba6\") " Apr 24 14:32:24.696415 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:24.696382 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6a4b37e-7c77-44e6-9bd3-994938f6dba6-kube-api-access-gqqwd" (OuterVolumeSpecName: "kube-api-access-gqqwd") pod "e6a4b37e-7c77-44e6-9bd3-994938f6dba6" (UID: "e6a4b37e-7c77-44e6-9bd3-994938f6dba6"). InnerVolumeSpecName "kube-api-access-gqqwd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 14:32:24.795202 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:24.795162 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gqqwd\" (UniqueName: \"kubernetes.io/projected/e6a4b37e-7c77-44e6-9bd3-994938f6dba6-kube-api-access-gqqwd\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 14:32:25.446697 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:25.446665 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-rgjhp" Apr 24 14:32:25.446876 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:25.446669 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-rgjhp" event={"ID":"e6a4b37e-7c77-44e6-9bd3-994938f6dba6","Type":"ContainerDied","Data":"21dc7272398a263e62d7e40a787547ff28738b6fc16f1c2615158d56d9462f23"} Apr 24 14:32:25.446876 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:25.446767 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21dc7272398a263e62d7e40a787547ff28738b6fc16f1c2615158d56d9462f23" Apr 24 14:32:36.303911 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:36.303847 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-788dff5dc7-g7jcp"] Apr 24 14:32:36.304376 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:36.304202 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e6a4b37e-7c77-44e6-9bd3-994938f6dba6" containerName="s3-tls-init-serving" Apr 24 14:32:36.304376 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:36.304216 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6a4b37e-7c77-44e6-9bd3-994938f6dba6" containerName="s3-tls-init-serving" Apr 24 14:32:36.304376 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:36.304273 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="e6a4b37e-7c77-44e6-9bd3-994938f6dba6" containerName="s3-tls-init-serving" Apr 24 14:32:36.308737 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:36.308719 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-788dff5dc7-g7jcp" Apr 24 14:32:36.310733 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:36.310715 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-5qb2v\"" Apr 24 14:32:36.314185 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:36.314149 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-788dff5dc7-g7jcp"] Apr 24 14:32:36.402835 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:36.402794 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/10621777-0fdb-41fd-8126-87bc46a928a9-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-788dff5dc7-g7jcp\" (UID: \"10621777-0fdb-41fd-8126-87bc46a928a9\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-788dff5dc7-g7jcp" Apr 24 14:32:36.503968 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:36.503916 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/10621777-0fdb-41fd-8126-87bc46a928a9-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-788dff5dc7-g7jcp\" (UID: \"10621777-0fdb-41fd-8126-87bc46a928a9\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-788dff5dc7-g7jcp" Apr 24 14:32:36.504246 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:36.504227 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/10621777-0fdb-41fd-8126-87bc46a928a9-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-788dff5dc7-g7jcp\" (UID: \"10621777-0fdb-41fd-8126-87bc46a928a9\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-788dff5dc7-g7jcp" Apr 24 14:32:36.620645 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:36.620556 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-788dff5dc7-g7jcp" Apr 24 14:32:36.741792 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:36.741749 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-788dff5dc7-g7jcp"] Apr 24 14:32:36.744608 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:32:36.744577 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10621777_0fdb_41fd_8126_87bc46a928a9.slice/crio-f6583253587d377500ec6b3ec9805c008d488df512ce9805cf11cd77403e4c32 WatchSource:0}: Error finding container f6583253587d377500ec6b3ec9805c008d488df512ce9805cf11cd77403e4c32: Status 404 returned error can't find the container with id f6583253587d377500ec6b3ec9805c008d488df512ce9805cf11cd77403e4c32 Apr 24 14:32:37.481671 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:37.481624 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-788dff5dc7-g7jcp" event={"ID":"10621777-0fdb-41fd-8126-87bc46a928a9","Type":"ContainerStarted","Data":"f6583253587d377500ec6b3ec9805c008d488df512ce9805cf11cd77403e4c32"} Apr 24 14:32:40.492469 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:40.492424 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-788dff5dc7-g7jcp" event={"ID":"10621777-0fdb-41fd-8126-87bc46a928a9","Type":"ContainerStarted","Data":"a96725f44f3eaee73eaa01a49e119e346b7b88a7042022daf97d906cdf73e609"} Apr 24 14:32:43.503338 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:43.503307 2569 generic.go:358] "Generic (PLEG): container finished" podID="10621777-0fdb-41fd-8126-87bc46a928a9" containerID="a96725f44f3eaee73eaa01a49e119e346b7b88a7042022daf97d906cdf73e609" exitCode=0 Apr 24 14:32:43.503707 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:43.503359 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-788dff5dc7-g7jcp" event={"ID":"10621777-0fdb-41fd-8126-87bc46a928a9","Type":"ContainerDied","Data":"a96725f44f3eaee73eaa01a49e119e346b7b88a7042022daf97d906cdf73e609"} Apr 24 14:32:56.556826 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:32:56.556791 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-788dff5dc7-g7jcp" event={"ID":"10621777-0fdb-41fd-8126-87bc46a928a9","Type":"ContainerStarted","Data":"f6331c0cf21eab9c3ec0ca1809c71b877e8e8b684a4854698627f545905e100e"} Apr 24 14:33:00.571135 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:33:00.571091 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-788dff5dc7-g7jcp" event={"ID":"10621777-0fdb-41fd-8126-87bc46a928a9","Type":"ContainerStarted","Data":"1a9a04bb5b23a5dc43ff7b5bb798012fd3b38dc005302e75d75483971acfb8ff"} Apr 24 14:33:00.571586 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:33:00.571317 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-788dff5dc7-g7jcp" Apr 24 14:33:00.571586 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:33:00.571347 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-788dff5dc7-g7jcp" Apr 24 14:33:00.572824 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:33:00.572769 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-788dff5dc7-g7jcp" podUID="10621777-0fdb-41fd-8126-87bc46a928a9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 24 14:33:00.573446 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:33:00.573421 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-788dff5dc7-g7jcp" podUID="10621777-0fdb-41fd-8126-87bc46a928a9" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:33:00.595286 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:33:00.595233 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-788dff5dc7-g7jcp" podStartSLOduration=1.711173496 podStartE2EDuration="24.595220837s" podCreationTimestamp="2026-04-24 14:32:36 +0000 UTC" firstStartedPulling="2026-04-24 14:32:36.746341084 +0000 UTC m=+509.540397966" lastFinishedPulling="2026-04-24 14:32:59.630388425 +0000 UTC m=+532.424445307" observedRunningTime="2026-04-24 14:33:00.593247184 +0000 UTC m=+533.387304084" watchObservedRunningTime="2026-04-24 14:33:00.595220837 +0000 UTC m=+533.389277738" Apr 24 14:33:01.575381 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:33:01.575332 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-788dff5dc7-g7jcp" podUID="10621777-0fdb-41fd-8126-87bc46a928a9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 24 14:33:01.575797 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:33:01.575624 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-788dff5dc7-g7jcp" podUID="10621777-0fdb-41fd-8126-87bc46a928a9" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:33:11.575995 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:33:11.575956 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-788dff5dc7-g7jcp" podUID="10621777-0fdb-41fd-8126-87bc46a928a9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 24 14:33:11.576489 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:33:11.576449 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-788dff5dc7-g7jcp" podUID="10621777-0fdb-41fd-8126-87bc46a928a9" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:33:21.575539 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:33:21.575478 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-788dff5dc7-g7jcp" podUID="10621777-0fdb-41fd-8126-87bc46a928a9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 24 14:33:21.575998 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:33:21.575969 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-788dff5dc7-g7jcp" podUID="10621777-0fdb-41fd-8126-87bc46a928a9" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:33:31.575639 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:33:31.575590 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-788dff5dc7-g7jcp" podUID="10621777-0fdb-41fd-8126-87bc46a928a9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 24 14:33:31.576138 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:33:31.576032 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-788dff5dc7-g7jcp" podUID="10621777-0fdb-41fd-8126-87bc46a928a9" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:33:41.575854 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:33:41.575804 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-788dff5dc7-g7jcp" podUID="10621777-0fdb-41fd-8126-87bc46a928a9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 24 14:33:41.576304 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:33:41.576282 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-788dff5dc7-g7jcp" podUID="10621777-0fdb-41fd-8126-87bc46a928a9" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:33:51.575612 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:33:51.575564 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-788dff5dc7-g7jcp" podUID="10621777-0fdb-41fd-8126-87bc46a928a9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 24 14:33:51.576099 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:33:51.576055 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-788dff5dc7-g7jcp" podUID="10621777-0fdb-41fd-8126-87bc46a928a9" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:34:01.575879 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:01.575828 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-788dff5dc7-g7jcp" podUID="10621777-0fdb-41fd-8126-87bc46a928a9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 24 14:34:01.576368 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:01.576247 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-788dff5dc7-g7jcp" podUID="10621777-0fdb-41fd-8126-87bc46a928a9" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:34:07.690272 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:07.690243 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mjpgb_be4da945-f6d1-4406-adea-f3ccedab88f6/ovn-acl-logging/0.log" Apr 24 14:34:07.692888 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:07.692859 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mjpgb_be4da945-f6d1-4406-adea-f3ccedab88f6/ovn-acl-logging/0.log" Apr 24 14:34:11.576031 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:11.575999 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-788dff5dc7-g7jcp" Apr 24 14:34:11.576406 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:11.576062 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-788dff5dc7-g7jcp" Apr 24 14:34:21.414937 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:21.414886 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-788dff5dc7-g7jcp"] Apr 24 14:34:21.415458 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:21.415242 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-788dff5dc7-g7jcp" podUID="10621777-0fdb-41fd-8126-87bc46a928a9" containerName="kserve-container" containerID="cri-o://f6331c0cf21eab9c3ec0ca1809c71b877e8e8b684a4854698627f545905e100e" gracePeriod=30 Apr 24 14:34:21.415458 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:21.415304 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-788dff5dc7-g7jcp" podUID="10621777-0fdb-41fd-8126-87bc46a928a9" containerName="agent" containerID="cri-o://1a9a04bb5b23a5dc43ff7b5bb798012fd3b38dc005302e75d75483971acfb8ff" gracePeriod=30 Apr 24 14:34:21.487362 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:21.487325 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-9757ddd9d-hjlzl"] Apr 24 14:34:21.490871 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:21.490851 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-9757ddd9d-hjlzl" Apr 24 14:34:21.499318 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:21.499278 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-9757ddd9d-hjlzl"] Apr 24 14:34:21.530088 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:21.530056 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c28a7590-d921-44ed-b9d4-2bb7388dfcae-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-9757ddd9d-hjlzl\" (UID: \"c28a7590-d921-44ed-b9d4-2bb7388dfcae\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-9757ddd9d-hjlzl" Apr 24 14:34:21.575830 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:21.575788 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-788dff5dc7-g7jcp" podUID="10621777-0fdb-41fd-8126-87bc46a928a9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 24 14:34:21.576142 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:21.576114 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-788dff5dc7-g7jcp" podUID="10621777-0fdb-41fd-8126-87bc46a928a9" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:34:21.631246 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:21.631208 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c28a7590-d921-44ed-b9d4-2bb7388dfcae-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-9757ddd9d-hjlzl\" (UID: \"c28a7590-d921-44ed-b9d4-2bb7388dfcae\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-9757ddd9d-hjlzl" Apr 24 14:34:21.631600 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:21.631579 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c28a7590-d921-44ed-b9d4-2bb7388dfcae-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-9757ddd9d-hjlzl\" (UID: \"c28a7590-d921-44ed-b9d4-2bb7388dfcae\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-9757ddd9d-hjlzl" Apr 24 14:34:21.802742 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:21.802662 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-9757ddd9d-hjlzl" Apr 24 14:34:21.924494 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:21.924453 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-9757ddd9d-hjlzl"] Apr 24 14:34:21.927947 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:34:21.927918 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc28a7590_d921_44ed_b9d4_2bb7388dfcae.slice/crio-89475455be9e90b640a3bf99d5f090ca32805666663ba2cf495b2d431e99e8f3 WatchSource:0}: Error finding container 89475455be9e90b640a3bf99d5f090ca32805666663ba2cf495b2d431e99e8f3: Status 404 returned error can't find the container with id 89475455be9e90b640a3bf99d5f090ca32805666663ba2cf495b2d431e99e8f3 Apr 24 14:34:22.830666 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:22.830627 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-9757ddd9d-hjlzl" event={"ID":"c28a7590-d921-44ed-b9d4-2bb7388dfcae","Type":"ContainerStarted","Data":"09c9e5138b60cc38724d44681cdc7e91efa527104abf617e4e4afe7a303a00a7"} Apr 24 14:34:22.830666 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:22.830667 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-9757ddd9d-hjlzl" event={"ID":"c28a7590-d921-44ed-b9d4-2bb7388dfcae","Type":"ContainerStarted","Data":"89475455be9e90b640a3bf99d5f090ca32805666663ba2cf495b2d431e99e8f3"} Apr 24 14:34:25.840001 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:25.839965 2569 generic.go:358] "Generic (PLEG): container finished" podID="c28a7590-d921-44ed-b9d4-2bb7388dfcae" containerID="09c9e5138b60cc38724d44681cdc7e91efa527104abf617e4e4afe7a303a00a7" exitCode=0 Apr 24 14:34:25.840375 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:25.840035 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-9757ddd9d-hjlzl" event={"ID":"c28a7590-d921-44ed-b9d4-2bb7388dfcae","Type":"ContainerDied","Data":"09c9e5138b60cc38724d44681cdc7e91efa527104abf617e4e4afe7a303a00a7"} Apr 24 14:34:26.845008 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:26.844923 2569 generic.go:358] "Generic (PLEG): container finished" podID="10621777-0fdb-41fd-8126-87bc46a928a9" containerID="f6331c0cf21eab9c3ec0ca1809c71b877e8e8b684a4854698627f545905e100e" exitCode=0 Apr 24 14:34:26.845410 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:26.845000 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-788dff5dc7-g7jcp" event={"ID":"10621777-0fdb-41fd-8126-87bc46a928a9","Type":"ContainerDied","Data":"f6331c0cf21eab9c3ec0ca1809c71b877e8e8b684a4854698627f545905e100e"} Apr 24 14:34:26.846925 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:26.846885 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-9757ddd9d-hjlzl" event={"ID":"c28a7590-d921-44ed-b9d4-2bb7388dfcae","Type":"ContainerStarted","Data":"9f7ce3fdf1bfa872d0b22df7d4878100c7a4cf009e0d7c2773f7bb28c5f0b30c"} Apr 24 14:34:26.847045 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:26.846933 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-9757ddd9d-hjlzl" event={"ID":"c28a7590-d921-44ed-b9d4-2bb7388dfcae","Type":"ContainerStarted","Data":"aa51794ddae611099f335d94a0133734d3426a89e94a1b0d8736b77be4079cc1"} Apr 24 14:34:26.847266 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:26.847224 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-9757ddd9d-hjlzl" Apr 24 14:34:26.848669 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:26.848643 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-9757ddd9d-hjlzl" podUID="c28a7590-d921-44ed-b9d4-2bb7388dfcae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:5000: connect: connection refused" Apr 24 14:34:26.862979 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:26.862924 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-9757ddd9d-hjlzl" podStartSLOduration=5.862906384 podStartE2EDuration="5.862906384s" podCreationTimestamp="2026-04-24 14:34:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:34:26.860721951 +0000 UTC m=+619.654778873" watchObservedRunningTime="2026-04-24 14:34:26.862906384 +0000 UTC m=+619.656963277" Apr 24 14:34:27.850094 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:27.850065 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-9757ddd9d-hjlzl" Apr 24 14:34:27.850515 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:27.850175 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-9757ddd9d-hjlzl" podUID="c28a7590-d921-44ed-b9d4-2bb7388dfcae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:5000: connect: connection refused" Apr 24 14:34:27.851065 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:27.851043 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-9757ddd9d-hjlzl" podUID="c28a7590-d921-44ed-b9d4-2bb7388dfcae" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:34:28.853997 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:28.853948 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-9757ddd9d-hjlzl" podUID="c28a7590-d921-44ed-b9d4-2bb7388dfcae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:5000: connect: connection refused" Apr 24 14:34:28.854388 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:28.854285 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-9757ddd9d-hjlzl" podUID="c28a7590-d921-44ed-b9d4-2bb7388dfcae" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:34:31.575413 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:31.575363 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-788dff5dc7-g7jcp" podUID="10621777-0fdb-41fd-8126-87bc46a928a9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 24 14:34:31.575831 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:31.575709 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-788dff5dc7-g7jcp" podUID="10621777-0fdb-41fd-8126-87bc46a928a9" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:34:38.854854 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:38.854811 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-9757ddd9d-hjlzl" podUID="c28a7590-d921-44ed-b9d4-2bb7388dfcae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:5000: connect: connection refused" Apr 24 14:34:38.855300 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:38.855277 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-9757ddd9d-hjlzl" podUID="c28a7590-d921-44ed-b9d4-2bb7388dfcae" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:34:41.575464 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:41.575412 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-788dff5dc7-g7jcp" podUID="10621777-0fdb-41fd-8126-87bc46a928a9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 24 14:34:41.575905 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:41.575560 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-788dff5dc7-g7jcp" Apr 24 14:34:41.575905 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:41.575807 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-788dff5dc7-g7jcp" podUID="10621777-0fdb-41fd-8126-87bc46a928a9" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:34:41.576007 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:41.575988 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-788dff5dc7-g7jcp" Apr 24 14:34:48.854497 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:48.854444 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-9757ddd9d-hjlzl" podUID="c28a7590-d921-44ed-b9d4-2bb7388dfcae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:5000: connect: connection refused" Apr 24 14:34:48.854984 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:48.854940 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-9757ddd9d-hjlzl" podUID="c28a7590-d921-44ed-b9d4-2bb7388dfcae" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:34:51.560462 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:51.560437 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-788dff5dc7-g7jcp" Apr 24 14:34:51.699089 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:51.698991 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/10621777-0fdb-41fd-8126-87bc46a928a9-kserve-provision-location\") pod \"10621777-0fdb-41fd-8126-87bc46a928a9\" (UID: \"10621777-0fdb-41fd-8126-87bc46a928a9\") " Apr 24 14:34:51.699349 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:51.699325 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10621777-0fdb-41fd-8126-87bc46a928a9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "10621777-0fdb-41fd-8126-87bc46a928a9" (UID: "10621777-0fdb-41fd-8126-87bc46a928a9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:34:51.800613 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:51.800579 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/10621777-0fdb-41fd-8126-87bc46a928a9-kserve-provision-location\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 14:34:51.926098 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:51.926060 2569 generic.go:358] "Generic (PLEG): container finished" podID="10621777-0fdb-41fd-8126-87bc46a928a9" containerID="1a9a04bb5b23a5dc43ff7b5bb798012fd3b38dc005302e75d75483971acfb8ff" exitCode=0 Apr 24 14:34:51.926266 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:51.926130 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-788dff5dc7-g7jcp" event={"ID":"10621777-0fdb-41fd-8126-87bc46a928a9","Type":"ContainerDied","Data":"1a9a04bb5b23a5dc43ff7b5bb798012fd3b38dc005302e75d75483971acfb8ff"} Apr 24 14:34:51.926266 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:51.926156 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-788dff5dc7-g7jcp" Apr 24 14:34:51.926266 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:51.926167 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-788dff5dc7-g7jcp" event={"ID":"10621777-0fdb-41fd-8126-87bc46a928a9","Type":"ContainerDied","Data":"f6583253587d377500ec6b3ec9805c008d488df512ce9805cf11cd77403e4c32"} Apr 24 14:34:51.926266 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:51.926187 2569 scope.go:117] "RemoveContainer" containerID="1a9a04bb5b23a5dc43ff7b5bb798012fd3b38dc005302e75d75483971acfb8ff" Apr 24 14:34:51.934120 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:51.934100 2569 scope.go:117] "RemoveContainer" containerID="f6331c0cf21eab9c3ec0ca1809c71b877e8e8b684a4854698627f545905e100e" Apr 24 14:34:51.941771 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:51.941745 2569 scope.go:117] "RemoveContainer" containerID="a96725f44f3eaee73eaa01a49e119e346b7b88a7042022daf97d906cdf73e609" Apr 24 14:34:51.941922 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:51.941878 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-788dff5dc7-g7jcp"] Apr 24 14:34:51.945409 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:51.945387 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-788dff5dc7-g7jcp"] Apr 24 14:34:51.950020 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:51.950003 2569 scope.go:117] "RemoveContainer" containerID="1a9a04bb5b23a5dc43ff7b5bb798012fd3b38dc005302e75d75483971acfb8ff" Apr 24 14:34:51.950307 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:34:51.950284 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a9a04bb5b23a5dc43ff7b5bb798012fd3b38dc005302e75d75483971acfb8ff\": container with ID starting with 1a9a04bb5b23a5dc43ff7b5bb798012fd3b38dc005302e75d75483971acfb8ff not found: ID does not exist" containerID="1a9a04bb5b23a5dc43ff7b5bb798012fd3b38dc005302e75d75483971acfb8ff" Apr 24 14:34:51.950394 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:51.950316 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a9a04bb5b23a5dc43ff7b5bb798012fd3b38dc005302e75d75483971acfb8ff"} err="failed to get container status \"1a9a04bb5b23a5dc43ff7b5bb798012fd3b38dc005302e75d75483971acfb8ff\": rpc error: code = NotFound desc = could not find container \"1a9a04bb5b23a5dc43ff7b5bb798012fd3b38dc005302e75d75483971acfb8ff\": container with ID starting with 1a9a04bb5b23a5dc43ff7b5bb798012fd3b38dc005302e75d75483971acfb8ff not found: ID does not exist" Apr 24 14:34:51.950394 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:51.950337 2569 scope.go:117] "RemoveContainer" containerID="f6331c0cf21eab9c3ec0ca1809c71b877e8e8b684a4854698627f545905e100e" Apr 24 14:34:51.950580 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:34:51.950564 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6331c0cf21eab9c3ec0ca1809c71b877e8e8b684a4854698627f545905e100e\": container with ID starting with f6331c0cf21eab9c3ec0ca1809c71b877e8e8b684a4854698627f545905e100e not found: ID does not exist" containerID="f6331c0cf21eab9c3ec0ca1809c71b877e8e8b684a4854698627f545905e100e" Apr 24 14:34:51.950623 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:51.950584 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6331c0cf21eab9c3ec0ca1809c71b877e8e8b684a4854698627f545905e100e"} err="failed to get container status \"f6331c0cf21eab9c3ec0ca1809c71b877e8e8b684a4854698627f545905e100e\": rpc error: code = NotFound desc = could not find container \"f6331c0cf21eab9c3ec0ca1809c71b877e8e8b684a4854698627f545905e100e\": container with ID starting with f6331c0cf21eab9c3ec0ca1809c71b877e8e8b684a4854698627f545905e100e not found: ID does not exist" Apr 24 14:34:51.950623 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:51.950597 2569 scope.go:117] "RemoveContainer" containerID="a96725f44f3eaee73eaa01a49e119e346b7b88a7042022daf97d906cdf73e609" Apr 24 14:34:51.950774 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:34:51.950755 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a96725f44f3eaee73eaa01a49e119e346b7b88a7042022daf97d906cdf73e609\": container with ID starting with a96725f44f3eaee73eaa01a49e119e346b7b88a7042022daf97d906cdf73e609 not found: ID does not exist" containerID="a96725f44f3eaee73eaa01a49e119e346b7b88a7042022daf97d906cdf73e609" Apr 24 14:34:51.950819 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:51.950778 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a96725f44f3eaee73eaa01a49e119e346b7b88a7042022daf97d906cdf73e609"} err="failed to get container status \"a96725f44f3eaee73eaa01a49e119e346b7b88a7042022daf97d906cdf73e609\": rpc error: code = NotFound desc = could not find container \"a96725f44f3eaee73eaa01a49e119e346b7b88a7042022daf97d906cdf73e609\": container with ID starting with a96725f44f3eaee73eaa01a49e119e346b7b88a7042022daf97d906cdf73e609 not found: ID does not exist" Apr 24 14:34:53.831745 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:53.831711 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10621777-0fdb-41fd-8126-87bc46a928a9" path="/var/lib/kubelet/pods/10621777-0fdb-41fd-8126-87bc46a928a9/volumes" Apr 24 14:34:58.854696 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:58.854653 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-9757ddd9d-hjlzl" podUID="c28a7590-d921-44ed-b9d4-2bb7388dfcae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:5000: connect: connection refused" Apr 24 14:34:58.855203 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:34:58.855177 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-9757ddd9d-hjlzl" podUID="c28a7590-d921-44ed-b9d4-2bb7388dfcae" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:35:08.854013 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:35:08.853956 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-9757ddd9d-hjlzl" podUID="c28a7590-d921-44ed-b9d4-2bb7388dfcae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:5000: connect: connection refused" Apr 24 14:35:08.854470 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:35:08.854417 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-9757ddd9d-hjlzl" podUID="c28a7590-d921-44ed-b9d4-2bb7388dfcae" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:35:18.854531 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:35:18.854477 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-9757ddd9d-hjlzl" podUID="c28a7590-d921-44ed-b9d4-2bb7388dfcae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:5000: connect: connection refused" Apr 24 14:35:18.855043 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:35:18.854860 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-9757ddd9d-hjlzl" podUID="c28a7590-d921-44ed-b9d4-2bb7388dfcae" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:35:28.854132 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:35:28.854075 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-9757ddd9d-hjlzl" podUID="c28a7590-d921-44ed-b9d4-2bb7388dfcae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:5000: connect: connection refused" Apr 24 14:35:28.854691 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:35:28.854522 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-9757ddd9d-hjlzl" podUID="c28a7590-d921-44ed-b9d4-2bb7388dfcae" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:35:38.854854 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:35:38.854820 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-9757ddd9d-hjlzl" Apr 24 14:35:38.855477 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:35:38.854976 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-9757ddd9d-hjlzl" Apr 24 14:35:46.618439 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:35:46.618403 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-9757ddd9d-hjlzl"] Apr 24 14:35:46.618927 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:35:46.618666 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-9757ddd9d-hjlzl" podUID="c28a7590-d921-44ed-b9d4-2bb7388dfcae" containerName="kserve-container" containerID="cri-o://aa51794ddae611099f335d94a0133734d3426a89e94a1b0d8736b77be4079cc1" gracePeriod=30 Apr 24 14:35:46.618927 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:35:46.618720 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-9757ddd9d-hjlzl" podUID="c28a7590-d921-44ed-b9d4-2bb7388dfcae" containerName="agent" containerID="cri-o://9f7ce3fdf1bfa872d0b22df7d4878100c7a4cf009e0d7c2773f7bb28c5f0b30c" gracePeriod=30 Apr 24 14:35:48.854084 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:35:48.854030 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-9757ddd9d-hjlzl" podUID="c28a7590-d921-44ed-b9d4-2bb7388dfcae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:5000: connect: connection refused" Apr 24 14:35:48.854480 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:35:48.854339 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-9757ddd9d-hjlzl" podUID="c28a7590-d921-44ed-b9d4-2bb7388dfcae" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:35:51.105211 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:35:51.105179 2569 generic.go:358] "Generic (PLEG): container finished" podID="c28a7590-d921-44ed-b9d4-2bb7388dfcae" containerID="aa51794ddae611099f335d94a0133734d3426a89e94a1b0d8736b77be4079cc1" exitCode=0 Apr 24 14:35:51.105585 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:35:51.105249 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-9757ddd9d-hjlzl" event={"ID":"c28a7590-d921-44ed-b9d4-2bb7388dfcae","Type":"ContainerDied","Data":"aa51794ddae611099f335d94a0133734d3426a89e94a1b0d8736b77be4079cc1"} Apr 24 14:35:56.682351 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:35:56.682275 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-544f9f4cf8-rskqq"] Apr 24 14:35:56.682783 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:35:56.682611 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="10621777-0fdb-41fd-8126-87bc46a928a9" containerName="storage-initializer" Apr 24 14:35:56.682783 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:35:56.682624 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="10621777-0fdb-41fd-8126-87bc46a928a9" containerName="storage-initializer" Apr 24 14:35:56.682783 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:35:56.682642 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="10621777-0fdb-41fd-8126-87bc46a928a9" containerName="agent" Apr 24 14:35:56.682783 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:35:56.682648 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="10621777-0fdb-41fd-8126-87bc46a928a9" containerName="agent" Apr 24 14:35:56.682783 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:35:56.682663 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="10621777-0fdb-41fd-8126-87bc46a928a9" containerName="kserve-container" Apr 24 14:35:56.682783 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:35:56.682669 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="10621777-0fdb-41fd-8126-87bc46a928a9" containerName="kserve-container" Apr 24 14:35:56.682783 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:35:56.682715 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="10621777-0fdb-41fd-8126-87bc46a928a9" containerName="kserve-container" Apr 24 14:35:56.682783 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:35:56.682722 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="10621777-0fdb-41fd-8126-87bc46a928a9" containerName="agent" Apr 24 14:35:56.686020 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:35:56.686003 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-544f9f4cf8-rskqq" Apr 24 14:35:56.693495 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:35:56.693472 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-544f9f4cf8-rskqq"] Apr 24 14:35:56.745936 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:35:56.745871 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2312064e-52cb-42e7-9d5d-3b0cc0a99a16-kserve-provision-location\") pod \"isvc-logger-predictor-544f9f4cf8-rskqq\" (UID: \"2312064e-52cb-42e7-9d5d-3b0cc0a99a16\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-544f9f4cf8-rskqq" Apr 24 14:35:56.847260 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:35:56.847220 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2312064e-52cb-42e7-9d5d-3b0cc0a99a16-kserve-provision-location\") pod \"isvc-logger-predictor-544f9f4cf8-rskqq\" (UID: \"2312064e-52cb-42e7-9d5d-3b0cc0a99a16\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-544f9f4cf8-rskqq" Apr 24 14:35:56.847607 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:35:56.847588 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2312064e-52cb-42e7-9d5d-3b0cc0a99a16-kserve-provision-location\") pod \"isvc-logger-predictor-544f9f4cf8-rskqq\" (UID: \"2312064e-52cb-42e7-9d5d-3b0cc0a99a16\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-544f9f4cf8-rskqq" Apr 24 14:35:56.997543 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:35:56.997439 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-544f9f4cf8-rskqq" Apr 24 14:35:57.124943 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:35:57.124751 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-544f9f4cf8-rskqq"] Apr 24 14:35:57.127630 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:35:57.127603 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2312064e_52cb_42e7_9d5d_3b0cc0a99a16.slice/crio-fbe05ee21c7aa6f054a1818f9685d3d814134d6fe1272b97cda7b0bd4d9a4f02 WatchSource:0}: Error finding container fbe05ee21c7aa6f054a1818f9685d3d814134d6fe1272b97cda7b0bd4d9a4f02: Status 404 returned error can't find the container with id fbe05ee21c7aa6f054a1818f9685d3d814134d6fe1272b97cda7b0bd4d9a4f02 Apr 24 14:35:58.129244 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:35:58.129209 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-544f9f4cf8-rskqq" event={"ID":"2312064e-52cb-42e7-9d5d-3b0cc0a99a16","Type":"ContainerStarted","Data":"aeed430c5b807a927765d99b92a5f794b7e010b4ec3bc47a3d8d4cfaf985f44e"} Apr 24 14:35:58.129244 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:35:58.129246 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-544f9f4cf8-rskqq" event={"ID":"2312064e-52cb-42e7-9d5d-3b0cc0a99a16","Type":"ContainerStarted","Data":"fbe05ee21c7aa6f054a1818f9685d3d814134d6fe1272b97cda7b0bd4d9a4f02"} Apr 24 14:35:58.854526 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:35:58.854474 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-9757ddd9d-hjlzl" podUID="c28a7590-d921-44ed-b9d4-2bb7388dfcae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:5000: connect: connection refused" Apr 24 14:35:58.854864 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:35:58.854834 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-9757ddd9d-hjlzl" podUID="c28a7590-d921-44ed-b9d4-2bb7388dfcae" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:36:01.139500 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:36:01.139417 2569 generic.go:358] "Generic (PLEG): container finished" podID="2312064e-52cb-42e7-9d5d-3b0cc0a99a16" containerID="aeed430c5b807a927765d99b92a5f794b7e010b4ec3bc47a3d8d4cfaf985f44e" exitCode=0 Apr 24 14:36:01.139835 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:36:01.139490 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-544f9f4cf8-rskqq" event={"ID":"2312064e-52cb-42e7-9d5d-3b0cc0a99a16","Type":"ContainerDied","Data":"aeed430c5b807a927765d99b92a5f794b7e010b4ec3bc47a3d8d4cfaf985f44e"} Apr 24 14:36:02.144776 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:36:02.144740 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-544f9f4cf8-rskqq" event={"ID":"2312064e-52cb-42e7-9d5d-3b0cc0a99a16","Type":"ContainerStarted","Data":"2ef325bf9c2c5951cc5257b8afdae8e493d2dc9fc0c3730315a4adb9d1cbb8b0"} Apr 24 14:36:02.145172 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:36:02.144784 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-544f9f4cf8-rskqq" event={"ID":"2312064e-52cb-42e7-9d5d-3b0cc0a99a16","Type":"ContainerStarted","Data":"cb7172159326321022026e09972c98a86a4f81c8a9a825e7b1a208009dea5963"} Apr 24 14:36:02.145172 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:36:02.145144 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-544f9f4cf8-rskqq" Apr 24 14:36:02.146402 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:36:02.146375 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-544f9f4cf8-rskqq" podUID="2312064e-52cb-42e7-9d5d-3b0cc0a99a16" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 24 14:36:02.174987 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:36:02.174938 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-logger-predictor-544f9f4cf8-rskqq" podStartSLOduration=6.174922682 podStartE2EDuration="6.174922682s" podCreationTimestamp="2026-04-24 14:35:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:36:02.174093695 +0000 UTC m=+714.968150596" watchObservedRunningTime="2026-04-24 14:36:02.174922682 +0000 UTC m=+714.968979583" Apr 24 14:36:03.147585 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:36:03.147559 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-544f9f4cf8-rskqq" Apr 24 14:36:03.148100 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:36:03.147673 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-544f9f4cf8-rskqq" podUID="2312064e-52cb-42e7-9d5d-3b0cc0a99a16" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 24 14:36:03.148687 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:36:03.148640 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-544f9f4cf8-rskqq" podUID="2312064e-52cb-42e7-9d5d-3b0cc0a99a16" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:36:04.150944 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:36:04.150875 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-544f9f4cf8-rskqq" podUID="2312064e-52cb-42e7-9d5d-3b0cc0a99a16" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 24 14:36:04.151357 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:36:04.151329 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-544f9f4cf8-rskqq" podUID="2312064e-52cb-42e7-9d5d-3b0cc0a99a16" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:36:08.854845 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:36:08.854769 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-9757ddd9d-hjlzl" podUID="c28a7590-d921-44ed-b9d4-2bb7388dfcae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:5000: connect: connection refused" Apr 24 14:36:08.855280 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:36:08.854969 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-9757ddd9d-hjlzl" Apr 24 14:36:08.855280 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:36:08.855039 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-9757ddd9d-hjlzl" podUID="c28a7590-d921-44ed-b9d4-2bb7388dfcae" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:36:08.855280 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:36:08.855151 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-9757ddd9d-hjlzl" Apr 24 14:36:14.151562 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:36:14.151512 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-544f9f4cf8-rskqq" podUID="2312064e-52cb-42e7-9d5d-3b0cc0a99a16" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 24 14:36:14.152041 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:36:14.151985 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-544f9f4cf8-rskqq" podUID="2312064e-52cb-42e7-9d5d-3b0cc0a99a16" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:36:16.762882 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:36:16.762855 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-9757ddd9d-hjlzl" Apr 24 14:36:16.918506 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:36:16.918470 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c28a7590-d921-44ed-b9d4-2bb7388dfcae-kserve-provision-location\") pod \"c28a7590-d921-44ed-b9d4-2bb7388dfcae\" (UID: \"c28a7590-d921-44ed-b9d4-2bb7388dfcae\") " Apr 24 14:36:16.918777 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:36:16.918752 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c28a7590-d921-44ed-b9d4-2bb7388dfcae-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c28a7590-d921-44ed-b9d4-2bb7388dfcae" (UID: "c28a7590-d921-44ed-b9d4-2bb7388dfcae"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:36:17.019472 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:36:17.019436 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c28a7590-d921-44ed-b9d4-2bb7388dfcae-kserve-provision-location\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 14:36:17.191403 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:36:17.191318 2569 generic.go:358] "Generic (PLEG): container finished" podID="c28a7590-d921-44ed-b9d4-2bb7388dfcae" containerID="9f7ce3fdf1bfa872d0b22df7d4878100c7a4cf009e0d7c2773f7bb28c5f0b30c" exitCode=0 Apr 24 14:36:17.191542 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:36:17.191401 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-9757ddd9d-hjlzl" event={"ID":"c28a7590-d921-44ed-b9d4-2bb7388dfcae","Type":"ContainerDied","Data":"9f7ce3fdf1bfa872d0b22df7d4878100c7a4cf009e0d7c2773f7bb28c5f0b30c"} Apr 24 14:36:17.191542 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:36:17.191423 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-9757ddd9d-hjlzl" Apr 24 14:36:17.191542 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:36:17.191439 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-9757ddd9d-hjlzl" event={"ID":"c28a7590-d921-44ed-b9d4-2bb7388dfcae","Type":"ContainerDied","Data":"89475455be9e90b640a3bf99d5f090ca32805666663ba2cf495b2d431e99e8f3"} Apr 24 14:36:17.191542 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:36:17.191455 2569 scope.go:117] "RemoveContainer" containerID="9f7ce3fdf1bfa872d0b22df7d4878100c7a4cf009e0d7c2773f7bb28c5f0b30c" Apr 24 14:36:17.199590 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:36:17.199334 2569 scope.go:117] "RemoveContainer" containerID="aa51794ddae611099f335d94a0133734d3426a89e94a1b0d8736b77be4079cc1" Apr 24 14:36:17.206606 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:36:17.206584 2569 scope.go:117] "RemoveContainer" containerID="09c9e5138b60cc38724d44681cdc7e91efa527104abf617e4e4afe7a303a00a7" Apr 24 14:36:17.212768 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:36:17.212741 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-9757ddd9d-hjlzl"] Apr 24 14:36:17.214494 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:36:17.214461 2569 scope.go:117] "RemoveContainer" containerID="9f7ce3fdf1bfa872d0b22df7d4878100c7a4cf009e0d7c2773f7bb28c5f0b30c" Apr 24 14:36:17.215228 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:36:17.215202 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f7ce3fdf1bfa872d0b22df7d4878100c7a4cf009e0d7c2773f7bb28c5f0b30c\": container with ID starting with 9f7ce3fdf1bfa872d0b22df7d4878100c7a4cf009e0d7c2773f7bb28c5f0b30c not found: ID does not exist" containerID="9f7ce3fdf1bfa872d0b22df7d4878100c7a4cf009e0d7c2773f7bb28c5f0b30c" Apr 24 14:36:17.215399 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:36:17.215366 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f7ce3fdf1bfa872d0b22df7d4878100c7a4cf009e0d7c2773f7bb28c5f0b30c"} err="failed to get container status \"9f7ce3fdf1bfa872d0b22df7d4878100c7a4cf009e0d7c2773f7bb28c5f0b30c\": rpc error: code = NotFound desc = could not find container \"9f7ce3fdf1bfa872d0b22df7d4878100c7a4cf009e0d7c2773f7bb28c5f0b30c\": container with ID starting with 9f7ce3fdf1bfa872d0b22df7d4878100c7a4cf009e0d7c2773f7bb28c5f0b30c not found: ID does not exist" Apr 24 14:36:17.215502 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:36:17.215402 2569 scope.go:117] "RemoveContainer" containerID="aa51794ddae611099f335d94a0133734d3426a89e94a1b0d8736b77be4079cc1" Apr 24 14:36:17.215738 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:36:17.215719 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa51794ddae611099f335d94a0133734d3426a89e94a1b0d8736b77be4079cc1\": container with ID starting with aa51794ddae611099f335d94a0133734d3426a89e94a1b0d8736b77be4079cc1 not found: ID does not exist" containerID="aa51794ddae611099f335d94a0133734d3426a89e94a1b0d8736b77be4079cc1" Apr 24 14:36:17.215798 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:36:17.215746 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa51794ddae611099f335d94a0133734d3426a89e94a1b0d8736b77be4079cc1"} err="failed to get container status \"aa51794ddae611099f335d94a0133734d3426a89e94a1b0d8736b77be4079cc1\": rpc error: code = NotFound desc = could not find container \"aa51794ddae611099f335d94a0133734d3426a89e94a1b0d8736b77be4079cc1\": container with ID starting with aa51794ddae611099f335d94a0133734d3426a89e94a1b0d8736b77be4079cc1 not found: ID does not exist" Apr 24 14:36:17.215798 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:36:17.215768 2569 scope.go:117] "RemoveContainer" containerID="09c9e5138b60cc38724d44681cdc7e91efa527104abf617e4e4afe7a303a00a7" Apr 24 14:36:17.216069 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:36:17.216047 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09c9e5138b60cc38724d44681cdc7e91efa527104abf617e4e4afe7a303a00a7\": container with ID starting with 09c9e5138b60cc38724d44681cdc7e91efa527104abf617e4e4afe7a303a00a7 not found: ID does not exist" containerID="09c9e5138b60cc38724d44681cdc7e91efa527104abf617e4e4afe7a303a00a7" Apr 24 14:36:17.216165 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:36:17.216075 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09c9e5138b60cc38724d44681cdc7e91efa527104abf617e4e4afe7a303a00a7"} err="failed to get container status \"09c9e5138b60cc38724d44681cdc7e91efa527104abf617e4e4afe7a303a00a7\": rpc error: code = NotFound desc = could not find container \"09c9e5138b60cc38724d44681cdc7e91efa527104abf617e4e4afe7a303a00a7\": container with ID starting with 09c9e5138b60cc38724d44681cdc7e91efa527104abf617e4e4afe7a303a00a7 not found: ID does not exist" Apr 24 14:36:17.216787 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:36:17.216772 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-9757ddd9d-hjlzl"] Apr 24 14:36:17.827601 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:36:17.827568 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c28a7590-d921-44ed-b9d4-2bb7388dfcae" path="/var/lib/kubelet/pods/c28a7590-d921-44ed-b9d4-2bb7388dfcae/volumes" Apr 24 14:36:24.151084 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:36:24.151037 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-544f9f4cf8-rskqq" podUID="2312064e-52cb-42e7-9d5d-3b0cc0a99a16" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 24 14:36:24.151541 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:36:24.151454 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-544f9f4cf8-rskqq" podUID="2312064e-52cb-42e7-9d5d-3b0cc0a99a16" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:36:34.151730 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:36:34.151686 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-544f9f4cf8-rskqq" podUID="2312064e-52cb-42e7-9d5d-3b0cc0a99a16" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 24 14:36:34.152284 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:36:34.152260 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-544f9f4cf8-rskqq" podUID="2312064e-52cb-42e7-9d5d-3b0cc0a99a16" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:36:44.151680 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:36:44.151637 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-544f9f4cf8-rskqq" podUID="2312064e-52cb-42e7-9d5d-3b0cc0a99a16" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 24 14:36:44.152264 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:36:44.152095 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-544f9f4cf8-rskqq" podUID="2312064e-52cb-42e7-9d5d-3b0cc0a99a16" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:36:54.151370 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:36:54.151313 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-544f9f4cf8-rskqq" podUID="2312064e-52cb-42e7-9d5d-3b0cc0a99a16" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 24 14:36:54.151876 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:36:54.151758 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-544f9f4cf8-rskqq" podUID="2312064e-52cb-42e7-9d5d-3b0cc0a99a16" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:37:04.151061 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:04.151009 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-544f9f4cf8-rskqq" podUID="2312064e-52cb-42e7-9d5d-3b0cc0a99a16" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 24 14:37:04.151581 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:04.151542 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-544f9f4cf8-rskqq" podUID="2312064e-52cb-42e7-9d5d-3b0cc0a99a16" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:37:14.152144 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:14.152089 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-544f9f4cf8-rskqq" Apr 24 14:37:14.152714 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:14.152554 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-544f9f4cf8-rskqq" Apr 24 14:37:21.910495 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:21.910462 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-544f9f4cf8-rskqq"] Apr 24 14:37:21.911027 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:21.910754 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-544f9f4cf8-rskqq" podUID="2312064e-52cb-42e7-9d5d-3b0cc0a99a16" containerName="kserve-container" containerID="cri-o://cb7172159326321022026e09972c98a86a4f81c8a9a825e7b1a208009dea5963" gracePeriod=30 Apr 24 14:37:21.911027 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:21.910847 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-544f9f4cf8-rskqq" podUID="2312064e-52cb-42e7-9d5d-3b0cc0a99a16" containerName="agent" containerID="cri-o://2ef325bf9c2c5951cc5257b8afdae8e493d2dc9fc0c3730315a4adb9d1cbb8b0" gracePeriod=30 Apr 24 14:37:21.919546 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:21.919515 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-kr52v"] Apr 24 14:37:21.919858 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:21.919846 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c28a7590-d921-44ed-b9d4-2bb7388dfcae" containerName="kserve-container" Apr 24 14:37:21.919926 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:21.919861 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="c28a7590-d921-44ed-b9d4-2bb7388dfcae" containerName="kserve-container" Apr 24 14:37:21.919926 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:21.919870 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c28a7590-d921-44ed-b9d4-2bb7388dfcae" containerName="agent" Apr 24 14:37:21.919926 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:21.919876 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="c28a7590-d921-44ed-b9d4-2bb7388dfcae" containerName="agent" Apr 24 14:37:21.919926 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:21.919888 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c28a7590-d921-44ed-b9d4-2bb7388dfcae" containerName="storage-initializer" Apr 24 14:37:21.919926 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:21.919920 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="c28a7590-d921-44ed-b9d4-2bb7388dfcae" containerName="storage-initializer" Apr 24 14:37:21.920078 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:21.919984 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="c28a7590-d921-44ed-b9d4-2bb7388dfcae" containerName="agent" Apr 24 14:37:21.920078 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:21.919996 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="c28a7590-d921-44ed-b9d4-2bb7388dfcae" containerName="kserve-container" Apr 24 14:37:21.924195 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:21.924177 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-kr52v" Apr 24 14:37:21.933008 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:21.932986 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-kr52v"] Apr 24 14:37:22.073410 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:22.073369 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c2a0e37d-83d1-4642-9063-27c3f18e9ed0-kserve-provision-location\") pod \"isvc-lightgbm-predictor-78c8d484d6-kr52v\" (UID: \"c2a0e37d-83d1-4642-9063-27c3f18e9ed0\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-kr52v" Apr 24 14:37:22.173844 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:22.173740 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c2a0e37d-83d1-4642-9063-27c3f18e9ed0-kserve-provision-location\") pod \"isvc-lightgbm-predictor-78c8d484d6-kr52v\" (UID: \"c2a0e37d-83d1-4642-9063-27c3f18e9ed0\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-kr52v" Apr 24 14:37:22.174196 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:22.174174 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c2a0e37d-83d1-4642-9063-27c3f18e9ed0-kserve-provision-location\") pod \"isvc-lightgbm-predictor-78c8d484d6-kr52v\" (UID: \"c2a0e37d-83d1-4642-9063-27c3f18e9ed0\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-kr52v" Apr 24 14:37:22.234861 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:22.234821 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-kr52v" Apr 24 14:37:22.352538 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:22.352504 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-kr52v"] Apr 24 14:37:22.359253 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:22.359233 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 14:37:22.386046 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:22.386016 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-kr52v" event={"ID":"c2a0e37d-83d1-4642-9063-27c3f18e9ed0","Type":"ContainerStarted","Data":"4928f25c0ead25b5e0927c6c8142360b5df1567d2c7513a884f2f129f8133f0e"} Apr 24 14:37:23.389969 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:23.389930 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-kr52v" event={"ID":"c2a0e37d-83d1-4642-9063-27c3f18e9ed0","Type":"ContainerStarted","Data":"142b881cdf0d842bc9050b94de46cd07ca6df3ea97bab5f83c80acec23b89797"} Apr 24 14:37:24.151910 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:24.151856 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-544f9f4cf8-rskqq" podUID="2312064e-52cb-42e7-9d5d-3b0cc0a99a16" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 24 14:37:24.153491 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:24.153467 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-544f9f4cf8-rskqq" podUID="2312064e-52cb-42e7-9d5d-3b0cc0a99a16" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:37:26.400700 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:26.400672 2569 generic.go:358] "Generic (PLEG): container finished" podID="c2a0e37d-83d1-4642-9063-27c3f18e9ed0" containerID="142b881cdf0d842bc9050b94de46cd07ca6df3ea97bab5f83c80acec23b89797" exitCode=0 Apr 24 14:37:26.401039 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:26.400747 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-kr52v" event={"ID":"c2a0e37d-83d1-4642-9063-27c3f18e9ed0","Type":"ContainerDied","Data":"142b881cdf0d842bc9050b94de46cd07ca6df3ea97bab5f83c80acec23b89797"} Apr 24 14:37:27.407126 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:27.407090 2569 generic.go:358] "Generic (PLEG): container finished" podID="2312064e-52cb-42e7-9d5d-3b0cc0a99a16" containerID="cb7172159326321022026e09972c98a86a4f81c8a9a825e7b1a208009dea5963" exitCode=0 Apr 24 14:37:27.407687 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:27.407164 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-544f9f4cf8-rskqq" event={"ID":"2312064e-52cb-42e7-9d5d-3b0cc0a99a16","Type":"ContainerDied","Data":"cb7172159326321022026e09972c98a86a4f81c8a9a825e7b1a208009dea5963"} Apr 24 14:37:33.429989 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:33.429948 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-kr52v" event={"ID":"c2a0e37d-83d1-4642-9063-27c3f18e9ed0","Type":"ContainerStarted","Data":"7da4eb856c80e95e2043027429691626ddde7fac30d294de32684642b4d04897"} Apr 24 14:37:33.430423 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:33.430242 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-kr52v" Apr 24 14:37:33.431498 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:33.431475 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-kr52v" podUID="c2a0e37d-83d1-4642-9063-27c3f18e9ed0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 24 14:37:33.445382 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:33.445329 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-kr52v" podStartSLOduration=6.168861741 podStartE2EDuration="12.445309754s" podCreationTimestamp="2026-04-24 14:37:21 +0000 UTC" firstStartedPulling="2026-04-24 14:37:26.402019418 +0000 UTC m=+799.196076297" lastFinishedPulling="2026-04-24 14:37:32.678467427 +0000 UTC m=+805.472524310" observedRunningTime="2026-04-24 14:37:33.443726241 +0000 UTC m=+806.237783141" watchObservedRunningTime="2026-04-24 14:37:33.445309754 +0000 UTC m=+806.239366656" Apr 24 14:37:34.151530 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:34.151482 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-544f9f4cf8-rskqq" podUID="2312064e-52cb-42e7-9d5d-3b0cc0a99a16" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 24 14:37:34.153258 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:34.153223 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-544f9f4cf8-rskqq" podUID="2312064e-52cb-42e7-9d5d-3b0cc0a99a16" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:37:34.433803 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:34.433768 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-kr52v" podUID="c2a0e37d-83d1-4642-9063-27c3f18e9ed0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 24 14:37:44.151777 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:44.151729 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-544f9f4cf8-rskqq" podUID="2312064e-52cb-42e7-9d5d-3b0cc0a99a16" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 24 14:37:44.152203 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:44.151875 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-544f9f4cf8-rskqq" Apr 24 14:37:44.153332 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:44.153301 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-544f9f4cf8-rskqq" podUID="2312064e-52cb-42e7-9d5d-3b0cc0a99a16" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:37:44.153432 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:44.153394 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-544f9f4cf8-rskqq" Apr 24 14:37:44.434282 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:44.434239 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-kr52v" podUID="c2a0e37d-83d1-4642-9063-27c3f18e9ed0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 24 14:37:52.094298 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:52.094275 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-544f9f4cf8-rskqq" Apr 24 14:37:52.229466 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:52.229376 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2312064e-52cb-42e7-9d5d-3b0cc0a99a16-kserve-provision-location\") pod \"2312064e-52cb-42e7-9d5d-3b0cc0a99a16\" (UID: \"2312064e-52cb-42e7-9d5d-3b0cc0a99a16\") " Apr 24 14:37:52.229730 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:52.229706 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2312064e-52cb-42e7-9d5d-3b0cc0a99a16-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2312064e-52cb-42e7-9d5d-3b0cc0a99a16" (UID: "2312064e-52cb-42e7-9d5d-3b0cc0a99a16"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:37:52.330466 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:52.330426 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2312064e-52cb-42e7-9d5d-3b0cc0a99a16-kserve-provision-location\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 14:37:52.486774 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:52.486683 2569 generic.go:358] "Generic (PLEG): container finished" podID="2312064e-52cb-42e7-9d5d-3b0cc0a99a16" containerID="2ef325bf9c2c5951cc5257b8afdae8e493d2dc9fc0c3730315a4adb9d1cbb8b0" exitCode=137 Apr 24 14:37:52.486774 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:52.486760 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-544f9f4cf8-rskqq" Apr 24 14:37:52.487006 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:52.486773 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-544f9f4cf8-rskqq" event={"ID":"2312064e-52cb-42e7-9d5d-3b0cc0a99a16","Type":"ContainerDied","Data":"2ef325bf9c2c5951cc5257b8afdae8e493d2dc9fc0c3730315a4adb9d1cbb8b0"} Apr 24 14:37:52.487006 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:52.486821 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-544f9f4cf8-rskqq" event={"ID":"2312064e-52cb-42e7-9d5d-3b0cc0a99a16","Type":"ContainerDied","Data":"fbe05ee21c7aa6f054a1818f9685d3d814134d6fe1272b97cda7b0bd4d9a4f02"} Apr 24 14:37:52.487006 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:52.486841 2569 scope.go:117] "RemoveContainer" containerID="2ef325bf9c2c5951cc5257b8afdae8e493d2dc9fc0c3730315a4adb9d1cbb8b0" Apr 24 14:37:52.494569 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:52.494554 2569 scope.go:117] "RemoveContainer" containerID="cb7172159326321022026e09972c98a86a4f81c8a9a825e7b1a208009dea5963" Apr 24 14:37:52.501475 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:52.501459 2569 scope.go:117] "RemoveContainer" containerID="aeed430c5b807a927765d99b92a5f794b7e010b4ec3bc47a3d8d4cfaf985f44e" Apr 24 14:37:52.508615 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:52.508592 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-544f9f4cf8-rskqq"] Apr 24 14:37:52.508968 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:52.508948 2569 scope.go:117] "RemoveContainer" containerID="2ef325bf9c2c5951cc5257b8afdae8e493d2dc9fc0c3730315a4adb9d1cbb8b0" Apr 24 14:37:52.509252 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:37:52.509235 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ef325bf9c2c5951cc5257b8afdae8e493d2dc9fc0c3730315a4adb9d1cbb8b0\": container with ID starting with 2ef325bf9c2c5951cc5257b8afdae8e493d2dc9fc0c3730315a4adb9d1cbb8b0 not found: ID does not exist" containerID="2ef325bf9c2c5951cc5257b8afdae8e493d2dc9fc0c3730315a4adb9d1cbb8b0" Apr 24 14:37:52.509309 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:52.509259 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ef325bf9c2c5951cc5257b8afdae8e493d2dc9fc0c3730315a4adb9d1cbb8b0"} err="failed to get container status \"2ef325bf9c2c5951cc5257b8afdae8e493d2dc9fc0c3730315a4adb9d1cbb8b0\": rpc error: code = NotFound desc = could not find container \"2ef325bf9c2c5951cc5257b8afdae8e493d2dc9fc0c3730315a4adb9d1cbb8b0\": container with ID starting with 2ef325bf9c2c5951cc5257b8afdae8e493d2dc9fc0c3730315a4adb9d1cbb8b0 not found: ID does not exist" Apr 24 14:37:52.509309 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:52.509276 2569 scope.go:117] "RemoveContainer" containerID="cb7172159326321022026e09972c98a86a4f81c8a9a825e7b1a208009dea5963" Apr 24 14:37:52.509493 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:37:52.509479 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb7172159326321022026e09972c98a86a4f81c8a9a825e7b1a208009dea5963\": container with ID starting with cb7172159326321022026e09972c98a86a4f81c8a9a825e7b1a208009dea5963 not found: ID does not exist" containerID="cb7172159326321022026e09972c98a86a4f81c8a9a825e7b1a208009dea5963" Apr 24 14:37:52.509556 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:52.509495 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb7172159326321022026e09972c98a86a4f81c8a9a825e7b1a208009dea5963"} err="failed to get container status \"cb7172159326321022026e09972c98a86a4f81c8a9a825e7b1a208009dea5963\": rpc error: code = NotFound desc = could not find container \"cb7172159326321022026e09972c98a86a4f81c8a9a825e7b1a208009dea5963\": container with ID starting with cb7172159326321022026e09972c98a86a4f81c8a9a825e7b1a208009dea5963 not found: ID does not exist" Apr 24 14:37:52.509556 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:52.509509 2569 scope.go:117] "RemoveContainer" containerID="aeed430c5b807a927765d99b92a5f794b7e010b4ec3bc47a3d8d4cfaf985f44e" Apr 24 14:37:52.509695 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:37:52.509680 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aeed430c5b807a927765d99b92a5f794b7e010b4ec3bc47a3d8d4cfaf985f44e\": container with ID starting with aeed430c5b807a927765d99b92a5f794b7e010b4ec3bc47a3d8d4cfaf985f44e not found: ID does not exist" containerID="aeed430c5b807a927765d99b92a5f794b7e010b4ec3bc47a3d8d4cfaf985f44e" Apr 24 14:37:52.509735 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:52.509696 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aeed430c5b807a927765d99b92a5f794b7e010b4ec3bc47a3d8d4cfaf985f44e"} err="failed to get container status \"aeed430c5b807a927765d99b92a5f794b7e010b4ec3bc47a3d8d4cfaf985f44e\": rpc error: code = NotFound desc = could not find container \"aeed430c5b807a927765d99b92a5f794b7e010b4ec3bc47a3d8d4cfaf985f44e\": container with ID starting with aeed430c5b807a927765d99b92a5f794b7e010b4ec3bc47a3d8d4cfaf985f44e not found: ID does not exist" Apr 24 14:37:52.512176 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:52.512154 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-544f9f4cf8-rskqq"] Apr 24 14:37:53.827316 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:53.827283 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2312064e-52cb-42e7-9d5d-3b0cc0a99a16" path="/var/lib/kubelet/pods/2312064e-52cb-42e7-9d5d-3b0cc0a99a16/volumes" Apr 24 14:37:54.434118 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:37:54.434073 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-kr52v" podUID="c2a0e37d-83d1-4642-9063-27c3f18e9ed0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 24 14:38:04.434077 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:38:04.434026 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-kr52v" podUID="c2a0e37d-83d1-4642-9063-27c3f18e9ed0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 24 14:38:14.434717 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:38:14.434662 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-kr52v" podUID="c2a0e37d-83d1-4642-9063-27c3f18e9ed0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 24 14:38:24.434236 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:38:24.434191 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-kr52v" podUID="c2a0e37d-83d1-4642-9063-27c3f18e9ed0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 24 14:38:34.434176 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:38:34.434128 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-kr52v" podUID="c2a0e37d-83d1-4642-9063-27c3f18e9ed0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 24 14:38:44.433983 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:38:44.433934 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-kr52v" podUID="c2a0e37d-83d1-4642-9063-27c3f18e9ed0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 24 14:38:53.827038 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:38:53.827010 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-kr52v" Apr 24 14:39:02.235181 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:39:02.235098 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-kr52v"] Apr 24 14:39:02.235631 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:39:02.235361 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-kr52v" podUID="c2a0e37d-83d1-4642-9063-27c3f18e9ed0" containerName="kserve-container" containerID="cri-o://7da4eb856c80e95e2043027429691626ddde7fac30d294de32684642b4d04897" gracePeriod=30 Apr 24 14:39:02.301347 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:39:02.301322 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-mmt4w"] Apr 24 14:39:02.301650 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:39:02.301639 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2312064e-52cb-42e7-9d5d-3b0cc0a99a16" containerName="storage-initializer" Apr 24 14:39:02.301699 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:39:02.301652 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="2312064e-52cb-42e7-9d5d-3b0cc0a99a16" containerName="storage-initializer" Apr 24 14:39:02.301699 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:39:02.301661 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2312064e-52cb-42e7-9d5d-3b0cc0a99a16" containerName="agent" Apr 24 14:39:02.301699 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:39:02.301667 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="2312064e-52cb-42e7-9d5d-3b0cc0a99a16" containerName="agent" Apr 24 14:39:02.301699 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:39:02.301676 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2312064e-52cb-42e7-9d5d-3b0cc0a99a16" containerName="kserve-container" Apr 24 14:39:02.301699 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:39:02.301682 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="2312064e-52cb-42e7-9d5d-3b0cc0a99a16" containerName="kserve-container" Apr 24 14:39:02.301848 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:39:02.301729 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="2312064e-52cb-42e7-9d5d-3b0cc0a99a16" containerName="agent" Apr 24 14:39:02.301848 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:39:02.301739 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="2312064e-52cb-42e7-9d5d-3b0cc0a99a16" containerName="kserve-container" Apr 24 14:39:02.304681 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:39:02.304667 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-mmt4w" Apr 24 14:39:02.311576 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:39:02.311555 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-mmt4w"] Apr 24 14:39:02.407022 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:39:02.406966 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8e2e0810-ef31-4af9-8c70-26bc333b4e62-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-64984c7cb-mmt4w\" (UID: \"8e2e0810-ef31-4af9-8c70-26bc333b4e62\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-mmt4w" Apr 24 14:39:02.508082 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:39:02.507981 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8e2e0810-ef31-4af9-8c70-26bc333b4e62-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-64984c7cb-mmt4w\" (UID: \"8e2e0810-ef31-4af9-8c70-26bc333b4e62\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-mmt4w" Apr 24 14:39:02.508367 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:39:02.508347 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8e2e0810-ef31-4af9-8c70-26bc333b4e62-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-64984c7cb-mmt4w\" (UID: \"8e2e0810-ef31-4af9-8c70-26bc333b4e62\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-mmt4w" Apr 24 14:39:02.617519 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:39:02.617470 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-mmt4w" Apr 24 14:39:02.734199 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:39:02.734164 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-mmt4w"] Apr 24 14:39:02.737479 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:39:02.737449 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e2e0810_ef31_4af9_8c70_26bc333b4e62.slice/crio-8f5e1f969ce7f674f3dd03ed0cb6b53bb6894cd53e1b383c78918ee92dc5ca01 WatchSource:0}: Error finding container 8f5e1f969ce7f674f3dd03ed0cb6b53bb6894cd53e1b383c78918ee92dc5ca01: Status 404 returned error can't find the container with id 8f5e1f969ce7f674f3dd03ed0cb6b53bb6894cd53e1b383c78918ee92dc5ca01 Apr 24 14:39:03.705132 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:39:03.705096 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-mmt4w" event={"ID":"8e2e0810-ef31-4af9-8c70-26bc333b4e62","Type":"ContainerStarted","Data":"e9af85f066d3d6a576469cea7239dcc66540870b8ad9f33b76f57a06f27f440a"} Apr 24 14:39:03.705132 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:39:03.705133 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-mmt4w" event={"ID":"8e2e0810-ef31-4af9-8c70-26bc333b4e62","Type":"ContainerStarted","Data":"8f5e1f969ce7f674f3dd03ed0cb6b53bb6894cd53e1b383c78918ee92dc5ca01"} Apr 24 14:39:03.824111 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:39:03.824063 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-kr52v" podUID="c2a0e37d-83d1-4642-9063-27c3f18e9ed0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 24 14:39:06.715678 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:39:06.715636 2569 generic.go:358] "Generic (PLEG): container finished" podID="8e2e0810-ef31-4af9-8c70-26bc333b4e62" containerID="e9af85f066d3d6a576469cea7239dcc66540870b8ad9f33b76f57a06f27f440a" exitCode=0 Apr 24 14:39:06.716116 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:39:06.715707 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-mmt4w" event={"ID":"8e2e0810-ef31-4af9-8c70-26bc333b4e62","Type":"ContainerDied","Data":"e9af85f066d3d6a576469cea7239dcc66540870b8ad9f33b76f57a06f27f440a"} Apr 24 14:39:07.474796 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:39:07.474772 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-kr52v" Apr 24 14:39:07.550593 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:39:07.550559 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c2a0e37d-83d1-4642-9063-27c3f18e9ed0-kserve-provision-location\") pod \"c2a0e37d-83d1-4642-9063-27c3f18e9ed0\" (UID: \"c2a0e37d-83d1-4642-9063-27c3f18e9ed0\") " Apr 24 14:39:07.550929 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:39:07.550883 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2a0e37d-83d1-4642-9063-27c3f18e9ed0-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c2a0e37d-83d1-4642-9063-27c3f18e9ed0" (UID: "c2a0e37d-83d1-4642-9063-27c3f18e9ed0"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:39:07.652084 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:39:07.652000 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c2a0e37d-83d1-4642-9063-27c3f18e9ed0-kserve-provision-location\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 14:39:07.712291 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:39:07.712262 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mjpgb_be4da945-f6d1-4406-adea-f3ccedab88f6/ovn-acl-logging/0.log" Apr 24 14:39:07.716131 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:39:07.716108 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mjpgb_be4da945-f6d1-4406-adea-f3ccedab88f6/ovn-acl-logging/0.log" Apr 24 14:39:07.719061 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:39:07.719021 2569 generic.go:358] "Generic (PLEG): container finished" podID="c2a0e37d-83d1-4642-9063-27c3f18e9ed0" containerID="7da4eb856c80e95e2043027429691626ddde7fac30d294de32684642b4d04897" exitCode=0 Apr 24 14:39:07.719061 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:39:07.719052 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-kr52v" event={"ID":"c2a0e37d-83d1-4642-9063-27c3f18e9ed0","Type":"ContainerDied","Data":"7da4eb856c80e95e2043027429691626ddde7fac30d294de32684642b4d04897"} Apr 24 14:39:07.719276 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:39:07.719097 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-kr52v" Apr 24 14:39:07.719276 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:39:07.719110 2569 scope.go:117] "RemoveContainer" containerID="7da4eb856c80e95e2043027429691626ddde7fac30d294de32684642b4d04897" Apr 24 14:39:07.719276 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:39:07.719095 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-kr52v" event={"ID":"c2a0e37d-83d1-4642-9063-27c3f18e9ed0","Type":"ContainerDied","Data":"4928f25c0ead25b5e0927c6c8142360b5df1567d2c7513a884f2f129f8133f0e"} Apr 24 14:39:07.720886 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:39:07.720862 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-mmt4w" event={"ID":"8e2e0810-ef31-4af9-8c70-26bc333b4e62","Type":"ContainerStarted","Data":"64853e2ae20c0157c9c7a2533c4f119bf16d23370e95eb317e2090941f160f12"} Apr 24 14:39:07.721207 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:39:07.721184 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-mmt4w" Apr 24 14:39:07.722921 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:39:07.722880 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-mmt4w" podUID="8e2e0810-ef31-4af9-8c70-26bc333b4e62" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 24 14:39:07.728376 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:39:07.728360 2569 scope.go:117] "RemoveContainer" containerID="142b881cdf0d842bc9050b94de46cd07ca6df3ea97bab5f83c80acec23b89797" Apr 24 14:39:07.736964 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:39:07.736876 2569 scope.go:117] "RemoveContainer" containerID="7da4eb856c80e95e2043027429691626ddde7fac30d294de32684642b4d04897" Apr 24 14:39:07.740401 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:39:07.738881 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-mmt4w" podStartSLOduration=5.738863827 podStartE2EDuration="5.738863827s" podCreationTimestamp="2026-04-24 14:39:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:39:07.736447657 +0000 UTC m=+900.530504559" watchObservedRunningTime="2026-04-24 14:39:07.738863827 +0000 UTC m=+900.532920729" Apr 24 14:39:07.740806 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:39:07.740633 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7da4eb856c80e95e2043027429691626ddde7fac30d294de32684642b4d04897\": container with ID starting with 7da4eb856c80e95e2043027429691626ddde7fac30d294de32684642b4d04897 not found: ID does not exist" containerID="7da4eb856c80e95e2043027429691626ddde7fac30d294de32684642b4d04897" Apr 24 14:39:07.740935 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:39:07.740817 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7da4eb856c80e95e2043027429691626ddde7fac30d294de32684642b4d04897"} err="failed to get container status \"7da4eb856c80e95e2043027429691626ddde7fac30d294de32684642b4d04897\": rpc error: code = NotFound desc = could not find container \"7da4eb856c80e95e2043027429691626ddde7fac30d294de32684642b4d04897\": container with ID starting with 7da4eb856c80e95e2043027429691626ddde7fac30d294de32684642b4d04897 not found: ID does not exist" Apr 24 14:39:07.740935 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:39:07.740837 2569 scope.go:117] "RemoveContainer" containerID="142b881cdf0d842bc9050b94de46cd07ca6df3ea97bab5f83c80acec23b89797" Apr 24 14:39:07.741109 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:39:07.741090 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"142b881cdf0d842bc9050b94de46cd07ca6df3ea97bab5f83c80acec23b89797\": container with ID starting with 142b881cdf0d842bc9050b94de46cd07ca6df3ea97bab5f83c80acec23b89797 not found: ID does not exist" containerID="142b881cdf0d842bc9050b94de46cd07ca6df3ea97bab5f83c80acec23b89797" Apr 24 14:39:07.741163 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:39:07.741112 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"142b881cdf0d842bc9050b94de46cd07ca6df3ea97bab5f83c80acec23b89797"} err="failed to get container status \"142b881cdf0d842bc9050b94de46cd07ca6df3ea97bab5f83c80acec23b89797\": rpc error: code = NotFound desc = could not find container \"142b881cdf0d842bc9050b94de46cd07ca6df3ea97bab5f83c80acec23b89797\": container with ID starting with 142b881cdf0d842bc9050b94de46cd07ca6df3ea97bab5f83c80acec23b89797 not found: ID does not exist" Apr 24 14:39:07.747916 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:39:07.747878 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-kr52v"] Apr 24 14:39:07.751601 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:39:07.751579 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-kr52v"] Apr 24 14:39:07.827003 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:39:07.826972 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2a0e37d-83d1-4642-9063-27c3f18e9ed0" path="/var/lib/kubelet/pods/c2a0e37d-83d1-4642-9063-27c3f18e9ed0/volumes" Apr 24 14:39:08.725213 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:39:08.725169 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-mmt4w" podUID="8e2e0810-ef31-4af9-8c70-26bc333b4e62" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 24 14:39:18.725837 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:39:18.725778 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-mmt4w" podUID="8e2e0810-ef31-4af9-8c70-26bc333b4e62" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 24 14:39:28.726013 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:39:28.725965 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-mmt4w" podUID="8e2e0810-ef31-4af9-8c70-26bc333b4e62" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 24 14:39:38.725228 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:39:38.725180 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-mmt4w" podUID="8e2e0810-ef31-4af9-8c70-26bc333b4e62" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 24 14:39:48.725570 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:39:48.725523 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-mmt4w" podUID="8e2e0810-ef31-4af9-8c70-26bc333b4e62" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 24 14:39:58.725817 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:39:58.725767 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-mmt4w" podUID="8e2e0810-ef31-4af9-8c70-26bc333b4e62" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 24 14:40:08.726063 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:40:08.726017 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-mmt4w" podUID="8e2e0810-ef31-4af9-8c70-26bc333b4e62" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 24 14:40:18.726002 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:40:18.725952 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-mmt4w" podUID="8e2e0810-ef31-4af9-8c70-26bc333b4e62" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 24 14:40:28.727217 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:40:28.727128 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-mmt4w" Apr 24 14:40:32.627540 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:40:32.627502 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-mmt4w"] Apr 24 14:40:32.627950 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:40:32.627843 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-mmt4w" podUID="8e2e0810-ef31-4af9-8c70-26bc333b4e62" containerName="kserve-container" containerID="cri-o://64853e2ae20c0157c9c7a2533c4f119bf16d23370e95eb317e2090941f160f12" gracePeriod=30 Apr 24 14:40:32.694402 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:40:32.694367 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-xhnmz"] Apr 24 14:40:32.694788 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:40:32.694771 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c2a0e37d-83d1-4642-9063-27c3f18e9ed0" containerName="kserve-container" Apr 24 14:40:32.694873 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:40:32.694791 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2a0e37d-83d1-4642-9063-27c3f18e9ed0" containerName="kserve-container" Apr 24 14:40:32.694873 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:40:32.694810 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c2a0e37d-83d1-4642-9063-27c3f18e9ed0" containerName="storage-initializer" Apr 24 14:40:32.694873 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:40:32.694818 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2a0e37d-83d1-4642-9063-27c3f18e9ed0" containerName="storage-initializer" Apr 24 14:40:32.695067 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:40:32.694910 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="c2a0e37d-83d1-4642-9063-27c3f18e9ed0" containerName="kserve-container" Apr 24 14:40:32.698152 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:40:32.698129 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-xhnmz" Apr 24 14:40:32.708302 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:40:32.708273 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-xhnmz"] Apr 24 14:40:32.771098 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:40:32.771064 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/659cdb4e-9a64-4444-8b73-cc320e4e27d2-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8d44c64dc-xhnmz\" (UID: \"659cdb4e-9a64-4444-8b73-cc320e4e27d2\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-xhnmz" Apr 24 14:40:32.872553 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:40:32.872512 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/659cdb4e-9a64-4444-8b73-cc320e4e27d2-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8d44c64dc-xhnmz\" (UID: \"659cdb4e-9a64-4444-8b73-cc320e4e27d2\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-xhnmz" Apr 24 14:40:32.872973 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:40:32.872950 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/659cdb4e-9a64-4444-8b73-cc320e4e27d2-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8d44c64dc-xhnmz\" (UID: \"659cdb4e-9a64-4444-8b73-cc320e4e27d2\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-xhnmz" Apr 24 14:40:33.009299 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:40:33.009269 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-xhnmz" Apr 24 14:40:33.127820 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:40:33.127780 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-xhnmz"] Apr 24 14:40:33.130877 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:40:33.130851 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod659cdb4e_9a64_4444_8b73_cc320e4e27d2.slice/crio-0a3e3727f47f1bbe557903f9b876dc9cdbcc25676466fc6db899a92f8c2b44c1 WatchSource:0}: Error finding container 0a3e3727f47f1bbe557903f9b876dc9cdbcc25676466fc6db899a92f8c2b44c1: Status 404 returned error can't find the container with id 0a3e3727f47f1bbe557903f9b876dc9cdbcc25676466fc6db899a92f8c2b44c1 Apr 24 14:40:33.976880 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:40:33.976842 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-xhnmz" event={"ID":"659cdb4e-9a64-4444-8b73-cc320e4e27d2","Type":"ContainerStarted","Data":"6e4b71c0c58a210858923c8bdc6e7f8ab2f3d44d0588c9fa30298f235dc842d1"} Apr 24 14:40:33.976880 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:40:33.976884 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-xhnmz" event={"ID":"659cdb4e-9a64-4444-8b73-cc320e4e27d2","Type":"ContainerStarted","Data":"0a3e3727f47f1bbe557903f9b876dc9cdbcc25676466fc6db899a92f8c2b44c1"} Apr 24 14:40:36.987285 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:40:36.987246 2569 generic.go:358] "Generic (PLEG): container finished" podID="659cdb4e-9a64-4444-8b73-cc320e4e27d2" containerID="6e4b71c0c58a210858923c8bdc6e7f8ab2f3d44d0588c9fa30298f235dc842d1" exitCode=0 Apr 24 14:40:36.987677 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:40:36.987321 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-xhnmz" event={"ID":"659cdb4e-9a64-4444-8b73-cc320e4e27d2","Type":"ContainerDied","Data":"6e4b71c0c58a210858923c8bdc6e7f8ab2f3d44d0588c9fa30298f235dc842d1"} Apr 24 14:40:37.506726 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:40:37.506672 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-mmt4w" Apr 24 14:40:37.614004 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:40:37.613412 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8e2e0810-ef31-4af9-8c70-26bc333b4e62-kserve-provision-location\") pod \"8e2e0810-ef31-4af9-8c70-26bc333b4e62\" (UID: \"8e2e0810-ef31-4af9-8c70-26bc333b4e62\") " Apr 24 14:40:37.614436 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:40:37.614378 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e2e0810-ef31-4af9-8c70-26bc333b4e62-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8e2e0810-ef31-4af9-8c70-26bc333b4e62" (UID: "8e2e0810-ef31-4af9-8c70-26bc333b4e62"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:40:37.715023 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:40:37.714960 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8e2e0810-ef31-4af9-8c70-26bc333b4e62-kserve-provision-location\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 14:40:37.996950 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:40:37.996514 2569 generic.go:358] "Generic (PLEG): container finished" podID="8e2e0810-ef31-4af9-8c70-26bc333b4e62" containerID="64853e2ae20c0157c9c7a2533c4f119bf16d23370e95eb317e2090941f160f12" exitCode=0 Apr 24 14:40:37.996950 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:40:37.996641 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-mmt4w" event={"ID":"8e2e0810-ef31-4af9-8c70-26bc333b4e62","Type":"ContainerDied","Data":"64853e2ae20c0157c9c7a2533c4f119bf16d23370e95eb317e2090941f160f12"} Apr 24 14:40:37.996950 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:40:37.996671 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-mmt4w" event={"ID":"8e2e0810-ef31-4af9-8c70-26bc333b4e62","Type":"ContainerDied","Data":"8f5e1f969ce7f674f3dd03ed0cb6b53bb6894cd53e1b383c78918ee92dc5ca01"} Apr 24 14:40:37.996950 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:40:37.996693 2569 scope.go:117] "RemoveContainer" containerID="64853e2ae20c0157c9c7a2533c4f119bf16d23370e95eb317e2090941f160f12" Apr 24 14:40:37.996950 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:40:37.996854 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-mmt4w" Apr 24 14:40:38.013923 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:40:38.013879 2569 scope.go:117] "RemoveContainer" containerID="e9af85f066d3d6a576469cea7239dcc66540870b8ad9f33b76f57a06f27f440a" Apr 24 14:40:38.017524 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:40:38.017493 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-mmt4w"] Apr 24 14:40:38.018980 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:40:38.018955 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-mmt4w"] Apr 24 14:40:38.029696 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:40:38.029577 2569 scope.go:117] "RemoveContainer" containerID="64853e2ae20c0157c9c7a2533c4f119bf16d23370e95eb317e2090941f160f12" Apr 24 14:40:38.030082 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:40:38.029934 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64853e2ae20c0157c9c7a2533c4f119bf16d23370e95eb317e2090941f160f12\": container with ID starting with 64853e2ae20c0157c9c7a2533c4f119bf16d23370e95eb317e2090941f160f12 not found: ID does not exist" containerID="64853e2ae20c0157c9c7a2533c4f119bf16d23370e95eb317e2090941f160f12" Apr 24 14:40:38.030082 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:40:38.029967 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64853e2ae20c0157c9c7a2533c4f119bf16d23370e95eb317e2090941f160f12"} err="failed to get container status \"64853e2ae20c0157c9c7a2533c4f119bf16d23370e95eb317e2090941f160f12\": rpc error: code = NotFound desc = could not find container \"64853e2ae20c0157c9c7a2533c4f119bf16d23370e95eb317e2090941f160f12\": container with ID starting with 64853e2ae20c0157c9c7a2533c4f119bf16d23370e95eb317e2090941f160f12 not found: ID does not exist" Apr 24 14:40:38.030082 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:40:38.029992 2569 scope.go:117] "RemoveContainer" containerID="e9af85f066d3d6a576469cea7239dcc66540870b8ad9f33b76f57a06f27f440a" Apr 24 14:40:38.030432 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:40:38.030356 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9af85f066d3d6a576469cea7239dcc66540870b8ad9f33b76f57a06f27f440a\": container with ID starting with e9af85f066d3d6a576469cea7239dcc66540870b8ad9f33b76f57a06f27f440a not found: ID does not exist" containerID="e9af85f066d3d6a576469cea7239dcc66540870b8ad9f33b76f57a06f27f440a" Apr 24 14:40:38.030432 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:40:38.030385 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9af85f066d3d6a576469cea7239dcc66540870b8ad9f33b76f57a06f27f440a"} err="failed to get container status \"e9af85f066d3d6a576469cea7239dcc66540870b8ad9f33b76f57a06f27f440a\": rpc error: code = NotFound desc = could not find container \"e9af85f066d3d6a576469cea7239dcc66540870b8ad9f33b76f57a06f27f440a\": container with ID starting with e9af85f066d3d6a576469cea7239dcc66540870b8ad9f33b76f57a06f27f440a not found: ID does not exist" Apr 24 14:40:39.828072 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:40:39.828034 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e2e0810-ef31-4af9-8c70-26bc333b4e62" path="/var/lib/kubelet/pods/8e2e0810-ef31-4af9-8c70-26bc333b4e62/volumes" Apr 24 14:43:00.466869 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:00.466826 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-xhnmz" event={"ID":"659cdb4e-9a64-4444-8b73-cc320e4e27d2","Type":"ContainerStarted","Data":"6d4fe60e31055e52757e8afbb5e7ea8da865e07836fc50d149ce1fc3dd6e6d41"} Apr 24 14:43:00.467399 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:00.467043 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-xhnmz" Apr 24 14:43:00.490004 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:00.489953 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-xhnmz" podStartSLOduration=5.901626412 podStartE2EDuration="2m28.489938684s" podCreationTimestamp="2026-04-24 14:40:32 +0000 UTC" firstStartedPulling="2026-04-24 14:40:36.988326568 +0000 UTC m=+989.782383447" lastFinishedPulling="2026-04-24 14:42:59.576638827 +0000 UTC m=+1132.370695719" observedRunningTime="2026-04-24 14:43:00.487364763 +0000 UTC m=+1133.281421689" watchObservedRunningTime="2026-04-24 14:43:00.489938684 +0000 UTC m=+1133.283995584" Apr 24 14:43:31.475964 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:31.475875 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-xhnmz" Apr 24 14:43:32.882103 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:32.882069 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-xhnmz"] Apr 24 14:43:32.882489 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:32.882299 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-xhnmz" podUID="659cdb4e-9a64-4444-8b73-cc320e4e27d2" containerName="kserve-container" containerID="cri-o://6d4fe60e31055e52757e8afbb5e7ea8da865e07836fc50d149ce1fc3dd6e6d41" gracePeriod=30 Apr 24 14:43:32.954676 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:32.954638 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-5mqk5"] Apr 24 14:43:32.954979 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:32.954966 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8e2e0810-ef31-4af9-8c70-26bc333b4e62" containerName="storage-initializer" Apr 24 14:43:32.955039 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:32.954980 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e2e0810-ef31-4af9-8c70-26bc333b4e62" containerName="storage-initializer" Apr 24 14:43:32.955039 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:32.954998 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8e2e0810-ef31-4af9-8c70-26bc333b4e62" containerName="kserve-container" Apr 24 14:43:32.955039 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:32.955004 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e2e0810-ef31-4af9-8c70-26bc333b4e62" containerName="kserve-container" Apr 24 14:43:32.955165 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:32.955056 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="8e2e0810-ef31-4af9-8c70-26bc333b4e62" containerName="kserve-container" Apr 24 14:43:32.969880 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:32.969851 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-5mqk5" Apr 24 14:43:32.972404 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:32.972376 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-5mqk5"] Apr 24 14:43:33.107641 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:33.107602 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b3d0e467-11d4-48c8-b11a-6a283ed87c62-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-5mqk5\" (UID: \"b3d0e467-11d4-48c8-b11a-6a283ed87c62\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-5mqk5" Apr 24 14:43:33.209096 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:33.209058 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b3d0e467-11d4-48c8-b11a-6a283ed87c62-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-5mqk5\" (UID: \"b3d0e467-11d4-48c8-b11a-6a283ed87c62\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-5mqk5" Apr 24 14:43:33.209491 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:33.209465 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b3d0e467-11d4-48c8-b11a-6a283ed87c62-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-5mqk5\" (UID: \"b3d0e467-11d4-48c8-b11a-6a283ed87c62\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-5mqk5" Apr 24 14:43:33.281874 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:33.281831 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-5mqk5" Apr 24 14:43:33.406336 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:33.406310 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-5mqk5"] Apr 24 14:43:33.409118 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:43:33.409082 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3d0e467_11d4_48c8_b11a_6a283ed87c62.slice/crio-c65d8ab53dfc2ed70b4b71d2bd08f394d8568ff88ac0e77d0524a0dff5bb76db WatchSource:0}: Error finding container c65d8ab53dfc2ed70b4b71d2bd08f394d8568ff88ac0e77d0524a0dff5bb76db: Status 404 returned error can't find the container with id c65d8ab53dfc2ed70b4b71d2bd08f394d8568ff88ac0e77d0524a0dff5bb76db Apr 24 14:43:33.411488 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:33.411472 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 14:43:33.563332 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:33.563283 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-5mqk5" event={"ID":"b3d0e467-11d4-48c8-b11a-6a283ed87c62","Type":"ContainerStarted","Data":"e229ff9412c60f0beb3b153bcc79624537dd7a8e747df2db9f4fd0cf1f59f0e8"} Apr 24 14:43:33.563332 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:33.563325 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-5mqk5" event={"ID":"b3d0e467-11d4-48c8-b11a-6a283ed87c62","Type":"ContainerStarted","Data":"c65d8ab53dfc2ed70b4b71d2bd08f394d8568ff88ac0e77d0524a0dff5bb76db"} Apr 24 14:43:34.046055 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:34.046028 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-xhnmz" Apr 24 14:43:34.117169 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:34.117082 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/659cdb4e-9a64-4444-8b73-cc320e4e27d2-kserve-provision-location\") pod \"659cdb4e-9a64-4444-8b73-cc320e4e27d2\" (UID: \"659cdb4e-9a64-4444-8b73-cc320e4e27d2\") " Apr 24 14:43:34.117382 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:34.117360 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/659cdb4e-9a64-4444-8b73-cc320e4e27d2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "659cdb4e-9a64-4444-8b73-cc320e4e27d2" (UID: "659cdb4e-9a64-4444-8b73-cc320e4e27d2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:43:34.217674 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:34.217638 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/659cdb4e-9a64-4444-8b73-cc320e4e27d2-kserve-provision-location\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 14:43:34.567987 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:34.567948 2569 generic.go:358] "Generic (PLEG): container finished" podID="659cdb4e-9a64-4444-8b73-cc320e4e27d2" containerID="6d4fe60e31055e52757e8afbb5e7ea8da865e07836fc50d149ce1fc3dd6e6d41" exitCode=0 Apr 24 14:43:34.568184 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:34.567992 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-xhnmz" event={"ID":"659cdb4e-9a64-4444-8b73-cc320e4e27d2","Type":"ContainerDied","Data":"6d4fe60e31055e52757e8afbb5e7ea8da865e07836fc50d149ce1fc3dd6e6d41"} Apr 24 14:43:34.568184 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:34.568014 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-xhnmz" Apr 24 14:43:34.568184 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:34.568033 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-xhnmz" event={"ID":"659cdb4e-9a64-4444-8b73-cc320e4e27d2","Type":"ContainerDied","Data":"0a3e3727f47f1bbe557903f9b876dc9cdbcc25676466fc6db899a92f8c2b44c1"} Apr 24 14:43:34.568184 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:34.568054 2569 scope.go:117] "RemoveContainer" containerID="6d4fe60e31055e52757e8afbb5e7ea8da865e07836fc50d149ce1fc3dd6e6d41" Apr 24 14:43:34.576574 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:34.576557 2569 scope.go:117] "RemoveContainer" containerID="6e4b71c0c58a210858923c8bdc6e7f8ab2f3d44d0588c9fa30298f235dc842d1" Apr 24 14:43:34.583629 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:34.583612 2569 scope.go:117] "RemoveContainer" containerID="6d4fe60e31055e52757e8afbb5e7ea8da865e07836fc50d149ce1fc3dd6e6d41" Apr 24 14:43:34.583905 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:43:34.583877 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d4fe60e31055e52757e8afbb5e7ea8da865e07836fc50d149ce1fc3dd6e6d41\": container with ID starting with 6d4fe60e31055e52757e8afbb5e7ea8da865e07836fc50d149ce1fc3dd6e6d41 not found: ID does not exist" containerID="6d4fe60e31055e52757e8afbb5e7ea8da865e07836fc50d149ce1fc3dd6e6d41" Apr 24 14:43:34.583968 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:34.583917 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d4fe60e31055e52757e8afbb5e7ea8da865e07836fc50d149ce1fc3dd6e6d41"} err="failed to get container status \"6d4fe60e31055e52757e8afbb5e7ea8da865e07836fc50d149ce1fc3dd6e6d41\": rpc error: code = NotFound desc = could not find container \"6d4fe60e31055e52757e8afbb5e7ea8da865e07836fc50d149ce1fc3dd6e6d41\": container with ID starting with 6d4fe60e31055e52757e8afbb5e7ea8da865e07836fc50d149ce1fc3dd6e6d41 not found: ID does not exist" Apr 24 14:43:34.583968 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:34.583935 2569 scope.go:117] "RemoveContainer" containerID="6e4b71c0c58a210858923c8bdc6e7f8ab2f3d44d0588c9fa30298f235dc842d1" Apr 24 14:43:34.584150 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:43:34.584135 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e4b71c0c58a210858923c8bdc6e7f8ab2f3d44d0588c9fa30298f235dc842d1\": container with ID starting with 6e4b71c0c58a210858923c8bdc6e7f8ab2f3d44d0588c9fa30298f235dc842d1 not found: ID does not exist" containerID="6e4b71c0c58a210858923c8bdc6e7f8ab2f3d44d0588c9fa30298f235dc842d1" Apr 24 14:43:34.584194 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:34.584155 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e4b71c0c58a210858923c8bdc6e7f8ab2f3d44d0588c9fa30298f235dc842d1"} err="failed to get container status \"6e4b71c0c58a210858923c8bdc6e7f8ab2f3d44d0588c9fa30298f235dc842d1\": rpc error: code = NotFound desc = could not find container \"6e4b71c0c58a210858923c8bdc6e7f8ab2f3d44d0588c9fa30298f235dc842d1\": container with ID starting with 6e4b71c0c58a210858923c8bdc6e7f8ab2f3d44d0588c9fa30298f235dc842d1 not found: ID does not exist" Apr 24 14:43:34.594448 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:34.594419 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-xhnmz"] Apr 24 14:43:34.596329 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:34.596305 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-xhnmz"] Apr 24 14:43:35.828160 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:35.828132 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="659cdb4e-9a64-4444-8b73-cc320e4e27d2" path="/var/lib/kubelet/pods/659cdb4e-9a64-4444-8b73-cc320e4e27d2/volumes" Apr 24 14:43:37.579056 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:37.579022 2569 generic.go:358] "Generic (PLEG): container finished" podID="b3d0e467-11d4-48c8-b11a-6a283ed87c62" containerID="e229ff9412c60f0beb3b153bcc79624537dd7a8e747df2db9f4fd0cf1f59f0e8" exitCode=0 Apr 24 14:43:37.579557 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:37.579111 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-5mqk5" event={"ID":"b3d0e467-11d4-48c8-b11a-6a283ed87c62","Type":"ContainerDied","Data":"e229ff9412c60f0beb3b153bcc79624537dd7a8e747df2db9f4fd0cf1f59f0e8"} Apr 24 14:43:38.584160 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:38.584124 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-5mqk5" event={"ID":"b3d0e467-11d4-48c8-b11a-6a283ed87c62","Type":"ContainerStarted","Data":"3ae69b4d6431e63f6f89524d208299e78a8a6709ee8fa50f5f58fc501ce4e4b4"} Apr 24 14:43:38.584614 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:38.584466 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-5mqk5" Apr 24 14:43:38.585788 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:38.585752 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-5mqk5" podUID="b3d0e467-11d4-48c8-b11a-6a283ed87c62" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 24 14:43:38.599374 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:38.599316 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-5mqk5" podStartSLOduration=6.599295034 podStartE2EDuration="6.599295034s" podCreationTimestamp="2026-04-24 14:43:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:43:38.597490913 +0000 UTC m=+1171.391547815" watchObservedRunningTime="2026-04-24 14:43:38.599295034 +0000 UTC m=+1171.393351939" Apr 24 14:43:39.587721 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:39.587683 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-5mqk5" podUID="b3d0e467-11d4-48c8-b11a-6a283ed87c62" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 24 14:43:49.589792 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:49.589757 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-5mqk5" Apr 24 14:43:53.017249 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:53.017218 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-5mqk5"] Apr 24 14:43:53.017645 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:53.017450 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-5mqk5" podUID="b3d0e467-11d4-48c8-b11a-6a283ed87c62" containerName="kserve-container" containerID="cri-o://3ae69b4d6431e63f6f89524d208299e78a8a6709ee8fa50f5f58fc501ce4e4b4" gracePeriod=30 Apr 24 14:43:53.059592 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:53.059559 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-s68px"] Apr 24 14:43:53.059904 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:53.059879 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="659cdb4e-9a64-4444-8b73-cc320e4e27d2" containerName="storage-initializer" Apr 24 14:43:53.059959 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:53.059911 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="659cdb4e-9a64-4444-8b73-cc320e4e27d2" containerName="storage-initializer" Apr 24 14:43:53.059959 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:53.059924 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="659cdb4e-9a64-4444-8b73-cc320e4e27d2" containerName="kserve-container" Apr 24 14:43:53.059959 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:53.059929 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="659cdb4e-9a64-4444-8b73-cc320e4e27d2" containerName="kserve-container" Apr 24 14:43:53.060068 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:53.059980 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="659cdb4e-9a64-4444-8b73-cc320e4e27d2" containerName="kserve-container" Apr 24 14:43:53.063712 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:53.063687 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-s68px" Apr 24 14:43:53.071582 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:53.071556 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-s68px"] Apr 24 14:43:53.181589 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:53.181548 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/781db27e-0c1c-4441-af98-f3dfb7854834-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5497fc5f68-s68px\" (UID: \"781db27e-0c1c-4441-af98-f3dfb7854834\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-s68px" Apr 24 14:43:53.283200 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:53.283104 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/781db27e-0c1c-4441-af98-f3dfb7854834-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5497fc5f68-s68px\" (UID: \"781db27e-0c1c-4441-af98-f3dfb7854834\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-s68px" Apr 24 14:43:53.283531 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:53.283508 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/781db27e-0c1c-4441-af98-f3dfb7854834-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5497fc5f68-s68px\" (UID: \"781db27e-0c1c-4441-af98-f3dfb7854834\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-s68px" Apr 24 14:43:53.375087 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:53.375026 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-s68px" Apr 24 14:43:53.510750 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:53.510724 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-s68px"] Apr 24 14:43:53.513481 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:43:53.513443 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod781db27e_0c1c_4441_af98_f3dfb7854834.slice/crio-b4623c970803c371c8ba72e55d616492615df8ffd3a17c93678ff1e7b5689260 WatchSource:0}: Error finding container b4623c970803c371c8ba72e55d616492615df8ffd3a17c93678ff1e7b5689260: Status 404 returned error can't find the container with id b4623c970803c371c8ba72e55d616492615df8ffd3a17c93678ff1e7b5689260 Apr 24 14:43:53.630075 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:53.630043 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-s68px" event={"ID":"781db27e-0c1c-4441-af98-f3dfb7854834","Type":"ContainerStarted","Data":"b4623c970803c371c8ba72e55d616492615df8ffd3a17c93678ff1e7b5689260"} Apr 24 14:43:53.755704 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:53.755676 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-5mqk5" Apr 24 14:43:53.888769 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:53.888678 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b3d0e467-11d4-48c8-b11a-6a283ed87c62-kserve-provision-location\") pod \"b3d0e467-11d4-48c8-b11a-6a283ed87c62\" (UID: \"b3d0e467-11d4-48c8-b11a-6a283ed87c62\") " Apr 24 14:43:53.889053 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:53.889029 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3d0e467-11d4-48c8-b11a-6a283ed87c62-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b3d0e467-11d4-48c8-b11a-6a283ed87c62" (UID: "b3d0e467-11d4-48c8-b11a-6a283ed87c62"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:43:53.989302 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:53.989269 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b3d0e467-11d4-48c8-b11a-6a283ed87c62-kserve-provision-location\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 14:43:54.634768 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:54.634730 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-s68px" event={"ID":"781db27e-0c1c-4441-af98-f3dfb7854834","Type":"ContainerStarted","Data":"525230515f24ba29239b6bece5177fb79c4659e20dbd0bd29dbed2a9993a8d55"} Apr 24 14:43:54.636261 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:54.636232 2569 generic.go:358] "Generic (PLEG): container finished" podID="b3d0e467-11d4-48c8-b11a-6a283ed87c62" containerID="3ae69b4d6431e63f6f89524d208299e78a8a6709ee8fa50f5f58fc501ce4e4b4" exitCode=0 Apr 24 14:43:54.636379 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:54.636288 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-5mqk5" Apr 24 14:43:54.636379 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:54.636316 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-5mqk5" event={"ID":"b3d0e467-11d4-48c8-b11a-6a283ed87c62","Type":"ContainerDied","Data":"3ae69b4d6431e63f6f89524d208299e78a8a6709ee8fa50f5f58fc501ce4e4b4"} Apr 24 14:43:54.636379 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:54.636350 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-5mqk5" event={"ID":"b3d0e467-11d4-48c8-b11a-6a283ed87c62","Type":"ContainerDied","Data":"c65d8ab53dfc2ed70b4b71d2bd08f394d8568ff88ac0e77d0524a0dff5bb76db"} Apr 24 14:43:54.636379 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:54.636367 2569 scope.go:117] "RemoveContainer" containerID="3ae69b4d6431e63f6f89524d208299e78a8a6709ee8fa50f5f58fc501ce4e4b4" Apr 24 14:43:54.645524 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:54.645505 2569 scope.go:117] "RemoveContainer" containerID="e229ff9412c60f0beb3b153bcc79624537dd7a8e747df2db9f4fd0cf1f59f0e8" Apr 24 14:43:54.653295 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:54.653277 2569 scope.go:117] "RemoveContainer" containerID="3ae69b4d6431e63f6f89524d208299e78a8a6709ee8fa50f5f58fc501ce4e4b4" Apr 24 14:43:54.653608 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:43:54.653588 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ae69b4d6431e63f6f89524d208299e78a8a6709ee8fa50f5f58fc501ce4e4b4\": container with ID starting with 3ae69b4d6431e63f6f89524d208299e78a8a6709ee8fa50f5f58fc501ce4e4b4 not found: ID does not exist" containerID="3ae69b4d6431e63f6f89524d208299e78a8a6709ee8fa50f5f58fc501ce4e4b4" Apr 24 14:43:54.653694 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:54.653620 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ae69b4d6431e63f6f89524d208299e78a8a6709ee8fa50f5f58fc501ce4e4b4"} err="failed to get container status \"3ae69b4d6431e63f6f89524d208299e78a8a6709ee8fa50f5f58fc501ce4e4b4\": rpc error: code = NotFound desc = could not find container \"3ae69b4d6431e63f6f89524d208299e78a8a6709ee8fa50f5f58fc501ce4e4b4\": container with ID starting with 3ae69b4d6431e63f6f89524d208299e78a8a6709ee8fa50f5f58fc501ce4e4b4 not found: ID does not exist" Apr 24 14:43:54.653694 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:54.653646 2569 scope.go:117] "RemoveContainer" containerID="e229ff9412c60f0beb3b153bcc79624537dd7a8e747df2db9f4fd0cf1f59f0e8" Apr 24 14:43:54.653977 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:43:54.653954 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e229ff9412c60f0beb3b153bcc79624537dd7a8e747df2db9f4fd0cf1f59f0e8\": container with ID starting with e229ff9412c60f0beb3b153bcc79624537dd7a8e747df2db9f4fd0cf1f59f0e8 not found: ID does not exist" containerID="e229ff9412c60f0beb3b153bcc79624537dd7a8e747df2db9f4fd0cf1f59f0e8" Apr 24 14:43:54.654031 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:54.653984 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e229ff9412c60f0beb3b153bcc79624537dd7a8e747df2db9f4fd0cf1f59f0e8"} err="failed to get container status \"e229ff9412c60f0beb3b153bcc79624537dd7a8e747df2db9f4fd0cf1f59f0e8\": rpc error: code = NotFound desc = could not find container \"e229ff9412c60f0beb3b153bcc79624537dd7a8e747df2db9f4fd0cf1f59f0e8\": container with ID starting with e229ff9412c60f0beb3b153bcc79624537dd7a8e747df2db9f4fd0cf1f59f0e8 not found: ID does not exist" Apr 24 14:43:54.663234 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:54.663209 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-5mqk5"] Apr 24 14:43:54.666203 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:54.666181 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-5mqk5"] Apr 24 14:43:55.827829 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:55.827793 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3d0e467-11d4-48c8-b11a-6a283ed87c62" path="/var/lib/kubelet/pods/b3d0e467-11d4-48c8-b11a-6a283ed87c62/volumes" Apr 24 14:43:58.651658 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:58.651615 2569 generic.go:358] "Generic (PLEG): container finished" podID="781db27e-0c1c-4441-af98-f3dfb7854834" containerID="525230515f24ba29239b6bece5177fb79c4659e20dbd0bd29dbed2a9993a8d55" exitCode=0 Apr 24 14:43:58.652073 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:58.651691 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-s68px" event={"ID":"781db27e-0c1c-4441-af98-f3dfb7854834","Type":"ContainerDied","Data":"525230515f24ba29239b6bece5177fb79c4659e20dbd0bd29dbed2a9993a8d55"} Apr 24 14:43:59.656834 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:59.656800 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-s68px" event={"ID":"781db27e-0c1c-4441-af98-f3dfb7854834","Type":"ContainerStarted","Data":"24b11bdaf6a25d75360a69dcef3f45fd1752a72ec9ca7406adf9f8fbd491796b"} Apr 24 14:43:59.657237 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:59.657044 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-s68px" Apr 24 14:43:59.672788 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:43:59.672743 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-s68px" podStartSLOduration=6.672731089 podStartE2EDuration="6.672731089s" podCreationTimestamp="2026-04-24 14:43:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:43:59.671844602 +0000 UTC m=+1192.465901503" watchObservedRunningTime="2026-04-24 14:43:59.672731089 +0000 UTC m=+1192.466787990" Apr 24 14:44:07.733706 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:44:07.733677 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mjpgb_be4da945-f6d1-4406-adea-f3ccedab88f6/ovn-acl-logging/0.log" Apr 24 14:44:07.739550 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:44:07.739527 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mjpgb_be4da945-f6d1-4406-adea-f3ccedab88f6/ovn-acl-logging/0.log" Apr 24 14:44:30.664906 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:44:30.664870 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-s68px" Apr 24 14:44:33.166336 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:44:33.166300 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-s68px"] Apr 24 14:44:33.166723 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:44:33.166583 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-s68px" podUID="781db27e-0c1c-4441-af98-f3dfb7854834" containerName="kserve-container" containerID="cri-o://24b11bdaf6a25d75360a69dcef3f45fd1752a72ec9ca7406adf9f8fbd491796b" gracePeriod=30 Apr 24 14:44:33.216675 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:44:33.216633 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-7c84c59f84-2jx57"] Apr 24 14:44:33.217066 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:44:33.217048 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b3d0e467-11d4-48c8-b11a-6a283ed87c62" containerName="kserve-container" Apr 24 14:44:33.217171 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:44:33.217068 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3d0e467-11d4-48c8-b11a-6a283ed87c62" containerName="kserve-container" Apr 24 14:44:33.217171 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:44:33.217089 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b3d0e467-11d4-48c8-b11a-6a283ed87c62" containerName="storage-initializer" Apr 24 14:44:33.217171 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:44:33.217097 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3d0e467-11d4-48c8-b11a-6a283ed87c62" containerName="storage-initializer" Apr 24 14:44:33.217330 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:44:33.217177 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="b3d0e467-11d4-48c8-b11a-6a283ed87c62" containerName="kserve-container" Apr 24 14:44:33.220370 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:44:33.220346 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-7c84c59f84-2jx57" Apr 24 14:44:33.230393 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:44:33.230363 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-7c84c59f84-2jx57"] Apr 24 14:44:33.311669 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:44:33.311623 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cc0bcdec-8377-4462-9910-6378e0563184-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-7c84c59f84-2jx57\" (UID: \"cc0bcdec-8377-4462-9910-6378e0563184\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-7c84c59f84-2jx57" Apr 24 14:44:33.413093 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:44:33.413050 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cc0bcdec-8377-4462-9910-6378e0563184-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-7c84c59f84-2jx57\" (UID: \"cc0bcdec-8377-4462-9910-6378e0563184\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-7c84c59f84-2jx57" Apr 24 14:44:33.413481 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:44:33.413459 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cc0bcdec-8377-4462-9910-6378e0563184-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-7c84c59f84-2jx57\" (UID: \"cc0bcdec-8377-4462-9910-6378e0563184\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-7c84c59f84-2jx57" Apr 24 14:44:33.532689 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:44:33.532591 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-7c84c59f84-2jx57" Apr 24 14:44:33.661216 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:44:33.661182 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-7c84c59f84-2jx57"] Apr 24 14:44:33.664479 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:44:33.664440 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc0bcdec_8377_4462_9910_6378e0563184.slice/crio-987ec72cfb98939c5990b3c2ebd118ae2832cef2fb8dd5d73d8ea5088669c2da WatchSource:0}: Error finding container 987ec72cfb98939c5990b3c2ebd118ae2832cef2fb8dd5d73d8ea5088669c2da: Status 404 returned error can't find the container with id 987ec72cfb98939c5990b3c2ebd118ae2832cef2fb8dd5d73d8ea5088669c2da Apr 24 14:44:33.763057 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:44:33.763021 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-7c84c59f84-2jx57" event={"ID":"cc0bcdec-8377-4462-9910-6378e0563184","Type":"ContainerStarted","Data":"86253a4fafc4bed5ac3246f9abe7bcff3ff8a11818420d20255b42b271948990"} Apr 24 14:44:33.763057 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:44:33.763060 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-7c84c59f84-2jx57" event={"ID":"cc0bcdec-8377-4462-9910-6378e0563184","Type":"ContainerStarted","Data":"987ec72cfb98939c5990b3c2ebd118ae2832cef2fb8dd5d73d8ea5088669c2da"} Apr 24 14:44:34.507633 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:44:34.507608 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-s68px" Apr 24 14:44:34.627699 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:44:34.627618 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/781db27e-0c1c-4441-af98-f3dfb7854834-kserve-provision-location\") pod \"781db27e-0c1c-4441-af98-f3dfb7854834\" (UID: \"781db27e-0c1c-4441-af98-f3dfb7854834\") " Apr 24 14:44:34.628016 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:44:34.627991 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/781db27e-0c1c-4441-af98-f3dfb7854834-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "781db27e-0c1c-4441-af98-f3dfb7854834" (UID: "781db27e-0c1c-4441-af98-f3dfb7854834"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:44:34.728924 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:44:34.728856 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/781db27e-0c1c-4441-af98-f3dfb7854834-kserve-provision-location\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 14:44:34.767016 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:44:34.766981 2569 generic.go:358] "Generic (PLEG): container finished" podID="781db27e-0c1c-4441-af98-f3dfb7854834" containerID="24b11bdaf6a25d75360a69dcef3f45fd1752a72ec9ca7406adf9f8fbd491796b" exitCode=0 Apr 24 14:44:34.767182 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:44:34.767048 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-s68px" Apr 24 14:44:34.767182 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:44:34.767068 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-s68px" event={"ID":"781db27e-0c1c-4441-af98-f3dfb7854834","Type":"ContainerDied","Data":"24b11bdaf6a25d75360a69dcef3f45fd1752a72ec9ca7406adf9f8fbd491796b"} Apr 24 14:44:34.767182 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:44:34.767118 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-s68px" event={"ID":"781db27e-0c1c-4441-af98-f3dfb7854834","Type":"ContainerDied","Data":"b4623c970803c371c8ba72e55d616492615df8ffd3a17c93678ff1e7b5689260"} Apr 24 14:44:34.767182 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:44:34.767134 2569 scope.go:117] "RemoveContainer" containerID="24b11bdaf6a25d75360a69dcef3f45fd1752a72ec9ca7406adf9f8fbd491796b" Apr 24 14:44:34.775172 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:44:34.775153 2569 scope.go:117] "RemoveContainer" containerID="525230515f24ba29239b6bece5177fb79c4659e20dbd0bd29dbed2a9993a8d55" Apr 24 14:44:34.782494 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:44:34.782475 2569 scope.go:117] "RemoveContainer" containerID="24b11bdaf6a25d75360a69dcef3f45fd1752a72ec9ca7406adf9f8fbd491796b" Apr 24 14:44:34.782749 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:44:34.782732 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24b11bdaf6a25d75360a69dcef3f45fd1752a72ec9ca7406adf9f8fbd491796b\": container with ID starting with 24b11bdaf6a25d75360a69dcef3f45fd1752a72ec9ca7406adf9f8fbd491796b not found: ID does not exist" containerID="24b11bdaf6a25d75360a69dcef3f45fd1752a72ec9ca7406adf9f8fbd491796b" Apr 24 14:44:34.782789 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:44:34.782760 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24b11bdaf6a25d75360a69dcef3f45fd1752a72ec9ca7406adf9f8fbd491796b"} err="failed to get container status \"24b11bdaf6a25d75360a69dcef3f45fd1752a72ec9ca7406adf9f8fbd491796b\": rpc error: code = NotFound desc = could not find container \"24b11bdaf6a25d75360a69dcef3f45fd1752a72ec9ca7406adf9f8fbd491796b\": container with ID starting with 24b11bdaf6a25d75360a69dcef3f45fd1752a72ec9ca7406adf9f8fbd491796b not found: ID does not exist" Apr 24 14:44:34.782789 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:44:34.782778 2569 scope.go:117] "RemoveContainer" containerID="525230515f24ba29239b6bece5177fb79c4659e20dbd0bd29dbed2a9993a8d55" Apr 24 14:44:34.783040 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:44:34.783020 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"525230515f24ba29239b6bece5177fb79c4659e20dbd0bd29dbed2a9993a8d55\": container with ID starting with 525230515f24ba29239b6bece5177fb79c4659e20dbd0bd29dbed2a9993a8d55 not found: ID does not exist" containerID="525230515f24ba29239b6bece5177fb79c4659e20dbd0bd29dbed2a9993a8d55" Apr 24 14:44:34.783083 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:44:34.783047 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"525230515f24ba29239b6bece5177fb79c4659e20dbd0bd29dbed2a9993a8d55"} err="failed to get container status \"525230515f24ba29239b6bece5177fb79c4659e20dbd0bd29dbed2a9993a8d55\": rpc error: code = NotFound desc = could not find container \"525230515f24ba29239b6bece5177fb79c4659e20dbd0bd29dbed2a9993a8d55\": container with ID starting with 525230515f24ba29239b6bece5177fb79c4659e20dbd0bd29dbed2a9993a8d55 not found: ID does not exist" Apr 24 14:44:34.786991 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:44:34.786958 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-s68px"] Apr 24 14:44:34.789930 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:44:34.789884 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-s68px"] Apr 24 14:44:35.827843 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:44:35.827810 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="781db27e-0c1c-4441-af98-f3dfb7854834" path="/var/lib/kubelet/pods/781db27e-0c1c-4441-af98-f3dfb7854834/volumes" Apr 24 14:44:37.778134 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:44:37.778099 2569 generic.go:358] "Generic (PLEG): container finished" podID="cc0bcdec-8377-4462-9910-6378e0563184" containerID="86253a4fafc4bed5ac3246f9abe7bcff3ff8a11818420d20255b42b271948990" exitCode=0 Apr 24 14:44:37.778534 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:44:37.778171 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-7c84c59f84-2jx57" event={"ID":"cc0bcdec-8377-4462-9910-6378e0563184","Type":"ContainerDied","Data":"86253a4fafc4bed5ac3246f9abe7bcff3ff8a11818420d20255b42b271948990"} Apr 24 14:44:38.783748 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:44:38.783710 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-7c84c59f84-2jx57" event={"ID":"cc0bcdec-8377-4462-9910-6378e0563184","Type":"ContainerStarted","Data":"bfcfde49ae64f5179ca46757191389696d4ad680e0f3e3a298d5e60fcdb7e943"} Apr 24 14:44:41.795069 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:44:41.794984 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-7c84c59f84-2jx57" event={"ID":"cc0bcdec-8377-4462-9910-6378e0563184","Type":"ContainerStarted","Data":"59f38608381942c30550e496ddf770ecc2bc290f7c7113e583e7ad1333191a51"} Apr 24 14:44:41.795418 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:44:41.795186 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-7c84c59f84-2jx57" Apr 24 14:44:41.812420 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:44:41.812373 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-7c84c59f84-2jx57" podStartSLOduration=5.879929257 podStartE2EDuration="8.812360213s" podCreationTimestamp="2026-04-24 14:44:33 +0000 UTC" firstStartedPulling="2026-04-24 14:44:37.838949143 +0000 UTC m=+1230.633006022" lastFinishedPulling="2026-04-24 14:44:40.771380098 +0000 UTC m=+1233.565436978" observedRunningTime="2026-04-24 14:44:41.81014372 +0000 UTC m=+1234.604200620" watchObservedRunningTime="2026-04-24 14:44:41.812360213 +0000 UTC m=+1234.606417115" Apr 24 14:44:42.798317 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:44:42.798285 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-7c84c59f84-2jx57" Apr 24 14:45:13.802522 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:45:13.802486 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-7c84c59f84-2jx57" Apr 24 14:45:43.803599 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:45:43.803568 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-7c84c59f84-2jx57" Apr 24 14:45:53.290370 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:45:53.290335 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-7c84c59f84-2jx57"] Apr 24 14:45:53.290795 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:45:53.290673 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-7c84c59f84-2jx57" podUID="cc0bcdec-8377-4462-9910-6378e0563184" containerName="kserve-container" containerID="cri-o://bfcfde49ae64f5179ca46757191389696d4ad680e0f3e3a298d5e60fcdb7e943" gracePeriod=30 Apr 24 14:45:53.290875 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:45:53.290762 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-7c84c59f84-2jx57" podUID="cc0bcdec-8377-4462-9910-6378e0563184" containerName="kserve-agent" containerID="cri-o://59f38608381942c30550e496ddf770ecc2bc290f7c7113e583e7ad1333191a51" gracePeriod=30 Apr 24 14:45:53.334570 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:45:53.334531 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zhpxp"] Apr 24 14:45:53.334878 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:45:53.334864 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="781db27e-0c1c-4441-af98-f3dfb7854834" containerName="kserve-container" Apr 24 14:45:53.334966 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:45:53.334880 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="781db27e-0c1c-4441-af98-f3dfb7854834" containerName="kserve-container" Apr 24 14:45:53.334966 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:45:53.334889 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="781db27e-0c1c-4441-af98-f3dfb7854834" containerName="storage-initializer" Apr 24 14:45:53.334966 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:45:53.334908 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="781db27e-0c1c-4441-af98-f3dfb7854834" containerName="storage-initializer" Apr 24 14:45:53.335077 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:45:53.334974 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="781db27e-0c1c-4441-af98-f3dfb7854834" containerName="kserve-container" Apr 24 14:45:53.338065 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:45:53.338049 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zhpxp" Apr 24 14:45:53.345010 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:45:53.344977 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zhpxp"] Apr 24 14:45:53.490036 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:45:53.489997 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0690c263-853f-4742-bcf8-729c3da01606-kserve-provision-location\") pod \"isvc-paddle-predictor-7dddcb4bd4-zhpxp\" (UID: \"0690c263-853f-4742-bcf8-729c3da01606\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zhpxp" Apr 24 14:45:53.591289 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:45:53.591195 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0690c263-853f-4742-bcf8-729c3da01606-kserve-provision-location\") pod \"isvc-paddle-predictor-7dddcb4bd4-zhpxp\" (UID: \"0690c263-853f-4742-bcf8-729c3da01606\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zhpxp" Apr 24 14:45:53.591573 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:45:53.591552 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0690c263-853f-4742-bcf8-729c3da01606-kserve-provision-location\") pod \"isvc-paddle-predictor-7dddcb4bd4-zhpxp\" (UID: \"0690c263-853f-4742-bcf8-729c3da01606\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zhpxp" Apr 24 14:45:53.649972 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:45:53.649941 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zhpxp" Apr 24 14:45:53.771101 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:45:53.770938 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zhpxp"] Apr 24 14:45:53.773751 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:45:53.773717 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0690c263_853f_4742_bcf8_729c3da01606.slice/crio-117cb38ef4d94385aa111fb15b1d427d754d42387ca06f932772d3a9648fb3d7 WatchSource:0}: Error finding container 117cb38ef4d94385aa111fb15b1d427d754d42387ca06f932772d3a9648fb3d7: Status 404 returned error can't find the container with id 117cb38ef4d94385aa111fb15b1d427d754d42387ca06f932772d3a9648fb3d7 Apr 24 14:45:53.801828 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:45:53.801797 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-7c84c59f84-2jx57" podUID="cc0bcdec-8377-4462-9910-6378e0563184" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.31:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.134.0.31:8080: connect: connection refused" Apr 24 14:45:54.008859 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:45:54.008824 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zhpxp" event={"ID":"0690c263-853f-4742-bcf8-729c3da01606","Type":"ContainerStarted","Data":"ff74fe53d9f65a8d8f6624e65bf2bf3f5eb04b9cc4c5f80c77758b730a709903"} Apr 24 14:45:54.008859 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:45:54.008862 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zhpxp" event={"ID":"0690c263-853f-4742-bcf8-729c3da01606","Type":"ContainerStarted","Data":"117cb38ef4d94385aa111fb15b1d427d754d42387ca06f932772d3a9648fb3d7"} Apr 24 14:45:56.020288 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:45:56.020255 2569 generic.go:358] "Generic (PLEG): container finished" podID="cc0bcdec-8377-4462-9910-6378e0563184" containerID="bfcfde49ae64f5179ca46757191389696d4ad680e0f3e3a298d5e60fcdb7e943" exitCode=0 Apr 24 14:45:56.020652 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:45:56.020329 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-7c84c59f84-2jx57" event={"ID":"cc0bcdec-8377-4462-9910-6378e0563184","Type":"ContainerDied","Data":"bfcfde49ae64f5179ca46757191389696d4ad680e0f3e3a298d5e60fcdb7e943"} Apr 24 14:45:59.031432 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:45:59.031395 2569 generic.go:358] "Generic (PLEG): container finished" podID="0690c263-853f-4742-bcf8-729c3da01606" containerID="ff74fe53d9f65a8d8f6624e65bf2bf3f5eb04b9cc4c5f80c77758b730a709903" exitCode=0 Apr 24 14:45:59.031834 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:45:59.031470 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zhpxp" event={"ID":"0690c263-853f-4742-bcf8-729c3da01606","Type":"ContainerDied","Data":"ff74fe53d9f65a8d8f6624e65bf2bf3f5eb04b9cc4c5f80c77758b730a709903"} Apr 24 14:46:03.801339 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:46:03.801288 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-7c84c59f84-2jx57" podUID="cc0bcdec-8377-4462-9910-6378e0563184" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.31:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.134.0.31:8080: connect: connection refused" Apr 24 14:46:11.073216 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:46:11.073177 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zhpxp" event={"ID":"0690c263-853f-4742-bcf8-729c3da01606","Type":"ContainerStarted","Data":"f63e7795a24c44fe0b9ce9d8c613063ae7ef2841a82d07aa5a13c92babe8c395"} Apr 24 14:46:11.073680 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:46:11.073479 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zhpxp" Apr 24 14:46:11.074852 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:46:11.074819 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zhpxp" podUID="0690c263-853f-4742-bcf8-729c3da01606" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 24 14:46:11.088622 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:46:11.088577 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zhpxp" podStartSLOduration=6.815597922 podStartE2EDuration="18.088561814s" podCreationTimestamp="2026-04-24 14:45:53 +0000 UTC" firstStartedPulling="2026-04-24 14:45:59.032689151 +0000 UTC m=+1311.826746030" lastFinishedPulling="2026-04-24 14:46:10.305653039 +0000 UTC m=+1323.099709922" observedRunningTime="2026-04-24 14:46:11.087399984 +0000 UTC m=+1323.881456884" watchObservedRunningTime="2026-04-24 14:46:11.088561814 +0000 UTC m=+1323.882618715" Apr 24 14:46:12.076431 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:46:12.076390 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zhpxp" podUID="0690c263-853f-4742-bcf8-729c3da01606" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 24 14:46:13.801335 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:46:13.801287 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-7c84c59f84-2jx57" podUID="cc0bcdec-8377-4462-9910-6378e0563184" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.31:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.134.0.31:8080: connect: connection refused" Apr 24 14:46:13.801730 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:46:13.801421 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-7c84c59f84-2jx57" Apr 24 14:46:22.076785 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:46:22.076729 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zhpxp" podUID="0690c263-853f-4742-bcf8-729c3da01606" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 24 14:46:23.484336 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:46:23.484309 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-7c84c59f84-2jx57" Apr 24 14:46:23.547653 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:46:23.547565 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cc0bcdec-8377-4462-9910-6378e0563184-kserve-provision-location\") pod \"cc0bcdec-8377-4462-9910-6378e0563184\" (UID: \"cc0bcdec-8377-4462-9910-6378e0563184\") " Apr 24 14:46:23.547906 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:46:23.547865 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc0bcdec-8377-4462-9910-6378e0563184-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "cc0bcdec-8377-4462-9910-6378e0563184" (UID: "cc0bcdec-8377-4462-9910-6378e0563184"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:46:23.648294 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:46:23.648258 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cc0bcdec-8377-4462-9910-6378e0563184-kserve-provision-location\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 14:46:24.112863 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:46:24.112770 2569 generic.go:358] "Generic (PLEG): container finished" podID="cc0bcdec-8377-4462-9910-6378e0563184" containerID="59f38608381942c30550e496ddf770ecc2bc290f7c7113e583e7ad1333191a51" exitCode=0 Apr 24 14:46:24.112863 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:46:24.112829 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-7c84c59f84-2jx57" event={"ID":"cc0bcdec-8377-4462-9910-6378e0563184","Type":"ContainerDied","Data":"59f38608381942c30550e496ddf770ecc2bc290f7c7113e583e7ad1333191a51"} Apr 24 14:46:24.113097 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:46:24.112866 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-7c84c59f84-2jx57" Apr 24 14:46:24.113097 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:46:24.112879 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-7c84c59f84-2jx57" event={"ID":"cc0bcdec-8377-4462-9910-6378e0563184","Type":"ContainerDied","Data":"987ec72cfb98939c5990b3c2ebd118ae2832cef2fb8dd5d73d8ea5088669c2da"} Apr 24 14:46:24.113097 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:46:24.112925 2569 scope.go:117] "RemoveContainer" containerID="59f38608381942c30550e496ddf770ecc2bc290f7c7113e583e7ad1333191a51" Apr 24 14:46:24.120677 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:46:24.120656 2569 scope.go:117] "RemoveContainer" containerID="bfcfde49ae64f5179ca46757191389696d4ad680e0f3e3a298d5e60fcdb7e943" Apr 24 14:46:24.128244 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:46:24.128208 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-7c84c59f84-2jx57"] Apr 24 14:46:24.128731 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:46:24.128705 2569 scope.go:117] "RemoveContainer" containerID="86253a4fafc4bed5ac3246f9abe7bcff3ff8a11818420d20255b42b271948990" Apr 24 14:46:24.129938 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:46:24.129914 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-7c84c59f84-2jx57"] Apr 24 14:46:24.135999 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:46:24.135968 2569 scope.go:117] "RemoveContainer" containerID="59f38608381942c30550e496ddf770ecc2bc290f7c7113e583e7ad1333191a51" Apr 24 14:46:24.136272 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:46:24.136249 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59f38608381942c30550e496ddf770ecc2bc290f7c7113e583e7ad1333191a51\": container with ID starting with 59f38608381942c30550e496ddf770ecc2bc290f7c7113e583e7ad1333191a51 not found: ID does not exist" containerID="59f38608381942c30550e496ddf770ecc2bc290f7c7113e583e7ad1333191a51" Apr 24 14:46:24.136329 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:46:24.136281 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59f38608381942c30550e496ddf770ecc2bc290f7c7113e583e7ad1333191a51"} err="failed to get container status \"59f38608381942c30550e496ddf770ecc2bc290f7c7113e583e7ad1333191a51\": rpc error: code = NotFound desc = could not find container \"59f38608381942c30550e496ddf770ecc2bc290f7c7113e583e7ad1333191a51\": container with ID starting with 59f38608381942c30550e496ddf770ecc2bc290f7c7113e583e7ad1333191a51 not found: ID does not exist" Apr 24 14:46:24.136329 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:46:24.136300 2569 scope.go:117] "RemoveContainer" containerID="bfcfde49ae64f5179ca46757191389696d4ad680e0f3e3a298d5e60fcdb7e943" Apr 24 14:46:24.136540 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:46:24.136515 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfcfde49ae64f5179ca46757191389696d4ad680e0f3e3a298d5e60fcdb7e943\": container with ID starting with bfcfde49ae64f5179ca46757191389696d4ad680e0f3e3a298d5e60fcdb7e943 not found: ID does not exist" containerID="bfcfde49ae64f5179ca46757191389696d4ad680e0f3e3a298d5e60fcdb7e943" Apr 24 14:46:24.136614 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:46:24.136546 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfcfde49ae64f5179ca46757191389696d4ad680e0f3e3a298d5e60fcdb7e943"} err="failed to get container status \"bfcfde49ae64f5179ca46757191389696d4ad680e0f3e3a298d5e60fcdb7e943\": rpc error: code = NotFound desc = could not find container \"bfcfde49ae64f5179ca46757191389696d4ad680e0f3e3a298d5e60fcdb7e943\": container with ID starting with bfcfde49ae64f5179ca46757191389696d4ad680e0f3e3a298d5e60fcdb7e943 not found: ID does not exist" Apr 24 14:46:24.136614 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:46:24.136564 2569 scope.go:117] "RemoveContainer" containerID="86253a4fafc4bed5ac3246f9abe7bcff3ff8a11818420d20255b42b271948990" Apr 24 14:46:24.136792 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:46:24.136774 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86253a4fafc4bed5ac3246f9abe7bcff3ff8a11818420d20255b42b271948990\": container with ID starting with 86253a4fafc4bed5ac3246f9abe7bcff3ff8a11818420d20255b42b271948990 not found: ID does not exist" containerID="86253a4fafc4bed5ac3246f9abe7bcff3ff8a11818420d20255b42b271948990" Apr 24 14:46:24.136849 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:46:24.136799 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86253a4fafc4bed5ac3246f9abe7bcff3ff8a11818420d20255b42b271948990"} err="failed to get container status \"86253a4fafc4bed5ac3246f9abe7bcff3ff8a11818420d20255b42b271948990\": rpc error: code = NotFound desc = could not find container \"86253a4fafc4bed5ac3246f9abe7bcff3ff8a11818420d20255b42b271948990\": container with ID starting with 86253a4fafc4bed5ac3246f9abe7bcff3ff8a11818420d20255b42b271948990 not found: ID does not exist" Apr 24 14:46:25.827702 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:46:25.827670 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc0bcdec-8377-4462-9910-6378e0563184" path="/var/lib/kubelet/pods/cc0bcdec-8377-4462-9910-6378e0563184/volumes" Apr 24 14:46:32.076591 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:46:32.076498 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zhpxp" podUID="0690c263-853f-4742-bcf8-729c3da01606" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 24 14:46:42.076647 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:46:42.076596 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zhpxp" podUID="0690c263-853f-4742-bcf8-729c3da01606" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 24 14:46:52.077464 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:46:52.077412 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zhpxp" podUID="0690c263-853f-4742-bcf8-729c3da01606" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 24 14:47:02.077992 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:47:02.077957 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zhpxp" Apr 24 14:47:04.823368 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:47:04.823331 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zhpxp"] Apr 24 14:47:04.823735 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:47:04.823569 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zhpxp" podUID="0690c263-853f-4742-bcf8-729c3da01606" containerName="kserve-container" containerID="cri-o://f63e7795a24c44fe0b9ce9d8c613063ae7ef2841a82d07aa5a13c92babe8c395" gracePeriod=30 Apr 24 14:47:04.894424 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:47:04.894390 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-2fwr2"] Apr 24 14:47:04.894713 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:47:04.894701 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cc0bcdec-8377-4462-9910-6378e0563184" containerName="storage-initializer" Apr 24 14:47:04.894761 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:47:04.894714 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc0bcdec-8377-4462-9910-6378e0563184" containerName="storage-initializer" Apr 24 14:47:04.894761 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:47:04.894736 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cc0bcdec-8377-4462-9910-6378e0563184" containerName="kserve-container" Apr 24 14:47:04.894761 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:47:04.894742 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc0bcdec-8377-4462-9910-6378e0563184" containerName="kserve-container" Apr 24 14:47:04.894761 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:47:04.894749 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cc0bcdec-8377-4462-9910-6378e0563184" containerName="kserve-agent" Apr 24 14:47:04.894761 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:47:04.894755 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc0bcdec-8377-4462-9910-6378e0563184" containerName="kserve-agent" Apr 24 14:47:04.894986 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:47:04.894805 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="cc0bcdec-8377-4462-9910-6378e0563184" containerName="kserve-container" Apr 24 14:47:04.894986 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:47:04.894813 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="cc0bcdec-8377-4462-9910-6378e0563184" containerName="kserve-agent" Apr 24 14:47:04.897709 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:47:04.897691 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-2fwr2" Apr 24 14:47:04.903608 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:47:04.903580 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-2fwr2"] Apr 24 14:47:04.991025 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:47:04.990991 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/83e23000-c9f7-4c06-8b5c-36b47d4cac6d-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-86b49c4466-2fwr2\" (UID: \"83e23000-c9f7-4c06-8b5c-36b47d4cac6d\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-2fwr2" Apr 24 14:47:05.092396 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:47:05.092291 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/83e23000-c9f7-4c06-8b5c-36b47d4cac6d-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-86b49c4466-2fwr2\" (UID: \"83e23000-c9f7-4c06-8b5c-36b47d4cac6d\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-2fwr2" Apr 24 14:47:05.092734 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:47:05.092711 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/83e23000-c9f7-4c06-8b5c-36b47d4cac6d-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-86b49c4466-2fwr2\" (UID: \"83e23000-c9f7-4c06-8b5c-36b47d4cac6d\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-2fwr2" Apr 24 14:47:05.209702 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:47:05.209647 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-2fwr2" Apr 24 14:47:05.327394 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:47:05.327368 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-2fwr2"] Apr 24 14:47:05.330074 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:47:05.330043 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83e23000_c9f7_4c06_8b5c_36b47d4cac6d.slice/crio-ed7167022de74fef907e49688c559c68e85bef59e0a7f198940893dfa9c3ff02 WatchSource:0}: Error finding container ed7167022de74fef907e49688c559c68e85bef59e0a7f198940893dfa9c3ff02: Status 404 returned error can't find the container with id ed7167022de74fef907e49688c559c68e85bef59e0a7f198940893dfa9c3ff02 Apr 24 14:47:06.243795 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:47:06.243750 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-2fwr2" event={"ID":"83e23000-c9f7-4c06-8b5c-36b47d4cac6d","Type":"ContainerStarted","Data":"02cc257f9a5bd8ecf090e328ea4dac39f8cc31a1e1f19d60334b0a5837622938"} Apr 24 14:47:06.243795 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:47:06.243791 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-2fwr2" event={"ID":"83e23000-c9f7-4c06-8b5c-36b47d4cac6d","Type":"ContainerStarted","Data":"ed7167022de74fef907e49688c559c68e85bef59e0a7f198940893dfa9c3ff02"} Apr 24 14:47:07.871643 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:47:07.871620 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zhpxp" Apr 24 14:47:07.918809 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:47:07.918779 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0690c263-853f-4742-bcf8-729c3da01606-kserve-provision-location\") pod \"0690c263-853f-4742-bcf8-729c3da01606\" (UID: \"0690c263-853f-4742-bcf8-729c3da01606\") " Apr 24 14:47:07.929163 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:47:07.929071 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0690c263-853f-4742-bcf8-729c3da01606-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0690c263-853f-4742-bcf8-729c3da01606" (UID: "0690c263-853f-4742-bcf8-729c3da01606"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:47:08.019532 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:47:08.019491 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0690c263-853f-4742-bcf8-729c3da01606-kserve-provision-location\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 14:47:08.251176 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:47:08.251075 2569 generic.go:358] "Generic (PLEG): container finished" podID="0690c263-853f-4742-bcf8-729c3da01606" containerID="f63e7795a24c44fe0b9ce9d8c613063ae7ef2841a82d07aa5a13c92babe8c395" exitCode=0 Apr 24 14:47:08.251176 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:47:08.251153 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zhpxp" Apr 24 14:47:08.251176 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:47:08.251156 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zhpxp" event={"ID":"0690c263-853f-4742-bcf8-729c3da01606","Type":"ContainerDied","Data":"f63e7795a24c44fe0b9ce9d8c613063ae7ef2841a82d07aa5a13c92babe8c395"} Apr 24 14:47:08.251403 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:47:08.251195 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zhpxp" event={"ID":"0690c263-853f-4742-bcf8-729c3da01606","Type":"ContainerDied","Data":"117cb38ef4d94385aa111fb15b1d427d754d42387ca06f932772d3a9648fb3d7"} Apr 24 14:47:08.251403 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:47:08.251212 2569 scope.go:117] "RemoveContainer" containerID="f63e7795a24c44fe0b9ce9d8c613063ae7ef2841a82d07aa5a13c92babe8c395" Apr 24 14:47:08.259665 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:47:08.259647 2569 scope.go:117] "RemoveContainer" containerID="ff74fe53d9f65a8d8f6624e65bf2bf3f5eb04b9cc4c5f80c77758b730a709903" Apr 24 14:47:08.266770 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:47:08.266750 2569 scope.go:117] "RemoveContainer" containerID="f63e7795a24c44fe0b9ce9d8c613063ae7ef2841a82d07aa5a13c92babe8c395" Apr 24 14:47:08.267060 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:47:08.267042 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f63e7795a24c44fe0b9ce9d8c613063ae7ef2841a82d07aa5a13c92babe8c395\": container with ID starting with f63e7795a24c44fe0b9ce9d8c613063ae7ef2841a82d07aa5a13c92babe8c395 not found: ID does not exist" containerID="f63e7795a24c44fe0b9ce9d8c613063ae7ef2841a82d07aa5a13c92babe8c395" Apr 24 14:47:08.267111 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:47:08.267070 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f63e7795a24c44fe0b9ce9d8c613063ae7ef2841a82d07aa5a13c92babe8c395"} err="failed to get container status \"f63e7795a24c44fe0b9ce9d8c613063ae7ef2841a82d07aa5a13c92babe8c395\": rpc error: code = NotFound desc = could not find container \"f63e7795a24c44fe0b9ce9d8c613063ae7ef2841a82d07aa5a13c92babe8c395\": container with ID starting with f63e7795a24c44fe0b9ce9d8c613063ae7ef2841a82d07aa5a13c92babe8c395 not found: ID does not exist" Apr 24 14:47:08.267111 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:47:08.267087 2569 scope.go:117] "RemoveContainer" containerID="ff74fe53d9f65a8d8f6624e65bf2bf3f5eb04b9cc4c5f80c77758b730a709903" Apr 24 14:47:08.267330 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:47:08.267312 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff74fe53d9f65a8d8f6624e65bf2bf3f5eb04b9cc4c5f80c77758b730a709903\": container with ID starting with ff74fe53d9f65a8d8f6624e65bf2bf3f5eb04b9cc4c5f80c77758b730a709903 not found: ID does not exist" containerID="ff74fe53d9f65a8d8f6624e65bf2bf3f5eb04b9cc4c5f80c77758b730a709903" Apr 24 14:47:08.267380 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:47:08.267334 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff74fe53d9f65a8d8f6624e65bf2bf3f5eb04b9cc4c5f80c77758b730a709903"} err="failed to get container status \"ff74fe53d9f65a8d8f6624e65bf2bf3f5eb04b9cc4c5f80c77758b730a709903\": rpc error: code = NotFound desc = could not find container \"ff74fe53d9f65a8d8f6624e65bf2bf3f5eb04b9cc4c5f80c77758b730a709903\": container with ID starting with ff74fe53d9f65a8d8f6624e65bf2bf3f5eb04b9cc4c5f80c77758b730a709903 not found: ID does not exist" Apr 24 14:47:08.270792 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:47:08.270768 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zhpxp"] Apr 24 14:47:08.275868 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:47:08.275846 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zhpxp"] Apr 24 14:47:09.828618 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:47:09.828580 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0690c263-853f-4742-bcf8-729c3da01606" path="/var/lib/kubelet/pods/0690c263-853f-4742-bcf8-729c3da01606/volumes" Apr 24 14:47:11.262326 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:47:11.262292 2569 generic.go:358] "Generic (PLEG): container finished" podID="83e23000-c9f7-4c06-8b5c-36b47d4cac6d" containerID="02cc257f9a5bd8ecf090e328ea4dac39f8cc31a1e1f19d60334b0a5837622938" exitCode=0 Apr 24 14:47:11.262732 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:47:11.262380 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-2fwr2" event={"ID":"83e23000-c9f7-4c06-8b5c-36b47d4cac6d","Type":"ContainerDied","Data":"02cc257f9a5bd8ecf090e328ea4dac39f8cc31a1e1f19d60334b0a5837622938"} Apr 24 14:47:12.266319 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:47:12.266285 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-2fwr2" event={"ID":"83e23000-c9f7-4c06-8b5c-36b47d4cac6d","Type":"ContainerStarted","Data":"0ed8b9bdd6b1731f85e19410d119519dabc451f3e2885a1b9bbe480bfec636d2"} Apr 24 14:47:12.266802 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:47:12.266650 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-2fwr2" Apr 24 14:47:12.267875 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:47:12.267848 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-2fwr2" podUID="83e23000-c9f7-4c06-8b5c-36b47d4cac6d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 24 14:47:12.282644 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:47:12.282590 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-2fwr2" podStartSLOduration=8.282574913 podStartE2EDuration="8.282574913s" podCreationTimestamp="2026-04-24 14:47:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:47:12.281430064 +0000 UTC m=+1385.075486965" watchObservedRunningTime="2026-04-24 14:47:12.282574913 +0000 UTC m=+1385.076631813" Apr 24 14:47:13.269555 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:47:13.269512 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-2fwr2" podUID="83e23000-c9f7-4c06-8b5c-36b47d4cac6d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 24 14:47:23.270350 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:47:23.270309 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-2fwr2" podUID="83e23000-c9f7-4c06-8b5c-36b47d4cac6d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 24 14:47:33.270134 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:47:33.270075 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-2fwr2" podUID="83e23000-c9f7-4c06-8b5c-36b47d4cac6d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 24 14:47:43.269992 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:47:43.269942 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-2fwr2" podUID="83e23000-c9f7-4c06-8b5c-36b47d4cac6d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 24 14:47:53.270579 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:47:53.270533 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-2fwr2" podUID="83e23000-c9f7-4c06-8b5c-36b47d4cac6d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 24 14:48:03.271035 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:48:03.270955 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-2fwr2" Apr 24 14:48:06.337361 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:48:06.337326 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-2fwr2"] Apr 24 14:48:06.337845 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:48:06.337574 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-2fwr2" podUID="83e23000-c9f7-4c06-8b5c-36b47d4cac6d" containerName="kserve-container" containerID="cri-o://0ed8b9bdd6b1731f85e19410d119519dabc451f3e2885a1b9bbe480bfec636d2" gracePeriod=30 Apr 24 14:48:06.387747 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:48:06.387707 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-lh7qz"] Apr 24 14:48:06.388073 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:48:06.388059 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0690c263-853f-4742-bcf8-729c3da01606" containerName="kserve-container" Apr 24 14:48:06.388123 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:48:06.388075 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="0690c263-853f-4742-bcf8-729c3da01606" containerName="kserve-container" Apr 24 14:48:06.388123 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:48:06.388100 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0690c263-853f-4742-bcf8-729c3da01606" containerName="storage-initializer" Apr 24 14:48:06.388123 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:48:06.388106 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="0690c263-853f-4742-bcf8-729c3da01606" containerName="storage-initializer" Apr 24 14:48:06.388214 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:48:06.388157 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="0690c263-853f-4742-bcf8-729c3da01606" containerName="kserve-container" Apr 24 14:48:06.390905 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:48:06.390875 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-lh7qz" Apr 24 14:48:06.401862 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:48:06.401835 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-lh7qz"] Apr 24 14:48:06.512258 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:48:06.512214 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b8bace2c-d211-4a76-8fd5-2e865efc8e13-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7c6bd76f7b-lh7qz\" (UID: \"b8bace2c-d211-4a76-8fd5-2e865efc8e13\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-lh7qz" Apr 24 14:48:06.613551 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:48:06.613452 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b8bace2c-d211-4a76-8fd5-2e865efc8e13-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7c6bd76f7b-lh7qz\" (UID: \"b8bace2c-d211-4a76-8fd5-2e865efc8e13\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-lh7qz" Apr 24 14:48:06.613846 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:48:06.613824 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b8bace2c-d211-4a76-8fd5-2e865efc8e13-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7c6bd76f7b-lh7qz\" (UID: \"b8bace2c-d211-4a76-8fd5-2e865efc8e13\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-lh7qz" Apr 24 14:48:06.702215 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:48:06.702177 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-lh7qz" Apr 24 14:48:06.830256 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:48:06.830231 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-lh7qz"] Apr 24 14:48:06.832954 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:48:06.832923 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8bace2c_d211_4a76_8fd5_2e865efc8e13.slice/crio-2862c3ba839239ee9f42af3412e5b43120d2ea2c74c9b39744d4ebb2d0bd164b WatchSource:0}: Error finding container 2862c3ba839239ee9f42af3412e5b43120d2ea2c74c9b39744d4ebb2d0bd164b: Status 404 returned error can't find the container with id 2862c3ba839239ee9f42af3412e5b43120d2ea2c74c9b39744d4ebb2d0bd164b Apr 24 14:48:07.439884 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:48:07.439847 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-lh7qz" event={"ID":"b8bace2c-d211-4a76-8fd5-2e865efc8e13","Type":"ContainerStarted","Data":"b2e54b5fea7453a70aab1dbfc4a3d9a5a0c59a7820388fa255932de85bb5c39c"} Apr 24 14:48:07.440279 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:48:07.439888 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-lh7qz" event={"ID":"b8bace2c-d211-4a76-8fd5-2e865efc8e13","Type":"ContainerStarted","Data":"2862c3ba839239ee9f42af3412e5b43120d2ea2c74c9b39744d4ebb2d0bd164b"} Apr 24 14:48:09.379422 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:48:09.379397 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-2fwr2" Apr 24 14:48:09.437095 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:48:09.437001 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/83e23000-c9f7-4c06-8b5c-36b47d4cac6d-kserve-provision-location\") pod \"83e23000-c9f7-4c06-8b5c-36b47d4cac6d\" (UID: \"83e23000-c9f7-4c06-8b5c-36b47d4cac6d\") " Apr 24 14:48:09.446974 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:48:09.446940 2569 generic.go:358] "Generic (PLEG): container finished" podID="83e23000-c9f7-4c06-8b5c-36b47d4cac6d" containerID="0ed8b9bdd6b1731f85e19410d119519dabc451f3e2885a1b9bbe480bfec636d2" exitCode=0 Apr 24 14:48:09.447158 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:48:09.447014 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-2fwr2" Apr 24 14:48:09.447158 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:48:09.447036 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-2fwr2" event={"ID":"83e23000-c9f7-4c06-8b5c-36b47d4cac6d","Type":"ContainerDied","Data":"0ed8b9bdd6b1731f85e19410d119519dabc451f3e2885a1b9bbe480bfec636d2"} Apr 24 14:48:09.447158 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:48:09.447084 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-2fwr2" event={"ID":"83e23000-c9f7-4c06-8b5c-36b47d4cac6d","Type":"ContainerDied","Data":"ed7167022de74fef907e49688c559c68e85bef59e0a7f198940893dfa9c3ff02"} Apr 24 14:48:09.447158 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:48:09.447105 2569 scope.go:117] "RemoveContainer" containerID="0ed8b9bdd6b1731f85e19410d119519dabc451f3e2885a1b9bbe480bfec636d2" Apr 24 14:48:09.447908 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:48:09.447866 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83e23000-c9f7-4c06-8b5c-36b47d4cac6d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "83e23000-c9f7-4c06-8b5c-36b47d4cac6d" (UID: "83e23000-c9f7-4c06-8b5c-36b47d4cac6d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:48:09.454935 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:48:09.454911 2569 scope.go:117] "RemoveContainer" containerID="02cc257f9a5bd8ecf090e328ea4dac39f8cc31a1e1f19d60334b0a5837622938" Apr 24 14:48:09.462013 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:48:09.461991 2569 scope.go:117] "RemoveContainer" containerID="0ed8b9bdd6b1731f85e19410d119519dabc451f3e2885a1b9bbe480bfec636d2" Apr 24 14:48:09.462301 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:48:09.462284 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ed8b9bdd6b1731f85e19410d119519dabc451f3e2885a1b9bbe480bfec636d2\": container with ID starting with 0ed8b9bdd6b1731f85e19410d119519dabc451f3e2885a1b9bbe480bfec636d2 not found: ID does not exist" containerID="0ed8b9bdd6b1731f85e19410d119519dabc451f3e2885a1b9bbe480bfec636d2" Apr 24 14:48:09.462370 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:48:09.462314 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ed8b9bdd6b1731f85e19410d119519dabc451f3e2885a1b9bbe480bfec636d2"} err="failed to get container status \"0ed8b9bdd6b1731f85e19410d119519dabc451f3e2885a1b9bbe480bfec636d2\": rpc error: code = NotFound desc = could not find container \"0ed8b9bdd6b1731f85e19410d119519dabc451f3e2885a1b9bbe480bfec636d2\": container with ID starting with 0ed8b9bdd6b1731f85e19410d119519dabc451f3e2885a1b9bbe480bfec636d2 not found: ID does not exist" Apr 24 14:48:09.462370 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:48:09.462332 2569 scope.go:117] "RemoveContainer" containerID="02cc257f9a5bd8ecf090e328ea4dac39f8cc31a1e1f19d60334b0a5837622938" Apr 24 14:48:09.462599 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:48:09.462578 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02cc257f9a5bd8ecf090e328ea4dac39f8cc31a1e1f19d60334b0a5837622938\": container with ID starting with 02cc257f9a5bd8ecf090e328ea4dac39f8cc31a1e1f19d60334b0a5837622938 not found: ID does not exist" containerID="02cc257f9a5bd8ecf090e328ea4dac39f8cc31a1e1f19d60334b0a5837622938" Apr 24 14:48:09.462648 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:48:09.462604 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02cc257f9a5bd8ecf090e328ea4dac39f8cc31a1e1f19d60334b0a5837622938"} err="failed to get container status \"02cc257f9a5bd8ecf090e328ea4dac39f8cc31a1e1f19d60334b0a5837622938\": rpc error: code = NotFound desc = could not find container \"02cc257f9a5bd8ecf090e328ea4dac39f8cc31a1e1f19d60334b0a5837622938\": container with ID starting with 02cc257f9a5bd8ecf090e328ea4dac39f8cc31a1e1f19d60334b0a5837622938 not found: ID does not exist" Apr 24 14:48:09.538277 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:48:09.538233 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/83e23000-c9f7-4c06-8b5c-36b47d4cac6d-kserve-provision-location\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 14:48:09.769683 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:48:09.769651 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-2fwr2"] Apr 24 14:48:09.774458 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:48:09.774433 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-2fwr2"] Apr 24 14:48:09.828070 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:48:09.828042 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83e23000-c9f7-4c06-8b5c-36b47d4cac6d" path="/var/lib/kubelet/pods/83e23000-c9f7-4c06-8b5c-36b47d4cac6d/volumes" Apr 24 14:48:11.454445 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:48:11.454411 2569 generic.go:358] "Generic (PLEG): container finished" podID="b8bace2c-d211-4a76-8fd5-2e865efc8e13" containerID="b2e54b5fea7453a70aab1dbfc4a3d9a5a0c59a7820388fa255932de85bb5c39c" exitCode=0 Apr 24 14:48:11.454851 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:48:11.454482 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-lh7qz" event={"ID":"b8bace2c-d211-4a76-8fd5-2e865efc8e13","Type":"ContainerDied","Data":"b2e54b5fea7453a70aab1dbfc4a3d9a5a0c59a7820388fa255932de85bb5c39c"} Apr 24 14:48:12.459371 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:48:12.459336 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-lh7qz" event={"ID":"b8bace2c-d211-4a76-8fd5-2e865efc8e13","Type":"ContainerStarted","Data":"70cd838dff4af87e8c8c81bb0be38aa3a30c6a9539511db449a0c18f6eded73d"} Apr 24 14:48:12.459794 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:48:12.459627 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-lh7qz" Apr 24 14:48:12.460888 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:48:12.460859 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-lh7qz" podUID="b8bace2c-d211-4a76-8fd5-2e865efc8e13" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 14:48:12.475542 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:48:12.475480 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-lh7qz" podStartSLOduration=6.47546174 podStartE2EDuration="6.47546174s" podCreationTimestamp="2026-04-24 14:48:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:48:12.473689807 +0000 UTC m=+1445.267746749" watchObservedRunningTime="2026-04-24 14:48:12.47546174 +0000 UTC m=+1445.269518642" Apr 24 14:48:13.462471 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:48:13.462427 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-lh7qz" podUID="b8bace2c-d211-4a76-8fd5-2e865efc8e13" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 14:48:23.463237 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:48:23.463191 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-lh7qz" podUID="b8bace2c-d211-4a76-8fd5-2e865efc8e13" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 14:48:33.462811 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:48:33.462763 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-lh7qz" podUID="b8bace2c-d211-4a76-8fd5-2e865efc8e13" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 14:48:43.463049 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:48:43.463005 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-lh7qz" podUID="b8bace2c-d211-4a76-8fd5-2e865efc8e13" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 14:48:53.463155 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:48:53.463101 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-lh7qz" podUID="b8bace2c-d211-4a76-8fd5-2e865efc8e13" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 14:49:03.464120 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:49:03.464088 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-lh7qz" Apr 24 14:49:07.759310 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:49:07.759280 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mjpgb_be4da945-f6d1-4406-adea-f3ccedab88f6/ovn-acl-logging/0.log" Apr 24 14:49:07.764856 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:49:07.764831 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mjpgb_be4da945-f6d1-4406-adea-f3ccedab88f6/ovn-acl-logging/0.log" Apr 24 14:49:08.136835 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:49:08.136746 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-lh7qz"] Apr 24 14:49:08.137520 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:49:08.137466 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-lh7qz" podUID="b8bace2c-d211-4a76-8fd5-2e865efc8e13" containerName="kserve-container" containerID="cri-o://70cd838dff4af87e8c8c81bb0be38aa3a30c6a9539511db449a0c18f6eded73d" gracePeriod=30 Apr 24 14:49:08.177973 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:49:08.177934 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-lpm66"] Apr 24 14:49:08.178321 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:49:08.178308 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="83e23000-c9f7-4c06-8b5c-36b47d4cac6d" containerName="kserve-container" Apr 24 14:49:08.178366 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:49:08.178323 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="83e23000-c9f7-4c06-8b5c-36b47d4cac6d" containerName="kserve-container" Apr 24 14:49:08.178366 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:49:08.178345 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="83e23000-c9f7-4c06-8b5c-36b47d4cac6d" containerName="storage-initializer" Apr 24 14:49:08.178366 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:49:08.178354 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="83e23000-c9f7-4c06-8b5c-36b47d4cac6d" containerName="storage-initializer" Apr 24 14:49:08.178460 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:49:08.178404 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="83e23000-c9f7-4c06-8b5c-36b47d4cac6d" containerName="kserve-container" Apr 24 14:49:08.181462 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:49:08.181441 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-lpm66" Apr 24 14:49:08.187842 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:49:08.187813 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-lpm66"] Apr 24 14:49:08.215164 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:49:08.215129 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/137a48e4-6b82-4dc0-b0e6-d500ea7f2705-kserve-provision-location\") pod \"isvc-pmml-predictor-5584ffd8c9-lpm66\" (UID: \"137a48e4-6b82-4dc0-b0e6-d500ea7f2705\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-lpm66" Apr 24 14:49:08.316128 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:49:08.316088 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/137a48e4-6b82-4dc0-b0e6-d500ea7f2705-kserve-provision-location\") pod \"isvc-pmml-predictor-5584ffd8c9-lpm66\" (UID: \"137a48e4-6b82-4dc0-b0e6-d500ea7f2705\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-lpm66" Apr 24 14:49:08.316526 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:49:08.316506 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/137a48e4-6b82-4dc0-b0e6-d500ea7f2705-kserve-provision-location\") pod \"isvc-pmml-predictor-5584ffd8c9-lpm66\" (UID: \"137a48e4-6b82-4dc0-b0e6-d500ea7f2705\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-lpm66" Apr 24 14:49:08.493006 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:49:08.492966 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-lpm66" Apr 24 14:49:08.623169 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:49:08.623139 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-lpm66"] Apr 24 14:49:08.625866 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:49:08.625822 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod137a48e4_6b82_4dc0_b0e6_d500ea7f2705.slice/crio-c6e2231802d0d1045e45fe3329b583b165464ef2f2d2b2ec68730acb0956a5c1 WatchSource:0}: Error finding container c6e2231802d0d1045e45fe3329b583b165464ef2f2d2b2ec68730acb0956a5c1: Status 404 returned error can't find the container with id c6e2231802d0d1045e45fe3329b583b165464ef2f2d2b2ec68730acb0956a5c1 Apr 24 14:49:08.627714 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:49:08.627694 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 14:49:08.635056 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:49:08.635026 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-lpm66" event={"ID":"137a48e4-6b82-4dc0-b0e6-d500ea7f2705","Type":"ContainerStarted","Data":"c6e2231802d0d1045e45fe3329b583b165464ef2f2d2b2ec68730acb0956a5c1"} Apr 24 14:49:09.639748 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:49:09.639708 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-lpm66" event={"ID":"137a48e4-6b82-4dc0-b0e6-d500ea7f2705","Type":"ContainerStarted","Data":"335e218979ebd835249d7394f443fc998915968da2c25d11f28fa49587625262"} Apr 24 14:49:11.178009 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:49:11.177982 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-lh7qz" Apr 24 14:49:11.243331 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:49:11.243238 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b8bace2c-d211-4a76-8fd5-2e865efc8e13-kserve-provision-location\") pod \"b8bace2c-d211-4a76-8fd5-2e865efc8e13\" (UID: \"b8bace2c-d211-4a76-8fd5-2e865efc8e13\") " Apr 24 14:49:11.254141 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:49:11.254103 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8bace2c-d211-4a76-8fd5-2e865efc8e13-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b8bace2c-d211-4a76-8fd5-2e865efc8e13" (UID: "b8bace2c-d211-4a76-8fd5-2e865efc8e13"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:49:11.344178 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:49:11.344144 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b8bace2c-d211-4a76-8fd5-2e865efc8e13-kserve-provision-location\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 14:49:11.647097 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:49:11.647014 2569 generic.go:358] "Generic (PLEG): container finished" podID="b8bace2c-d211-4a76-8fd5-2e865efc8e13" containerID="70cd838dff4af87e8c8c81bb0be38aa3a30c6a9539511db449a0c18f6eded73d" exitCode=0 Apr 24 14:49:11.647097 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:49:11.647083 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-lh7qz" Apr 24 14:49:11.647097 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:49:11.647088 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-lh7qz" event={"ID":"b8bace2c-d211-4a76-8fd5-2e865efc8e13","Type":"ContainerDied","Data":"70cd838dff4af87e8c8c81bb0be38aa3a30c6a9539511db449a0c18f6eded73d"} Apr 24 14:49:11.647338 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:49:11.647131 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-lh7qz" event={"ID":"b8bace2c-d211-4a76-8fd5-2e865efc8e13","Type":"ContainerDied","Data":"2862c3ba839239ee9f42af3412e5b43120d2ea2c74c9b39744d4ebb2d0bd164b"} Apr 24 14:49:11.647338 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:49:11.647149 2569 scope.go:117] "RemoveContainer" containerID="70cd838dff4af87e8c8c81bb0be38aa3a30c6a9539511db449a0c18f6eded73d" Apr 24 14:49:11.655975 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:49:11.655954 2569 scope.go:117] "RemoveContainer" containerID="b2e54b5fea7453a70aab1dbfc4a3d9a5a0c59a7820388fa255932de85bb5c39c" Apr 24 14:49:11.663447 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:49:11.663429 2569 scope.go:117] "RemoveContainer" containerID="70cd838dff4af87e8c8c81bb0be38aa3a30c6a9539511db449a0c18f6eded73d" Apr 24 14:49:11.663756 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:49:11.663738 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70cd838dff4af87e8c8c81bb0be38aa3a30c6a9539511db449a0c18f6eded73d\": container with ID starting with 70cd838dff4af87e8c8c81bb0be38aa3a30c6a9539511db449a0c18f6eded73d not found: ID does not exist" containerID="70cd838dff4af87e8c8c81bb0be38aa3a30c6a9539511db449a0c18f6eded73d" Apr 24 14:49:11.663796 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:49:11.663766 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70cd838dff4af87e8c8c81bb0be38aa3a30c6a9539511db449a0c18f6eded73d"} err="failed to get container status \"70cd838dff4af87e8c8c81bb0be38aa3a30c6a9539511db449a0c18f6eded73d\": rpc error: code = NotFound desc = could not find container \"70cd838dff4af87e8c8c81bb0be38aa3a30c6a9539511db449a0c18f6eded73d\": container with ID starting with 70cd838dff4af87e8c8c81bb0be38aa3a30c6a9539511db449a0c18f6eded73d not found: ID does not exist" Apr 24 14:49:11.663796 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:49:11.663787 2569 scope.go:117] "RemoveContainer" containerID="b2e54b5fea7453a70aab1dbfc4a3d9a5a0c59a7820388fa255932de85bb5c39c" Apr 24 14:49:11.664034 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:49:11.664019 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2e54b5fea7453a70aab1dbfc4a3d9a5a0c59a7820388fa255932de85bb5c39c\": container with ID starting with b2e54b5fea7453a70aab1dbfc4a3d9a5a0c59a7820388fa255932de85bb5c39c not found: ID does not exist" containerID="b2e54b5fea7453a70aab1dbfc4a3d9a5a0c59a7820388fa255932de85bb5c39c" Apr 24 14:49:11.664089 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:49:11.664040 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2e54b5fea7453a70aab1dbfc4a3d9a5a0c59a7820388fa255932de85bb5c39c"} err="failed to get container status \"b2e54b5fea7453a70aab1dbfc4a3d9a5a0c59a7820388fa255932de85bb5c39c\": rpc error: code = NotFound desc = could not find container \"b2e54b5fea7453a70aab1dbfc4a3d9a5a0c59a7820388fa255932de85bb5c39c\": container with ID starting with b2e54b5fea7453a70aab1dbfc4a3d9a5a0c59a7820388fa255932de85bb5c39c not found: ID does not exist" Apr 24 14:49:11.667153 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:49:11.667121 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-lh7qz"] Apr 24 14:49:11.672848 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:49:11.672824 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-lh7qz"] Apr 24 14:49:11.828171 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:49:11.828137 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8bace2c-d211-4a76-8fd5-2e865efc8e13" path="/var/lib/kubelet/pods/b8bace2c-d211-4a76-8fd5-2e865efc8e13/volumes" Apr 24 14:49:12.653005 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:49:12.652971 2569 generic.go:358] "Generic (PLEG): container finished" podID="137a48e4-6b82-4dc0-b0e6-d500ea7f2705" containerID="335e218979ebd835249d7394f443fc998915968da2c25d11f28fa49587625262" exitCode=0 Apr 24 14:49:12.653373 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:49:12.653044 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-lpm66" event={"ID":"137a48e4-6b82-4dc0-b0e6-d500ea7f2705","Type":"ContainerDied","Data":"335e218979ebd835249d7394f443fc998915968da2c25d11f28fa49587625262"} Apr 24 14:49:19.683234 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:49:19.683194 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-lpm66" event={"ID":"137a48e4-6b82-4dc0-b0e6-d500ea7f2705","Type":"ContainerStarted","Data":"897a8e05dfb411700ebc223be3cd1664b226ece42038807e912870e6b52e68d7"} Apr 24 14:49:19.683585 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:49:19.683492 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-lpm66" Apr 24 14:49:19.684949 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:49:19.684923 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-lpm66" podUID="137a48e4-6b82-4dc0-b0e6-d500ea7f2705" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 24 14:49:19.701424 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:49:19.701105 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-lpm66" podStartSLOduration=4.775845229 podStartE2EDuration="11.70108803s" podCreationTimestamp="2026-04-24 14:49:08 +0000 UTC" firstStartedPulling="2026-04-24 14:49:12.654238197 +0000 UTC m=+1505.448295077" lastFinishedPulling="2026-04-24 14:49:19.579480996 +0000 UTC m=+1512.373537878" observedRunningTime="2026-04-24 14:49:19.699514364 +0000 UTC m=+1512.493571266" watchObservedRunningTime="2026-04-24 14:49:19.70108803 +0000 UTC m=+1512.495144932" Apr 24 14:49:20.686734 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:49:20.686696 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-lpm66" podUID="137a48e4-6b82-4dc0-b0e6-d500ea7f2705" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 24 14:49:30.687493 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:49:30.687396 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-lpm66" podUID="137a48e4-6b82-4dc0-b0e6-d500ea7f2705" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 24 14:49:40.687453 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:49:40.687409 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-lpm66" podUID="137a48e4-6b82-4dc0-b0e6-d500ea7f2705" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 24 14:49:50.687180 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:49:50.687123 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-lpm66" podUID="137a48e4-6b82-4dc0-b0e6-d500ea7f2705" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 24 14:50:00.687332 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:50:00.687288 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-lpm66" podUID="137a48e4-6b82-4dc0-b0e6-d500ea7f2705" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 24 14:50:10.687528 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:50:10.687473 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-lpm66" podUID="137a48e4-6b82-4dc0-b0e6-d500ea7f2705" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 24 14:50:20.687070 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:50:20.687013 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-lpm66" podUID="137a48e4-6b82-4dc0-b0e6-d500ea7f2705" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 24 14:50:30.687415 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:50:30.687361 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-lpm66" podUID="137a48e4-6b82-4dc0-b0e6-d500ea7f2705" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 24 14:50:35.824584 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:50:35.824543 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-lpm66" podUID="137a48e4-6b82-4dc0-b0e6-d500ea7f2705" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 24 14:50:45.828466 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:50:45.828434 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-lpm66" Apr 24 14:50:49.250633 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:50:49.250598 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-lpm66"] Apr 24 14:50:49.251148 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:50:49.250933 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-lpm66" podUID="137a48e4-6b82-4dc0-b0e6-d500ea7f2705" containerName="kserve-container" containerID="cri-o://897a8e05dfb411700ebc223be3cd1664b226ece42038807e912870e6b52e68d7" gracePeriod=30 Apr 24 14:50:49.303321 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:50:49.303284 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-l5n4c"] Apr 24 14:50:49.303723 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:50:49.303704 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b8bace2c-d211-4a76-8fd5-2e865efc8e13" containerName="kserve-container" Apr 24 14:50:49.303821 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:50:49.303724 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8bace2c-d211-4a76-8fd5-2e865efc8e13" containerName="kserve-container" Apr 24 14:50:49.303821 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:50:49.303750 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b8bace2c-d211-4a76-8fd5-2e865efc8e13" containerName="storage-initializer" Apr 24 14:50:49.303821 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:50:49.303758 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8bace2c-d211-4a76-8fd5-2e865efc8e13" containerName="storage-initializer" Apr 24 14:50:49.304022 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:50:49.303833 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="b8bace2c-d211-4a76-8fd5-2e865efc8e13" containerName="kserve-container" Apr 24 14:50:49.307113 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:50:49.307092 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-l5n4c" Apr 24 14:50:49.314964 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:50:49.314672 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-l5n4c"] Apr 24 14:50:49.359926 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:50:49.355496 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4b0d3341-2952-4cc8-8b2a-1b96e1294546-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-7576f6b69f-l5n4c\" (UID: \"4b0d3341-2952-4cc8-8b2a-1b96e1294546\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-l5n4c" Apr 24 14:50:49.456923 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:50:49.456854 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4b0d3341-2952-4cc8-8b2a-1b96e1294546-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-7576f6b69f-l5n4c\" (UID: \"4b0d3341-2952-4cc8-8b2a-1b96e1294546\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-l5n4c" Apr 24 14:50:49.457257 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:50:49.457236 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4b0d3341-2952-4cc8-8b2a-1b96e1294546-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-7576f6b69f-l5n4c\" (UID: \"4b0d3341-2952-4cc8-8b2a-1b96e1294546\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-l5n4c" Apr 24 14:50:49.619050 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:50:49.618950 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-l5n4c" Apr 24 14:50:49.744663 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:50:49.744621 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-l5n4c"] Apr 24 14:50:49.747768 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:50:49.747733 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b0d3341_2952_4cc8_8b2a_1b96e1294546.slice/crio-50ee4aedac63d9fb9a384fdba1d53a9cc06061289911dcb04e79a2b37a44aa3c WatchSource:0}: Error finding container 50ee4aedac63d9fb9a384fdba1d53a9cc06061289911dcb04e79a2b37a44aa3c: Status 404 returned error can't find the container with id 50ee4aedac63d9fb9a384fdba1d53a9cc06061289911dcb04e79a2b37a44aa3c Apr 24 14:50:49.941784 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:50:49.941748 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-l5n4c" event={"ID":"4b0d3341-2952-4cc8-8b2a-1b96e1294546","Type":"ContainerStarted","Data":"ebd1eed731decc5323946240a1cf687342f343edb7086b37faba107c8cfad395"} Apr 24 14:50:49.941784 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:50:49.941785 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-l5n4c" event={"ID":"4b0d3341-2952-4cc8-8b2a-1b96e1294546","Type":"ContainerStarted","Data":"50ee4aedac63d9fb9a384fdba1d53a9cc06061289911dcb04e79a2b37a44aa3c"} Apr 24 14:50:53.091381 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:50:53.091357 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-lpm66" Apr 24 14:50:53.194260 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:50:53.194224 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/137a48e4-6b82-4dc0-b0e6-d500ea7f2705-kserve-provision-location\") pod \"137a48e4-6b82-4dc0-b0e6-d500ea7f2705\" (UID: \"137a48e4-6b82-4dc0-b0e6-d500ea7f2705\") " Apr 24 14:50:53.194567 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:50:53.194544 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/137a48e4-6b82-4dc0-b0e6-d500ea7f2705-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "137a48e4-6b82-4dc0-b0e6-d500ea7f2705" (UID: "137a48e4-6b82-4dc0-b0e6-d500ea7f2705"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:50:53.295525 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:50:53.295485 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/137a48e4-6b82-4dc0-b0e6-d500ea7f2705-kserve-provision-location\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 14:50:53.956761 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:50:53.956726 2569 generic.go:358] "Generic (PLEG): container finished" podID="137a48e4-6b82-4dc0-b0e6-d500ea7f2705" containerID="897a8e05dfb411700ebc223be3cd1664b226ece42038807e912870e6b52e68d7" exitCode=0 Apr 24 14:50:53.956962 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:50:53.956810 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-lpm66" Apr 24 14:50:53.956962 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:50:53.956813 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-lpm66" event={"ID":"137a48e4-6b82-4dc0-b0e6-d500ea7f2705","Type":"ContainerDied","Data":"897a8e05dfb411700ebc223be3cd1664b226ece42038807e912870e6b52e68d7"} Apr 24 14:50:53.956962 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:50:53.956941 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-lpm66" event={"ID":"137a48e4-6b82-4dc0-b0e6-d500ea7f2705","Type":"ContainerDied","Data":"c6e2231802d0d1045e45fe3329b583b165464ef2f2d2b2ec68730acb0956a5c1"} Apr 24 14:50:53.956962 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:50:53.956964 2569 scope.go:117] "RemoveContainer" containerID="897a8e05dfb411700ebc223be3cd1664b226ece42038807e912870e6b52e68d7" Apr 24 14:50:53.958313 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:50:53.958291 2569 generic.go:358] "Generic (PLEG): container finished" podID="4b0d3341-2952-4cc8-8b2a-1b96e1294546" containerID="ebd1eed731decc5323946240a1cf687342f343edb7086b37faba107c8cfad395" exitCode=0 Apr 24 14:50:53.958407 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:50:53.958331 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-l5n4c" event={"ID":"4b0d3341-2952-4cc8-8b2a-1b96e1294546","Type":"ContainerDied","Data":"ebd1eed731decc5323946240a1cf687342f343edb7086b37faba107c8cfad395"} Apr 24 14:50:53.965257 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:50:53.965194 2569 scope.go:117] "RemoveContainer" containerID="335e218979ebd835249d7394f443fc998915968da2c25d11f28fa49587625262" Apr 24 14:50:53.973015 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:50:53.972815 2569 scope.go:117] "RemoveContainer" containerID="897a8e05dfb411700ebc223be3cd1664b226ece42038807e912870e6b52e68d7" Apr 24 14:50:53.973015 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:50:53.972945 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-lpm66"] Apr 24 14:50:53.973175 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:50:53.973152 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"897a8e05dfb411700ebc223be3cd1664b226ece42038807e912870e6b52e68d7\": container with ID starting with 897a8e05dfb411700ebc223be3cd1664b226ece42038807e912870e6b52e68d7 not found: ID does not exist" containerID="897a8e05dfb411700ebc223be3cd1664b226ece42038807e912870e6b52e68d7" Apr 24 14:50:53.973216 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:50:53.973189 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"897a8e05dfb411700ebc223be3cd1664b226ece42038807e912870e6b52e68d7"} err="failed to get container status \"897a8e05dfb411700ebc223be3cd1664b226ece42038807e912870e6b52e68d7\": rpc error: code = NotFound desc = could not find container \"897a8e05dfb411700ebc223be3cd1664b226ece42038807e912870e6b52e68d7\": container with ID starting with 897a8e05dfb411700ebc223be3cd1664b226ece42038807e912870e6b52e68d7 not found: ID does not exist" Apr 24 14:50:53.973261 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:50:53.973218 2569 scope.go:117] "RemoveContainer" containerID="335e218979ebd835249d7394f443fc998915968da2c25d11f28fa49587625262" Apr 24 14:50:53.973497 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:50:53.973479 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"335e218979ebd835249d7394f443fc998915968da2c25d11f28fa49587625262\": container with ID starting with 335e218979ebd835249d7394f443fc998915968da2c25d11f28fa49587625262 not found: ID does not exist" containerID="335e218979ebd835249d7394f443fc998915968da2c25d11f28fa49587625262" Apr 24 14:50:53.973552 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:50:53.973502 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"335e218979ebd835249d7394f443fc998915968da2c25d11f28fa49587625262"} err="failed to get container status \"335e218979ebd835249d7394f443fc998915968da2c25d11f28fa49587625262\": rpc error: code = NotFound desc = could not find container \"335e218979ebd835249d7394f443fc998915968da2c25d11f28fa49587625262\": container with ID starting with 335e218979ebd835249d7394f443fc998915968da2c25d11f28fa49587625262 not found: ID does not exist" Apr 24 14:50:53.978795 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:50:53.978769 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-lpm66"] Apr 24 14:50:54.964715 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:50:54.964677 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-l5n4c" event={"ID":"4b0d3341-2952-4cc8-8b2a-1b96e1294546","Type":"ContainerStarted","Data":"9e4510dded63ab690917537240b825176e7ed6495fe45cc9c4dd3f6aa043935e"} Apr 24 14:50:54.965194 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:50:54.965022 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-l5n4c" Apr 24 14:50:54.966529 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:50:54.966505 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-l5n4c" podUID="4b0d3341-2952-4cc8-8b2a-1b96e1294546" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 24 14:50:54.981223 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:50:54.981171 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-l5n4c" podStartSLOduration=5.981155501 podStartE2EDuration="5.981155501s" podCreationTimestamp="2026-04-24 14:50:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:50:54.97909318 +0000 UTC m=+1607.773150080" watchObservedRunningTime="2026-04-24 14:50:54.981155501 +0000 UTC m=+1607.775212492" Apr 24 14:50:55.827745 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:50:55.827714 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="137a48e4-6b82-4dc0-b0e6-d500ea7f2705" path="/var/lib/kubelet/pods/137a48e4-6b82-4dc0-b0e6-d500ea7f2705/volumes" Apr 24 14:50:55.968116 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:50:55.968074 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-l5n4c" podUID="4b0d3341-2952-4cc8-8b2a-1b96e1294546" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 24 14:51:05.968493 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:51:05.968389 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-l5n4c" podUID="4b0d3341-2952-4cc8-8b2a-1b96e1294546" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 24 14:51:15.968479 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:51:15.968427 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-l5n4c" podUID="4b0d3341-2952-4cc8-8b2a-1b96e1294546" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 24 14:51:25.969075 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:51:25.969019 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-l5n4c" podUID="4b0d3341-2952-4cc8-8b2a-1b96e1294546" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 24 14:51:35.968314 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:51:35.968267 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-l5n4c" podUID="4b0d3341-2952-4cc8-8b2a-1b96e1294546" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 24 14:51:45.968111 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:51:45.968063 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-l5n4c" podUID="4b0d3341-2952-4cc8-8b2a-1b96e1294546" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 24 14:51:55.968421 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:51:55.968370 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-l5n4c" podUID="4b0d3341-2952-4cc8-8b2a-1b96e1294546" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 24 14:52:05.968759 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:52:05.968714 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-l5n4c" podUID="4b0d3341-2952-4cc8-8b2a-1b96e1294546" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 24 14:52:07.825962 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:52:07.825923 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-l5n4c" podUID="4b0d3341-2952-4cc8-8b2a-1b96e1294546" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 24 14:52:17.828956 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:52:17.828928 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-l5n4c" Apr 24 14:52:20.431699 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:52:20.431663 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-l5n4c"] Apr 24 14:52:20.432114 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:52:20.431929 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-l5n4c" podUID="4b0d3341-2952-4cc8-8b2a-1b96e1294546" containerName="kserve-container" containerID="cri-o://9e4510dded63ab690917537240b825176e7ed6495fe45cc9c4dd3f6aa043935e" gracePeriod=30 Apr 24 14:52:20.505846 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:52:20.505807 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-8g24w"] Apr 24 14:52:20.506154 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:52:20.506140 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="137a48e4-6b82-4dc0-b0e6-d500ea7f2705" containerName="kserve-container" Apr 24 14:52:20.506204 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:52:20.506156 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="137a48e4-6b82-4dc0-b0e6-d500ea7f2705" containerName="kserve-container" Apr 24 14:52:20.506204 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:52:20.506175 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="137a48e4-6b82-4dc0-b0e6-d500ea7f2705" containerName="storage-initializer" Apr 24 14:52:20.506204 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:52:20.506181 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="137a48e4-6b82-4dc0-b0e6-d500ea7f2705" containerName="storage-initializer" Apr 24 14:52:20.506294 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:52:20.506229 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="137a48e4-6b82-4dc0-b0e6-d500ea7f2705" containerName="kserve-container" Apr 24 14:52:20.509351 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:52:20.509333 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-8g24w" Apr 24 14:52:20.515531 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:52:20.515502 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-8g24w"] Apr 24 14:52:20.550495 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:52:20.550465 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87a899bb-3ffd-469d-a451-df62f955e994-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-75b87ff64c-8g24w\" (UID: \"87a899bb-3ffd-469d-a451-df62f955e994\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-8g24w" Apr 24 14:52:20.651784 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:52:20.651736 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87a899bb-3ffd-469d-a451-df62f955e994-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-75b87ff64c-8g24w\" (UID: \"87a899bb-3ffd-469d-a451-df62f955e994\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-8g24w" Apr 24 14:52:20.652156 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:52:20.652133 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87a899bb-3ffd-469d-a451-df62f955e994-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-75b87ff64c-8g24w\" (UID: \"87a899bb-3ffd-469d-a451-df62f955e994\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-8g24w" Apr 24 14:52:20.819938 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:52:20.819812 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-8g24w" Apr 24 14:52:20.939455 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:52:20.939428 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-8g24w"] Apr 24 14:52:20.941804 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:52:20.941773 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87a899bb_3ffd_469d_a451_df62f955e994.slice/crio-cae63f9924b7e36d1ae96dc9b50029697ff88c8a57ac10f05e6fed975c1ce164 WatchSource:0}: Error finding container cae63f9924b7e36d1ae96dc9b50029697ff88c8a57ac10f05e6fed975c1ce164: Status 404 returned error can't find the container with id cae63f9924b7e36d1ae96dc9b50029697ff88c8a57ac10f05e6fed975c1ce164 Apr 24 14:52:21.220444 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:52:21.220410 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-8g24w" event={"ID":"87a899bb-3ffd-469d-a451-df62f955e994","Type":"ContainerStarted","Data":"dbb8db47f94cdfe9aa2b83cd8d91785099c6a8b5e6aa40f2f701b35b0cd6d8f8"} Apr 24 14:52:21.220444 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:52:21.220449 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-8g24w" event={"ID":"87a899bb-3ffd-469d-a451-df62f955e994","Type":"ContainerStarted","Data":"cae63f9924b7e36d1ae96dc9b50029697ff88c8a57ac10f05e6fed975c1ce164"} Apr 24 14:52:24.230359 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:52:24.230326 2569 generic.go:358] "Generic (PLEG): container finished" podID="4b0d3341-2952-4cc8-8b2a-1b96e1294546" containerID="9e4510dded63ab690917537240b825176e7ed6495fe45cc9c4dd3f6aa043935e" exitCode=0 Apr 24 14:52:24.230735 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:52:24.230395 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-l5n4c" event={"ID":"4b0d3341-2952-4cc8-8b2a-1b96e1294546","Type":"ContainerDied","Data":"9e4510dded63ab690917537240b825176e7ed6495fe45cc9c4dd3f6aa043935e"} Apr 24 14:52:24.273309 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:52:24.273287 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-l5n4c" Apr 24 14:52:24.384469 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:52:24.384379 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4b0d3341-2952-4cc8-8b2a-1b96e1294546-kserve-provision-location\") pod \"4b0d3341-2952-4cc8-8b2a-1b96e1294546\" (UID: \"4b0d3341-2952-4cc8-8b2a-1b96e1294546\") " Apr 24 14:52:24.384703 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:52:24.384678 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b0d3341-2952-4cc8-8b2a-1b96e1294546-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4b0d3341-2952-4cc8-8b2a-1b96e1294546" (UID: "4b0d3341-2952-4cc8-8b2a-1b96e1294546"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:52:24.485774 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:52:24.485737 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4b0d3341-2952-4cc8-8b2a-1b96e1294546-kserve-provision-location\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 14:52:25.234985 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:52:25.234948 2569 generic.go:358] "Generic (PLEG): container finished" podID="87a899bb-3ffd-469d-a451-df62f955e994" containerID="dbb8db47f94cdfe9aa2b83cd8d91785099c6a8b5e6aa40f2f701b35b0cd6d8f8" exitCode=0 Apr 24 14:52:25.235382 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:52:25.235022 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-8g24w" event={"ID":"87a899bb-3ffd-469d-a451-df62f955e994","Type":"ContainerDied","Data":"dbb8db47f94cdfe9aa2b83cd8d91785099c6a8b5e6aa40f2f701b35b0cd6d8f8"} Apr 24 14:52:25.236587 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:52:25.236567 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-l5n4c" event={"ID":"4b0d3341-2952-4cc8-8b2a-1b96e1294546","Type":"ContainerDied","Data":"50ee4aedac63d9fb9a384fdba1d53a9cc06061289911dcb04e79a2b37a44aa3c"} Apr 24 14:52:25.236587 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:52:25.236580 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-l5n4c" Apr 24 14:52:25.236703 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:52:25.236604 2569 scope.go:117] "RemoveContainer" containerID="9e4510dded63ab690917537240b825176e7ed6495fe45cc9c4dd3f6aa043935e" Apr 24 14:52:25.246193 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:52:25.246025 2569 scope.go:117] "RemoveContainer" containerID="ebd1eed731decc5323946240a1cf687342f343edb7086b37faba107c8cfad395" Apr 24 14:52:25.260942 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:52:25.260886 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-l5n4c"] Apr 24 14:52:25.264632 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:52:25.264609 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-l5n4c"] Apr 24 14:52:25.827581 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:52:25.827547 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b0d3341-2952-4cc8-8b2a-1b96e1294546" path="/var/lib/kubelet/pods/4b0d3341-2952-4cc8-8b2a-1b96e1294546/volumes" Apr 24 14:52:26.241304 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:52:26.241271 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-8g24w" event={"ID":"87a899bb-3ffd-469d-a451-df62f955e994","Type":"ContainerStarted","Data":"350ff836ef6c4766576508be4c2a85d4ef247a45031f9def64a7fe58f2cf6503"} Apr 24 14:52:26.241730 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:52:26.241600 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-8g24w" Apr 24 14:52:26.243081 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:52:26.243052 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-8g24w" podUID="87a899bb-3ffd-469d-a451-df62f955e994" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 24 14:52:26.254845 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:52:26.254692 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-8g24w" podStartSLOduration=6.254674703 podStartE2EDuration="6.254674703s" podCreationTimestamp="2026-04-24 14:52:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:52:26.25423133 +0000 UTC m=+1699.048288231" watchObservedRunningTime="2026-04-24 14:52:26.254674703 +0000 UTC m=+1699.048731596" Apr 24 14:52:27.245272 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:52:27.245233 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-8g24w" podUID="87a899bb-3ffd-469d-a451-df62f955e994" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 24 14:52:37.245404 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:52:37.245354 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-8g24w" podUID="87a899bb-3ffd-469d-a451-df62f955e994" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 24 14:52:47.246009 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:52:47.245959 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-8g24w" podUID="87a899bb-3ffd-469d-a451-df62f955e994" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 24 14:52:57.245260 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:52:57.245219 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-8g24w" podUID="87a899bb-3ffd-469d-a451-df62f955e994" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 24 14:53:07.246159 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:53:07.246109 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-8g24w" podUID="87a899bb-3ffd-469d-a451-df62f955e994" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 24 14:53:17.246104 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:53:17.246055 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-8g24w" podUID="87a899bb-3ffd-469d-a451-df62f955e994" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 24 14:53:27.245816 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:53:27.245773 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-8g24w" podUID="87a899bb-3ffd-469d-a451-df62f955e994" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 24 14:53:37.246158 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:53:37.246105 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-8g24w" podUID="87a899bb-3ffd-469d-a451-df62f955e994" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 24 14:53:47.246109 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:53:47.246069 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-8g24w" Apr 24 14:53:51.725412 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:53:51.725378 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-8g24w"] Apr 24 14:53:51.725872 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:53:51.725736 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-8g24w" podUID="87a899bb-3ffd-469d-a451-df62f955e994" containerName="kserve-container" containerID="cri-o://350ff836ef6c4766576508be4c2a85d4ef247a45031f9def64a7fe58f2cf6503" gracePeriod=30 Apr 24 14:53:51.793723 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:53:51.793689 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-primary-5dfb41-predictor-5fffc54484-kbnfx"] Apr 24 14:53:51.794067 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:53:51.794054 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4b0d3341-2952-4cc8-8b2a-1b96e1294546" containerName="kserve-container" Apr 24 14:53:51.794116 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:53:51.794068 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b0d3341-2952-4cc8-8b2a-1b96e1294546" containerName="kserve-container" Apr 24 14:53:51.794116 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:53:51.794091 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4b0d3341-2952-4cc8-8b2a-1b96e1294546" containerName="storage-initializer" Apr 24 14:53:51.794116 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:53:51.794097 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b0d3341-2952-4cc8-8b2a-1b96e1294546" containerName="storage-initializer" Apr 24 14:53:51.794257 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:53:51.794145 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="4b0d3341-2952-4cc8-8b2a-1b96e1294546" containerName="kserve-container" Apr 24 14:53:51.797270 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:53:51.797251 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-5dfb41-predictor-5fffc54484-kbnfx" Apr 24 14:53:51.803680 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:53:51.803656 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-5dfb41-predictor-5fffc54484-kbnfx"] Apr 24 14:53:51.925185 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:53:51.925141 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a50ae531-3ab8-4d5f-813f-3686a0ac7d92-kserve-provision-location\") pod \"isvc-primary-5dfb41-predictor-5fffc54484-kbnfx\" (UID: \"a50ae531-3ab8-4d5f-813f-3686a0ac7d92\") " pod="kserve-ci-e2e-test/isvc-primary-5dfb41-predictor-5fffc54484-kbnfx" Apr 24 14:53:52.025801 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:53:52.025710 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a50ae531-3ab8-4d5f-813f-3686a0ac7d92-kserve-provision-location\") pod \"isvc-primary-5dfb41-predictor-5fffc54484-kbnfx\" (UID: \"a50ae531-3ab8-4d5f-813f-3686a0ac7d92\") " pod="kserve-ci-e2e-test/isvc-primary-5dfb41-predictor-5fffc54484-kbnfx" Apr 24 14:53:52.026125 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:53:52.026105 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a50ae531-3ab8-4d5f-813f-3686a0ac7d92-kserve-provision-location\") pod \"isvc-primary-5dfb41-predictor-5fffc54484-kbnfx\" (UID: \"a50ae531-3ab8-4d5f-813f-3686a0ac7d92\") " pod="kserve-ci-e2e-test/isvc-primary-5dfb41-predictor-5fffc54484-kbnfx" Apr 24 14:53:52.107557 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:53:52.107520 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-5dfb41-predictor-5fffc54484-kbnfx" Apr 24 14:53:52.224534 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:53:52.224505 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-5dfb41-predictor-5fffc54484-kbnfx"] Apr 24 14:53:52.227238 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:53:52.227195 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda50ae531_3ab8_4d5f_813f_3686a0ac7d92.slice/crio-02ceb7fc9703842e8a781052d565f2ca7d681d182a5585771917617d716a9d0f WatchSource:0}: Error finding container 02ceb7fc9703842e8a781052d565f2ca7d681d182a5585771917617d716a9d0f: Status 404 returned error can't find the container with id 02ceb7fc9703842e8a781052d565f2ca7d681d182a5585771917617d716a9d0f Apr 24 14:53:52.503150 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:53:52.503111 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-5dfb41-predictor-5fffc54484-kbnfx" event={"ID":"a50ae531-3ab8-4d5f-813f-3686a0ac7d92","Type":"ContainerStarted","Data":"7bdd75543c7e2bca7ee0611dfe0db842d3e25e536dc72a9ae440f73ee4f663e7"} Apr 24 14:53:52.503150 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:53:52.503146 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-5dfb41-predictor-5fffc54484-kbnfx" event={"ID":"a50ae531-3ab8-4d5f-813f-3686a0ac7d92","Type":"ContainerStarted","Data":"02ceb7fc9703842e8a781052d565f2ca7d681d182a5585771917617d716a9d0f"} Apr 24 14:53:55.461364 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:53:55.461338 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-8g24w" Apr 24 14:53:55.513012 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:53:55.512980 2569 generic.go:358] "Generic (PLEG): container finished" podID="87a899bb-3ffd-469d-a451-df62f955e994" containerID="350ff836ef6c4766576508be4c2a85d4ef247a45031f9def64a7fe58f2cf6503" exitCode=0 Apr 24 14:53:55.513191 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:53:55.513045 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-8g24w" Apr 24 14:53:55.513191 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:53:55.513064 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-8g24w" event={"ID":"87a899bb-3ffd-469d-a451-df62f955e994","Type":"ContainerDied","Data":"350ff836ef6c4766576508be4c2a85d4ef247a45031f9def64a7fe58f2cf6503"} Apr 24 14:53:55.513191 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:53:55.513102 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-8g24w" event={"ID":"87a899bb-3ffd-469d-a451-df62f955e994","Type":"ContainerDied","Data":"cae63f9924b7e36d1ae96dc9b50029697ff88c8a57ac10f05e6fed975c1ce164"} Apr 24 14:53:55.513191 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:53:55.513118 2569 scope.go:117] "RemoveContainer" containerID="350ff836ef6c4766576508be4c2a85d4ef247a45031f9def64a7fe58f2cf6503" Apr 24 14:53:55.520964 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:53:55.520947 2569 scope.go:117] "RemoveContainer" containerID="dbb8db47f94cdfe9aa2b83cd8d91785099c6a8b5e6aa40f2f701b35b0cd6d8f8" Apr 24 14:53:55.528156 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:53:55.528136 2569 scope.go:117] "RemoveContainer" containerID="350ff836ef6c4766576508be4c2a85d4ef247a45031f9def64a7fe58f2cf6503" Apr 24 14:53:55.528442 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:53:55.528422 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"350ff836ef6c4766576508be4c2a85d4ef247a45031f9def64a7fe58f2cf6503\": container with ID starting with 350ff836ef6c4766576508be4c2a85d4ef247a45031f9def64a7fe58f2cf6503 not found: ID does not exist" containerID="350ff836ef6c4766576508be4c2a85d4ef247a45031f9def64a7fe58f2cf6503" Apr 24 14:53:55.528506 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:53:55.528452 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"350ff836ef6c4766576508be4c2a85d4ef247a45031f9def64a7fe58f2cf6503"} err="failed to get container status \"350ff836ef6c4766576508be4c2a85d4ef247a45031f9def64a7fe58f2cf6503\": rpc error: code = NotFound desc = could not find container \"350ff836ef6c4766576508be4c2a85d4ef247a45031f9def64a7fe58f2cf6503\": container with ID starting with 350ff836ef6c4766576508be4c2a85d4ef247a45031f9def64a7fe58f2cf6503 not found: ID does not exist" Apr 24 14:53:55.528506 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:53:55.528474 2569 scope.go:117] "RemoveContainer" containerID="dbb8db47f94cdfe9aa2b83cd8d91785099c6a8b5e6aa40f2f701b35b0cd6d8f8" Apr 24 14:53:55.528719 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:53:55.528700 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbb8db47f94cdfe9aa2b83cd8d91785099c6a8b5e6aa40f2f701b35b0cd6d8f8\": container with ID starting with dbb8db47f94cdfe9aa2b83cd8d91785099c6a8b5e6aa40f2f701b35b0cd6d8f8 not found: ID does not exist" containerID="dbb8db47f94cdfe9aa2b83cd8d91785099c6a8b5e6aa40f2f701b35b0cd6d8f8" Apr 24 14:53:55.528774 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:53:55.528727 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbb8db47f94cdfe9aa2b83cd8d91785099c6a8b5e6aa40f2f701b35b0cd6d8f8"} err="failed to get container status \"dbb8db47f94cdfe9aa2b83cd8d91785099c6a8b5e6aa40f2f701b35b0cd6d8f8\": rpc error: code = NotFound desc = could not find container \"dbb8db47f94cdfe9aa2b83cd8d91785099c6a8b5e6aa40f2f701b35b0cd6d8f8\": container with ID starting with dbb8db47f94cdfe9aa2b83cd8d91785099c6a8b5e6aa40f2f701b35b0cd6d8f8 not found: ID does not exist" Apr 24 14:53:55.556139 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:53:55.556064 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87a899bb-3ffd-469d-a451-df62f955e994-kserve-provision-location\") pod \"87a899bb-3ffd-469d-a451-df62f955e994\" (UID: \"87a899bb-3ffd-469d-a451-df62f955e994\") " Apr 24 14:53:55.556395 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:53:55.556374 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87a899bb-3ffd-469d-a451-df62f955e994-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "87a899bb-3ffd-469d-a451-df62f955e994" (UID: "87a899bb-3ffd-469d-a451-df62f955e994"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:53:55.657035 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:53:55.657001 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87a899bb-3ffd-469d-a451-df62f955e994-kserve-provision-location\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 14:53:55.833783 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:53:55.833693 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-8g24w"] Apr 24 14:53:55.835266 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:53:55.835239 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-8g24w"] Apr 24 14:53:56.518116 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:53:56.518084 2569 generic.go:358] "Generic (PLEG): container finished" podID="a50ae531-3ab8-4d5f-813f-3686a0ac7d92" containerID="7bdd75543c7e2bca7ee0611dfe0db842d3e25e536dc72a9ae440f73ee4f663e7" exitCode=0 Apr 24 14:53:56.518486 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:53:56.518157 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-5dfb41-predictor-5fffc54484-kbnfx" event={"ID":"a50ae531-3ab8-4d5f-813f-3686a0ac7d92","Type":"ContainerDied","Data":"7bdd75543c7e2bca7ee0611dfe0db842d3e25e536dc72a9ae440f73ee4f663e7"} Apr 24 14:53:57.522676 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:53:57.522585 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-5dfb41-predictor-5fffc54484-kbnfx" event={"ID":"a50ae531-3ab8-4d5f-813f-3686a0ac7d92","Type":"ContainerStarted","Data":"c84238e458dc7ae6bf429258dcafead5dacfbc2ff84d081c3d50c0f8c54c3e42"} Apr 24 14:53:57.523140 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:53:57.522946 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-5dfb41-predictor-5fffc54484-kbnfx" Apr 24 14:53:57.524243 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:53:57.524216 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-5dfb41-predictor-5fffc54484-kbnfx" podUID="a50ae531-3ab8-4d5f-813f-3686a0ac7d92" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 24 14:53:57.537695 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:53:57.537632 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-primary-5dfb41-predictor-5fffc54484-kbnfx" podStartSLOduration=6.537614271 podStartE2EDuration="6.537614271s" podCreationTimestamp="2026-04-24 14:53:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:53:57.536365045 +0000 UTC m=+1790.330421958" watchObservedRunningTime="2026-04-24 14:53:57.537614271 +0000 UTC m=+1790.331671173" Apr 24 14:53:57.827140 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:53:57.827038 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87a899bb-3ffd-469d-a451-df62f955e994" path="/var/lib/kubelet/pods/87a899bb-3ffd-469d-a451-df62f955e994/volumes" Apr 24 14:53:58.526582 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:53:58.526536 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-5dfb41-predictor-5fffc54484-kbnfx" podUID="a50ae531-3ab8-4d5f-813f-3686a0ac7d92" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 24 14:54:07.782595 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:54:07.782559 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mjpgb_be4da945-f6d1-4406-adea-f3ccedab88f6/ovn-acl-logging/0.log" Apr 24 14:54:07.788370 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:54:07.788348 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mjpgb_be4da945-f6d1-4406-adea-f3ccedab88f6/ovn-acl-logging/0.log" Apr 24 14:54:08.527284 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:54:08.527238 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-5dfb41-predictor-5fffc54484-kbnfx" podUID="a50ae531-3ab8-4d5f-813f-3686a0ac7d92" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 24 14:54:18.526758 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:54:18.526710 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-5dfb41-predictor-5fffc54484-kbnfx" podUID="a50ae531-3ab8-4d5f-813f-3686a0ac7d92" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 24 14:54:28.527015 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:54:28.526963 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-5dfb41-predictor-5fffc54484-kbnfx" podUID="a50ae531-3ab8-4d5f-813f-3686a0ac7d92" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 24 14:54:38.527225 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:54:38.527184 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-5dfb41-predictor-5fffc54484-kbnfx" podUID="a50ae531-3ab8-4d5f-813f-3686a0ac7d92" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 24 14:54:48.527171 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:54:48.527126 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-5dfb41-predictor-5fffc54484-kbnfx" podUID="a50ae531-3ab8-4d5f-813f-3686a0ac7d92" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 24 14:54:58.527242 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:54:58.527192 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-5dfb41-predictor-5fffc54484-kbnfx" podUID="a50ae531-3ab8-4d5f-813f-3686a0ac7d92" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 24 14:54:59.824690 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:54:59.824643 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-5dfb41-predictor-5fffc54484-kbnfx" podUID="a50ae531-3ab8-4d5f-813f-3686a0ac7d92" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 24 14:55:09.827508 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:09.827479 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-5dfb41-predictor-5fffc54484-kbnfx" Apr 24 14:55:11.927629 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:11.927590 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-5dfb41-predictor-67587b564f-t25pg"] Apr 24 14:55:11.928110 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:11.927935 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87a899bb-3ffd-469d-a451-df62f955e994" containerName="kserve-container" Apr 24 14:55:11.928110 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:11.927950 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="87a899bb-3ffd-469d-a451-df62f955e994" containerName="kserve-container" Apr 24 14:55:11.928110 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:11.927967 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87a899bb-3ffd-469d-a451-df62f955e994" containerName="storage-initializer" Apr 24 14:55:11.928110 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:11.927973 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="87a899bb-3ffd-469d-a451-df62f955e994" containerName="storage-initializer" Apr 24 14:55:11.928110 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:11.928027 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="87a899bb-3ffd-469d-a451-df62f955e994" containerName="kserve-container" Apr 24 14:55:11.930840 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:11.930822 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-5dfb41-predictor-67587b564f-t25pg" Apr 24 14:55:11.933021 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:11.932998 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 24 14:55:11.933172 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:11.933066 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-secret-5dfb41\"" Apr 24 14:55:11.933619 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:11.933603 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-sa-5dfb41-dockercfg-fwptt\"" Apr 24 14:55:11.938373 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:11.938049 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-5dfb41-predictor-67587b564f-t25pg"] Apr 24 14:55:12.006426 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:12.006380 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/fdeafebe-9545-4f8d-88f3-98e726f00ecd-cabundle-cert\") pod \"isvc-secondary-5dfb41-predictor-67587b564f-t25pg\" (UID: \"fdeafebe-9545-4f8d-88f3-98e726f00ecd\") " pod="kserve-ci-e2e-test/isvc-secondary-5dfb41-predictor-67587b564f-t25pg" Apr 24 14:55:12.006609 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:12.006495 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fdeafebe-9545-4f8d-88f3-98e726f00ecd-kserve-provision-location\") pod \"isvc-secondary-5dfb41-predictor-67587b564f-t25pg\" (UID: \"fdeafebe-9545-4f8d-88f3-98e726f00ecd\") " pod="kserve-ci-e2e-test/isvc-secondary-5dfb41-predictor-67587b564f-t25pg" Apr 24 14:55:12.107929 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:12.107866 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/fdeafebe-9545-4f8d-88f3-98e726f00ecd-cabundle-cert\") pod \"isvc-secondary-5dfb41-predictor-67587b564f-t25pg\" (UID: \"fdeafebe-9545-4f8d-88f3-98e726f00ecd\") " pod="kserve-ci-e2e-test/isvc-secondary-5dfb41-predictor-67587b564f-t25pg" Apr 24 14:55:12.108130 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:12.107977 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fdeafebe-9545-4f8d-88f3-98e726f00ecd-kserve-provision-location\") pod \"isvc-secondary-5dfb41-predictor-67587b564f-t25pg\" (UID: \"fdeafebe-9545-4f8d-88f3-98e726f00ecd\") " pod="kserve-ci-e2e-test/isvc-secondary-5dfb41-predictor-67587b564f-t25pg" Apr 24 14:55:12.108326 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:12.108309 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fdeafebe-9545-4f8d-88f3-98e726f00ecd-kserve-provision-location\") pod \"isvc-secondary-5dfb41-predictor-67587b564f-t25pg\" (UID: \"fdeafebe-9545-4f8d-88f3-98e726f00ecd\") " pod="kserve-ci-e2e-test/isvc-secondary-5dfb41-predictor-67587b564f-t25pg" Apr 24 14:55:12.108498 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:12.108477 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/fdeafebe-9545-4f8d-88f3-98e726f00ecd-cabundle-cert\") pod \"isvc-secondary-5dfb41-predictor-67587b564f-t25pg\" (UID: \"fdeafebe-9545-4f8d-88f3-98e726f00ecd\") " pod="kserve-ci-e2e-test/isvc-secondary-5dfb41-predictor-67587b564f-t25pg" Apr 24 14:55:12.241935 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:12.241819 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-5dfb41-predictor-67587b564f-t25pg" Apr 24 14:55:12.366617 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:12.366584 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-5dfb41-predictor-67587b564f-t25pg"] Apr 24 14:55:12.369188 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:55:12.369158 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfdeafebe_9545_4f8d_88f3_98e726f00ecd.slice/crio-3cfec146bb2d068e3a8adb84831d043ddb5f805bdc48c5ed2f8cf28327bf4880 WatchSource:0}: Error finding container 3cfec146bb2d068e3a8adb84831d043ddb5f805bdc48c5ed2f8cf28327bf4880: Status 404 returned error can't find the container with id 3cfec146bb2d068e3a8adb84831d043ddb5f805bdc48c5ed2f8cf28327bf4880 Apr 24 14:55:12.370996 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:12.370977 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 14:55:12.760330 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:12.760292 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-5dfb41-predictor-67587b564f-t25pg" event={"ID":"fdeafebe-9545-4f8d-88f3-98e726f00ecd","Type":"ContainerStarted","Data":"d2df81d5a7bbb2cf205553178e7d9e1a0228e4901e5792c75ed4dd8815058426"} Apr 24 14:55:12.760330 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:12.760332 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-5dfb41-predictor-67587b564f-t25pg" event={"ID":"fdeafebe-9545-4f8d-88f3-98e726f00ecd","Type":"ContainerStarted","Data":"3cfec146bb2d068e3a8adb84831d043ddb5f805bdc48c5ed2f8cf28327bf4880"} Apr 24 14:55:16.773114 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:16.773084 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-5dfb41-predictor-67587b564f-t25pg_fdeafebe-9545-4f8d-88f3-98e726f00ecd/storage-initializer/0.log" Apr 24 14:55:16.773531 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:16.773124 2569 generic.go:358] "Generic (PLEG): container finished" podID="fdeafebe-9545-4f8d-88f3-98e726f00ecd" containerID="d2df81d5a7bbb2cf205553178e7d9e1a0228e4901e5792c75ed4dd8815058426" exitCode=1 Apr 24 14:55:16.773531 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:16.773198 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-5dfb41-predictor-67587b564f-t25pg" event={"ID":"fdeafebe-9545-4f8d-88f3-98e726f00ecd","Type":"ContainerDied","Data":"d2df81d5a7bbb2cf205553178e7d9e1a0228e4901e5792c75ed4dd8815058426"} Apr 24 14:55:17.778569 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:17.778541 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-5dfb41-predictor-67587b564f-t25pg_fdeafebe-9545-4f8d-88f3-98e726f00ecd/storage-initializer/0.log" Apr 24 14:55:17.778971 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:17.778655 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-5dfb41-predictor-67587b564f-t25pg" event={"ID":"fdeafebe-9545-4f8d-88f3-98e726f00ecd","Type":"ContainerStarted","Data":"7239fccdf8f8848325e2ca774511faeb194a97543bfdd865bdcaf1e6468ceccc"} Apr 24 14:55:22.794804 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:22.794771 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-5dfb41-predictor-67587b564f-t25pg_fdeafebe-9545-4f8d-88f3-98e726f00ecd/storage-initializer/1.log" Apr 24 14:55:22.795265 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:22.795171 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-5dfb41-predictor-67587b564f-t25pg_fdeafebe-9545-4f8d-88f3-98e726f00ecd/storage-initializer/0.log" Apr 24 14:55:22.795265 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:22.795204 2569 generic.go:358] "Generic (PLEG): container finished" podID="fdeafebe-9545-4f8d-88f3-98e726f00ecd" containerID="7239fccdf8f8848325e2ca774511faeb194a97543bfdd865bdcaf1e6468ceccc" exitCode=1 Apr 24 14:55:22.795341 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:22.795280 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-5dfb41-predictor-67587b564f-t25pg" event={"ID":"fdeafebe-9545-4f8d-88f3-98e726f00ecd","Type":"ContainerDied","Data":"7239fccdf8f8848325e2ca774511faeb194a97543bfdd865bdcaf1e6468ceccc"} Apr 24 14:55:22.795341 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:22.795322 2569 scope.go:117] "RemoveContainer" containerID="d2df81d5a7bbb2cf205553178e7d9e1a0228e4901e5792c75ed4dd8815058426" Apr 24 14:55:22.795721 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:22.795705 2569 scope.go:117] "RemoveContainer" containerID="d2df81d5a7bbb2cf205553178e7d9e1a0228e4901e5792c75ed4dd8815058426" Apr 24 14:55:22.805818 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:55:22.805786 2569 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-5dfb41-predictor-67587b564f-t25pg_kserve-ci-e2e-test_fdeafebe-9545-4f8d-88f3-98e726f00ecd_0 in pod sandbox 3cfec146bb2d068e3a8adb84831d043ddb5f805bdc48c5ed2f8cf28327bf4880 from index: no such id: 'd2df81d5a7bbb2cf205553178e7d9e1a0228e4901e5792c75ed4dd8815058426'" containerID="d2df81d5a7bbb2cf205553178e7d9e1a0228e4901e5792c75ed4dd8815058426" Apr 24 14:55:22.805881 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:55:22.805846 2569 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-5dfb41-predictor-67587b564f-t25pg_kserve-ci-e2e-test_fdeafebe-9545-4f8d-88f3-98e726f00ecd_0 in pod sandbox 3cfec146bb2d068e3a8adb84831d043ddb5f805bdc48c5ed2f8cf28327bf4880 from index: no such id: 'd2df81d5a7bbb2cf205553178e7d9e1a0228e4901e5792c75ed4dd8815058426'; Skipping pod \"isvc-secondary-5dfb41-predictor-67587b564f-t25pg_kserve-ci-e2e-test(fdeafebe-9545-4f8d-88f3-98e726f00ecd)\"" logger="UnhandledError" Apr 24 14:55:22.807163 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:55:22.807143 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-secondary-5dfb41-predictor-67587b564f-t25pg_kserve-ci-e2e-test(fdeafebe-9545-4f8d-88f3-98e726f00ecd)\"" pod="kserve-ci-e2e-test/isvc-secondary-5dfb41-predictor-67587b564f-t25pg" podUID="fdeafebe-9545-4f8d-88f3-98e726f00ecd" Apr 24 14:55:23.799679 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:23.799648 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-5dfb41-predictor-67587b564f-t25pg_fdeafebe-9545-4f8d-88f3-98e726f00ecd/storage-initializer/1.log" Apr 24 14:55:27.946079 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:27.946008 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-5dfb41-predictor-5fffc54484-kbnfx"] Apr 24 14:55:27.946415 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:27.946261 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-5dfb41-predictor-5fffc54484-kbnfx" podUID="a50ae531-3ab8-4d5f-813f-3686a0ac7d92" containerName="kserve-container" containerID="cri-o://c84238e458dc7ae6bf429258dcafead5dacfbc2ff84d081c3d50c0f8c54c3e42" gracePeriod=30 Apr 24 14:55:28.007685 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:28.007653 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-5dfb41-predictor-67587b564f-t25pg"] Apr 24 14:55:28.077089 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:28.077061 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-d5f013-predictor-6b9db4f877-pgrkn"] Apr 24 14:55:28.081645 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:28.081618 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-d5f013-predictor-6b9db4f877-pgrkn" Apr 24 14:55:28.083557 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:28.083537 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-secret-d5f013\"" Apr 24 14:55:28.083830 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:28.083645 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-sa-d5f013-dockercfg-nbnhd\"" Apr 24 14:55:28.090057 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:28.090009 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-d5f013-predictor-6b9db4f877-pgrkn"] Apr 24 14:55:28.148074 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:28.148049 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-5dfb41-predictor-67587b564f-t25pg_fdeafebe-9545-4f8d-88f3-98e726f00ecd/storage-initializer/1.log" Apr 24 14:55:28.148218 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:28.148116 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-5dfb41-predictor-67587b564f-t25pg" Apr 24 14:55:28.251111 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:28.251008 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fdeafebe-9545-4f8d-88f3-98e726f00ecd-kserve-provision-location\") pod \"fdeafebe-9545-4f8d-88f3-98e726f00ecd\" (UID: \"fdeafebe-9545-4f8d-88f3-98e726f00ecd\") " Apr 24 14:55:28.251111 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:28.251058 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/fdeafebe-9545-4f8d-88f3-98e726f00ecd-cabundle-cert\") pod \"fdeafebe-9545-4f8d-88f3-98e726f00ecd\" (UID: \"fdeafebe-9545-4f8d-88f3-98e726f00ecd\") " Apr 24 14:55:28.251334 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:28.251155 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/57b520b1-074a-4f04-8cc9-e5a0bb213730-cabundle-cert\") pod \"isvc-init-fail-d5f013-predictor-6b9db4f877-pgrkn\" (UID: \"57b520b1-074a-4f04-8cc9-e5a0bb213730\") " pod="kserve-ci-e2e-test/isvc-init-fail-d5f013-predictor-6b9db4f877-pgrkn" Apr 24 14:55:28.251334 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:28.251239 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/57b520b1-074a-4f04-8cc9-e5a0bb213730-kserve-provision-location\") pod \"isvc-init-fail-d5f013-predictor-6b9db4f877-pgrkn\" (UID: \"57b520b1-074a-4f04-8cc9-e5a0bb213730\") " pod="kserve-ci-e2e-test/isvc-init-fail-d5f013-predictor-6b9db4f877-pgrkn" Apr 24 14:55:28.251416 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:28.251349 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdeafebe-9545-4f8d-88f3-98e726f00ecd-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "fdeafebe-9545-4f8d-88f3-98e726f00ecd" (UID: "fdeafebe-9545-4f8d-88f3-98e726f00ecd"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:55:28.251454 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:28.251438 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdeafebe-9545-4f8d-88f3-98e726f00ecd-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "fdeafebe-9545-4f8d-88f3-98e726f00ecd" (UID: "fdeafebe-9545-4f8d-88f3-98e726f00ecd"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:55:28.351662 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:28.351620 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/57b520b1-074a-4f04-8cc9-e5a0bb213730-kserve-provision-location\") pod \"isvc-init-fail-d5f013-predictor-6b9db4f877-pgrkn\" (UID: \"57b520b1-074a-4f04-8cc9-e5a0bb213730\") " pod="kserve-ci-e2e-test/isvc-init-fail-d5f013-predictor-6b9db4f877-pgrkn" Apr 24 14:55:28.351823 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:28.351672 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/57b520b1-074a-4f04-8cc9-e5a0bb213730-cabundle-cert\") pod \"isvc-init-fail-d5f013-predictor-6b9db4f877-pgrkn\" (UID: \"57b520b1-074a-4f04-8cc9-e5a0bb213730\") " pod="kserve-ci-e2e-test/isvc-init-fail-d5f013-predictor-6b9db4f877-pgrkn" Apr 24 14:55:28.351823 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:28.351745 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fdeafebe-9545-4f8d-88f3-98e726f00ecd-kserve-provision-location\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 14:55:28.351823 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:28.351763 2569 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/fdeafebe-9545-4f8d-88f3-98e726f00ecd-cabundle-cert\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 14:55:28.352102 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:28.352084 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/57b520b1-074a-4f04-8cc9-e5a0bb213730-kserve-provision-location\") pod \"isvc-init-fail-d5f013-predictor-6b9db4f877-pgrkn\" (UID: \"57b520b1-074a-4f04-8cc9-e5a0bb213730\") " pod="kserve-ci-e2e-test/isvc-init-fail-d5f013-predictor-6b9db4f877-pgrkn" Apr 24 14:55:28.352348 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:28.352326 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/57b520b1-074a-4f04-8cc9-e5a0bb213730-cabundle-cert\") pod \"isvc-init-fail-d5f013-predictor-6b9db4f877-pgrkn\" (UID: \"57b520b1-074a-4f04-8cc9-e5a0bb213730\") " pod="kserve-ci-e2e-test/isvc-init-fail-d5f013-predictor-6b9db4f877-pgrkn" Apr 24 14:55:28.396258 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:28.396222 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-d5f013-predictor-6b9db4f877-pgrkn" Apr 24 14:55:28.522876 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:28.522651 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-d5f013-predictor-6b9db4f877-pgrkn"] Apr 24 14:55:28.525729 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:55:28.525688 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57b520b1_074a_4f04_8cc9_e5a0bb213730.slice/crio-3940bc19e6855cfed7feb1056fa4e17ca41121f3f1da7137d65826fcdb40713b WatchSource:0}: Error finding container 3940bc19e6855cfed7feb1056fa4e17ca41121f3f1da7137d65826fcdb40713b: Status 404 returned error can't find the container with id 3940bc19e6855cfed7feb1056fa4e17ca41121f3f1da7137d65826fcdb40713b Apr 24 14:55:28.819223 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:28.819125 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-d5f013-predictor-6b9db4f877-pgrkn" event={"ID":"57b520b1-074a-4f04-8cc9-e5a0bb213730","Type":"ContainerStarted","Data":"fc238a1ce02779d6577fe7d96554cb49beebcf30b0362bbcb48816358d24674c"} Apr 24 14:55:28.819223 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:28.819165 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-d5f013-predictor-6b9db4f877-pgrkn" event={"ID":"57b520b1-074a-4f04-8cc9-e5a0bb213730","Type":"ContainerStarted","Data":"3940bc19e6855cfed7feb1056fa4e17ca41121f3f1da7137d65826fcdb40713b"} Apr 24 14:55:28.820289 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:28.820267 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-5dfb41-predictor-67587b564f-t25pg_fdeafebe-9545-4f8d-88f3-98e726f00ecd/storage-initializer/1.log" Apr 24 14:55:28.820404 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:28.820381 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-5dfb41-predictor-67587b564f-t25pg" Apr 24 14:55:28.820460 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:28.820399 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-5dfb41-predictor-67587b564f-t25pg" event={"ID":"fdeafebe-9545-4f8d-88f3-98e726f00ecd","Type":"ContainerDied","Data":"3cfec146bb2d068e3a8adb84831d043ddb5f805bdc48c5ed2f8cf28327bf4880"} Apr 24 14:55:28.820498 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:28.820442 2569 scope.go:117] "RemoveContainer" containerID="7239fccdf8f8848325e2ca774511faeb194a97543bfdd865bdcaf1e6468ceccc" Apr 24 14:55:28.864083 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:28.859689 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-5dfb41-predictor-67587b564f-t25pg"] Apr 24 14:55:28.866539 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:28.866504 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-5dfb41-predictor-67587b564f-t25pg"] Apr 24 14:55:29.824665 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:29.824623 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-5dfb41-predictor-5fffc54484-kbnfx" podUID="a50ae531-3ab8-4d5f-813f-3686a0ac7d92" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 24 14:55:29.827214 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:29.827188 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdeafebe-9545-4f8d-88f3-98e726f00ecd" path="/var/lib/kubelet/pods/fdeafebe-9545-4f8d-88f3-98e726f00ecd/volumes" Apr 24 14:55:32.792200 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:32.792174 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-5dfb41-predictor-5fffc54484-kbnfx" Apr 24 14:55:32.833513 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:32.833417 2569 generic.go:358] "Generic (PLEG): container finished" podID="a50ae531-3ab8-4d5f-813f-3686a0ac7d92" containerID="c84238e458dc7ae6bf429258dcafead5dacfbc2ff84d081c3d50c0f8c54c3e42" exitCode=0 Apr 24 14:55:32.833513 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:32.833482 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-5dfb41-predictor-5fffc54484-kbnfx" Apr 24 14:55:32.833718 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:32.833506 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-5dfb41-predictor-5fffc54484-kbnfx" event={"ID":"a50ae531-3ab8-4d5f-813f-3686a0ac7d92","Type":"ContainerDied","Data":"c84238e458dc7ae6bf429258dcafead5dacfbc2ff84d081c3d50c0f8c54c3e42"} Apr 24 14:55:32.833718 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:32.833547 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-5dfb41-predictor-5fffc54484-kbnfx" event={"ID":"a50ae531-3ab8-4d5f-813f-3686a0ac7d92","Type":"ContainerDied","Data":"02ceb7fc9703842e8a781052d565f2ca7d681d182a5585771917617d716a9d0f"} Apr 24 14:55:32.833718 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:32.833570 2569 scope.go:117] "RemoveContainer" containerID="c84238e458dc7ae6bf429258dcafead5dacfbc2ff84d081c3d50c0f8c54c3e42" Apr 24 14:55:32.834940 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:32.834917 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-d5f013-predictor-6b9db4f877-pgrkn_57b520b1-074a-4f04-8cc9-e5a0bb213730/storage-initializer/0.log" Apr 24 14:55:32.835042 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:32.834952 2569 generic.go:358] "Generic (PLEG): container finished" podID="57b520b1-074a-4f04-8cc9-e5a0bb213730" containerID="fc238a1ce02779d6577fe7d96554cb49beebcf30b0362bbcb48816358d24674c" exitCode=1 Apr 24 14:55:32.835042 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:32.834983 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-d5f013-predictor-6b9db4f877-pgrkn" event={"ID":"57b520b1-074a-4f04-8cc9-e5a0bb213730","Type":"ContainerDied","Data":"fc238a1ce02779d6577fe7d96554cb49beebcf30b0362bbcb48816358d24674c"} Apr 24 14:55:32.841961 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:32.841935 2569 scope.go:117] "RemoveContainer" containerID="7bdd75543c7e2bca7ee0611dfe0db842d3e25e536dc72a9ae440f73ee4f663e7" Apr 24 14:55:32.849402 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:32.849387 2569 scope.go:117] "RemoveContainer" containerID="c84238e458dc7ae6bf429258dcafead5dacfbc2ff84d081c3d50c0f8c54c3e42" Apr 24 14:55:32.849677 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:55:32.849659 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c84238e458dc7ae6bf429258dcafead5dacfbc2ff84d081c3d50c0f8c54c3e42\": container with ID starting with c84238e458dc7ae6bf429258dcafead5dacfbc2ff84d081c3d50c0f8c54c3e42 not found: ID does not exist" containerID="c84238e458dc7ae6bf429258dcafead5dacfbc2ff84d081c3d50c0f8c54c3e42" Apr 24 14:55:32.849732 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:32.849688 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c84238e458dc7ae6bf429258dcafead5dacfbc2ff84d081c3d50c0f8c54c3e42"} err="failed to get container status \"c84238e458dc7ae6bf429258dcafead5dacfbc2ff84d081c3d50c0f8c54c3e42\": rpc error: code = NotFound desc = could not find container \"c84238e458dc7ae6bf429258dcafead5dacfbc2ff84d081c3d50c0f8c54c3e42\": container with ID starting with c84238e458dc7ae6bf429258dcafead5dacfbc2ff84d081c3d50c0f8c54c3e42 not found: ID does not exist" Apr 24 14:55:32.849732 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:32.849711 2569 scope.go:117] "RemoveContainer" containerID="7bdd75543c7e2bca7ee0611dfe0db842d3e25e536dc72a9ae440f73ee4f663e7" Apr 24 14:55:32.850004 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:55:32.849987 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bdd75543c7e2bca7ee0611dfe0db842d3e25e536dc72a9ae440f73ee4f663e7\": container with ID starting with 7bdd75543c7e2bca7ee0611dfe0db842d3e25e536dc72a9ae440f73ee4f663e7 not found: ID does not exist" containerID="7bdd75543c7e2bca7ee0611dfe0db842d3e25e536dc72a9ae440f73ee4f663e7" Apr 24 14:55:32.850061 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:32.850006 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bdd75543c7e2bca7ee0611dfe0db842d3e25e536dc72a9ae440f73ee4f663e7"} err="failed to get container status \"7bdd75543c7e2bca7ee0611dfe0db842d3e25e536dc72a9ae440f73ee4f663e7\": rpc error: code = NotFound desc = could not find container \"7bdd75543c7e2bca7ee0611dfe0db842d3e25e536dc72a9ae440f73ee4f663e7\": container with ID starting with 7bdd75543c7e2bca7ee0611dfe0db842d3e25e536dc72a9ae440f73ee4f663e7 not found: ID does not exist" Apr 24 14:55:32.890186 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:32.890159 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a50ae531-3ab8-4d5f-813f-3686a0ac7d92-kserve-provision-location\") pod \"a50ae531-3ab8-4d5f-813f-3686a0ac7d92\" (UID: \"a50ae531-3ab8-4d5f-813f-3686a0ac7d92\") " Apr 24 14:55:32.890445 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:32.890424 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a50ae531-3ab8-4d5f-813f-3686a0ac7d92-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a50ae531-3ab8-4d5f-813f-3686a0ac7d92" (UID: "a50ae531-3ab8-4d5f-813f-3686a0ac7d92"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:55:32.991346 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:32.991293 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a50ae531-3ab8-4d5f-813f-3686a0ac7d92-kserve-provision-location\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 14:55:33.113776 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:33.113681 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-d5f013-predictor-6b9db4f877-pgrkn"] Apr 24 14:55:33.160493 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:33.160452 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-5dfb41-predictor-5fffc54484-kbnfx"] Apr 24 14:55:33.162428 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:33.162398 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-5dfb41-predictor-5fffc54484-kbnfx"] Apr 24 14:55:33.195505 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:33.195463 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-s6qfv"] Apr 24 14:55:33.196013 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:33.195991 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a50ae531-3ab8-4d5f-813f-3686a0ac7d92" containerName="kserve-container" Apr 24 14:55:33.196072 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:33.196019 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="a50ae531-3ab8-4d5f-813f-3686a0ac7d92" containerName="kserve-container" Apr 24 14:55:33.196072 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:33.196029 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fdeafebe-9545-4f8d-88f3-98e726f00ecd" containerName="storage-initializer" Apr 24 14:55:33.196072 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:33.196039 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdeafebe-9545-4f8d-88f3-98e726f00ecd" containerName="storage-initializer" Apr 24 14:55:33.196072 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:33.196051 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a50ae531-3ab8-4d5f-813f-3686a0ac7d92" containerName="storage-initializer" Apr 24 14:55:33.196072 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:33.196061 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="a50ae531-3ab8-4d5f-813f-3686a0ac7d92" containerName="storage-initializer" Apr 24 14:55:33.196225 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:33.196072 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fdeafebe-9545-4f8d-88f3-98e726f00ecd" containerName="storage-initializer" Apr 24 14:55:33.196225 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:33.196081 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdeafebe-9545-4f8d-88f3-98e726f00ecd" containerName="storage-initializer" Apr 24 14:55:33.196225 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:33.196148 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="fdeafebe-9545-4f8d-88f3-98e726f00ecd" containerName="storage-initializer" Apr 24 14:55:33.196225 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:33.196163 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="a50ae531-3ab8-4d5f-813f-3686a0ac7d92" containerName="kserve-container" Apr 24 14:55:33.196225 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:33.196174 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="fdeafebe-9545-4f8d-88f3-98e726f00ecd" containerName="storage-initializer" Apr 24 14:55:33.199604 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:33.199582 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-s6qfv" Apr 24 14:55:33.201697 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:33.201672 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-5qb2v\"" Apr 24 14:55:33.206746 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:33.206717 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-s6qfv"] Apr 24 14:55:33.294262 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:33.294224 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/90975bab-5efd-40eb-8b56-ba2060c78181-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-85bccb8945-s6qfv\" (UID: \"90975bab-5efd-40eb-8b56-ba2060c78181\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-s6qfv" Apr 24 14:55:33.395392 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:33.395304 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/90975bab-5efd-40eb-8b56-ba2060c78181-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-85bccb8945-s6qfv\" (UID: \"90975bab-5efd-40eb-8b56-ba2060c78181\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-s6qfv" Apr 24 14:55:33.395648 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:33.395629 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/90975bab-5efd-40eb-8b56-ba2060c78181-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-85bccb8945-s6qfv\" (UID: \"90975bab-5efd-40eb-8b56-ba2060c78181\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-s6qfv" Apr 24 14:55:33.513704 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:33.513659 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-s6qfv" Apr 24 14:55:33.639565 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:33.639540 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-s6qfv"] Apr 24 14:55:33.641823 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:55:33.641794 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90975bab_5efd_40eb_8b56_ba2060c78181.slice/crio-f81b2e2c33c66f56e89485712c7e238b7caf7a31fb829b2233224e77faa8559c WatchSource:0}: Error finding container f81b2e2c33c66f56e89485712c7e238b7caf7a31fb829b2233224e77faa8559c: Status 404 returned error can't find the container with id f81b2e2c33c66f56e89485712c7e238b7caf7a31fb829b2233224e77faa8559c Apr 24 14:55:33.827473 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:33.827437 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a50ae531-3ab8-4d5f-813f-3686a0ac7d92" path="/var/lib/kubelet/pods/a50ae531-3ab8-4d5f-813f-3686a0ac7d92/volumes" Apr 24 14:55:33.839776 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:33.839740 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-s6qfv" event={"ID":"90975bab-5efd-40eb-8b56-ba2060c78181","Type":"ContainerStarted","Data":"11069b714aa5857064017b65fc1dc3945a998cb7eb9d9d3eaf6330cbf54ee5df"} Apr 24 14:55:33.839973 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:33.839784 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-s6qfv" event={"ID":"90975bab-5efd-40eb-8b56-ba2060c78181","Type":"ContainerStarted","Data":"f81b2e2c33c66f56e89485712c7e238b7caf7a31fb829b2233224e77faa8559c"} Apr 24 14:55:33.842112 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:33.842088 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-d5f013-predictor-6b9db4f877-pgrkn_57b520b1-074a-4f04-8cc9-e5a0bb213730/storage-initializer/0.log" Apr 24 14:55:33.842221 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:33.842132 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-d5f013-predictor-6b9db4f877-pgrkn" event={"ID":"57b520b1-074a-4f04-8cc9-e5a0bb213730","Type":"ContainerStarted","Data":"4094e1b3c8b01be54a1da3cdfddce3fd2e2daa28fec97f27633e6840dfde2046"} Apr 24 14:55:33.842289 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:33.842268 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-init-fail-d5f013-predictor-6b9db4f877-pgrkn" podUID="57b520b1-074a-4f04-8cc9-e5a0bb213730" containerName="storage-initializer" containerID="cri-o://4094e1b3c8b01be54a1da3cdfddce3fd2e2daa28fec97f27633e6840dfde2046" gracePeriod=30 Apr 24 14:55:37.487192 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:37.487166 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-d5f013-predictor-6b9db4f877-pgrkn_57b520b1-074a-4f04-8cc9-e5a0bb213730/storage-initializer/1.log" Apr 24 14:55:37.487553 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:37.487537 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-d5f013-predictor-6b9db4f877-pgrkn_57b520b1-074a-4f04-8cc9-e5a0bb213730/storage-initializer/0.log" Apr 24 14:55:37.487608 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:37.487603 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-d5f013-predictor-6b9db4f877-pgrkn" Apr 24 14:55:37.629659 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:37.629625 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/57b520b1-074a-4f04-8cc9-e5a0bb213730-kserve-provision-location\") pod \"57b520b1-074a-4f04-8cc9-e5a0bb213730\" (UID: \"57b520b1-074a-4f04-8cc9-e5a0bb213730\") " Apr 24 14:55:37.629865 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:37.629709 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/57b520b1-074a-4f04-8cc9-e5a0bb213730-cabundle-cert\") pod \"57b520b1-074a-4f04-8cc9-e5a0bb213730\" (UID: \"57b520b1-074a-4f04-8cc9-e5a0bb213730\") " Apr 24 14:55:37.630026 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:37.629999 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57b520b1-074a-4f04-8cc9-e5a0bb213730-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "57b520b1-074a-4f04-8cc9-e5a0bb213730" (UID: "57b520b1-074a-4f04-8cc9-e5a0bb213730"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:55:37.630094 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:37.630055 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57b520b1-074a-4f04-8cc9-e5a0bb213730-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "57b520b1-074a-4f04-8cc9-e5a0bb213730" (UID: "57b520b1-074a-4f04-8cc9-e5a0bb213730"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:55:37.730441 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:37.730391 2569 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/57b520b1-074a-4f04-8cc9-e5a0bb213730-cabundle-cert\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 14:55:37.730441 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:37.730440 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/57b520b1-074a-4f04-8cc9-e5a0bb213730-kserve-provision-location\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 14:55:37.854083 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:37.854055 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-d5f013-predictor-6b9db4f877-pgrkn_57b520b1-074a-4f04-8cc9-e5a0bb213730/storage-initializer/1.log" Apr 24 14:55:37.854469 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:37.854453 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-d5f013-predictor-6b9db4f877-pgrkn_57b520b1-074a-4f04-8cc9-e5a0bb213730/storage-initializer/0.log" Apr 24 14:55:37.854537 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:37.854487 2569 generic.go:358] "Generic (PLEG): container finished" podID="57b520b1-074a-4f04-8cc9-e5a0bb213730" containerID="4094e1b3c8b01be54a1da3cdfddce3fd2e2daa28fec97f27633e6840dfde2046" exitCode=1 Apr 24 14:55:37.854580 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:37.854555 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-d5f013-predictor-6b9db4f877-pgrkn" event={"ID":"57b520b1-074a-4f04-8cc9-e5a0bb213730","Type":"ContainerDied","Data":"4094e1b3c8b01be54a1da3cdfddce3fd2e2daa28fec97f27633e6840dfde2046"} Apr 24 14:55:37.854580 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:37.854561 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-d5f013-predictor-6b9db4f877-pgrkn" Apr 24 14:55:37.854667 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:37.854581 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-d5f013-predictor-6b9db4f877-pgrkn" event={"ID":"57b520b1-074a-4f04-8cc9-e5a0bb213730","Type":"ContainerDied","Data":"3940bc19e6855cfed7feb1056fa4e17ca41121f3f1da7137d65826fcdb40713b"} Apr 24 14:55:37.854667 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:37.854601 2569 scope.go:117] "RemoveContainer" containerID="4094e1b3c8b01be54a1da3cdfddce3fd2e2daa28fec97f27633e6840dfde2046" Apr 24 14:55:37.856121 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:37.856097 2569 generic.go:358] "Generic (PLEG): container finished" podID="90975bab-5efd-40eb-8b56-ba2060c78181" containerID="11069b714aa5857064017b65fc1dc3945a998cb7eb9d9d3eaf6330cbf54ee5df" exitCode=0 Apr 24 14:55:37.856240 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:37.856143 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-s6qfv" event={"ID":"90975bab-5efd-40eb-8b56-ba2060c78181","Type":"ContainerDied","Data":"11069b714aa5857064017b65fc1dc3945a998cb7eb9d9d3eaf6330cbf54ee5df"} Apr 24 14:55:37.862934 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:37.862917 2569 scope.go:117] "RemoveContainer" containerID="fc238a1ce02779d6577fe7d96554cb49beebcf30b0362bbcb48816358d24674c" Apr 24 14:55:37.870138 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:37.870112 2569 scope.go:117] "RemoveContainer" containerID="4094e1b3c8b01be54a1da3cdfddce3fd2e2daa28fec97f27633e6840dfde2046" Apr 24 14:55:37.870397 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:55:37.870379 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4094e1b3c8b01be54a1da3cdfddce3fd2e2daa28fec97f27633e6840dfde2046\": container with ID starting with 4094e1b3c8b01be54a1da3cdfddce3fd2e2daa28fec97f27633e6840dfde2046 not found: ID does not exist" containerID="4094e1b3c8b01be54a1da3cdfddce3fd2e2daa28fec97f27633e6840dfde2046" Apr 24 14:55:37.870446 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:37.870406 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4094e1b3c8b01be54a1da3cdfddce3fd2e2daa28fec97f27633e6840dfde2046"} err="failed to get container status \"4094e1b3c8b01be54a1da3cdfddce3fd2e2daa28fec97f27633e6840dfde2046\": rpc error: code = NotFound desc = could not find container \"4094e1b3c8b01be54a1da3cdfddce3fd2e2daa28fec97f27633e6840dfde2046\": container with ID starting with 4094e1b3c8b01be54a1da3cdfddce3fd2e2daa28fec97f27633e6840dfde2046 not found: ID does not exist" Apr 24 14:55:37.870446 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:37.870424 2569 scope.go:117] "RemoveContainer" containerID="fc238a1ce02779d6577fe7d96554cb49beebcf30b0362bbcb48816358d24674c" Apr 24 14:55:37.870654 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:55:37.870634 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc238a1ce02779d6577fe7d96554cb49beebcf30b0362bbcb48816358d24674c\": container with ID starting with fc238a1ce02779d6577fe7d96554cb49beebcf30b0362bbcb48816358d24674c not found: ID does not exist" containerID="fc238a1ce02779d6577fe7d96554cb49beebcf30b0362bbcb48816358d24674c" Apr 24 14:55:37.870702 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:37.870664 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc238a1ce02779d6577fe7d96554cb49beebcf30b0362bbcb48816358d24674c"} err="failed to get container status \"fc238a1ce02779d6577fe7d96554cb49beebcf30b0362bbcb48816358d24674c\": rpc error: code = NotFound desc = could not find container \"fc238a1ce02779d6577fe7d96554cb49beebcf30b0362bbcb48816358d24674c\": container with ID starting with fc238a1ce02779d6577fe7d96554cb49beebcf30b0362bbcb48816358d24674c not found: ID does not exist" Apr 24 14:55:37.881249 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:37.881226 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-d5f013-predictor-6b9db4f877-pgrkn"] Apr 24 14:55:37.884297 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:37.884265 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-d5f013-predictor-6b9db4f877-pgrkn"] Apr 24 14:55:39.827883 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:55:39.827849 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57b520b1-074a-4f04-8cc9-e5a0bb213730" path="/var/lib/kubelet/pods/57b520b1-074a-4f04-8cc9-e5a0bb213730/volumes" Apr 24 14:56:01.938935 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:56:01.938877 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-s6qfv" event={"ID":"90975bab-5efd-40eb-8b56-ba2060c78181","Type":"ContainerStarted","Data":"710ca4001eb4edc3add3acabc7860658e768abf2d8e63ee794f53bcb7579e9d6"} Apr 24 14:56:01.939388 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:56:01.939208 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-s6qfv" Apr 24 14:56:01.940437 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:56:01.940409 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-s6qfv" podUID="90975bab-5efd-40eb-8b56-ba2060c78181" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 24 14:56:01.953231 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:56:01.953140 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-s6qfv" podStartSLOduration=5.371806213 podStartE2EDuration="28.953124489s" podCreationTimestamp="2026-04-24 14:55:33 +0000 UTC" firstStartedPulling="2026-04-24 14:55:37.857341763 +0000 UTC m=+1890.651398642" lastFinishedPulling="2026-04-24 14:56:01.438660029 +0000 UTC m=+1914.232716918" observedRunningTime="2026-04-24 14:56:01.95241342 +0000 UTC m=+1914.746470356" watchObservedRunningTime="2026-04-24 14:56:01.953124489 +0000 UTC m=+1914.747181391" Apr 24 14:56:02.942306 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:56:02.942266 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-s6qfv" podUID="90975bab-5efd-40eb-8b56-ba2060c78181" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 24 14:56:12.943006 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:56:12.942958 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-s6qfv" podUID="90975bab-5efd-40eb-8b56-ba2060c78181" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 24 14:56:22.942967 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:56:22.942924 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-s6qfv" podUID="90975bab-5efd-40eb-8b56-ba2060c78181" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 24 14:56:32.942670 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:56:32.942623 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-s6qfv" podUID="90975bab-5efd-40eb-8b56-ba2060c78181" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 24 14:56:42.942275 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:56:42.942230 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-s6qfv" podUID="90975bab-5efd-40eb-8b56-ba2060c78181" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 24 14:56:52.942433 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:56:52.942385 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-s6qfv" podUID="90975bab-5efd-40eb-8b56-ba2060c78181" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 24 14:57:02.943088 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:57:02.942997 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-s6qfv" podUID="90975bab-5efd-40eb-8b56-ba2060c78181" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 24 14:57:09.824039 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:57:09.823992 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-s6qfv" podUID="90975bab-5efd-40eb-8b56-ba2060c78181" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 24 14:57:19.827381 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:57:19.827353 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-s6qfv" Apr 24 14:57:23.359625 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:57:23.359589 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-s6qfv"] Apr 24 14:57:23.360089 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:57:23.359860 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-s6qfv" podUID="90975bab-5efd-40eb-8b56-ba2060c78181" containerName="kserve-container" containerID="cri-o://710ca4001eb4edc3add3acabc7860658e768abf2d8e63ee794f53bcb7579e9d6" gracePeriod=30 Apr 24 14:57:23.419242 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:57:23.419183 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-c2k6k"] Apr 24 14:57:23.419580 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:57:23.419565 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="57b520b1-074a-4f04-8cc9-e5a0bb213730" containerName="storage-initializer" Apr 24 14:57:23.419625 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:57:23.419584 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="57b520b1-074a-4f04-8cc9-e5a0bb213730" containerName="storage-initializer" Apr 24 14:57:23.419625 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:57:23.419594 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="57b520b1-074a-4f04-8cc9-e5a0bb213730" containerName="storage-initializer" Apr 24 14:57:23.419625 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:57:23.419600 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="57b520b1-074a-4f04-8cc9-e5a0bb213730" containerName="storage-initializer" Apr 24 14:57:23.419724 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:57:23.419665 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="57b520b1-074a-4f04-8cc9-e5a0bb213730" containerName="storage-initializer" Apr 24 14:57:23.419768 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:57:23.419759 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="57b520b1-074a-4f04-8cc9-e5a0bb213730" containerName="storage-initializer" Apr 24 14:57:23.422659 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:57:23.422641 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-c2k6k" Apr 24 14:57:23.430375 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:57:23.430349 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b19fa1f5-ee00-4c5e-acb9-2340fa2c660a-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-794b54b9b4-c2k6k\" (UID: \"b19fa1f5-ee00-4c5e-acb9-2340fa2c660a\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-c2k6k" Apr 24 14:57:23.431399 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:57:23.431374 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-c2k6k"] Apr 24 14:57:23.531146 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:57:23.531099 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b19fa1f5-ee00-4c5e-acb9-2340fa2c660a-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-794b54b9b4-c2k6k\" (UID: \"b19fa1f5-ee00-4c5e-acb9-2340fa2c660a\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-c2k6k" Apr 24 14:57:23.531474 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:57:23.531454 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b19fa1f5-ee00-4c5e-acb9-2340fa2c660a-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-794b54b9b4-c2k6k\" (UID: \"b19fa1f5-ee00-4c5e-acb9-2340fa2c660a\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-c2k6k" Apr 24 14:57:23.734609 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:57:23.734570 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-c2k6k" Apr 24 14:57:23.859616 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:57:23.859590 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-c2k6k"] Apr 24 14:57:23.862332 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:57:23.862302 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb19fa1f5_ee00_4c5e_acb9_2340fa2c660a.slice/crio-6eacd9a5bd41a9dc1990c127722be5c1d7c6dee478e17361b83ce39ec2bfb3e9 WatchSource:0}: Error finding container 6eacd9a5bd41a9dc1990c127722be5c1d7c6dee478e17361b83ce39ec2bfb3e9: Status 404 returned error can't find the container with id 6eacd9a5bd41a9dc1990c127722be5c1d7c6dee478e17361b83ce39ec2bfb3e9 Apr 24 14:57:24.199773 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:57:24.199736 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-c2k6k" event={"ID":"b19fa1f5-ee00-4c5e-acb9-2340fa2c660a","Type":"ContainerStarted","Data":"5cca54911d4d895cb5e009fd4b1cb4957d96721e2aa59cbf2357940534e563c1"} Apr 24 14:57:24.199773 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:57:24.199777 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-c2k6k" event={"ID":"b19fa1f5-ee00-4c5e-acb9-2340fa2c660a","Type":"ContainerStarted","Data":"6eacd9a5bd41a9dc1990c127722be5c1d7c6dee478e17361b83ce39ec2bfb3e9"} Apr 24 14:57:28.213887 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:57:28.213854 2569 generic.go:358] "Generic (PLEG): container finished" podID="b19fa1f5-ee00-4c5e-acb9-2340fa2c660a" containerID="5cca54911d4d895cb5e009fd4b1cb4957d96721e2aa59cbf2357940534e563c1" exitCode=0 Apr 24 14:57:28.214265 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:57:28.213933 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-c2k6k" event={"ID":"b19fa1f5-ee00-4c5e-acb9-2340fa2c660a","Type":"ContainerDied","Data":"5cca54911d4d895cb5e009fd4b1cb4957d96721e2aa59cbf2357940534e563c1"} Apr 24 14:57:28.598393 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:57:28.598369 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-s6qfv" Apr 24 14:57:28.667872 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:57:28.667831 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/90975bab-5efd-40eb-8b56-ba2060c78181-kserve-provision-location\") pod \"90975bab-5efd-40eb-8b56-ba2060c78181\" (UID: \"90975bab-5efd-40eb-8b56-ba2060c78181\") " Apr 24 14:57:28.668238 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:57:28.668214 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90975bab-5efd-40eb-8b56-ba2060c78181-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "90975bab-5efd-40eb-8b56-ba2060c78181" (UID: "90975bab-5efd-40eb-8b56-ba2060c78181"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:57:28.769071 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:57:28.768979 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/90975bab-5efd-40eb-8b56-ba2060c78181-kserve-provision-location\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 14:57:29.222100 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:57:29.222061 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-c2k6k" event={"ID":"b19fa1f5-ee00-4c5e-acb9-2340fa2c660a","Type":"ContainerStarted","Data":"698157e600d502c1dd359edc02ddb2e5c14365f64587b2ec2dd8869aa08f520d"} Apr 24 14:57:29.222558 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:57:29.222375 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-c2k6k" Apr 24 14:57:29.223476 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:57:29.223452 2569 generic.go:358] "Generic (PLEG): container finished" podID="90975bab-5efd-40eb-8b56-ba2060c78181" containerID="710ca4001eb4edc3add3acabc7860658e768abf2d8e63ee794f53bcb7579e9d6" exitCode=0 Apr 24 14:57:29.223617 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:57:29.223524 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-s6qfv" Apr 24 14:57:29.223617 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:57:29.223532 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-s6qfv" event={"ID":"90975bab-5efd-40eb-8b56-ba2060c78181","Type":"ContainerDied","Data":"710ca4001eb4edc3add3acabc7860658e768abf2d8e63ee794f53bcb7579e9d6"} Apr 24 14:57:29.223617 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:57:29.223562 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-s6qfv" event={"ID":"90975bab-5efd-40eb-8b56-ba2060c78181","Type":"ContainerDied","Data":"f81b2e2c33c66f56e89485712c7e238b7caf7a31fb829b2233224e77faa8559c"} Apr 24 14:57:29.223617 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:57:29.223579 2569 scope.go:117] "RemoveContainer" containerID="710ca4001eb4edc3add3acabc7860658e768abf2d8e63ee794f53bcb7579e9d6" Apr 24 14:57:29.223877 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:57:29.223857 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-c2k6k" podUID="b19fa1f5-ee00-4c5e-acb9-2340fa2c660a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 24 14:57:29.232000 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:57:29.231972 2569 scope.go:117] "RemoveContainer" containerID="11069b714aa5857064017b65fc1dc3945a998cb7eb9d9d3eaf6330cbf54ee5df" Apr 24 14:57:29.236778 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:57:29.236733 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-c2k6k" podStartSLOduration=6.236719177 podStartE2EDuration="6.236719177s" podCreationTimestamp="2026-04-24 14:57:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:57:29.235616986 +0000 UTC m=+2002.029673889" watchObservedRunningTime="2026-04-24 14:57:29.236719177 +0000 UTC m=+2002.030776079" Apr 24 14:57:29.240395 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:57:29.240361 2569 scope.go:117] "RemoveContainer" containerID="710ca4001eb4edc3add3acabc7860658e768abf2d8e63ee794f53bcb7579e9d6" Apr 24 14:57:29.240725 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:57:29.240703 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"710ca4001eb4edc3add3acabc7860658e768abf2d8e63ee794f53bcb7579e9d6\": container with ID starting with 710ca4001eb4edc3add3acabc7860658e768abf2d8e63ee794f53bcb7579e9d6 not found: ID does not exist" containerID="710ca4001eb4edc3add3acabc7860658e768abf2d8e63ee794f53bcb7579e9d6" Apr 24 14:57:29.240781 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:57:29.240735 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"710ca4001eb4edc3add3acabc7860658e768abf2d8e63ee794f53bcb7579e9d6"} err="failed to get container status \"710ca4001eb4edc3add3acabc7860658e768abf2d8e63ee794f53bcb7579e9d6\": rpc error: code = NotFound desc = could not find container \"710ca4001eb4edc3add3acabc7860658e768abf2d8e63ee794f53bcb7579e9d6\": container with ID starting with 710ca4001eb4edc3add3acabc7860658e768abf2d8e63ee794f53bcb7579e9d6 not found: ID does not exist" Apr 24 14:57:29.240781 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:57:29.240753 2569 scope.go:117] "RemoveContainer" containerID="11069b714aa5857064017b65fc1dc3945a998cb7eb9d9d3eaf6330cbf54ee5df" Apr 24 14:57:29.241024 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:57:29.241009 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11069b714aa5857064017b65fc1dc3945a998cb7eb9d9d3eaf6330cbf54ee5df\": container with ID starting with 11069b714aa5857064017b65fc1dc3945a998cb7eb9d9d3eaf6330cbf54ee5df not found: ID does not exist" containerID="11069b714aa5857064017b65fc1dc3945a998cb7eb9d9d3eaf6330cbf54ee5df" Apr 24 14:57:29.241062 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:57:29.241027 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11069b714aa5857064017b65fc1dc3945a998cb7eb9d9d3eaf6330cbf54ee5df"} err="failed to get container status \"11069b714aa5857064017b65fc1dc3945a998cb7eb9d9d3eaf6330cbf54ee5df\": rpc error: code = NotFound desc = could not find container \"11069b714aa5857064017b65fc1dc3945a998cb7eb9d9d3eaf6330cbf54ee5df\": container with ID starting with 11069b714aa5857064017b65fc1dc3945a998cb7eb9d9d3eaf6330cbf54ee5df not found: ID does not exist" Apr 24 14:57:29.247773 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:57:29.247748 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-s6qfv"] Apr 24 14:57:29.251877 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:57:29.251854 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-s6qfv"] Apr 24 14:57:29.827725 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:57:29.827694 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90975bab-5efd-40eb-8b56-ba2060c78181" path="/var/lib/kubelet/pods/90975bab-5efd-40eb-8b56-ba2060c78181/volumes" Apr 24 14:57:30.227458 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:57:30.227423 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-c2k6k" podUID="b19fa1f5-ee00-4c5e-acb9-2340fa2c660a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 24 14:57:40.227499 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:57:40.227453 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-c2k6k" podUID="b19fa1f5-ee00-4c5e-acb9-2340fa2c660a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 24 14:57:50.228389 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:57:50.228335 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-c2k6k" podUID="b19fa1f5-ee00-4c5e-acb9-2340fa2c660a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 24 14:58:00.228250 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:58:00.228201 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-c2k6k" podUID="b19fa1f5-ee00-4c5e-acb9-2340fa2c660a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 24 14:58:10.228392 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:58:10.228347 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-c2k6k" podUID="b19fa1f5-ee00-4c5e-acb9-2340fa2c660a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 24 14:58:20.228486 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:58:20.228437 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-c2k6k" podUID="b19fa1f5-ee00-4c5e-acb9-2340fa2c660a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 24 14:58:30.228282 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:58:30.228187 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-c2k6k" podUID="b19fa1f5-ee00-4c5e-acb9-2340fa2c660a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 24 14:58:40.228315 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:58:40.228268 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-c2k6k" podUID="b19fa1f5-ee00-4c5e-acb9-2340fa2c660a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 24 14:58:50.229151 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:58:50.229119 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-c2k6k" Apr 24 14:58:53.554466 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:58:53.554432 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-c2k6k"] Apr 24 14:58:53.554945 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:58:53.554783 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-c2k6k" podUID="b19fa1f5-ee00-4c5e-acb9-2340fa2c660a" containerName="kserve-container" containerID="cri-o://698157e600d502c1dd359edc02ddb2e5c14365f64587b2ec2dd8869aa08f520d" gracePeriod=30 Apr 24 14:58:53.612401 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:58:53.612361 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-46pvp"] Apr 24 14:58:53.612720 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:58:53.612694 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="90975bab-5efd-40eb-8b56-ba2060c78181" containerName="storage-initializer" Apr 24 14:58:53.612720 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:58:53.612717 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="90975bab-5efd-40eb-8b56-ba2060c78181" containerName="storage-initializer" Apr 24 14:58:53.612883 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:58:53.612732 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="90975bab-5efd-40eb-8b56-ba2060c78181" containerName="kserve-container" Apr 24 14:58:53.612883 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:58:53.612738 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="90975bab-5efd-40eb-8b56-ba2060c78181" containerName="kserve-container" Apr 24 14:58:53.612883 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:58:53.612807 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="90975bab-5efd-40eb-8b56-ba2060c78181" containerName="kserve-container" Apr 24 14:58:53.615865 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:58:53.615848 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-46pvp" Apr 24 14:58:53.625598 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:58:53.625574 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-46pvp"] Apr 24 14:58:53.807887 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:58:53.807791 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5ae9766-de66-4547-8f60-db38d71d6bf9-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-669896799c-46pvp\" (UID: \"b5ae9766-de66-4547-8f60-db38d71d6bf9\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-46pvp" Apr 24 14:58:53.908763 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:58:53.908721 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5ae9766-de66-4547-8f60-db38d71d6bf9-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-669896799c-46pvp\" (UID: \"b5ae9766-de66-4547-8f60-db38d71d6bf9\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-46pvp" Apr 24 14:58:53.909130 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:58:53.909111 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5ae9766-de66-4547-8f60-db38d71d6bf9-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-669896799c-46pvp\" (UID: \"b5ae9766-de66-4547-8f60-db38d71d6bf9\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-46pvp" Apr 24 14:58:53.927035 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:58:53.927002 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-46pvp" Apr 24 14:58:54.050121 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:58:54.049923 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-46pvp"] Apr 24 14:58:54.052793 ip-10-0-129-231 kubenswrapper[2569]: W0424 14:58:54.052751 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5ae9766_de66_4547_8f60_db38d71d6bf9.slice/crio-619804be6effe8177ea69e74512f46eccb25c8b0609a73bda0b4f6814872925c WatchSource:0}: Error finding container 619804be6effe8177ea69e74512f46eccb25c8b0609a73bda0b4f6814872925c: Status 404 returned error can't find the container with id 619804be6effe8177ea69e74512f46eccb25c8b0609a73bda0b4f6814872925c Apr 24 14:58:54.489031 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:58:54.488996 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-46pvp" event={"ID":"b5ae9766-de66-4547-8f60-db38d71d6bf9","Type":"ContainerStarted","Data":"9a8ac0b6e95a7c42841ac090b440c697f59b75c6bdc7bedf3cd00be7b58e7096"} Apr 24 14:58:54.489031 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:58:54.489036 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-46pvp" event={"ID":"b5ae9766-de66-4547-8f60-db38d71d6bf9","Type":"ContainerStarted","Data":"619804be6effe8177ea69e74512f46eccb25c8b0609a73bda0b4f6814872925c"} Apr 24 14:58:58.503906 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:58:58.503866 2569 generic.go:358] "Generic (PLEG): container finished" podID="b5ae9766-de66-4547-8f60-db38d71d6bf9" containerID="9a8ac0b6e95a7c42841ac090b440c697f59b75c6bdc7bedf3cd00be7b58e7096" exitCode=0 Apr 24 14:58:58.504234 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:58:58.503940 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-46pvp" event={"ID":"b5ae9766-de66-4547-8f60-db38d71d6bf9","Type":"ContainerDied","Data":"9a8ac0b6e95a7c42841ac090b440c697f59b75c6bdc7bedf3cd00be7b58e7096"} Apr 24 14:58:58.690866 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:58:58.690844 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-c2k6k" Apr 24 14:58:58.749709 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:58:58.749625 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b19fa1f5-ee00-4c5e-acb9-2340fa2c660a-kserve-provision-location\") pod \"b19fa1f5-ee00-4c5e-acb9-2340fa2c660a\" (UID: \"b19fa1f5-ee00-4c5e-acb9-2340fa2c660a\") " Apr 24 14:58:58.749976 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:58:58.749952 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b19fa1f5-ee00-4c5e-acb9-2340fa2c660a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b19fa1f5-ee00-4c5e-acb9-2340fa2c660a" (UID: "b19fa1f5-ee00-4c5e-acb9-2340fa2c660a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:58:58.850202 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:58:58.850164 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b19fa1f5-ee00-4c5e-acb9-2340fa2c660a-kserve-provision-location\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 14:58:59.508634 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:58:59.508597 2569 generic.go:358] "Generic (PLEG): container finished" podID="b19fa1f5-ee00-4c5e-acb9-2340fa2c660a" containerID="698157e600d502c1dd359edc02ddb2e5c14365f64587b2ec2dd8869aa08f520d" exitCode=0 Apr 24 14:58:59.509096 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:58:59.508633 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-c2k6k" event={"ID":"b19fa1f5-ee00-4c5e-acb9-2340fa2c660a","Type":"ContainerDied","Data":"698157e600d502c1dd359edc02ddb2e5c14365f64587b2ec2dd8869aa08f520d"} Apr 24 14:58:59.509096 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:58:59.508664 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-c2k6k" Apr 24 14:58:59.509096 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:58:59.508673 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-c2k6k" event={"ID":"b19fa1f5-ee00-4c5e-acb9-2340fa2c660a","Type":"ContainerDied","Data":"6eacd9a5bd41a9dc1990c127722be5c1d7c6dee478e17361b83ce39ec2bfb3e9"} Apr 24 14:58:59.509096 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:58:59.508694 2569 scope.go:117] "RemoveContainer" containerID="698157e600d502c1dd359edc02ddb2e5c14365f64587b2ec2dd8869aa08f520d" Apr 24 14:58:59.510559 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:58:59.510537 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-46pvp" event={"ID":"b5ae9766-de66-4547-8f60-db38d71d6bf9","Type":"ContainerStarted","Data":"a22229793bbd22eb1e05f323a50269c8e231d3fbca569cdace12a9a9a8d7e592"} Apr 24 14:58:59.510857 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:58:59.510840 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-46pvp" Apr 24 14:58:59.512149 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:58:59.512124 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-46pvp" podUID="b5ae9766-de66-4547-8f60-db38d71d6bf9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 24 14:58:59.517506 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:58:59.517489 2569 scope.go:117] "RemoveContainer" containerID="5cca54911d4d895cb5e009fd4b1cb4957d96721e2aa59cbf2357940534e563c1" Apr 24 14:58:59.525544 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:58:59.525446 2569 scope.go:117] "RemoveContainer" containerID="698157e600d502c1dd359edc02ddb2e5c14365f64587b2ec2dd8869aa08f520d" Apr 24 14:58:59.525783 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:58:59.525761 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"698157e600d502c1dd359edc02ddb2e5c14365f64587b2ec2dd8869aa08f520d\": container with ID starting with 698157e600d502c1dd359edc02ddb2e5c14365f64587b2ec2dd8869aa08f520d not found: ID does not exist" containerID="698157e600d502c1dd359edc02ddb2e5c14365f64587b2ec2dd8869aa08f520d" Apr 24 14:58:59.525907 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:58:59.525808 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"698157e600d502c1dd359edc02ddb2e5c14365f64587b2ec2dd8869aa08f520d"} err="failed to get container status \"698157e600d502c1dd359edc02ddb2e5c14365f64587b2ec2dd8869aa08f520d\": rpc error: code = NotFound desc = could not find container \"698157e600d502c1dd359edc02ddb2e5c14365f64587b2ec2dd8869aa08f520d\": container with ID starting with 698157e600d502c1dd359edc02ddb2e5c14365f64587b2ec2dd8869aa08f520d not found: ID does not exist" Apr 24 14:58:59.525907 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:58:59.525836 2569 scope.go:117] "RemoveContainer" containerID="5cca54911d4d895cb5e009fd4b1cb4957d96721e2aa59cbf2357940534e563c1" Apr 24 14:58:59.526251 ip-10-0-129-231 kubenswrapper[2569]: E0424 14:58:59.526233 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cca54911d4d895cb5e009fd4b1cb4957d96721e2aa59cbf2357940534e563c1\": container with ID starting with 5cca54911d4d895cb5e009fd4b1cb4957d96721e2aa59cbf2357940534e563c1 not found: ID does not exist" containerID="5cca54911d4d895cb5e009fd4b1cb4957d96721e2aa59cbf2357940534e563c1" Apr 24 14:58:59.526333 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:58:59.526261 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cca54911d4d895cb5e009fd4b1cb4957d96721e2aa59cbf2357940534e563c1"} err="failed to get container status \"5cca54911d4d895cb5e009fd4b1cb4957d96721e2aa59cbf2357940534e563c1\": rpc error: code = NotFound desc = could not find container \"5cca54911d4d895cb5e009fd4b1cb4957d96721e2aa59cbf2357940534e563c1\": container with ID starting with 5cca54911d4d895cb5e009fd4b1cb4957d96721e2aa59cbf2357940534e563c1 not found: ID does not exist" Apr 24 14:58:59.527151 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:58:59.527108 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-46pvp" podStartSLOduration=6.527098576 podStartE2EDuration="6.527098576s" podCreationTimestamp="2026-04-24 14:58:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:58:59.525152644 +0000 UTC m=+2092.319209555" watchObservedRunningTime="2026-04-24 14:58:59.527098576 +0000 UTC m=+2092.321155478" Apr 24 14:58:59.536626 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:58:59.536599 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-c2k6k"] Apr 24 14:58:59.539937 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:58:59.539913 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-c2k6k"] Apr 24 14:58:59.832033 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:58:59.831953 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b19fa1f5-ee00-4c5e-acb9-2340fa2c660a" path="/var/lib/kubelet/pods/b19fa1f5-ee00-4c5e-acb9-2340fa2c660a/volumes" Apr 24 14:59:00.514598 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:59:00.514508 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-46pvp" podUID="b5ae9766-de66-4547-8f60-db38d71d6bf9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 24 14:59:07.810541 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:59:07.810505 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mjpgb_be4da945-f6d1-4406-adea-f3ccedab88f6/ovn-acl-logging/0.log" Apr 24 14:59:07.816239 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:59:07.816219 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mjpgb_be4da945-f6d1-4406-adea-f3ccedab88f6/ovn-acl-logging/0.log" Apr 24 14:59:10.515178 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:59:10.515125 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-46pvp" podUID="b5ae9766-de66-4547-8f60-db38d71d6bf9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 24 14:59:20.514806 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:59:20.514754 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-46pvp" podUID="b5ae9766-de66-4547-8f60-db38d71d6bf9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 24 14:59:30.515436 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:59:30.515387 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-46pvp" podUID="b5ae9766-de66-4547-8f60-db38d71d6bf9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 24 14:59:40.514755 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:59:40.514710 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-46pvp" podUID="b5ae9766-de66-4547-8f60-db38d71d6bf9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 24 14:59:50.514712 ip-10-0-129-231 kubenswrapper[2569]: I0424 14:59:50.514667 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-46pvp" podUID="b5ae9766-de66-4547-8f60-db38d71d6bf9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 24 15:00:00.515328 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:00:00.515228 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-46pvp" podUID="b5ae9766-de66-4547-8f60-db38d71d6bf9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 24 15:00:10.514499 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:00:10.514451 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-46pvp" podUID="b5ae9766-de66-4547-8f60-db38d71d6bf9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 24 15:00:20.516088 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:00:20.516049 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-46pvp" Apr 24 15:00:23.748737 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:00:23.748697 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-46pvp"] Apr 24 15:00:23.749138 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:00:23.748985 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-46pvp" podUID="b5ae9766-de66-4547-8f60-db38d71d6bf9" containerName="kserve-container" containerID="cri-o://a22229793bbd22eb1e05f323a50269c8e231d3fbca569cdace12a9a9a8d7e592" gracePeriod=30 Apr 24 15:00:23.813685 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:00:23.813617 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-8c94s"] Apr 24 15:00:23.814073 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:00:23.814054 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b19fa1f5-ee00-4c5e-acb9-2340fa2c660a" containerName="storage-initializer" Apr 24 15:00:23.814175 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:00:23.814074 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="b19fa1f5-ee00-4c5e-acb9-2340fa2c660a" containerName="storage-initializer" Apr 24 15:00:23.814175 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:00:23.814115 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b19fa1f5-ee00-4c5e-acb9-2340fa2c660a" containerName="kserve-container" Apr 24 15:00:23.814175 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:00:23.814125 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="b19fa1f5-ee00-4c5e-acb9-2340fa2c660a" containerName="kserve-container" Apr 24 15:00:23.814339 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:00:23.814193 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="b19fa1f5-ee00-4c5e-acb9-2340fa2c660a" containerName="kserve-container" Apr 24 15:00:23.816245 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:00:23.816226 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-8c94s" Apr 24 15:00:23.828681 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:00:23.828651 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-8c94s"] Apr 24 15:00:23.848227 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:00:23.848197 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/08fa7d38-1a69-4562-865d-d6f26bd24fcd-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-8c94s\" (UID: \"08fa7d38-1a69-4562-865d-d6f26bd24fcd\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-8c94s" Apr 24 15:00:23.948933 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:00:23.948878 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/08fa7d38-1a69-4562-865d-d6f26bd24fcd-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-8c94s\" (UID: \"08fa7d38-1a69-4562-865d-d6f26bd24fcd\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-8c94s" Apr 24 15:00:23.949281 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:00:23.949260 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/08fa7d38-1a69-4562-865d-d6f26bd24fcd-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-8c94s\" (UID: \"08fa7d38-1a69-4562-865d-d6f26bd24fcd\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-8c94s" Apr 24 15:00:24.130530 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:00:24.130432 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-8c94s" Apr 24 15:00:24.249615 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:00:24.249490 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-8c94s"] Apr 24 15:00:24.252477 ip-10-0-129-231 kubenswrapper[2569]: W0424 15:00:24.252444 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08fa7d38_1a69_4562_865d_d6f26bd24fcd.slice/crio-50e1c6bb18e07ef78737de1c673cedd452d193c6ea81453ec27029eeaa593c20 WatchSource:0}: Error finding container 50e1c6bb18e07ef78737de1c673cedd452d193c6ea81453ec27029eeaa593c20: Status 404 returned error can't find the container with id 50e1c6bb18e07ef78737de1c673cedd452d193c6ea81453ec27029eeaa593c20 Apr 24 15:00:24.254301 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:00:24.254281 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 15:00:24.774971 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:00:24.774934 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-8c94s" event={"ID":"08fa7d38-1a69-4562-865d-d6f26bd24fcd","Type":"ContainerStarted","Data":"59126a13427a4c3b83d7329799b7dfb36ebba3e5d20d77bcc74490eb24eb806f"} Apr 24 15:00:24.774971 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:00:24.774973 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-8c94s" event={"ID":"08fa7d38-1a69-4562-865d-d6f26bd24fcd","Type":"ContainerStarted","Data":"50e1c6bb18e07ef78737de1c673cedd452d193c6ea81453ec27029eeaa593c20"} Apr 24 15:00:28.789752 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:00:28.789717 2569 generic.go:358] "Generic (PLEG): container finished" podID="08fa7d38-1a69-4562-865d-d6f26bd24fcd" containerID="59126a13427a4c3b83d7329799b7dfb36ebba3e5d20d77bcc74490eb24eb806f" exitCode=0 Apr 24 15:00:28.790192 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:00:28.789790 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-8c94s" event={"ID":"08fa7d38-1a69-4562-865d-d6f26bd24fcd","Type":"ContainerDied","Data":"59126a13427a4c3b83d7329799b7dfb36ebba3e5d20d77bcc74490eb24eb806f"} Apr 24 15:00:29.486000 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:00:29.485969 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-46pvp" Apr 24 15:00:29.595431 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:00:29.595394 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5ae9766-de66-4547-8f60-db38d71d6bf9-kserve-provision-location\") pod \"b5ae9766-de66-4547-8f60-db38d71d6bf9\" (UID: \"b5ae9766-de66-4547-8f60-db38d71d6bf9\") " Apr 24 15:00:29.595714 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:00:29.595689 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5ae9766-de66-4547-8f60-db38d71d6bf9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b5ae9766-de66-4547-8f60-db38d71d6bf9" (UID: "b5ae9766-de66-4547-8f60-db38d71d6bf9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 15:00:29.696494 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:00:29.696459 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5ae9766-de66-4547-8f60-db38d71d6bf9-kserve-provision-location\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 15:00:29.794164 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:00:29.794121 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-8c94s" event={"ID":"08fa7d38-1a69-4562-865d-d6f26bd24fcd","Type":"ContainerStarted","Data":"45b88052fc69198b07c50769a246b116fb64b69c557fe0c1ce5a5a5627f09cf3"} Apr 24 15:00:29.794613 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:00:29.794397 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-8c94s" Apr 24 15:00:29.795486 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:00:29.795461 2569 generic.go:358] "Generic (PLEG): container finished" podID="b5ae9766-de66-4547-8f60-db38d71d6bf9" containerID="a22229793bbd22eb1e05f323a50269c8e231d3fbca569cdace12a9a9a8d7e592" exitCode=0 Apr 24 15:00:29.795600 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:00:29.795519 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-46pvp" Apr 24 15:00:29.795600 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:00:29.795523 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-46pvp" event={"ID":"b5ae9766-de66-4547-8f60-db38d71d6bf9","Type":"ContainerDied","Data":"a22229793bbd22eb1e05f323a50269c8e231d3fbca569cdace12a9a9a8d7e592"} Apr 24 15:00:29.795600 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:00:29.795549 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-46pvp" event={"ID":"b5ae9766-de66-4547-8f60-db38d71d6bf9","Type":"ContainerDied","Data":"619804be6effe8177ea69e74512f46eccb25c8b0609a73bda0b4f6814872925c"} Apr 24 15:00:29.795600 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:00:29.795568 2569 scope.go:117] "RemoveContainer" containerID="a22229793bbd22eb1e05f323a50269c8e231d3fbca569cdace12a9a9a8d7e592" Apr 24 15:00:29.803299 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:00:29.803280 2569 scope.go:117] "RemoveContainer" containerID="9a8ac0b6e95a7c42841ac090b440c697f59b75c6bdc7bedf3cd00be7b58e7096" Apr 24 15:00:29.808948 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:00:29.808879 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-8c94s" podStartSLOduration=6.808862468 podStartE2EDuration="6.808862468s" podCreationTimestamp="2026-04-24 15:00:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 15:00:29.808619673 +0000 UTC m=+2182.602676597" watchObservedRunningTime="2026-04-24 15:00:29.808862468 +0000 UTC m=+2182.602919370" Apr 24 15:00:29.810812 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:00:29.810795 2569 scope.go:117] "RemoveContainer" containerID="a22229793bbd22eb1e05f323a50269c8e231d3fbca569cdace12a9a9a8d7e592" Apr 24 15:00:29.811121 ip-10-0-129-231 kubenswrapper[2569]: E0424 15:00:29.811095 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a22229793bbd22eb1e05f323a50269c8e231d3fbca569cdace12a9a9a8d7e592\": container with ID starting with a22229793bbd22eb1e05f323a50269c8e231d3fbca569cdace12a9a9a8d7e592 not found: ID does not exist" containerID="a22229793bbd22eb1e05f323a50269c8e231d3fbca569cdace12a9a9a8d7e592" Apr 24 15:00:29.811203 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:00:29.811127 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a22229793bbd22eb1e05f323a50269c8e231d3fbca569cdace12a9a9a8d7e592"} err="failed to get container status \"a22229793bbd22eb1e05f323a50269c8e231d3fbca569cdace12a9a9a8d7e592\": rpc error: code = NotFound desc = could not find container \"a22229793bbd22eb1e05f323a50269c8e231d3fbca569cdace12a9a9a8d7e592\": container with ID starting with a22229793bbd22eb1e05f323a50269c8e231d3fbca569cdace12a9a9a8d7e592 not found: ID does not exist" Apr 24 15:00:29.811203 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:00:29.811154 2569 scope.go:117] "RemoveContainer" containerID="9a8ac0b6e95a7c42841ac090b440c697f59b75c6bdc7bedf3cd00be7b58e7096" Apr 24 15:00:29.811436 ip-10-0-129-231 kubenswrapper[2569]: E0424 15:00:29.811415 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a8ac0b6e95a7c42841ac090b440c697f59b75c6bdc7bedf3cd00be7b58e7096\": container with ID starting with 9a8ac0b6e95a7c42841ac090b440c697f59b75c6bdc7bedf3cd00be7b58e7096 not found: ID does not exist" containerID="9a8ac0b6e95a7c42841ac090b440c697f59b75c6bdc7bedf3cd00be7b58e7096" Apr 24 15:00:29.811499 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:00:29.811442 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a8ac0b6e95a7c42841ac090b440c697f59b75c6bdc7bedf3cd00be7b58e7096"} err="failed to get container status \"9a8ac0b6e95a7c42841ac090b440c697f59b75c6bdc7bedf3cd00be7b58e7096\": rpc error: code = NotFound desc = could not find container \"9a8ac0b6e95a7c42841ac090b440c697f59b75c6bdc7bedf3cd00be7b58e7096\": container with ID starting with 9a8ac0b6e95a7c42841ac090b440c697f59b75c6bdc7bedf3cd00be7b58e7096 not found: ID does not exist" Apr 24 15:00:29.822198 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:00:29.822175 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-46pvp"] Apr 24 15:00:29.828308 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:00:29.828289 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-46pvp"] Apr 24 15:00:31.826909 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:00:31.826866 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5ae9766-de66-4547-8f60-db38d71d6bf9" path="/var/lib/kubelet/pods/b5ae9766-de66-4547-8f60-db38d71d6bf9/volumes" Apr 24 15:01:00.801247 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:01:00.801194 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-8c94s" podUID="08fa7d38-1a69-4562-865d-d6f26bd24fcd" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.44:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.134.0.44:8080: connect: connection refused" Apr 24 15:01:10.800424 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:01:10.800378 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-8c94s" podUID="08fa7d38-1a69-4562-865d-d6f26bd24fcd" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.44:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.134.0.44:8080: connect: connection refused" Apr 24 15:01:20.800540 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:01:20.800494 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-8c94s" podUID="08fa7d38-1a69-4562-865d-d6f26bd24fcd" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.44:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.134.0.44:8080: connect: connection refused" Apr 24 15:01:30.800712 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:01:30.800604 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-8c94s" podUID="08fa7d38-1a69-4562-865d-d6f26bd24fcd" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.44:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.134.0.44:8080: connect: connection refused" Apr 24 15:01:40.800416 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:01:40.800373 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-8c94s" podUID="08fa7d38-1a69-4562-865d-d6f26bd24fcd" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.44:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.134.0.44:8080: connect: connection refused" Apr 24 15:01:50.803496 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:01:50.803464 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-8c94s" Apr 24 15:01:53.954309 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:01:53.954252 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-8c94s"] Apr 24 15:01:53.954775 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:01:53.954582 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-8c94s" podUID="08fa7d38-1a69-4562-865d-d6f26bd24fcd" containerName="kserve-container" containerID="cri-o://45b88052fc69198b07c50769a246b116fb64b69c557fe0c1ce5a5a5627f09cf3" gracePeriod=30 Apr 24 15:01:54.042011 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:01:54.041975 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-45hw4"] Apr 24 15:01:54.042360 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:01:54.042346 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5ae9766-de66-4547-8f60-db38d71d6bf9" containerName="kserve-container" Apr 24 15:01:54.042460 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:01:54.042362 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5ae9766-de66-4547-8f60-db38d71d6bf9" containerName="kserve-container" Apr 24 15:01:54.042460 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:01:54.042379 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5ae9766-de66-4547-8f60-db38d71d6bf9" containerName="storage-initializer" Apr 24 15:01:54.042460 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:01:54.042387 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5ae9766-de66-4547-8f60-db38d71d6bf9" containerName="storage-initializer" Apr 24 15:01:54.042460 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:01:54.042440 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="b5ae9766-de66-4547-8f60-db38d71d6bf9" containerName="kserve-container" Apr 24 15:01:54.045344 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:01:54.045323 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-45hw4" Apr 24 15:01:54.054159 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:01:54.054132 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-45hw4"] Apr 24 15:01:54.118593 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:01:54.118555 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9b5b3720-83c9-429d-b8fd-528cd923bcde-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-67fd65d6cb-45hw4\" (UID: \"9b5b3720-83c9-429d-b8fd-528cd923bcde\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-45hw4" Apr 24 15:01:54.219529 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:01:54.219433 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9b5b3720-83c9-429d-b8fd-528cd923bcde-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-67fd65d6cb-45hw4\" (UID: \"9b5b3720-83c9-429d-b8fd-528cd923bcde\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-45hw4" Apr 24 15:01:54.219826 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:01:54.219806 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9b5b3720-83c9-429d-b8fd-528cd923bcde-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-67fd65d6cb-45hw4\" (UID: \"9b5b3720-83c9-429d-b8fd-528cd923bcde\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-45hw4" Apr 24 15:01:54.357388 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:01:54.357355 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-45hw4" Apr 24 15:01:54.482206 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:01:54.482004 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-45hw4"] Apr 24 15:01:54.484925 ip-10-0-129-231 kubenswrapper[2569]: W0424 15:01:54.484870 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b5b3720_83c9_429d_b8fd_528cd923bcde.slice/crio-ceca074fecf1894af685cde550f7bf4dba81f777a1138fd0b3b90b55b84100ab WatchSource:0}: Error finding container ceca074fecf1894af685cde550f7bf4dba81f777a1138fd0b3b90b55b84100ab: Status 404 returned error can't find the container with id ceca074fecf1894af685cde550f7bf4dba81f777a1138fd0b3b90b55b84100ab Apr 24 15:01:55.049716 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:01:55.049679 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-45hw4" event={"ID":"9b5b3720-83c9-429d-b8fd-528cd923bcde","Type":"ContainerStarted","Data":"b19d4064ee351e6025e03d9efe9a444239a998b16e5bd8a52ed1c1983f64fd1e"} Apr 24 15:01:55.049716 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:01:55.049715 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-45hw4" event={"ID":"9b5b3720-83c9-429d-b8fd-528cd923bcde","Type":"ContainerStarted","Data":"ceca074fecf1894af685cde550f7bf4dba81f777a1138fd0b3b90b55b84100ab"} Apr 24 15:01:59.062233 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:01:59.062196 2569 generic.go:358] "Generic (PLEG): container finished" podID="08fa7d38-1a69-4562-865d-d6f26bd24fcd" containerID="45b88052fc69198b07c50769a246b116fb64b69c557fe0c1ce5a5a5627f09cf3" exitCode=0 Apr 24 15:01:59.062652 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:01:59.062245 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-8c94s" event={"ID":"08fa7d38-1a69-4562-865d-d6f26bd24fcd","Type":"ContainerDied","Data":"45b88052fc69198b07c50769a246b116fb64b69c557fe0c1ce5a5a5627f09cf3"} Apr 24 15:01:59.063749 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:01:59.063729 2569 generic.go:358] "Generic (PLEG): container finished" podID="9b5b3720-83c9-429d-b8fd-528cd923bcde" containerID="b19d4064ee351e6025e03d9efe9a444239a998b16e5bd8a52ed1c1983f64fd1e" exitCode=0 Apr 24 15:01:59.063873 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:01:59.063807 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-45hw4" event={"ID":"9b5b3720-83c9-429d-b8fd-528cd923bcde","Type":"ContainerDied","Data":"b19d4064ee351e6025e03d9efe9a444239a998b16e5bd8a52ed1c1983f64fd1e"} Apr 24 15:01:59.122879 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:01:59.122854 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-8c94s" Apr 24 15:01:59.159674 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:01:59.159651 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/08fa7d38-1a69-4562-865d-d6f26bd24fcd-kserve-provision-location\") pod \"08fa7d38-1a69-4562-865d-d6f26bd24fcd\" (UID: \"08fa7d38-1a69-4562-865d-d6f26bd24fcd\") " Apr 24 15:01:59.159973 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:01:59.159952 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08fa7d38-1a69-4562-865d-d6f26bd24fcd-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "08fa7d38-1a69-4562-865d-d6f26bd24fcd" (UID: "08fa7d38-1a69-4562-865d-d6f26bd24fcd"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 15:01:59.260826 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:01:59.260789 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/08fa7d38-1a69-4562-865d-d6f26bd24fcd-kserve-provision-location\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 15:02:00.068366 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:02:00.068336 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-8c94s" Apr 24 15:02:00.068798 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:02:00.068333 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-8c94s" event={"ID":"08fa7d38-1a69-4562-865d-d6f26bd24fcd","Type":"ContainerDied","Data":"50e1c6bb18e07ef78737de1c673cedd452d193c6ea81453ec27029eeaa593c20"} Apr 24 15:02:00.068798 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:02:00.068464 2569 scope.go:117] "RemoveContainer" containerID="45b88052fc69198b07c50769a246b116fb64b69c557fe0c1ce5a5a5627f09cf3" Apr 24 15:02:00.070134 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:02:00.070114 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-45hw4" event={"ID":"9b5b3720-83c9-429d-b8fd-528cd923bcde","Type":"ContainerStarted","Data":"e57868fd47f93d753fc06c12b10698b254e6e4e3d03fc37ff6bf89dcb750dd55"} Apr 24 15:02:00.070385 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:02:00.070351 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-45hw4" Apr 24 15:02:00.076285 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:02:00.076269 2569 scope.go:117] "RemoveContainer" containerID="59126a13427a4c3b83d7329799b7dfb36ebba3e5d20d77bcc74490eb24eb806f" Apr 24 15:02:00.082473 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:02:00.082452 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-8c94s"] Apr 24 15:02:00.087548 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:02:00.087501 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-8c94s"] Apr 24 15:02:00.102631 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:02:00.102584 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-45hw4" podStartSLOduration=6.102569643 podStartE2EDuration="6.102569643s" podCreationTimestamp="2026-04-24 15:01:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 15:02:00.100748069 +0000 UTC m=+2272.894804994" watchObservedRunningTime="2026-04-24 15:02:00.102569643 +0000 UTC m=+2272.896626544" Apr 24 15:02:01.827703 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:02:01.827667 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08fa7d38-1a69-4562-865d-d6f26bd24fcd" path="/var/lib/kubelet/pods/08fa7d38-1a69-4562-865d-d6f26bd24fcd/volumes" Apr 24 15:02:31.077948 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:02:31.077877 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-45hw4" podUID="9b5b3720-83c9-429d-b8fd-528cd923bcde" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.45:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.134.0.45:8080: connect: connection refused" Apr 24 15:02:41.077574 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:02:41.077529 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-45hw4" podUID="9b5b3720-83c9-429d-b8fd-528cd923bcde" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.45:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.134.0.45:8080: connect: connection refused" Apr 24 15:02:51.076844 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:02:51.076787 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-45hw4" podUID="9b5b3720-83c9-429d-b8fd-528cd923bcde" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.45:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.134.0.45:8080: connect: connection refused" Apr 24 15:03:01.077327 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:03:01.077217 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-45hw4" podUID="9b5b3720-83c9-429d-b8fd-528cd923bcde" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.45:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.134.0.45:8080: connect: connection refused" Apr 24 15:03:04.823697 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:03:04.823652 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-45hw4" podUID="9b5b3720-83c9-429d-b8fd-528cd923bcde" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.45:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.134.0.45:8080: connect: connection refused" Apr 24 15:03:14.827444 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:03:14.827403 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-45hw4" Apr 24 15:03:24.163563 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:03:24.163529 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-45hw4"] Apr 24 15:03:24.163954 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:03:24.163805 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-45hw4" podUID="9b5b3720-83c9-429d-b8fd-528cd923bcde" containerName="kserve-container" containerID="cri-o://e57868fd47f93d753fc06c12b10698b254e6e4e3d03fc37ff6bf89dcb750dd55" gracePeriod=30 Apr 24 15:03:24.243490 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:03:24.243456 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-xknt7"] Apr 24 15:03:24.243776 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:03:24.243764 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="08fa7d38-1a69-4562-865d-d6f26bd24fcd" containerName="kserve-container" Apr 24 15:03:24.243833 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:03:24.243777 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="08fa7d38-1a69-4562-865d-d6f26bd24fcd" containerName="kserve-container" Apr 24 15:03:24.243833 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:03:24.243787 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="08fa7d38-1a69-4562-865d-d6f26bd24fcd" containerName="storage-initializer" Apr 24 15:03:24.243833 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:03:24.243793 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="08fa7d38-1a69-4562-865d-d6f26bd24fcd" containerName="storage-initializer" Apr 24 15:03:24.243966 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:03:24.243851 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="08fa7d38-1a69-4562-865d-d6f26bd24fcd" containerName="kserve-container" Apr 24 15:03:24.246686 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:03:24.246669 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-xknt7" Apr 24 15:03:24.255835 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:03:24.255808 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-xknt7"] Apr 24 15:03:24.389635 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:03:24.389594 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2b7b6af0-1fcf-4b93-a009-18ab7f3aefc9-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-65df45cf79-xknt7\" (UID: \"2b7b6af0-1fcf-4b93-a009-18ab7f3aefc9\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-xknt7" Apr 24 15:03:24.490842 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:03:24.490731 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2b7b6af0-1fcf-4b93-a009-18ab7f3aefc9-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-65df45cf79-xknt7\" (UID: \"2b7b6af0-1fcf-4b93-a009-18ab7f3aefc9\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-xknt7" Apr 24 15:03:24.491148 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:03:24.491120 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2b7b6af0-1fcf-4b93-a009-18ab7f3aefc9-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-65df45cf79-xknt7\" (UID: \"2b7b6af0-1fcf-4b93-a009-18ab7f3aefc9\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-xknt7" Apr 24 15:03:24.557881 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:03:24.557845 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-xknt7" Apr 24 15:03:24.679480 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:03:24.679289 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-xknt7"] Apr 24 15:03:24.682180 ip-10-0-129-231 kubenswrapper[2569]: W0424 15:03:24.682152 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b7b6af0_1fcf_4b93_a009_18ab7f3aefc9.slice/crio-9a69cc2855b9c5b0e1ff797fc9b5611871711af858d71c5baa2198588070c592 WatchSource:0}: Error finding container 9a69cc2855b9c5b0e1ff797fc9b5611871711af858d71c5baa2198588070c592: Status 404 returned error can't find the container with id 9a69cc2855b9c5b0e1ff797fc9b5611871711af858d71c5baa2198588070c592 Apr 24 15:03:24.824517 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:03:24.824412 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-45hw4" podUID="9b5b3720-83c9-429d-b8fd-528cd923bcde" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.45:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.134.0.45:8080: connect: connection refused" Apr 24 15:03:25.324717 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:03:25.324676 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-xknt7" event={"ID":"2b7b6af0-1fcf-4b93-a009-18ab7f3aefc9","Type":"ContainerStarted","Data":"6090efffe6cd4ec57caaa07beaeacf979e4f8e6bd16b56ec807d2c36b6ca2458"} Apr 24 15:03:25.324717 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:03:25.324720 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-xknt7" event={"ID":"2b7b6af0-1fcf-4b93-a009-18ab7f3aefc9","Type":"ContainerStarted","Data":"9a69cc2855b9c5b0e1ff797fc9b5611871711af858d71c5baa2198588070c592"} Apr 24 15:03:29.303194 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:03:29.303170 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-45hw4" Apr 24 15:03:29.337190 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:03:29.337092 2569 generic.go:358] "Generic (PLEG): container finished" podID="9b5b3720-83c9-429d-b8fd-528cd923bcde" containerID="e57868fd47f93d753fc06c12b10698b254e6e4e3d03fc37ff6bf89dcb750dd55" exitCode=0 Apr 24 15:03:29.337190 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:03:29.337156 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-45hw4" event={"ID":"9b5b3720-83c9-429d-b8fd-528cd923bcde","Type":"ContainerDied","Data":"e57868fd47f93d753fc06c12b10698b254e6e4e3d03fc37ff6bf89dcb750dd55"} Apr 24 15:03:29.337190 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:03:29.337172 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-45hw4" Apr 24 15:03:29.337463 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:03:29.337193 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-45hw4" event={"ID":"9b5b3720-83c9-429d-b8fd-528cd923bcde","Type":"ContainerDied","Data":"ceca074fecf1894af685cde550f7bf4dba81f777a1138fd0b3b90b55b84100ab"} Apr 24 15:03:29.337463 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:03:29.337210 2569 scope.go:117] "RemoveContainer" containerID="e57868fd47f93d753fc06c12b10698b254e6e4e3d03fc37ff6bf89dcb750dd55" Apr 24 15:03:29.338637 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:03:29.338617 2569 generic.go:358] "Generic (PLEG): container finished" podID="2b7b6af0-1fcf-4b93-a009-18ab7f3aefc9" containerID="6090efffe6cd4ec57caaa07beaeacf979e4f8e6bd16b56ec807d2c36b6ca2458" exitCode=0 Apr 24 15:03:29.338770 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:03:29.338691 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-xknt7" event={"ID":"2b7b6af0-1fcf-4b93-a009-18ab7f3aefc9","Type":"ContainerDied","Data":"6090efffe6cd4ec57caaa07beaeacf979e4f8e6bd16b56ec807d2c36b6ca2458"} Apr 24 15:03:29.345575 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:03:29.345557 2569 scope.go:117] "RemoveContainer" containerID="b19d4064ee351e6025e03d9efe9a444239a998b16e5bd8a52ed1c1983f64fd1e" Apr 24 15:03:29.353029 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:03:29.353000 2569 scope.go:117] "RemoveContainer" containerID="e57868fd47f93d753fc06c12b10698b254e6e4e3d03fc37ff6bf89dcb750dd55" Apr 24 15:03:29.353321 ip-10-0-129-231 kubenswrapper[2569]: E0424 15:03:29.353300 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e57868fd47f93d753fc06c12b10698b254e6e4e3d03fc37ff6bf89dcb750dd55\": container with ID starting with e57868fd47f93d753fc06c12b10698b254e6e4e3d03fc37ff6bf89dcb750dd55 not found: ID does not exist" containerID="e57868fd47f93d753fc06c12b10698b254e6e4e3d03fc37ff6bf89dcb750dd55" Apr 24 15:03:29.353394 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:03:29.353329 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e57868fd47f93d753fc06c12b10698b254e6e4e3d03fc37ff6bf89dcb750dd55"} err="failed to get container status \"e57868fd47f93d753fc06c12b10698b254e6e4e3d03fc37ff6bf89dcb750dd55\": rpc error: code = NotFound desc = could not find container \"e57868fd47f93d753fc06c12b10698b254e6e4e3d03fc37ff6bf89dcb750dd55\": container with ID starting with e57868fd47f93d753fc06c12b10698b254e6e4e3d03fc37ff6bf89dcb750dd55 not found: ID does not exist" Apr 24 15:03:29.353394 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:03:29.353349 2569 scope.go:117] "RemoveContainer" containerID="b19d4064ee351e6025e03d9efe9a444239a998b16e5bd8a52ed1c1983f64fd1e" Apr 24 15:03:29.353561 ip-10-0-129-231 kubenswrapper[2569]: E0424 15:03:29.353542 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b19d4064ee351e6025e03d9efe9a444239a998b16e5bd8a52ed1c1983f64fd1e\": container with ID starting with b19d4064ee351e6025e03d9efe9a444239a998b16e5bd8a52ed1c1983f64fd1e not found: ID does not exist" containerID="b19d4064ee351e6025e03d9efe9a444239a998b16e5bd8a52ed1c1983f64fd1e" Apr 24 15:03:29.353603 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:03:29.353565 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b19d4064ee351e6025e03d9efe9a444239a998b16e5bd8a52ed1c1983f64fd1e"} err="failed to get container status \"b19d4064ee351e6025e03d9efe9a444239a998b16e5bd8a52ed1c1983f64fd1e\": rpc error: code = NotFound desc = could not find container \"b19d4064ee351e6025e03d9efe9a444239a998b16e5bd8a52ed1c1983f64fd1e\": container with ID starting with b19d4064ee351e6025e03d9efe9a444239a998b16e5bd8a52ed1c1983f64fd1e not found: ID does not exist" Apr 24 15:03:29.435525 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:03:29.435490 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9b5b3720-83c9-429d-b8fd-528cd923bcde-kserve-provision-location\") pod \"9b5b3720-83c9-429d-b8fd-528cd923bcde\" (UID: \"9b5b3720-83c9-429d-b8fd-528cd923bcde\") " Apr 24 15:03:29.435856 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:03:29.435831 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b5b3720-83c9-429d-b8fd-528cd923bcde-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9b5b3720-83c9-429d-b8fd-528cd923bcde" (UID: "9b5b3720-83c9-429d-b8fd-528cd923bcde"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 15:03:29.536692 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:03:29.536640 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9b5b3720-83c9-429d-b8fd-528cd923bcde-kserve-provision-location\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 15:03:29.657387 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:03:29.657354 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-45hw4"] Apr 24 15:03:29.662709 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:03:29.662686 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-45hw4"] Apr 24 15:03:29.827872 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:03:29.827839 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b5b3720-83c9-429d-b8fd-528cd923bcde" path="/var/lib/kubelet/pods/9b5b3720-83c9-429d-b8fd-528cd923bcde/volumes" Apr 24 15:03:30.344392 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:03:30.344355 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-xknt7" event={"ID":"2b7b6af0-1fcf-4b93-a009-18ab7f3aefc9","Type":"ContainerStarted","Data":"d02ed6c89e12219544de4e44cd313ec8cfb839540f84ca9a9a8fefff1ed6f651"} Apr 24 15:03:30.344809 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:03:30.344571 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-xknt7" Apr 24 15:03:30.361235 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:03:30.361188 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-xknt7" podStartSLOduration=6.361174882 podStartE2EDuration="6.361174882s" podCreationTimestamp="2026-04-24 15:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 15:03:30.359958426 +0000 UTC m=+2363.154015349" watchObservedRunningTime="2026-04-24 15:03:30.361174882 +0000 UTC m=+2363.155231786" Apr 24 15:04:01.348944 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:04:01.348877 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-xknt7" podUID="2b7b6af0-1fcf-4b93-a009-18ab7f3aefc9" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.46:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.134.0.46:8080: connect: connection refused" Apr 24 15:04:07.832850 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:04:07.832819 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mjpgb_be4da945-f6d1-4406-adea-f3ccedab88f6/ovn-acl-logging/0.log" Apr 24 15:04:07.840058 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:04:07.840033 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mjpgb_be4da945-f6d1-4406-adea-f3ccedab88f6/ovn-acl-logging/0.log" Apr 24 15:04:11.348495 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:04:11.348447 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-xknt7" podUID="2b7b6af0-1fcf-4b93-a009-18ab7f3aefc9" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.46:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.134.0.46:8080: connect: connection refused" Apr 24 15:04:21.348380 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:04:21.348337 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-xknt7" podUID="2b7b6af0-1fcf-4b93-a009-18ab7f3aefc9" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.46:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.134.0.46:8080: connect: connection refused" Apr 24 15:04:31.348715 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:04:31.348619 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-xknt7" podUID="2b7b6af0-1fcf-4b93-a009-18ab7f3aefc9" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.46:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.134.0.46:8080: connect: connection refused" Apr 24 15:04:38.823851 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:04:38.823800 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-xknt7" podUID="2b7b6af0-1fcf-4b93-a009-18ab7f3aefc9" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.46:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.134.0.46:8080: connect: connection refused" Apr 24 15:04:48.828058 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:04:48.828025 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-xknt7" Apr 24 15:04:54.576021 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:04:54.575982 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-xknt7"] Apr 24 15:04:54.576566 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:04:54.576280 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-xknt7" podUID="2b7b6af0-1fcf-4b93-a009-18ab7f3aefc9" containerName="kserve-container" containerID="cri-o://d02ed6c89e12219544de4e44cd313ec8cfb839540f84ca9a9a8fefff1ed6f651" gracePeriod=30 Apr 24 15:04:56.703200 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:04:56.703162 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-7c68dc9d59-fbsl6"] Apr 24 15:04:56.703596 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:04:56.703532 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9b5b3720-83c9-429d-b8fd-528cd923bcde" containerName="storage-initializer" Apr 24 15:04:56.703596 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:04:56.703543 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b5b3720-83c9-429d-b8fd-528cd923bcde" containerName="storage-initializer" Apr 24 15:04:56.703596 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:04:56.703558 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9b5b3720-83c9-429d-b8fd-528cd923bcde" containerName="kserve-container" Apr 24 15:04:56.703596 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:04:56.703564 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b5b3720-83c9-429d-b8fd-528cd923bcde" containerName="kserve-container" Apr 24 15:04:56.703756 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:04:56.703612 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="9b5b3720-83c9-429d-b8fd-528cd923bcde" containerName="kserve-container" Apr 24 15:04:56.706720 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:04:56.706704 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7c68dc9d59-fbsl6" Apr 24 15:04:56.715062 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:04:56.715036 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-7c68dc9d59-fbsl6"] Apr 24 15:04:56.776165 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:04:56.776126 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8417d5ff-d785-4cf8-8229-9295fba802e2-kserve-provision-location\") pod \"isvc-sklearn-predictor-7c68dc9d59-fbsl6\" (UID: \"8417d5ff-d785-4cf8-8229-9295fba802e2\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7c68dc9d59-fbsl6" Apr 24 15:04:56.877203 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:04:56.877168 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8417d5ff-d785-4cf8-8229-9295fba802e2-kserve-provision-location\") pod \"isvc-sklearn-predictor-7c68dc9d59-fbsl6\" (UID: \"8417d5ff-d785-4cf8-8229-9295fba802e2\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7c68dc9d59-fbsl6" Apr 24 15:04:56.877560 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:04:56.877541 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8417d5ff-d785-4cf8-8229-9295fba802e2-kserve-provision-location\") pod \"isvc-sklearn-predictor-7c68dc9d59-fbsl6\" (UID: \"8417d5ff-d785-4cf8-8229-9295fba802e2\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7c68dc9d59-fbsl6" Apr 24 15:04:57.018110 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:04:57.017992 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7c68dc9d59-fbsl6" Apr 24 15:04:57.138736 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:04:57.138704 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-7c68dc9d59-fbsl6"] Apr 24 15:04:57.141951 ip-10-0-129-231 kubenswrapper[2569]: W0424 15:04:57.141925 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8417d5ff_d785_4cf8_8229_9295fba802e2.slice/crio-2a85e50c184342890179e4247b11e0443ca5935a250c031b4047f62c1cf6f3a0 WatchSource:0}: Error finding container 2a85e50c184342890179e4247b11e0443ca5935a250c031b4047f62c1cf6f3a0: Status 404 returned error can't find the container with id 2a85e50c184342890179e4247b11e0443ca5935a250c031b4047f62c1cf6f3a0 Apr 24 15:04:57.616201 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:04:57.616164 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7c68dc9d59-fbsl6" event={"ID":"8417d5ff-d785-4cf8-8229-9295fba802e2","Type":"ContainerStarted","Data":"4b768342f7d259b046c00a6410f973a1f15c4d1b45932b0dd37d2fef494bb0af"} Apr 24 15:04:57.616201 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:04:57.616203 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7c68dc9d59-fbsl6" event={"ID":"8417d5ff-d785-4cf8-8229-9295fba802e2","Type":"ContainerStarted","Data":"2a85e50c184342890179e4247b11e0443ca5935a250c031b4047f62c1cf6f3a0"} Apr 24 15:04:58.823825 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:04:58.823777 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-xknt7" podUID="2b7b6af0-1fcf-4b93-a009-18ab7f3aefc9" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.46:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.134.0.46:8080: connect: connection refused" Apr 24 15:04:59.623489 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:04:59.623460 2569 generic.go:358] "Generic (PLEG): container finished" podID="2b7b6af0-1fcf-4b93-a009-18ab7f3aefc9" containerID="d02ed6c89e12219544de4e44cd313ec8cfb839540f84ca9a9a8fefff1ed6f651" exitCode=0 Apr 24 15:04:59.623649 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:04:59.623537 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-xknt7" event={"ID":"2b7b6af0-1fcf-4b93-a009-18ab7f3aefc9","Type":"ContainerDied","Data":"d02ed6c89e12219544de4e44cd313ec8cfb839540f84ca9a9a8fefff1ed6f651"} Apr 24 15:05:00.019109 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:05:00.019070 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-xknt7" Apr 24 15:05:00.105882 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:05:00.105848 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2b7b6af0-1fcf-4b93-a009-18ab7f3aefc9-kserve-provision-location\") pod \"2b7b6af0-1fcf-4b93-a009-18ab7f3aefc9\" (UID: \"2b7b6af0-1fcf-4b93-a009-18ab7f3aefc9\") " Apr 24 15:05:00.106223 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:05:00.106196 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b7b6af0-1fcf-4b93-a009-18ab7f3aefc9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2b7b6af0-1fcf-4b93-a009-18ab7f3aefc9" (UID: "2b7b6af0-1fcf-4b93-a009-18ab7f3aefc9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 15:05:00.212226 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:05:00.207745 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2b7b6af0-1fcf-4b93-a009-18ab7f3aefc9-kserve-provision-location\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 15:05:00.627801 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:05:00.627708 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-xknt7" event={"ID":"2b7b6af0-1fcf-4b93-a009-18ab7f3aefc9","Type":"ContainerDied","Data":"9a69cc2855b9c5b0e1ff797fc9b5611871711af858d71c5baa2198588070c592"} Apr 24 15:05:00.627801 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:05:00.627747 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-xknt7" Apr 24 15:05:00.627801 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:05:00.627765 2569 scope.go:117] "RemoveContainer" containerID="d02ed6c89e12219544de4e44cd313ec8cfb839540f84ca9a9a8fefff1ed6f651" Apr 24 15:05:00.635822 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:05:00.635799 2569 scope.go:117] "RemoveContainer" containerID="6090efffe6cd4ec57caaa07beaeacf979e4f8e6bd16b56ec807d2c36b6ca2458" Apr 24 15:05:00.647158 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:05:00.647126 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-xknt7"] Apr 24 15:05:00.651238 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:05:00.651209 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-xknt7"] Apr 24 15:05:01.632416 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:05:01.632385 2569 generic.go:358] "Generic (PLEG): container finished" podID="8417d5ff-d785-4cf8-8229-9295fba802e2" containerID="4b768342f7d259b046c00a6410f973a1f15c4d1b45932b0dd37d2fef494bb0af" exitCode=0 Apr 24 15:05:01.632886 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:05:01.632460 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7c68dc9d59-fbsl6" event={"ID":"8417d5ff-d785-4cf8-8229-9295fba802e2","Type":"ContainerDied","Data":"4b768342f7d259b046c00a6410f973a1f15c4d1b45932b0dd37d2fef494bb0af"} Apr 24 15:05:01.828541 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:05:01.828505 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b7b6af0-1fcf-4b93-a009-18ab7f3aefc9" path="/var/lib/kubelet/pods/2b7b6af0-1fcf-4b93-a009-18ab7f3aefc9/volumes" Apr 24 15:05:02.638119 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:05:02.638087 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7c68dc9d59-fbsl6" event={"ID":"8417d5ff-d785-4cf8-8229-9295fba802e2","Type":"ContainerStarted","Data":"88f12101387c0cf13c878109fd51ebae3a009dcb4fca8df5da29d401473f4740"} Apr 24 15:05:02.638568 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:05:02.638374 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7c68dc9d59-fbsl6" Apr 24 15:05:02.639634 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:05:02.639610 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7c68dc9d59-fbsl6" podUID="8417d5ff-d785-4cf8-8229-9295fba802e2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 24 15:05:02.652367 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:05:02.652305 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7c68dc9d59-fbsl6" podStartSLOduration=6.652288566 podStartE2EDuration="6.652288566s" podCreationTimestamp="2026-04-24 15:04:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 15:05:02.651300353 +0000 UTC m=+2455.445357255" watchObservedRunningTime="2026-04-24 15:05:02.652288566 +0000 UTC m=+2455.446345468" Apr 24 15:05:03.644339 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:05:03.644303 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7c68dc9d59-fbsl6" podUID="8417d5ff-d785-4cf8-8229-9295fba802e2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 24 15:05:13.644348 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:05:13.644299 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7c68dc9d59-fbsl6" podUID="8417d5ff-d785-4cf8-8229-9295fba802e2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 24 15:05:23.644531 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:05:23.644480 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7c68dc9d59-fbsl6" podUID="8417d5ff-d785-4cf8-8229-9295fba802e2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 24 15:05:33.644683 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:05:33.644635 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7c68dc9d59-fbsl6" podUID="8417d5ff-d785-4cf8-8229-9295fba802e2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 24 15:05:43.645181 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:05:43.645134 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7c68dc9d59-fbsl6" podUID="8417d5ff-d785-4cf8-8229-9295fba802e2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 24 15:05:53.645191 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:05:53.645146 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7c68dc9d59-fbsl6" podUID="8417d5ff-d785-4cf8-8229-9295fba802e2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 24 15:06:03.644932 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:06:03.644824 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7c68dc9d59-fbsl6" podUID="8417d5ff-d785-4cf8-8229-9295fba802e2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 24 15:06:10.824882 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:06:10.824852 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7c68dc9d59-fbsl6" Apr 24 15:06:16.822475 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:06:16.822439 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-7c68dc9d59-fbsl6"] Apr 24 15:06:16.822878 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:06:16.822829 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7c68dc9d59-fbsl6" podUID="8417d5ff-d785-4cf8-8229-9295fba802e2" containerName="kserve-container" containerID="cri-o://88f12101387c0cf13c878109fd51ebae3a009dcb4fca8df5da29d401473f4740" gracePeriod=30 Apr 24 15:06:16.868608 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:06:16.868573 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-v459n"] Apr 24 15:06:16.868930 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:06:16.868917 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2b7b6af0-1fcf-4b93-a009-18ab7f3aefc9" containerName="kserve-container" Apr 24 15:06:16.868985 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:06:16.868931 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b7b6af0-1fcf-4b93-a009-18ab7f3aefc9" containerName="kserve-container" Apr 24 15:06:16.868985 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:06:16.868947 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2b7b6af0-1fcf-4b93-a009-18ab7f3aefc9" containerName="storage-initializer" Apr 24 15:06:16.868985 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:06:16.868953 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b7b6af0-1fcf-4b93-a009-18ab7f3aefc9" containerName="storage-initializer" Apr 24 15:06:16.869081 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:06:16.869017 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="2b7b6af0-1fcf-4b93-a009-18ab7f3aefc9" containerName="kserve-container" Apr 24 15:06:16.872084 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:06:16.872064 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-v459n" Apr 24 15:06:16.881583 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:06:16.881556 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-v459n"] Apr 24 15:06:16.975698 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:06:16.975657 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fe8ef603-6bf6-48aa-9f04-b368f8330c00-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-695d5f5568-v459n\" (UID: \"fe8ef603-6bf6-48aa-9f04-b368f8330c00\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-v459n" Apr 24 15:06:17.077225 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:06:17.077118 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fe8ef603-6bf6-48aa-9f04-b368f8330c00-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-695d5f5568-v459n\" (UID: \"fe8ef603-6bf6-48aa-9f04-b368f8330c00\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-v459n" Apr 24 15:06:17.077569 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:06:17.077543 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fe8ef603-6bf6-48aa-9f04-b368f8330c00-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-695d5f5568-v459n\" (UID: \"fe8ef603-6bf6-48aa-9f04-b368f8330c00\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-v459n" Apr 24 15:06:17.182926 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:06:17.182874 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-v459n" Apr 24 15:06:17.310698 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:06:17.310669 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-v459n"] Apr 24 15:06:17.313534 ip-10-0-129-231 kubenswrapper[2569]: W0424 15:06:17.313506 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe8ef603_6bf6_48aa_9f04_b368f8330c00.slice/crio-d0c7afd8c28e99f29d5b6c865d4888e967fe9b8c8a14fd85350ddb08a47bdf3f WatchSource:0}: Error finding container d0c7afd8c28e99f29d5b6c865d4888e967fe9b8c8a14fd85350ddb08a47bdf3f: Status 404 returned error can't find the container with id d0c7afd8c28e99f29d5b6c865d4888e967fe9b8c8a14fd85350ddb08a47bdf3f Apr 24 15:06:17.315412 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:06:17.315396 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 15:06:17.868721 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:06:17.868682 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-v459n" event={"ID":"fe8ef603-6bf6-48aa-9f04-b368f8330c00","Type":"ContainerStarted","Data":"d066153cf2a69fd4dd047b00a8c19603731b817c34fcd7835f85c3b340cd9a33"} Apr 24 15:06:17.868721 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:06:17.868717 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-v459n" event={"ID":"fe8ef603-6bf6-48aa-9f04-b368f8330c00","Type":"ContainerStarted","Data":"d0c7afd8c28e99f29d5b6c865d4888e967fe9b8c8a14fd85350ddb08a47bdf3f"} Apr 24 15:06:20.824181 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:06:20.824134 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7c68dc9d59-fbsl6" podUID="8417d5ff-d785-4cf8-8229-9295fba802e2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 24 15:06:21.461360 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:06:21.461335 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7c68dc9d59-fbsl6" Apr 24 15:06:21.516639 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:06:21.516599 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8417d5ff-d785-4cf8-8229-9295fba802e2-kserve-provision-location\") pod \"8417d5ff-d785-4cf8-8229-9295fba802e2\" (UID: \"8417d5ff-d785-4cf8-8229-9295fba802e2\") " Apr 24 15:06:21.516971 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:06:21.516946 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8417d5ff-d785-4cf8-8229-9295fba802e2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8417d5ff-d785-4cf8-8229-9295fba802e2" (UID: "8417d5ff-d785-4cf8-8229-9295fba802e2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 15:06:21.617560 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:06:21.617465 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8417d5ff-d785-4cf8-8229-9295fba802e2-kserve-provision-location\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 15:06:21.881777 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:06:21.881685 2569 generic.go:358] "Generic (PLEG): container finished" podID="8417d5ff-d785-4cf8-8229-9295fba802e2" containerID="88f12101387c0cf13c878109fd51ebae3a009dcb4fca8df5da29d401473f4740" exitCode=0 Apr 24 15:06:21.881777 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:06:21.881756 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7c68dc9d59-fbsl6" Apr 24 15:06:21.882340 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:06:21.881756 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7c68dc9d59-fbsl6" event={"ID":"8417d5ff-d785-4cf8-8229-9295fba802e2","Type":"ContainerDied","Data":"88f12101387c0cf13c878109fd51ebae3a009dcb4fca8df5da29d401473f4740"} Apr 24 15:06:21.882340 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:06:21.881851 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7c68dc9d59-fbsl6" event={"ID":"8417d5ff-d785-4cf8-8229-9295fba802e2","Type":"ContainerDied","Data":"2a85e50c184342890179e4247b11e0443ca5935a250c031b4047f62c1cf6f3a0"} Apr 24 15:06:21.882340 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:06:21.881879 2569 scope.go:117] "RemoveContainer" containerID="88f12101387c0cf13c878109fd51ebae3a009dcb4fca8df5da29d401473f4740" Apr 24 15:06:21.883389 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:06:21.883361 2569 generic.go:358] "Generic (PLEG): container finished" podID="fe8ef603-6bf6-48aa-9f04-b368f8330c00" containerID="d066153cf2a69fd4dd047b00a8c19603731b817c34fcd7835f85c3b340cd9a33" exitCode=0 Apr 24 15:06:21.883502 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:06:21.883407 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-v459n" event={"ID":"fe8ef603-6bf6-48aa-9f04-b368f8330c00","Type":"ContainerDied","Data":"d066153cf2a69fd4dd047b00a8c19603731b817c34fcd7835f85c3b340cd9a33"} Apr 24 15:06:21.890343 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:06:21.890324 2569 scope.go:117] "RemoveContainer" containerID="4b768342f7d259b046c00a6410f973a1f15c4d1b45932b0dd37d2fef494bb0af" Apr 24 15:06:21.896447 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:06:21.896416 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-7c68dc9d59-fbsl6"] Apr 24 15:06:21.899866 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:06:21.899840 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-7c68dc9d59-fbsl6"] Apr 24 15:06:21.901581 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:06:21.901563 2569 scope.go:117] "RemoveContainer" containerID="88f12101387c0cf13c878109fd51ebae3a009dcb4fca8df5da29d401473f4740" Apr 24 15:06:21.902089 ip-10-0-129-231 kubenswrapper[2569]: E0424 15:06:21.902067 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88f12101387c0cf13c878109fd51ebae3a009dcb4fca8df5da29d401473f4740\": container with ID starting with 88f12101387c0cf13c878109fd51ebae3a009dcb4fca8df5da29d401473f4740 not found: ID does not exist" containerID="88f12101387c0cf13c878109fd51ebae3a009dcb4fca8df5da29d401473f4740" Apr 24 15:06:21.902190 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:06:21.902098 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88f12101387c0cf13c878109fd51ebae3a009dcb4fca8df5da29d401473f4740"} err="failed to get container status \"88f12101387c0cf13c878109fd51ebae3a009dcb4fca8df5da29d401473f4740\": rpc error: code = NotFound desc = could not find container \"88f12101387c0cf13c878109fd51ebae3a009dcb4fca8df5da29d401473f4740\": container with ID starting with 88f12101387c0cf13c878109fd51ebae3a009dcb4fca8df5da29d401473f4740 not found: ID does not exist" Apr 24 15:06:21.902190 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:06:21.902125 2569 scope.go:117] "RemoveContainer" containerID="4b768342f7d259b046c00a6410f973a1f15c4d1b45932b0dd37d2fef494bb0af" Apr 24 15:06:21.902414 ip-10-0-129-231 kubenswrapper[2569]: E0424 15:06:21.902385 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b768342f7d259b046c00a6410f973a1f15c4d1b45932b0dd37d2fef494bb0af\": container with ID starting with 4b768342f7d259b046c00a6410f973a1f15c4d1b45932b0dd37d2fef494bb0af not found: ID does not exist" containerID="4b768342f7d259b046c00a6410f973a1f15c4d1b45932b0dd37d2fef494bb0af" Apr 24 15:06:21.902504 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:06:21.902418 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b768342f7d259b046c00a6410f973a1f15c4d1b45932b0dd37d2fef494bb0af"} err="failed to get container status \"4b768342f7d259b046c00a6410f973a1f15c4d1b45932b0dd37d2fef494bb0af\": rpc error: code = NotFound desc = could not find container \"4b768342f7d259b046c00a6410f973a1f15c4d1b45932b0dd37d2fef494bb0af\": container with ID starting with 4b768342f7d259b046c00a6410f973a1f15c4d1b45932b0dd37d2fef494bb0af not found: ID does not exist" Apr 24 15:06:22.888277 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:06:22.888244 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-v459n" event={"ID":"fe8ef603-6bf6-48aa-9f04-b368f8330c00","Type":"ContainerStarted","Data":"d84882a4e02726053c282d668e413b7159202384600765fbb08728d126f30946"} Apr 24 15:06:22.888648 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:06:22.888469 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-v459n" Apr 24 15:06:22.904959 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:06:22.904884 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-v459n" podStartSLOduration=6.904869301 podStartE2EDuration="6.904869301s" podCreationTimestamp="2026-04-24 15:06:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 15:06:22.90298812 +0000 UTC m=+2535.697045057" watchObservedRunningTime="2026-04-24 15:06:22.904869301 +0000 UTC m=+2535.698926202" Apr 24 15:06:23.827838 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:06:23.827802 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8417d5ff-d785-4cf8-8229-9295fba802e2" path="/var/lib/kubelet/pods/8417d5ff-d785-4cf8-8229-9295fba802e2/volumes" Apr 24 15:06:53.975002 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:06:53.974955 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-v459n" podUID="fe8ef603-6bf6-48aa-9f04-b368f8330c00" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 24 15:07:03.893077 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:03.893029 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-v459n" podUID="fe8ef603-6bf6-48aa-9f04-b368f8330c00" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 24 15:07:13.893132 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:13.893098 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-v459n" Apr 24 15:07:17.052676 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:17.052642 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6c49b9bd4f-kcz7j"] Apr 24 15:07:17.053074 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:17.052990 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8417d5ff-d785-4cf8-8229-9295fba802e2" containerName="storage-initializer" Apr 24 15:07:17.053074 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:17.053002 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="8417d5ff-d785-4cf8-8229-9295fba802e2" containerName="storage-initializer" Apr 24 15:07:17.053074 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:17.053011 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8417d5ff-d785-4cf8-8229-9295fba802e2" containerName="kserve-container" Apr 24 15:07:17.053074 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:17.053017 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="8417d5ff-d785-4cf8-8229-9295fba802e2" containerName="kserve-container" Apr 24 15:07:17.053074 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:17.053072 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="8417d5ff-d785-4cf8-8229-9295fba802e2" containerName="kserve-container" Apr 24 15:07:17.056002 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:17.055982 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6c49b9bd4f-kcz7j" Apr 24 15:07:17.065427 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:17.065400 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6c49b9bd4f-kcz7j"] Apr 24 15:07:17.074099 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:17.074063 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-v459n"] Apr 24 15:07:17.074463 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:17.074436 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-v459n" podUID="fe8ef603-6bf6-48aa-9f04-b368f8330c00" containerName="kserve-container" containerID="cri-o://d84882a4e02726053c282d668e413b7159202384600765fbb08728d126f30946" gracePeriod=30 Apr 24 15:07:17.205229 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:17.205191 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3f4d9eb7-ace4-4441-af63-b3e87ca02e87-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-6c49b9bd4f-kcz7j\" (UID: \"3f4d9eb7-ace4-4441-af63-b3e87ca02e87\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6c49b9bd4f-kcz7j" Apr 24 15:07:17.306732 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:17.306617 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3f4d9eb7-ace4-4441-af63-b3e87ca02e87-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-6c49b9bd4f-kcz7j\" (UID: \"3f4d9eb7-ace4-4441-af63-b3e87ca02e87\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6c49b9bd4f-kcz7j" Apr 24 15:07:17.307043 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:17.307022 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3f4d9eb7-ace4-4441-af63-b3e87ca02e87-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-6c49b9bd4f-kcz7j\" (UID: \"3f4d9eb7-ace4-4441-af63-b3e87ca02e87\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6c49b9bd4f-kcz7j" Apr 24 15:07:17.367853 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:17.367811 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6c49b9bd4f-kcz7j" Apr 24 15:07:17.491336 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:17.491293 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6c49b9bd4f-kcz7j"] Apr 24 15:07:17.494104 ip-10-0-129-231 kubenswrapper[2569]: W0424 15:07:17.494060 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f4d9eb7_ace4_4441_af63_b3e87ca02e87.slice/crio-ffd3aff92c24bb854fbd0564ad3155f229959f5e763ed699ece12b9cc1910467 WatchSource:0}: Error finding container ffd3aff92c24bb854fbd0564ad3155f229959f5e763ed699ece12b9cc1910467: Status 404 returned error can't find the container with id ffd3aff92c24bb854fbd0564ad3155f229959f5e763ed699ece12b9cc1910467 Apr 24 15:07:18.056190 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:18.056141 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6c49b9bd4f-kcz7j" event={"ID":"3f4d9eb7-ace4-4441-af63-b3e87ca02e87","Type":"ContainerStarted","Data":"ee66e97c2223e7fbed8f6f588aeabed3d3c241e80aa891af39e05ec6efc21d80"} Apr 24 15:07:18.056580 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:18.056199 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6c49b9bd4f-kcz7j" event={"ID":"3f4d9eb7-ace4-4441-af63-b3e87ca02e87","Type":"ContainerStarted","Data":"ffd3aff92c24bb854fbd0564ad3155f229959f5e763ed699ece12b9cc1910467"} Apr 24 15:07:23.892050 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:23.892003 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-v459n" podUID="fe8ef603-6bf6-48aa-9f04-b368f8330c00" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.48:8080/v2/models/sklearn-v2-mlserver/ready\": dial tcp 10.134.0.48:8080: connect: connection refused" Apr 24 15:07:24.076136 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:24.076103 2569 generic.go:358] "Generic (PLEG): container finished" podID="3f4d9eb7-ace4-4441-af63-b3e87ca02e87" containerID="ee66e97c2223e7fbed8f6f588aeabed3d3c241e80aa891af39e05ec6efc21d80" exitCode=0 Apr 24 15:07:24.076343 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:24.076184 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6c49b9bd4f-kcz7j" event={"ID":"3f4d9eb7-ace4-4441-af63-b3e87ca02e87","Type":"ContainerDied","Data":"ee66e97c2223e7fbed8f6f588aeabed3d3c241e80aa891af39e05ec6efc21d80"} Apr 24 15:07:25.020330 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:25.020305 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-v459n" Apr 24 15:07:25.081202 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:25.081111 2569 generic.go:358] "Generic (PLEG): container finished" podID="fe8ef603-6bf6-48aa-9f04-b368f8330c00" containerID="d84882a4e02726053c282d668e413b7159202384600765fbb08728d126f30946" exitCode=0 Apr 24 15:07:25.081202 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:25.081190 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-v459n" event={"ID":"fe8ef603-6bf6-48aa-9f04-b368f8330c00","Type":"ContainerDied","Data":"d84882a4e02726053c282d668e413b7159202384600765fbb08728d126f30946"} Apr 24 15:07:25.081421 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:25.081203 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-v459n" Apr 24 15:07:25.081421 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:25.081230 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-v459n" event={"ID":"fe8ef603-6bf6-48aa-9f04-b368f8330c00","Type":"ContainerDied","Data":"d0c7afd8c28e99f29d5b6c865d4888e967fe9b8c8a14fd85350ddb08a47bdf3f"} Apr 24 15:07:25.081421 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:25.081250 2569 scope.go:117] "RemoveContainer" containerID="d84882a4e02726053c282d668e413b7159202384600765fbb08728d126f30946" Apr 24 15:07:25.086298 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:25.086272 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6c49b9bd4f-kcz7j" event={"ID":"3f4d9eb7-ace4-4441-af63-b3e87ca02e87","Type":"ContainerStarted","Data":"6545ac51a575fbd7f5204443c8fd939f3ccc1dd2d0e267f0958bca25472bdb1e"} Apr 24 15:07:25.086652 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:25.086626 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6c49b9bd4f-kcz7j" Apr 24 15:07:25.087605 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:25.087576 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6c49b9bd4f-kcz7j" podUID="3f4d9eb7-ace4-4441-af63-b3e87ca02e87" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 24 15:07:25.091974 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:25.091960 2569 scope.go:117] "RemoveContainer" containerID="d066153cf2a69fd4dd047b00a8c19603731b817c34fcd7835f85c3b340cd9a33" Apr 24 15:07:25.100422 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:25.100368 2569 scope.go:117] "RemoveContainer" containerID="d84882a4e02726053c282d668e413b7159202384600765fbb08728d126f30946" Apr 24 15:07:25.100818 ip-10-0-129-231 kubenswrapper[2569]: E0424 15:07:25.100783 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d84882a4e02726053c282d668e413b7159202384600765fbb08728d126f30946\": container with ID starting with d84882a4e02726053c282d668e413b7159202384600765fbb08728d126f30946 not found: ID does not exist" containerID="d84882a4e02726053c282d668e413b7159202384600765fbb08728d126f30946" Apr 24 15:07:25.100940 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:25.100826 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d84882a4e02726053c282d668e413b7159202384600765fbb08728d126f30946"} err="failed to get container status \"d84882a4e02726053c282d668e413b7159202384600765fbb08728d126f30946\": rpc error: code = NotFound desc = could not find container \"d84882a4e02726053c282d668e413b7159202384600765fbb08728d126f30946\": container with ID starting with d84882a4e02726053c282d668e413b7159202384600765fbb08728d126f30946 not found: ID does not exist" Apr 24 15:07:25.100940 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:25.100852 2569 scope.go:117] "RemoveContainer" containerID="d066153cf2a69fd4dd047b00a8c19603731b817c34fcd7835f85c3b340cd9a33" Apr 24 15:07:25.101230 ip-10-0-129-231 kubenswrapper[2569]: E0424 15:07:25.101206 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d066153cf2a69fd4dd047b00a8c19603731b817c34fcd7835f85c3b340cd9a33\": container with ID starting with d066153cf2a69fd4dd047b00a8c19603731b817c34fcd7835f85c3b340cd9a33 not found: ID does not exist" containerID="d066153cf2a69fd4dd047b00a8c19603731b817c34fcd7835f85c3b340cd9a33" Apr 24 15:07:25.101310 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:25.101240 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d066153cf2a69fd4dd047b00a8c19603731b817c34fcd7835f85c3b340cd9a33"} err="failed to get container status \"d066153cf2a69fd4dd047b00a8c19603731b817c34fcd7835f85c3b340cd9a33\": rpc error: code = NotFound desc = could not find container \"d066153cf2a69fd4dd047b00a8c19603731b817c34fcd7835f85c3b340cd9a33\": container with ID starting with d066153cf2a69fd4dd047b00a8c19603731b817c34fcd7835f85c3b340cd9a33 not found: ID does not exist" Apr 24 15:07:25.102075 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:25.102034 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6c49b9bd4f-kcz7j" podStartSLOduration=8.102021278 podStartE2EDuration="8.102021278s" podCreationTimestamp="2026-04-24 15:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 15:07:25.09921167 +0000 UTC m=+2597.893268571" watchObservedRunningTime="2026-04-24 15:07:25.102021278 +0000 UTC m=+2597.896078176" Apr 24 15:07:25.175864 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:25.175822 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fe8ef603-6bf6-48aa-9f04-b368f8330c00-kserve-provision-location\") pod \"fe8ef603-6bf6-48aa-9f04-b368f8330c00\" (UID: \"fe8ef603-6bf6-48aa-9f04-b368f8330c00\") " Apr 24 15:07:25.176156 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:25.176131 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe8ef603-6bf6-48aa-9f04-b368f8330c00-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "fe8ef603-6bf6-48aa-9f04-b368f8330c00" (UID: "fe8ef603-6bf6-48aa-9f04-b368f8330c00"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 15:07:25.277233 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:25.277194 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fe8ef603-6bf6-48aa-9f04-b368f8330c00-kserve-provision-location\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 15:07:25.401118 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:25.401079 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-v459n"] Apr 24 15:07:25.404949 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:25.404922 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-v459n"] Apr 24 15:07:25.827762 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:25.827731 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe8ef603-6bf6-48aa-9f04-b368f8330c00" path="/var/lib/kubelet/pods/fe8ef603-6bf6-48aa-9f04-b368f8330c00/volumes" Apr 24 15:07:26.090568 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:26.090480 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6c49b9bd4f-kcz7j" podUID="3f4d9eb7-ace4-4441-af63-b3e87ca02e87" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 24 15:07:36.091452 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:36.091392 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6c49b9bd4f-kcz7j" podUID="3f4d9eb7-ace4-4441-af63-b3e87ca02e87" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 24 15:07:46.091860 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:46.091826 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6c49b9bd4f-kcz7j" Apr 24 15:07:53.970945 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:53.970916 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-runtime-predictor-6c49b9bd4f-kcz7j_3f4d9eb7-ace4-4441-af63-b3e87ca02e87/kserve-container/0.log" Apr 24 15:07:54.093764 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:54.093727 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6c49b9bd4f-kcz7j"] Apr 24 15:07:54.094093 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:54.094066 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6c49b9bd4f-kcz7j" podUID="3f4d9eb7-ace4-4441-af63-b3e87ca02e87" containerName="kserve-container" containerID="cri-o://6545ac51a575fbd7f5204443c8fd939f3ccc1dd2d0e267f0958bca25472bdb1e" gracePeriod=30 Apr 24 15:07:54.137832 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:54.137798 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-8qm49"] Apr 24 15:07:54.138166 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:54.138153 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fe8ef603-6bf6-48aa-9f04-b368f8330c00" containerName="storage-initializer" Apr 24 15:07:54.138220 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:54.138167 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe8ef603-6bf6-48aa-9f04-b368f8330c00" containerName="storage-initializer" Apr 24 15:07:54.138220 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:54.138179 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fe8ef603-6bf6-48aa-9f04-b368f8330c00" containerName="kserve-container" Apr 24 15:07:54.138220 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:54.138185 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe8ef603-6bf6-48aa-9f04-b368f8330c00" containerName="kserve-container" Apr 24 15:07:54.138326 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:54.138231 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="fe8ef603-6bf6-48aa-9f04-b368f8330c00" containerName="kserve-container" Apr 24 15:07:54.140255 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:54.140228 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-8qm49" Apr 24 15:07:54.149065 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:54.149035 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-8qm49"] Apr 24 15:07:54.219946 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:54.219914 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7276100c-d4d7-482d-8f37-cfc545d81d13-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-7dd9b85c64-8qm49\" (UID: \"7276100c-d4d7-482d-8f37-cfc545d81d13\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-8qm49" Apr 24 15:07:54.321172 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:54.321065 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7276100c-d4d7-482d-8f37-cfc545d81d13-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-7dd9b85c64-8qm49\" (UID: \"7276100c-d4d7-482d-8f37-cfc545d81d13\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-8qm49" Apr 24 15:07:54.321429 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:54.321410 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7276100c-d4d7-482d-8f37-cfc545d81d13-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-7dd9b85c64-8qm49\" (UID: \"7276100c-d4d7-482d-8f37-cfc545d81d13\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-8qm49" Apr 24 15:07:54.451466 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:54.451429 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-8qm49" Apr 24 15:07:54.570808 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:54.570770 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-8qm49"] Apr 24 15:07:54.574283 ip-10-0-129-231 kubenswrapper[2569]: W0424 15:07:54.574253 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7276100c_d4d7_482d_8f37_cfc545d81d13.slice/crio-7a576780225a0112148d22d2282f63b3b5b976e66637001cebdb2d6a37a785d6 WatchSource:0}: Error finding container 7a576780225a0112148d22d2282f63b3b5b976e66637001cebdb2d6a37a785d6: Status 404 returned error can't find the container with id 7a576780225a0112148d22d2282f63b3b5b976e66637001cebdb2d6a37a785d6 Apr 24 15:07:55.187034 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:55.186994 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-8qm49" event={"ID":"7276100c-d4d7-482d-8f37-cfc545d81d13","Type":"ContainerStarted","Data":"5111a24dcc159c389733b908d474cc495cceec55dc8be34349ef8e1a9c689f76"} Apr 24 15:07:55.187034 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:55.187037 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-8qm49" event={"ID":"7276100c-d4d7-482d-8f37-cfc545d81d13","Type":"ContainerStarted","Data":"7a576780225a0112148d22d2282f63b3b5b976e66637001cebdb2d6a37a785d6"} Apr 24 15:07:55.314944 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:55.314921 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6c49b9bd4f-kcz7j" Apr 24 15:07:55.428866 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:55.428834 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3f4d9eb7-ace4-4441-af63-b3e87ca02e87-kserve-provision-location\") pod \"3f4d9eb7-ace4-4441-af63-b3e87ca02e87\" (UID: \"3f4d9eb7-ace4-4441-af63-b3e87ca02e87\") " Apr 24 15:07:55.453797 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:55.453752 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f4d9eb7-ace4-4441-af63-b3e87ca02e87-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3f4d9eb7-ace4-4441-af63-b3e87ca02e87" (UID: "3f4d9eb7-ace4-4441-af63-b3e87ca02e87"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 15:07:55.530280 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:55.530238 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3f4d9eb7-ace4-4441-af63-b3e87ca02e87-kserve-provision-location\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 15:07:56.191240 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:56.191205 2569 generic.go:358] "Generic (PLEG): container finished" podID="3f4d9eb7-ace4-4441-af63-b3e87ca02e87" containerID="6545ac51a575fbd7f5204443c8fd939f3ccc1dd2d0e267f0958bca25472bdb1e" exitCode=0 Apr 24 15:07:56.191697 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:56.191275 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6c49b9bd4f-kcz7j" Apr 24 15:07:56.191697 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:56.191292 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6c49b9bd4f-kcz7j" event={"ID":"3f4d9eb7-ace4-4441-af63-b3e87ca02e87","Type":"ContainerDied","Data":"6545ac51a575fbd7f5204443c8fd939f3ccc1dd2d0e267f0958bca25472bdb1e"} Apr 24 15:07:56.191697 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:56.191332 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6c49b9bd4f-kcz7j" event={"ID":"3f4d9eb7-ace4-4441-af63-b3e87ca02e87","Type":"ContainerDied","Data":"ffd3aff92c24bb854fbd0564ad3155f229959f5e763ed699ece12b9cc1910467"} Apr 24 15:07:56.191697 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:56.191348 2569 scope.go:117] "RemoveContainer" containerID="6545ac51a575fbd7f5204443c8fd939f3ccc1dd2d0e267f0958bca25472bdb1e" Apr 24 15:07:56.199112 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:56.199080 2569 scope.go:117] "RemoveContainer" containerID="ee66e97c2223e7fbed8f6f588aeabed3d3c241e80aa891af39e05ec6efc21d80" Apr 24 15:07:56.207421 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:56.207391 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6c49b9bd4f-kcz7j"] Apr 24 15:07:56.208204 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:56.208071 2569 scope.go:117] "RemoveContainer" containerID="6545ac51a575fbd7f5204443c8fd939f3ccc1dd2d0e267f0958bca25472bdb1e" Apr 24 15:07:56.208432 ip-10-0-129-231 kubenswrapper[2569]: E0424 15:07:56.208407 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6545ac51a575fbd7f5204443c8fd939f3ccc1dd2d0e267f0958bca25472bdb1e\": container with ID starting with 6545ac51a575fbd7f5204443c8fd939f3ccc1dd2d0e267f0958bca25472bdb1e not found: ID does not exist" containerID="6545ac51a575fbd7f5204443c8fd939f3ccc1dd2d0e267f0958bca25472bdb1e" Apr 24 15:07:56.208488 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:56.208444 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6545ac51a575fbd7f5204443c8fd939f3ccc1dd2d0e267f0958bca25472bdb1e"} err="failed to get container status \"6545ac51a575fbd7f5204443c8fd939f3ccc1dd2d0e267f0958bca25472bdb1e\": rpc error: code = NotFound desc = could not find container \"6545ac51a575fbd7f5204443c8fd939f3ccc1dd2d0e267f0958bca25472bdb1e\": container with ID starting with 6545ac51a575fbd7f5204443c8fd939f3ccc1dd2d0e267f0958bca25472bdb1e not found: ID does not exist" Apr 24 15:07:56.208488 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:56.208472 2569 scope.go:117] "RemoveContainer" containerID="ee66e97c2223e7fbed8f6f588aeabed3d3c241e80aa891af39e05ec6efc21d80" Apr 24 15:07:56.208731 ip-10-0-129-231 kubenswrapper[2569]: E0424 15:07:56.208710 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee66e97c2223e7fbed8f6f588aeabed3d3c241e80aa891af39e05ec6efc21d80\": container with ID starting with ee66e97c2223e7fbed8f6f588aeabed3d3c241e80aa891af39e05ec6efc21d80 not found: ID does not exist" containerID="ee66e97c2223e7fbed8f6f588aeabed3d3c241e80aa891af39e05ec6efc21d80" Apr 24 15:07:56.208826 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:56.208735 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee66e97c2223e7fbed8f6f588aeabed3d3c241e80aa891af39e05ec6efc21d80"} err="failed to get container status \"ee66e97c2223e7fbed8f6f588aeabed3d3c241e80aa891af39e05ec6efc21d80\": rpc error: code = NotFound desc = could not find container \"ee66e97c2223e7fbed8f6f588aeabed3d3c241e80aa891af39e05ec6efc21d80\": container with ID starting with ee66e97c2223e7fbed8f6f588aeabed3d3c241e80aa891af39e05ec6efc21d80 not found: ID does not exist" Apr 24 15:07:56.208981 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:56.208965 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6c49b9bd4f-kcz7j"] Apr 24 15:07:57.831911 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:57.831859 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f4d9eb7-ace4-4441-af63-b3e87ca02e87" path="/var/lib/kubelet/pods/3f4d9eb7-ace4-4441-af63-b3e87ca02e87/volumes" Apr 24 15:07:59.201270 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:59.201241 2569 generic.go:358] "Generic (PLEG): container finished" podID="7276100c-d4d7-482d-8f37-cfc545d81d13" containerID="5111a24dcc159c389733b908d474cc495cceec55dc8be34349ef8e1a9c689f76" exitCode=0 Apr 24 15:07:59.201653 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:07:59.201289 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-8qm49" event={"ID":"7276100c-d4d7-482d-8f37-cfc545d81d13","Type":"ContainerDied","Data":"5111a24dcc159c389733b908d474cc495cceec55dc8be34349ef8e1a9c689f76"} Apr 24 15:08:00.209176 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:08:00.209139 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-8qm49" event={"ID":"7276100c-d4d7-482d-8f37-cfc545d81d13","Type":"ContainerStarted","Data":"ce169630ebe4cdb88376686306e3d44f94d11fe12008250b5ff9b2210873c143"} Apr 24 15:08:00.209605 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:08:00.209369 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-8qm49" Apr 24 15:08:00.226182 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:08:00.226132 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-8qm49" podStartSLOduration=6.226118115 podStartE2EDuration="6.226118115s" podCreationTimestamp="2026-04-24 15:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 15:08:00.223486047 +0000 UTC m=+2633.017542947" watchObservedRunningTime="2026-04-24 15:08:00.226118115 +0000 UTC m=+2633.020175018" Apr 24 15:08:31.275078 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:08:31.275033 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-8qm49" podUID="7276100c-d4d7-482d-8f37-cfc545d81d13" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 24 15:08:41.214929 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:08:41.214874 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-8qm49" Apr 24 15:08:44.241685 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:08:44.241643 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-8qm49"] Apr 24 15:08:44.242093 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:08:44.242042 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-8qm49" podUID="7276100c-d4d7-482d-8f37-cfc545d81d13" containerName="kserve-container" containerID="cri-o://ce169630ebe4cdb88376686306e3d44f94d11fe12008250b5ff9b2210873c143" gracePeriod=30 Apr 24 15:08:44.290628 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:08:44.290579 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-9f94dd8ff-64xbx"] Apr 24 15:08:44.290930 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:08:44.290914 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3f4d9eb7-ace4-4441-af63-b3e87ca02e87" containerName="kserve-container" Apr 24 15:08:44.290997 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:08:44.290933 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f4d9eb7-ace4-4441-af63-b3e87ca02e87" containerName="kserve-container" Apr 24 15:08:44.290997 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:08:44.290952 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3f4d9eb7-ace4-4441-af63-b3e87ca02e87" containerName="storage-initializer" Apr 24 15:08:44.290997 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:08:44.290961 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f4d9eb7-ace4-4441-af63-b3e87ca02e87" containerName="storage-initializer" Apr 24 15:08:44.291152 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:08:44.291053 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="3f4d9eb7-ace4-4441-af63-b3e87ca02e87" containerName="kserve-container" Apr 24 15:08:44.293411 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:08:44.293388 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-9f94dd8ff-64xbx" Apr 24 15:08:44.302355 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:08:44.302332 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-9f94dd8ff-64xbx"] Apr 24 15:08:44.343026 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:08:44.342990 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5e3c6478-4747-4e25-b50e-5d32a2fc32ea-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-9f94dd8ff-64xbx\" (UID: \"5e3c6478-4747-4e25-b50e-5d32a2fc32ea\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-9f94dd8ff-64xbx" Apr 24 15:08:44.443644 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:08:44.443589 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5e3c6478-4747-4e25-b50e-5d32a2fc32ea-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-9f94dd8ff-64xbx\" (UID: \"5e3c6478-4747-4e25-b50e-5d32a2fc32ea\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-9f94dd8ff-64xbx" Apr 24 15:08:44.444045 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:08:44.444024 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5e3c6478-4747-4e25-b50e-5d32a2fc32ea-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-9f94dd8ff-64xbx\" (UID: \"5e3c6478-4747-4e25-b50e-5d32a2fc32ea\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-9f94dd8ff-64xbx" Apr 24 15:08:44.604690 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:08:44.604596 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-9f94dd8ff-64xbx" Apr 24 15:08:44.721533 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:08:44.721368 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-9f94dd8ff-64xbx"] Apr 24 15:08:44.724354 ip-10-0-129-231 kubenswrapper[2569]: W0424 15:08:44.724309 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e3c6478_4747_4e25_b50e_5d32a2fc32ea.slice/crio-125cf368ebcd36ecaf950237d0b68e4367b935ce1870646ae60dd8655ad248bc WatchSource:0}: Error finding container 125cf368ebcd36ecaf950237d0b68e4367b935ce1870646ae60dd8655ad248bc: Status 404 returned error can't find the container with id 125cf368ebcd36ecaf950237d0b68e4367b935ce1870646ae60dd8655ad248bc Apr 24 15:08:45.343769 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:08:45.343737 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-9f94dd8ff-64xbx" event={"ID":"5e3c6478-4747-4e25-b50e-5d32a2fc32ea","Type":"ContainerStarted","Data":"21145fcf81fe588d3ab948e4700dd2cf581e04686d3ed5e650342a075675b9d8"} Apr 24 15:08:45.344180 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:08:45.343775 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-9f94dd8ff-64xbx" event={"ID":"5e3c6478-4747-4e25-b50e-5d32a2fc32ea","Type":"ContainerStarted","Data":"125cf368ebcd36ecaf950237d0b68e4367b935ce1870646ae60dd8655ad248bc"} Apr 24 15:08:48.354409 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:08:48.354374 2569 generic.go:358] "Generic (PLEG): container finished" podID="5e3c6478-4747-4e25-b50e-5d32a2fc32ea" containerID="21145fcf81fe588d3ab948e4700dd2cf581e04686d3ed5e650342a075675b9d8" exitCode=0 Apr 24 15:08:48.354764 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:08:48.354420 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-9f94dd8ff-64xbx" event={"ID":"5e3c6478-4747-4e25-b50e-5d32a2fc32ea","Type":"ContainerDied","Data":"21145fcf81fe588d3ab948e4700dd2cf581e04686d3ed5e650342a075675b9d8"} Apr 24 15:08:49.358970 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:08:49.358935 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-9f94dd8ff-64xbx" event={"ID":"5e3c6478-4747-4e25-b50e-5d32a2fc32ea","Type":"ContainerStarted","Data":"6a67ef78abd67bc71b32bda473bbf42b5bf59b597a265ac9beb1e6ea9c355584"} Apr 24 15:08:49.359445 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:08:49.359245 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-9f94dd8ff-64xbx" Apr 24 15:08:49.360680 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:08:49.360656 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-9f94dd8ff-64xbx" podUID="5e3c6478-4747-4e25-b50e-5d32a2fc32ea" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 24 15:08:49.376887 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:08:49.376829 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-9f94dd8ff-64xbx" podStartSLOduration=5.376815291 podStartE2EDuration="5.376815291s" podCreationTimestamp="2026-04-24 15:08:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 15:08:49.375001669 +0000 UTC m=+2682.169058595" watchObservedRunningTime="2026-04-24 15:08:49.376815291 +0000 UTC m=+2682.170872192" Apr 24 15:08:50.362746 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:08:50.362706 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-9f94dd8ff-64xbx" podUID="5e3c6478-4747-4e25-b50e-5d32a2fc32ea" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 24 15:08:51.212991 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:08:51.212943 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-8qm49" podUID="7276100c-d4d7-482d-8f37-cfc545d81d13" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.50:8080/v2/models/isvc-sklearn-v2-runtime/ready\": dial tcp 10.134.0.50:8080: connect: connection refused" Apr 24 15:08:52.081170 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:08:52.081147 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-8qm49" Apr 24 15:08:52.215409 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:08:52.215373 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7276100c-d4d7-482d-8f37-cfc545d81d13-kserve-provision-location\") pod \"7276100c-d4d7-482d-8f37-cfc545d81d13\" (UID: \"7276100c-d4d7-482d-8f37-cfc545d81d13\") " Apr 24 15:08:52.215717 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:08:52.215691 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7276100c-d4d7-482d-8f37-cfc545d81d13-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7276100c-d4d7-482d-8f37-cfc545d81d13" (UID: "7276100c-d4d7-482d-8f37-cfc545d81d13"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 15:08:52.316799 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:08:52.316760 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7276100c-d4d7-482d-8f37-cfc545d81d13-kserve-provision-location\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 15:08:52.369312 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:08:52.369279 2569 generic.go:358] "Generic (PLEG): container finished" podID="7276100c-d4d7-482d-8f37-cfc545d81d13" containerID="ce169630ebe4cdb88376686306e3d44f94d11fe12008250b5ff9b2210873c143" exitCode=0 Apr 24 15:08:52.369512 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:08:52.369333 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-8qm49" event={"ID":"7276100c-d4d7-482d-8f37-cfc545d81d13","Type":"ContainerDied","Data":"ce169630ebe4cdb88376686306e3d44f94d11fe12008250b5ff9b2210873c143"} Apr 24 15:08:52.369512 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:08:52.369349 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-8qm49" Apr 24 15:08:52.369512 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:08:52.369374 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-8qm49" event={"ID":"7276100c-d4d7-482d-8f37-cfc545d81d13","Type":"ContainerDied","Data":"7a576780225a0112148d22d2282f63b3b5b976e66637001cebdb2d6a37a785d6"} Apr 24 15:08:52.369512 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:08:52.369391 2569 scope.go:117] "RemoveContainer" containerID="ce169630ebe4cdb88376686306e3d44f94d11fe12008250b5ff9b2210873c143" Apr 24 15:08:52.377612 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:08:52.377592 2569 scope.go:117] "RemoveContainer" containerID="5111a24dcc159c389733b908d474cc495cceec55dc8be34349ef8e1a9c689f76" Apr 24 15:08:52.385662 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:08:52.385641 2569 scope.go:117] "RemoveContainer" containerID="ce169630ebe4cdb88376686306e3d44f94d11fe12008250b5ff9b2210873c143" Apr 24 15:08:52.385947 ip-10-0-129-231 kubenswrapper[2569]: E0424 15:08:52.385920 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce169630ebe4cdb88376686306e3d44f94d11fe12008250b5ff9b2210873c143\": container with ID starting with ce169630ebe4cdb88376686306e3d44f94d11fe12008250b5ff9b2210873c143 not found: ID does not exist" containerID="ce169630ebe4cdb88376686306e3d44f94d11fe12008250b5ff9b2210873c143" Apr 24 15:08:52.386040 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:08:52.385957 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce169630ebe4cdb88376686306e3d44f94d11fe12008250b5ff9b2210873c143"} err="failed to get container status \"ce169630ebe4cdb88376686306e3d44f94d11fe12008250b5ff9b2210873c143\": rpc error: code = NotFound desc = could not find container \"ce169630ebe4cdb88376686306e3d44f94d11fe12008250b5ff9b2210873c143\": container with ID starting with ce169630ebe4cdb88376686306e3d44f94d11fe12008250b5ff9b2210873c143 not found: ID does not exist" Apr 24 15:08:52.386040 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:08:52.385983 2569 scope.go:117] "RemoveContainer" containerID="5111a24dcc159c389733b908d474cc495cceec55dc8be34349ef8e1a9c689f76" Apr 24 15:08:52.386248 ip-10-0-129-231 kubenswrapper[2569]: E0424 15:08:52.386232 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5111a24dcc159c389733b908d474cc495cceec55dc8be34349ef8e1a9c689f76\": container with ID starting with 5111a24dcc159c389733b908d474cc495cceec55dc8be34349ef8e1a9c689f76 not found: ID does not exist" containerID="5111a24dcc159c389733b908d474cc495cceec55dc8be34349ef8e1a9c689f76" Apr 24 15:08:52.386295 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:08:52.386252 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5111a24dcc159c389733b908d474cc495cceec55dc8be34349ef8e1a9c689f76"} err="failed to get container status \"5111a24dcc159c389733b908d474cc495cceec55dc8be34349ef8e1a9c689f76\": rpc error: code = NotFound desc = could not find container \"5111a24dcc159c389733b908d474cc495cceec55dc8be34349ef8e1a9c689f76\": container with ID starting with 5111a24dcc159c389733b908d474cc495cceec55dc8be34349ef8e1a9c689f76 not found: ID does not exist" Apr 24 15:08:52.389611 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:08:52.389588 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-8qm49"] Apr 24 15:08:52.393166 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:08:52.393147 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-8qm49"] Apr 24 15:08:53.826980 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:08:53.826943 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7276100c-d4d7-482d-8f37-cfc545d81d13" path="/var/lib/kubelet/pods/7276100c-d4d7-482d-8f37-cfc545d81d13/volumes" Apr 24 15:09:00.363051 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:09:00.362964 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-9f94dd8ff-64xbx" podUID="5e3c6478-4747-4e25-b50e-5d32a2fc32ea" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 24 15:09:07.858490 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:09:07.858456 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mjpgb_be4da945-f6d1-4406-adea-f3ccedab88f6/ovn-acl-logging/0.log" Apr 24 15:09:07.866022 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:09:07.865999 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mjpgb_be4da945-f6d1-4406-adea-f3ccedab88f6/ovn-acl-logging/0.log" Apr 24 15:09:10.363624 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:09:10.363582 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-9f94dd8ff-64xbx" podUID="5e3c6478-4747-4e25-b50e-5d32a2fc32ea" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 24 15:09:20.363390 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:09:20.363341 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-9f94dd8ff-64xbx" podUID="5e3c6478-4747-4e25-b50e-5d32a2fc32ea" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 24 15:09:30.363792 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:09:30.363736 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-9f94dd8ff-64xbx" podUID="5e3c6478-4747-4e25-b50e-5d32a2fc32ea" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 24 15:09:40.363410 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:09:40.363363 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-9f94dd8ff-64xbx" podUID="5e3c6478-4747-4e25-b50e-5d32a2fc32ea" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 24 15:09:50.363038 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:09:50.362988 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-9f94dd8ff-64xbx" podUID="5e3c6478-4747-4e25-b50e-5d32a2fc32ea" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 24 15:10:00.364092 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:10:00.364058 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-9f94dd8ff-64xbx" Apr 24 15:10:04.478414 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:10:04.478375 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-9f94dd8ff-64xbx"] Apr 24 15:10:04.478830 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:10:04.478730 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-9f94dd8ff-64xbx" podUID="5e3c6478-4747-4e25-b50e-5d32a2fc32ea" containerName="kserve-container" containerID="cri-o://6a67ef78abd67bc71b32bda473bbf42b5bf59b597a265ac9beb1e6ea9c355584" gracePeriod=30 Apr 24 15:10:04.536282 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:10:04.536242 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-69777647bd-vszdk"] Apr 24 15:10:04.536616 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:10:04.536603 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7276100c-d4d7-482d-8f37-cfc545d81d13" containerName="storage-initializer" Apr 24 15:10:04.536665 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:10:04.536617 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="7276100c-d4d7-482d-8f37-cfc545d81d13" containerName="storage-initializer" Apr 24 15:10:04.536665 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:10:04.536627 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7276100c-d4d7-482d-8f37-cfc545d81d13" containerName="kserve-container" Apr 24 15:10:04.536665 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:10:04.536632 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="7276100c-d4d7-482d-8f37-cfc545d81d13" containerName="kserve-container" Apr 24 15:10:04.536767 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:10:04.536687 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="7276100c-d4d7-482d-8f37-cfc545d81d13" containerName="kserve-container" Apr 24 15:10:04.538691 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:10:04.538669 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-69777647bd-vszdk" Apr 24 15:10:04.546247 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:10:04.546216 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-69777647bd-vszdk"] Apr 24 15:10:04.633249 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:10:04.633215 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5323fef3-be31-4fba-b641-6b86de166464-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-69777647bd-vszdk\" (UID: \"5323fef3-be31-4fba-b641-6b86de166464\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-69777647bd-vszdk" Apr 24 15:10:04.734343 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:10:04.734249 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5323fef3-be31-4fba-b641-6b86de166464-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-69777647bd-vszdk\" (UID: \"5323fef3-be31-4fba-b641-6b86de166464\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-69777647bd-vszdk" Apr 24 15:10:04.734635 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:10:04.734613 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5323fef3-be31-4fba-b641-6b86de166464-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-69777647bd-vszdk\" (UID: \"5323fef3-be31-4fba-b641-6b86de166464\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-69777647bd-vszdk" Apr 24 15:10:04.849731 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:10:04.849697 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-69777647bd-vszdk" Apr 24 15:10:04.969280 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:10:04.969249 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-69777647bd-vszdk"] Apr 24 15:10:04.973282 ip-10-0-129-231 kubenswrapper[2569]: W0424 15:10:04.973254 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5323fef3_be31_4fba_b641_6b86de166464.slice/crio-9c4a5a1c00f13f398525055b9ed7971390c87a7a31c719925d2b5d104c162083 WatchSource:0}: Error finding container 9c4a5a1c00f13f398525055b9ed7971390c87a7a31c719925d2b5d104c162083: Status 404 returned error can't find the container with id 9c4a5a1c00f13f398525055b9ed7971390c87a7a31c719925d2b5d104c162083 Apr 24 15:10:05.593918 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:10:05.593859 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-69777647bd-vszdk" event={"ID":"5323fef3-be31-4fba-b641-6b86de166464","Type":"ContainerStarted","Data":"4cbc28e7ac7f767192e6277f5df5bad5c411d86e2bdd21cd2b17008e3c047130"} Apr 24 15:10:05.594357 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:10:05.593928 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-69777647bd-vszdk" event={"ID":"5323fef3-be31-4fba-b641-6b86de166464","Type":"ContainerStarted","Data":"9c4a5a1c00f13f398525055b9ed7971390c87a7a31c719925d2b5d104c162083"} Apr 24 15:10:09.126100 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:10:09.126072 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-9f94dd8ff-64xbx" Apr 24 15:10:09.272042 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:10:09.271953 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5e3c6478-4747-4e25-b50e-5d32a2fc32ea-kserve-provision-location\") pod \"5e3c6478-4747-4e25-b50e-5d32a2fc32ea\" (UID: \"5e3c6478-4747-4e25-b50e-5d32a2fc32ea\") " Apr 24 15:10:09.272294 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:10:09.272267 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e3c6478-4747-4e25-b50e-5d32a2fc32ea-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5e3c6478-4747-4e25-b50e-5d32a2fc32ea" (UID: "5e3c6478-4747-4e25-b50e-5d32a2fc32ea"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 15:10:09.372604 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:10:09.372560 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5e3c6478-4747-4e25-b50e-5d32a2fc32ea-kserve-provision-location\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 15:10:09.607320 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:10:09.607228 2569 generic.go:358] "Generic (PLEG): container finished" podID="5e3c6478-4747-4e25-b50e-5d32a2fc32ea" containerID="6a67ef78abd67bc71b32bda473bbf42b5bf59b597a265ac9beb1e6ea9c355584" exitCode=0 Apr 24 15:10:09.607320 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:10:09.607291 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-9f94dd8ff-64xbx" Apr 24 15:10:09.607556 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:10:09.607315 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-9f94dd8ff-64xbx" event={"ID":"5e3c6478-4747-4e25-b50e-5d32a2fc32ea","Type":"ContainerDied","Data":"6a67ef78abd67bc71b32bda473bbf42b5bf59b597a265ac9beb1e6ea9c355584"} Apr 24 15:10:09.607556 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:10:09.607366 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-9f94dd8ff-64xbx" event={"ID":"5e3c6478-4747-4e25-b50e-5d32a2fc32ea","Type":"ContainerDied","Data":"125cf368ebcd36ecaf950237d0b68e4367b935ce1870646ae60dd8655ad248bc"} Apr 24 15:10:09.607556 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:10:09.607387 2569 scope.go:117] "RemoveContainer" containerID="6a67ef78abd67bc71b32bda473bbf42b5bf59b597a265ac9beb1e6ea9c355584" Apr 24 15:10:09.608655 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:10:09.608634 2569 generic.go:358] "Generic (PLEG): container finished" podID="5323fef3-be31-4fba-b641-6b86de166464" containerID="4cbc28e7ac7f767192e6277f5df5bad5c411d86e2bdd21cd2b17008e3c047130" exitCode=0 Apr 24 15:10:09.608748 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:10:09.608702 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-69777647bd-vszdk" event={"ID":"5323fef3-be31-4fba-b641-6b86de166464","Type":"ContainerDied","Data":"4cbc28e7ac7f767192e6277f5df5bad5c411d86e2bdd21cd2b17008e3c047130"} Apr 24 15:10:09.617082 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:10:09.616913 2569 scope.go:117] "RemoveContainer" containerID="21145fcf81fe588d3ab948e4700dd2cf581e04686d3ed5e650342a075675b9d8" Apr 24 15:10:09.624601 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:10:09.624580 2569 scope.go:117] "RemoveContainer" containerID="6a67ef78abd67bc71b32bda473bbf42b5bf59b597a265ac9beb1e6ea9c355584" Apr 24 15:10:09.624974 ip-10-0-129-231 kubenswrapper[2569]: E0424 15:10:09.624954 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a67ef78abd67bc71b32bda473bbf42b5bf59b597a265ac9beb1e6ea9c355584\": container with ID starting with 6a67ef78abd67bc71b32bda473bbf42b5bf59b597a265ac9beb1e6ea9c355584 not found: ID does not exist" containerID="6a67ef78abd67bc71b32bda473bbf42b5bf59b597a265ac9beb1e6ea9c355584" Apr 24 15:10:09.625051 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:10:09.624984 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a67ef78abd67bc71b32bda473bbf42b5bf59b597a265ac9beb1e6ea9c355584"} err="failed to get container status \"6a67ef78abd67bc71b32bda473bbf42b5bf59b597a265ac9beb1e6ea9c355584\": rpc error: code = NotFound desc = could not find container \"6a67ef78abd67bc71b32bda473bbf42b5bf59b597a265ac9beb1e6ea9c355584\": container with ID starting with 6a67ef78abd67bc71b32bda473bbf42b5bf59b597a265ac9beb1e6ea9c355584 not found: ID does not exist" Apr 24 15:10:09.625051 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:10:09.625010 2569 scope.go:117] "RemoveContainer" containerID="21145fcf81fe588d3ab948e4700dd2cf581e04686d3ed5e650342a075675b9d8" Apr 24 15:10:09.625315 ip-10-0-129-231 kubenswrapper[2569]: E0424 15:10:09.625286 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21145fcf81fe588d3ab948e4700dd2cf581e04686d3ed5e650342a075675b9d8\": container with ID starting with 21145fcf81fe588d3ab948e4700dd2cf581e04686d3ed5e650342a075675b9d8 not found: ID does not exist" containerID="21145fcf81fe588d3ab948e4700dd2cf581e04686d3ed5e650342a075675b9d8" Apr 24 15:10:09.625404 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:10:09.625322 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21145fcf81fe588d3ab948e4700dd2cf581e04686d3ed5e650342a075675b9d8"} err="failed to get container status \"21145fcf81fe588d3ab948e4700dd2cf581e04686d3ed5e650342a075675b9d8\": rpc error: code = NotFound desc = could not find container \"21145fcf81fe588d3ab948e4700dd2cf581e04686d3ed5e650342a075675b9d8\": container with ID starting with 21145fcf81fe588d3ab948e4700dd2cf581e04686d3ed5e650342a075675b9d8 not found: ID does not exist" Apr 24 15:10:09.635720 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:10:09.635693 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-9f94dd8ff-64xbx"] Apr 24 15:10:09.637407 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:10:09.637385 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-9f94dd8ff-64xbx"] Apr 24 15:10:09.827384 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:10:09.827343 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e3c6478-4747-4e25-b50e-5d32a2fc32ea" path="/var/lib/kubelet/pods/5e3c6478-4747-4e25-b50e-5d32a2fc32ea/volumes" Apr 24 15:10:10.614176 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:10:10.614141 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-69777647bd-vszdk" event={"ID":"5323fef3-be31-4fba-b641-6b86de166464","Type":"ContainerStarted","Data":"bbcaec0d14e1c093c034e692ce8ad2e5de498ec2822ae655281e30aafcb8708e"} Apr 24 15:10:10.614570 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:10:10.614423 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-69777647bd-vszdk" Apr 24 15:10:10.615652 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:10:10.615625 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-69777647bd-vszdk" podUID="5323fef3-be31-4fba-b641-6b86de166464" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.52:8080: connect: connection refused" Apr 24 15:10:10.629366 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:10:10.629313 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-69777647bd-vszdk" podStartSLOduration=6.629300345 podStartE2EDuration="6.629300345s" podCreationTimestamp="2026-04-24 15:10:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 15:10:10.627792212 +0000 UTC m=+2763.421849113" watchObservedRunningTime="2026-04-24 15:10:10.629300345 +0000 UTC m=+2763.423357245" Apr 24 15:10:11.618470 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:10:11.618428 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-69777647bd-vszdk" podUID="5323fef3-be31-4fba-b641-6b86de166464" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.52:8080: connect: connection refused" Apr 24 15:10:21.618990 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:10:21.618934 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-69777647bd-vszdk" podUID="5323fef3-be31-4fba-b641-6b86de166464" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.52:8080: connect: connection refused" Apr 24 15:10:31.618543 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:10:31.618452 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-69777647bd-vszdk" podUID="5323fef3-be31-4fba-b641-6b86de166464" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.52:8080: connect: connection refused" Apr 24 15:10:41.618591 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:10:41.618540 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-69777647bd-vszdk" podUID="5323fef3-be31-4fba-b641-6b86de166464" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.52:8080: connect: connection refused" Apr 24 15:10:51.618745 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:10:51.618693 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-69777647bd-vszdk" podUID="5323fef3-be31-4fba-b641-6b86de166464" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.52:8080: connect: connection refused" Apr 24 15:11:01.618689 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:11:01.618643 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-69777647bd-vszdk" podUID="5323fef3-be31-4fba-b641-6b86de166464" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.52:8080: connect: connection refused" Apr 24 15:11:11.619258 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:11:11.619205 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-69777647bd-vszdk" podUID="5323fef3-be31-4fba-b641-6b86de166464" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.52:8080: connect: connection refused" Apr 24 15:11:21.619961 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:11:21.619925 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-69777647bd-vszdk" Apr 24 15:11:24.674410 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:11:24.674375 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-69777647bd-vszdk"] Apr 24 15:11:24.674916 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:11:24.674624 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-69777647bd-vszdk" podUID="5323fef3-be31-4fba-b641-6b86de166464" containerName="kserve-container" containerID="cri-o://bbcaec0d14e1c093c034e692ce8ad2e5de498ec2822ae655281e30aafcb8708e" gracePeriod=30 Apr 24 15:11:24.722560 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:11:24.722516 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-cp9b8"] Apr 24 15:11:24.722923 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:11:24.722878 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5e3c6478-4747-4e25-b50e-5d32a2fc32ea" containerName="kserve-container" Apr 24 15:11:24.722923 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:11:24.722911 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e3c6478-4747-4e25-b50e-5d32a2fc32ea" containerName="kserve-container" Apr 24 15:11:24.723021 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:11:24.722926 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5e3c6478-4747-4e25-b50e-5d32a2fc32ea" containerName="storage-initializer" Apr 24 15:11:24.723021 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:11:24.722932 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e3c6478-4747-4e25-b50e-5d32a2fc32ea" containerName="storage-initializer" Apr 24 15:11:24.723021 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:11:24.722988 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="5e3c6478-4747-4e25-b50e-5d32a2fc32ea" containerName="kserve-container" Apr 24 15:11:24.726058 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:11:24.726040 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-cp9b8" Apr 24 15:11:24.732403 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:11:24.732304 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-cp9b8"] Apr 24 15:11:24.771466 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:11:24.771427 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3d3d6538-ed15-497a-a172-d6b084ebdfd2-kserve-provision-location\") pod \"isvc-tensorflow-predictor-88f6f6cb7-cp9b8\" (UID: \"3d3d6538-ed15-497a-a172-d6b084ebdfd2\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-cp9b8" Apr 24 15:11:24.872061 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:11:24.872019 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3d3d6538-ed15-497a-a172-d6b084ebdfd2-kserve-provision-location\") pod \"isvc-tensorflow-predictor-88f6f6cb7-cp9b8\" (UID: \"3d3d6538-ed15-497a-a172-d6b084ebdfd2\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-cp9b8" Apr 24 15:11:24.872414 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:11:24.872393 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3d3d6538-ed15-497a-a172-d6b084ebdfd2-kserve-provision-location\") pod \"isvc-tensorflow-predictor-88f6f6cb7-cp9b8\" (UID: \"3d3d6538-ed15-497a-a172-d6b084ebdfd2\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-cp9b8" Apr 24 15:11:25.037968 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:11:25.037830 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-cp9b8" Apr 24 15:11:25.158040 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:11:25.158006 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-cp9b8"] Apr 24 15:11:25.161420 ip-10-0-129-231 kubenswrapper[2569]: W0424 15:11:25.161382 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d3d6538_ed15_497a_a172_d6b084ebdfd2.slice/crio-63120cfdf50b264c924f973aa3f24f76b1ee37906907c07778b74ed9c6616a11 WatchSource:0}: Error finding container 63120cfdf50b264c924f973aa3f24f76b1ee37906907c07778b74ed9c6616a11: Status 404 returned error can't find the container with id 63120cfdf50b264c924f973aa3f24f76b1ee37906907c07778b74ed9c6616a11 Apr 24 15:11:25.163340 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:11:25.163322 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 15:11:25.840788 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:11:25.840746 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-cp9b8" event={"ID":"3d3d6538-ed15-497a-a172-d6b084ebdfd2","Type":"ContainerStarted","Data":"10af17cc31acaa07ad91eda6267b5595806ab733b495519eac306a72e2e9d104"} Apr 24 15:11:25.840788 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:11:25.840788 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-cp9b8" event={"ID":"3d3d6538-ed15-497a-a172-d6b084ebdfd2","Type":"ContainerStarted","Data":"63120cfdf50b264c924f973aa3f24f76b1ee37906907c07778b74ed9c6616a11"} Apr 24 15:11:29.429677 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:11:29.429648 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-69777647bd-vszdk" Apr 24 15:11:29.509288 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:11:29.509194 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5323fef3-be31-4fba-b641-6b86de166464-kserve-provision-location\") pod \"5323fef3-be31-4fba-b641-6b86de166464\" (UID: \"5323fef3-be31-4fba-b641-6b86de166464\") " Apr 24 15:11:29.509750 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:11:29.509720 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5323fef3-be31-4fba-b641-6b86de166464-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5323fef3-be31-4fba-b641-6b86de166464" (UID: "5323fef3-be31-4fba-b641-6b86de166464"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 15:11:29.609803 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:11:29.609768 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5323fef3-be31-4fba-b641-6b86de166464-kserve-provision-location\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 15:11:29.854317 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:11:29.854290 2569 generic.go:358] "Generic (PLEG): container finished" podID="5323fef3-be31-4fba-b641-6b86de166464" containerID="bbcaec0d14e1c093c034e692ce8ad2e5de498ec2822ae655281e30aafcb8708e" exitCode=0 Apr 24 15:11:29.854525 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:11:29.854352 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-69777647bd-vszdk" event={"ID":"5323fef3-be31-4fba-b641-6b86de166464","Type":"ContainerDied","Data":"bbcaec0d14e1c093c034e692ce8ad2e5de498ec2822ae655281e30aafcb8708e"} Apr 24 15:11:29.854525 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:11:29.854357 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-69777647bd-vszdk" Apr 24 15:11:29.854525 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:11:29.854379 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-69777647bd-vszdk" event={"ID":"5323fef3-be31-4fba-b641-6b86de166464","Type":"ContainerDied","Data":"9c4a5a1c00f13f398525055b9ed7971390c87a7a31c719925d2b5d104c162083"} Apr 24 15:11:29.854525 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:11:29.854397 2569 scope.go:117] "RemoveContainer" containerID="bbcaec0d14e1c093c034e692ce8ad2e5de498ec2822ae655281e30aafcb8708e" Apr 24 15:11:29.862442 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:11:29.862414 2569 scope.go:117] "RemoveContainer" containerID="4cbc28e7ac7f767192e6277f5df5bad5c411d86e2bdd21cd2b17008e3c047130" Apr 24 15:11:29.870187 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:11:29.870169 2569 scope.go:117] "RemoveContainer" containerID="bbcaec0d14e1c093c034e692ce8ad2e5de498ec2822ae655281e30aafcb8708e" Apr 24 15:11:29.870278 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:11:29.870235 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-69777647bd-vszdk"] Apr 24 15:11:29.870499 ip-10-0-129-231 kubenswrapper[2569]: E0424 15:11:29.870479 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbcaec0d14e1c093c034e692ce8ad2e5de498ec2822ae655281e30aafcb8708e\": container with ID starting with bbcaec0d14e1c093c034e692ce8ad2e5de498ec2822ae655281e30aafcb8708e not found: ID does not exist" containerID="bbcaec0d14e1c093c034e692ce8ad2e5de498ec2822ae655281e30aafcb8708e" Apr 24 15:11:29.870549 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:11:29.870509 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbcaec0d14e1c093c034e692ce8ad2e5de498ec2822ae655281e30aafcb8708e"} err="failed to get container status \"bbcaec0d14e1c093c034e692ce8ad2e5de498ec2822ae655281e30aafcb8708e\": rpc error: code = NotFound desc = could not find container \"bbcaec0d14e1c093c034e692ce8ad2e5de498ec2822ae655281e30aafcb8708e\": container with ID starting with bbcaec0d14e1c093c034e692ce8ad2e5de498ec2822ae655281e30aafcb8708e not found: ID does not exist" Apr 24 15:11:29.870549 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:11:29.870528 2569 scope.go:117] "RemoveContainer" containerID="4cbc28e7ac7f767192e6277f5df5bad5c411d86e2bdd21cd2b17008e3c047130" Apr 24 15:11:29.870808 ip-10-0-129-231 kubenswrapper[2569]: E0424 15:11:29.870792 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cbc28e7ac7f767192e6277f5df5bad5c411d86e2bdd21cd2b17008e3c047130\": container with ID starting with 4cbc28e7ac7f767192e6277f5df5bad5c411d86e2bdd21cd2b17008e3c047130 not found: ID does not exist" containerID="4cbc28e7ac7f767192e6277f5df5bad5c411d86e2bdd21cd2b17008e3c047130" Apr 24 15:11:29.870870 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:11:29.870816 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cbc28e7ac7f767192e6277f5df5bad5c411d86e2bdd21cd2b17008e3c047130"} err="failed to get container status \"4cbc28e7ac7f767192e6277f5df5bad5c411d86e2bdd21cd2b17008e3c047130\": rpc error: code = NotFound desc = could not find container \"4cbc28e7ac7f767192e6277f5df5bad5c411d86e2bdd21cd2b17008e3c047130\": container with ID starting with 4cbc28e7ac7f767192e6277f5df5bad5c411d86e2bdd21cd2b17008e3c047130 not found: ID does not exist" Apr 24 15:11:29.874055 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:11:29.874034 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-69777647bd-vszdk"] Apr 24 15:11:30.858459 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:11:30.858427 2569 generic.go:358] "Generic (PLEG): container finished" podID="3d3d6538-ed15-497a-a172-d6b084ebdfd2" containerID="10af17cc31acaa07ad91eda6267b5595806ab733b495519eac306a72e2e9d104" exitCode=0 Apr 24 15:11:30.858949 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:11:30.858499 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-cp9b8" event={"ID":"3d3d6538-ed15-497a-a172-d6b084ebdfd2","Type":"ContainerDied","Data":"10af17cc31acaa07ad91eda6267b5595806ab733b495519eac306a72e2e9d104"} Apr 24 15:11:31.829351 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:11:31.829301 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5323fef3-be31-4fba-b641-6b86de166464" path="/var/lib/kubelet/pods/5323fef3-be31-4fba-b641-6b86de166464/volumes" Apr 24 15:11:34.874524 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:11:34.874485 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-cp9b8" event={"ID":"3d3d6538-ed15-497a-a172-d6b084ebdfd2","Type":"ContainerStarted","Data":"95269eb1b5621687ed35fac36e5f9664a1dfc33b0f3085e973c72562b235ab59"} Apr 24 15:11:34.874990 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:11:34.874777 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-cp9b8" Apr 24 15:11:34.876342 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:11:34.876315 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-cp9b8" podUID="3d3d6538-ed15-497a-a172-d6b084ebdfd2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.53:8080: connect: connection refused" Apr 24 15:11:34.890487 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:11:34.890432 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-cp9b8" podStartSLOduration=7.10584686 podStartE2EDuration="10.890417244s" podCreationTimestamp="2026-04-24 15:11:24 +0000 UTC" firstStartedPulling="2026-04-24 15:11:30.862076876 +0000 UTC m=+2843.656133755" lastFinishedPulling="2026-04-24 15:11:34.646647258 +0000 UTC m=+2847.440704139" observedRunningTime="2026-04-24 15:11:34.890245703 +0000 UTC m=+2847.684302638" watchObservedRunningTime="2026-04-24 15:11:34.890417244 +0000 UTC m=+2847.684474147" Apr 24 15:11:35.878419 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:11:35.878372 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-cp9b8" podUID="3d3d6538-ed15-497a-a172-d6b084ebdfd2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.53:8080: connect: connection refused" Apr 24 15:11:45.878999 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:11:45.878944 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-cp9b8" podUID="3d3d6538-ed15-497a-a172-d6b084ebdfd2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.53:8080: connect: connection refused" Apr 24 15:11:55.879360 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:11:55.879331 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-cp9b8" Apr 24 15:12:16.194991 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:12:16.194952 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-cp9b8"] Apr 24 15:12:16.195414 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:12:16.195346 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-cp9b8" podUID="3d3d6538-ed15-497a-a172-d6b084ebdfd2" containerName="kserve-container" containerID="cri-o://95269eb1b5621687ed35fac36e5f9664a1dfc33b0f3085e973c72562b235ab59" gracePeriod=30 Apr 24 15:12:16.262145 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:12:16.259547 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-xsxv6"] Apr 24 15:12:16.262145 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:12:16.260260 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5323fef3-be31-4fba-b641-6b86de166464" containerName="storage-initializer" Apr 24 15:12:16.262145 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:12:16.260282 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="5323fef3-be31-4fba-b641-6b86de166464" containerName="storage-initializer" Apr 24 15:12:16.262145 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:12:16.260325 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5323fef3-be31-4fba-b641-6b86de166464" containerName="kserve-container" Apr 24 15:12:16.262145 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:12:16.260334 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="5323fef3-be31-4fba-b641-6b86de166464" containerName="kserve-container" Apr 24 15:12:16.262145 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:12:16.260469 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="5323fef3-be31-4fba-b641-6b86de166464" containerName="kserve-container" Apr 24 15:12:16.264453 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:12:16.264427 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-xsxv6" Apr 24 15:12:16.268005 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:12:16.267978 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-xsxv6"] Apr 24 15:12:16.316802 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:12:16.316764 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/346e0e01-8a4b-47b3-812f-0b5d36a7e508-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-854bdff69c-xsxv6\" (UID: \"346e0e01-8a4b-47b3-812f-0b5d36a7e508\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-xsxv6" Apr 24 15:12:16.418073 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:12:16.418036 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/346e0e01-8a4b-47b3-812f-0b5d36a7e508-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-854bdff69c-xsxv6\" (UID: \"346e0e01-8a4b-47b3-812f-0b5d36a7e508\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-xsxv6" Apr 24 15:12:16.418469 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:12:16.418442 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/346e0e01-8a4b-47b3-812f-0b5d36a7e508-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-854bdff69c-xsxv6\" (UID: \"346e0e01-8a4b-47b3-812f-0b5d36a7e508\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-xsxv6" Apr 24 15:12:16.576342 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:12:16.576249 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-xsxv6" Apr 24 15:12:16.698983 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:12:16.698957 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-xsxv6"] Apr 24 15:12:16.701667 ip-10-0-129-231 kubenswrapper[2569]: W0424 15:12:16.701627 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod346e0e01_8a4b_47b3_812f_0b5d36a7e508.slice/crio-94b5629acc5cf20dfebad2738e32c2ac277612cc55d31f87c2670912ea06d01e WatchSource:0}: Error finding container 94b5629acc5cf20dfebad2738e32c2ac277612cc55d31f87c2670912ea06d01e: Status 404 returned error can't find the container with id 94b5629acc5cf20dfebad2738e32c2ac277612cc55d31f87c2670912ea06d01e Apr 24 15:12:17.000949 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:12:17.000906 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-xsxv6" event={"ID":"346e0e01-8a4b-47b3-812f-0b5d36a7e508","Type":"ContainerStarted","Data":"f6b493aea603d838954f84315d1ec8738b30304f7bd45788126b873423d4508d"} Apr 24 15:12:17.000949 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:12:17.000954 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-xsxv6" event={"ID":"346e0e01-8a4b-47b3-812f-0b5d36a7e508","Type":"ContainerStarted","Data":"94b5629acc5cf20dfebad2738e32c2ac277612cc55d31f87c2670912ea06d01e"} Apr 24 15:12:21.013153 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:12:21.013119 2569 generic.go:358] "Generic (PLEG): container finished" podID="346e0e01-8a4b-47b3-812f-0b5d36a7e508" containerID="f6b493aea603d838954f84315d1ec8738b30304f7bd45788126b873423d4508d" exitCode=0 Apr 24 15:12:21.013537 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:12:21.013191 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-xsxv6" event={"ID":"346e0e01-8a4b-47b3-812f-0b5d36a7e508","Type":"ContainerDied","Data":"f6b493aea603d838954f84315d1ec8738b30304f7bd45788126b873423d4508d"} Apr 24 15:12:22.017750 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:12:22.017712 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-xsxv6" event={"ID":"346e0e01-8a4b-47b3-812f-0b5d36a7e508","Type":"ContainerStarted","Data":"a7ba5f2394054f3d7cd555b138284fdd465999e259e8de2c61b27218d9b47b0e"} Apr 24 15:12:22.018160 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:12:22.018077 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-xsxv6" Apr 24 15:12:22.019377 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:12:22.019346 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-xsxv6" podUID="346e0e01-8a4b-47b3-812f-0b5d36a7e508" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 24 15:12:22.034976 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:12:22.034927 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-xsxv6" podStartSLOduration=6.034913177 podStartE2EDuration="6.034913177s" podCreationTimestamp="2026-04-24 15:12:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 15:12:22.032784546 +0000 UTC m=+2894.826841530" watchObservedRunningTime="2026-04-24 15:12:22.034913177 +0000 UTC m=+2894.828970069" Apr 24 15:12:23.021419 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:12:23.021373 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-xsxv6" podUID="346e0e01-8a4b-47b3-812f-0b5d36a7e508" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 24 15:12:33.022621 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:12:33.022585 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-xsxv6" Apr 24 15:12:46.831403 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:12:46.831371 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-cp9b8" Apr 24 15:12:46.978804 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:12:46.978688 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3d3d6538-ed15-497a-a172-d6b084ebdfd2-kserve-provision-location\") pod \"3d3d6538-ed15-497a-a172-d6b084ebdfd2\" (UID: \"3d3d6538-ed15-497a-a172-d6b084ebdfd2\") " Apr 24 15:12:46.989815 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:12:46.989778 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d3d6538-ed15-497a-a172-d6b084ebdfd2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3d3d6538-ed15-497a-a172-d6b084ebdfd2" (UID: "3d3d6538-ed15-497a-a172-d6b084ebdfd2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 15:12:47.079590 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:12:47.079554 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3d3d6538-ed15-497a-a172-d6b084ebdfd2-kserve-provision-location\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 15:12:47.094282 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:12:47.094247 2569 generic.go:358] "Generic (PLEG): container finished" podID="3d3d6538-ed15-497a-a172-d6b084ebdfd2" containerID="95269eb1b5621687ed35fac36e5f9664a1dfc33b0f3085e973c72562b235ab59" exitCode=137 Apr 24 15:12:47.094463 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:12:47.094323 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-cp9b8" Apr 24 15:12:47.094463 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:12:47.094335 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-cp9b8" event={"ID":"3d3d6538-ed15-497a-a172-d6b084ebdfd2","Type":"ContainerDied","Data":"95269eb1b5621687ed35fac36e5f9664a1dfc33b0f3085e973c72562b235ab59"} Apr 24 15:12:47.094463 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:12:47.094375 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-cp9b8" event={"ID":"3d3d6538-ed15-497a-a172-d6b084ebdfd2","Type":"ContainerDied","Data":"63120cfdf50b264c924f973aa3f24f76b1ee37906907c07778b74ed9c6616a11"} Apr 24 15:12:47.094463 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:12:47.094391 2569 scope.go:117] "RemoveContainer" containerID="95269eb1b5621687ed35fac36e5f9664a1dfc33b0f3085e973c72562b235ab59" Apr 24 15:12:47.103327 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:12:47.103267 2569 scope.go:117] "RemoveContainer" containerID="10af17cc31acaa07ad91eda6267b5595806ab733b495519eac306a72e2e9d104" Apr 24 15:12:47.110888 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:12:47.110872 2569 scope.go:117] "RemoveContainer" containerID="95269eb1b5621687ed35fac36e5f9664a1dfc33b0f3085e973c72562b235ab59" Apr 24 15:12:47.111178 ip-10-0-129-231 kubenswrapper[2569]: E0424 15:12:47.111161 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95269eb1b5621687ed35fac36e5f9664a1dfc33b0f3085e973c72562b235ab59\": container with ID starting with 95269eb1b5621687ed35fac36e5f9664a1dfc33b0f3085e973c72562b235ab59 not found: ID does not exist" containerID="95269eb1b5621687ed35fac36e5f9664a1dfc33b0f3085e973c72562b235ab59" Apr 24 15:12:47.111239 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:12:47.111186 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95269eb1b5621687ed35fac36e5f9664a1dfc33b0f3085e973c72562b235ab59"} err="failed to get container status \"95269eb1b5621687ed35fac36e5f9664a1dfc33b0f3085e973c72562b235ab59\": rpc error: code = NotFound desc = could not find container \"95269eb1b5621687ed35fac36e5f9664a1dfc33b0f3085e973c72562b235ab59\": container with ID starting with 95269eb1b5621687ed35fac36e5f9664a1dfc33b0f3085e973c72562b235ab59 not found: ID does not exist" Apr 24 15:12:47.111239 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:12:47.111213 2569 scope.go:117] "RemoveContainer" containerID="10af17cc31acaa07ad91eda6267b5595806ab733b495519eac306a72e2e9d104" Apr 24 15:12:47.111449 ip-10-0-129-231 kubenswrapper[2569]: E0424 15:12:47.111436 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10af17cc31acaa07ad91eda6267b5595806ab733b495519eac306a72e2e9d104\": container with ID starting with 10af17cc31acaa07ad91eda6267b5595806ab733b495519eac306a72e2e9d104 not found: ID does not exist" containerID="10af17cc31acaa07ad91eda6267b5595806ab733b495519eac306a72e2e9d104" Apr 24 15:12:47.111491 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:12:47.111452 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10af17cc31acaa07ad91eda6267b5595806ab733b495519eac306a72e2e9d104"} err="failed to get container status \"10af17cc31acaa07ad91eda6267b5595806ab733b495519eac306a72e2e9d104\": rpc error: code = NotFound desc = could not find container \"10af17cc31acaa07ad91eda6267b5595806ab733b495519eac306a72e2e9d104\": container with ID starting with 10af17cc31acaa07ad91eda6267b5595806ab733b495519eac306a72e2e9d104 not found: ID does not exist" Apr 24 15:12:47.120035 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:12:47.120006 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-cp9b8"] Apr 24 15:12:47.127006 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:12:47.126978 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-cp9b8"] Apr 24 15:12:47.827009 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:12:47.826979 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d3d6538-ed15-497a-a172-d6b084ebdfd2" path="/var/lib/kubelet/pods/3d3d6538-ed15-497a-a172-d6b084ebdfd2/volumes" Apr 24 15:12:48.006850 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:12:48.006813 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-xsxv6"] Apr 24 15:12:48.007389 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:12:48.007185 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-xsxv6" podUID="346e0e01-8a4b-47b3-812f-0b5d36a7e508" containerName="kserve-container" containerID="cri-o://a7ba5f2394054f3d7cd555b138284fdd465999e259e8de2c61b27218d9b47b0e" gracePeriod=30 Apr 24 15:12:48.067425 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:12:48.067390 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-56tmz"] Apr 24 15:12:48.067707 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:12:48.067694 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3d3d6538-ed15-497a-a172-d6b084ebdfd2" containerName="storage-initializer" Apr 24 15:12:48.067751 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:12:48.067709 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d3d6538-ed15-497a-a172-d6b084ebdfd2" containerName="storage-initializer" Apr 24 15:12:48.067751 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:12:48.067729 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3d3d6538-ed15-497a-a172-d6b084ebdfd2" containerName="kserve-container" Apr 24 15:12:48.067751 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:12:48.067735 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d3d6538-ed15-497a-a172-d6b084ebdfd2" containerName="kserve-container" Apr 24 15:12:48.067841 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:12:48.067785 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="3d3d6538-ed15-497a-a172-d6b084ebdfd2" containerName="kserve-container" Apr 24 15:12:48.070506 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:12:48.070485 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-56tmz" Apr 24 15:12:48.077799 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:12:48.077491 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-56tmz"] Apr 24 15:12:48.188937 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:12:48.188881 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8dedb61f-56be-4ca0-afb9-17e91f11d64e-kserve-provision-location\") pod \"isvc-triton-predictor-85f9f46646-56tmz\" (UID: \"8dedb61f-56be-4ca0-afb9-17e91f11d64e\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-56tmz" Apr 24 15:12:48.289574 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:12:48.289539 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8dedb61f-56be-4ca0-afb9-17e91f11d64e-kserve-provision-location\") pod \"isvc-triton-predictor-85f9f46646-56tmz\" (UID: \"8dedb61f-56be-4ca0-afb9-17e91f11d64e\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-56tmz" Apr 24 15:12:48.289951 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:12:48.289934 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8dedb61f-56be-4ca0-afb9-17e91f11d64e-kserve-provision-location\") pod \"isvc-triton-predictor-85f9f46646-56tmz\" (UID: \"8dedb61f-56be-4ca0-afb9-17e91f11d64e\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-56tmz" Apr 24 15:12:48.382154 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:12:48.382063 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-56tmz" Apr 24 15:12:48.499998 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:12:48.499913 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-56tmz"] Apr 24 15:12:48.506773 ip-10-0-129-231 kubenswrapper[2569]: W0424 15:12:48.506743 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8dedb61f_56be_4ca0_afb9_17e91f11d64e.slice/crio-74b2f1dae2aafc0012ae3b7fd083056f5ef50da9b7b179b5d5ca017e40709474 WatchSource:0}: Error finding container 74b2f1dae2aafc0012ae3b7fd083056f5ef50da9b7b179b5d5ca017e40709474: Status 404 returned error can't find the container with id 74b2f1dae2aafc0012ae3b7fd083056f5ef50da9b7b179b5d5ca017e40709474 Apr 24 15:12:49.102597 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:12:49.102558 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-56tmz" event={"ID":"8dedb61f-56be-4ca0-afb9-17e91f11d64e","Type":"ContainerStarted","Data":"67268467f8695b5c203bac38dd2fdcd66210b31d9146c0269ed7e6e4bc0d04f6"} Apr 24 15:12:49.102597 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:12:49.102594 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-56tmz" event={"ID":"8dedb61f-56be-4ca0-afb9-17e91f11d64e","Type":"ContainerStarted","Data":"74b2f1dae2aafc0012ae3b7fd083056f5ef50da9b7b179b5d5ca017e40709474"} Apr 24 15:12:53.116986 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:12:53.116953 2569 generic.go:358] "Generic (PLEG): container finished" podID="8dedb61f-56be-4ca0-afb9-17e91f11d64e" containerID="67268467f8695b5c203bac38dd2fdcd66210b31d9146c0269ed7e6e4bc0d04f6" exitCode=0 Apr 24 15:12:53.117407 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:12:53.117029 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-56tmz" event={"ID":"8dedb61f-56be-4ca0-afb9-17e91f11d64e","Type":"ContainerDied","Data":"67268467f8695b5c203bac38dd2fdcd66210b31d9146c0269ed7e6e4bc0d04f6"} Apr 24 15:13:18.233252 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:13:18.233203 2569 generic.go:358] "Generic (PLEG): container finished" podID="346e0e01-8a4b-47b3-812f-0b5d36a7e508" containerID="a7ba5f2394054f3d7cd555b138284fdd465999e259e8de2c61b27218d9b47b0e" exitCode=137 Apr 24 15:13:18.233790 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:13:18.233383 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-xsxv6" event={"ID":"346e0e01-8a4b-47b3-812f-0b5d36a7e508","Type":"ContainerDied","Data":"a7ba5f2394054f3d7cd555b138284fdd465999e259e8de2c61b27218d9b47b0e"} Apr 24 15:13:18.803151 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:13:18.802987 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-xsxv6" Apr 24 15:13:18.888674 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:13:18.888627 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/346e0e01-8a4b-47b3-812f-0b5d36a7e508-kserve-provision-location\") pod \"346e0e01-8a4b-47b3-812f-0b5d36a7e508\" (UID: \"346e0e01-8a4b-47b3-812f-0b5d36a7e508\") " Apr 24 15:13:18.893609 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:13:18.893565 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/346e0e01-8a4b-47b3-812f-0b5d36a7e508-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "346e0e01-8a4b-47b3-812f-0b5d36a7e508" (UID: "346e0e01-8a4b-47b3-812f-0b5d36a7e508"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 15:13:18.989529 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:13:18.989490 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/346e0e01-8a4b-47b3-812f-0b5d36a7e508-kserve-provision-location\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 15:13:19.238529 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:13:19.238495 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-xsxv6" event={"ID":"346e0e01-8a4b-47b3-812f-0b5d36a7e508","Type":"ContainerDied","Data":"94b5629acc5cf20dfebad2738e32c2ac277612cc55d31f87c2670912ea06d01e"} Apr 24 15:13:19.238529 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:13:19.238520 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-xsxv6" Apr 24 15:13:19.238529 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:13:19.238545 2569 scope.go:117] "RemoveContainer" containerID="a7ba5f2394054f3d7cd555b138284fdd465999e259e8de2c61b27218d9b47b0e" Apr 24 15:13:19.250442 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:13:19.250416 2569 scope.go:117] "RemoveContainer" containerID="f6b493aea603d838954f84315d1ec8738b30304f7bd45788126b873423d4508d" Apr 24 15:13:19.268525 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:13:19.268496 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-xsxv6"] Apr 24 15:13:19.276299 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:13:19.276272 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-xsxv6"] Apr 24 15:13:19.829689 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:13:19.829649 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="346e0e01-8a4b-47b3-812f-0b5d36a7e508" path="/var/lib/kubelet/pods/346e0e01-8a4b-47b3-812f-0b5d36a7e508/volumes" Apr 24 15:14:47.287762 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:14:47.287723 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mjpgb_be4da945-f6d1-4406-adea-f3ccedab88f6/ovn-acl-logging/0.log" Apr 24 15:14:47.287762 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:14:47.287724 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mjpgb_be4da945-f6d1-4406-adea-f3ccedab88f6/ovn-acl-logging/0.log" Apr 24 15:14:48.545490 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:14:48.545397 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-56tmz" event={"ID":"8dedb61f-56be-4ca0-afb9-17e91f11d64e","Type":"ContainerStarted","Data":"b9e18dc8f5ecc44ab428b78932ca6fb7a7ee67d424b1a4082a48c63781ca798d"} Apr 24 15:14:48.545929 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:14:48.545596 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-56tmz" Apr 24 15:14:48.546883 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:14:48.546856 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-56tmz" podUID="8dedb61f-56be-4ca0-afb9-17e91f11d64e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 24 15:14:48.562444 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:14:48.562363 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-56tmz" podStartSLOduration=5.501628741 podStartE2EDuration="2m0.562334302s" podCreationTimestamp="2026-04-24 15:12:48 +0000 UTC" firstStartedPulling="2026-04-24 15:12:53.118105783 +0000 UTC m=+2925.912162662" lastFinishedPulling="2026-04-24 15:14:48.178811344 +0000 UTC m=+3040.972868223" observedRunningTime="2026-04-24 15:14:48.558787518 +0000 UTC m=+3041.352844419" watchObservedRunningTime="2026-04-24 15:14:48.562334302 +0000 UTC m=+3041.356391206" Apr 24 15:14:49.549206 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:14:49.549155 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-56tmz" podUID="8dedb61f-56be-4ca0-afb9-17e91f11d64e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 24 15:14:59.550015 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:14:59.549944 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-56tmz" Apr 24 15:15:09.652294 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:15:09.652259 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-56tmz"] Apr 24 15:15:09.652713 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:15:09.652534 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-56tmz" podUID="8dedb61f-56be-4ca0-afb9-17e91f11d64e" containerName="kserve-container" containerID="cri-o://b9e18dc8f5ecc44ab428b78932ca6fb7a7ee67d424b1a4082a48c63781ca798d" gracePeriod=30 Apr 24 15:15:09.715818 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:15:09.715780 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-qw45f"] Apr 24 15:15:09.716124 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:15:09.716112 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="346e0e01-8a4b-47b3-812f-0b5d36a7e508" containerName="storage-initializer" Apr 24 15:15:09.716177 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:15:09.716126 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="346e0e01-8a4b-47b3-812f-0b5d36a7e508" containerName="storage-initializer" Apr 24 15:15:09.716177 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:15:09.716136 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="346e0e01-8a4b-47b3-812f-0b5d36a7e508" containerName="kserve-container" Apr 24 15:15:09.716177 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:15:09.716142 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="346e0e01-8a4b-47b3-812f-0b5d36a7e508" containerName="kserve-container" Apr 24 15:15:09.716283 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:15:09.716201 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="346e0e01-8a4b-47b3-812f-0b5d36a7e508" containerName="kserve-container" Apr 24 15:15:09.721032 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:15:09.721012 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-qw45f" Apr 24 15:15:09.727369 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:15:09.727343 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-qw45f"] Apr 24 15:15:09.784418 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:15:09.784378 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7e1ad16b-9f6c-4e23-a8e3-d214fac8ac43-kserve-provision-location\") pod \"isvc-xgboost-predictor-6dbc9d6d47-qw45f\" (UID: \"7e1ad16b-9f6c-4e23-a8e3-d214fac8ac43\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-qw45f" Apr 24 15:15:09.885162 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:15:09.885077 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7e1ad16b-9f6c-4e23-a8e3-d214fac8ac43-kserve-provision-location\") pod \"isvc-xgboost-predictor-6dbc9d6d47-qw45f\" (UID: \"7e1ad16b-9f6c-4e23-a8e3-d214fac8ac43\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-qw45f" Apr 24 15:15:09.885464 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:15:09.885441 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7e1ad16b-9f6c-4e23-a8e3-d214fac8ac43-kserve-provision-location\") pod \"isvc-xgboost-predictor-6dbc9d6d47-qw45f\" (UID: \"7e1ad16b-9f6c-4e23-a8e3-d214fac8ac43\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-qw45f" Apr 24 15:15:10.031319 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:15:10.031278 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-qw45f" Apr 24 15:15:10.189586 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:15:10.189553 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-qw45f"] Apr 24 15:15:10.195861 ip-10-0-129-231 kubenswrapper[2569]: W0424 15:15:10.195828 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e1ad16b_9f6c_4e23_a8e3_d214fac8ac43.slice/crio-71c970a8d9da237ede4de58843a152619725cb7d40706b27cd899d5c5ed27814 WatchSource:0}: Error finding container 71c970a8d9da237ede4de58843a152619725cb7d40706b27cd899d5c5ed27814: Status 404 returned error can't find the container with id 71c970a8d9da237ede4de58843a152619725cb7d40706b27cd899d5c5ed27814 Apr 24 15:15:10.611323 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:15:10.611288 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-qw45f" event={"ID":"7e1ad16b-9f6c-4e23-a8e3-d214fac8ac43","Type":"ContainerStarted","Data":"97744b4179a82efae0d0193a5d5e00759002167b16670a2a2cadc903c8782479"} Apr 24 15:15:10.611323 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:15:10.611324 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-qw45f" event={"ID":"7e1ad16b-9f6c-4e23-a8e3-d214fac8ac43","Type":"ContainerStarted","Data":"71c970a8d9da237ede4de58843a152619725cb7d40706b27cd899d5c5ed27814"} Apr 24 15:15:12.089996 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:15:12.089970 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-56tmz" Apr 24 15:15:12.102991 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:15:12.102888 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8dedb61f-56be-4ca0-afb9-17e91f11d64e-kserve-provision-location\") pod \"8dedb61f-56be-4ca0-afb9-17e91f11d64e\" (UID: \"8dedb61f-56be-4ca0-afb9-17e91f11d64e\") " Apr 24 15:15:12.103296 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:15:12.103274 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8dedb61f-56be-4ca0-afb9-17e91f11d64e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8dedb61f-56be-4ca0-afb9-17e91f11d64e" (UID: "8dedb61f-56be-4ca0-afb9-17e91f11d64e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 15:15:12.204226 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:15:12.204191 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8dedb61f-56be-4ca0-afb9-17e91f11d64e-kserve-provision-location\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 15:15:12.619694 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:15:12.619593 2569 generic.go:358] "Generic (PLEG): container finished" podID="8dedb61f-56be-4ca0-afb9-17e91f11d64e" containerID="b9e18dc8f5ecc44ab428b78932ca6fb7a7ee67d424b1a4082a48c63781ca798d" exitCode=0 Apr 24 15:15:12.619694 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:15:12.619675 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-56tmz" Apr 24 15:15:12.619694 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:15:12.619678 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-56tmz" event={"ID":"8dedb61f-56be-4ca0-afb9-17e91f11d64e","Type":"ContainerDied","Data":"b9e18dc8f5ecc44ab428b78932ca6fb7a7ee67d424b1a4082a48c63781ca798d"} Apr 24 15:15:12.620019 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:15:12.619712 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-56tmz" event={"ID":"8dedb61f-56be-4ca0-afb9-17e91f11d64e","Type":"ContainerDied","Data":"74b2f1dae2aafc0012ae3b7fd083056f5ef50da9b7b179b5d5ca017e40709474"} Apr 24 15:15:12.620019 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:15:12.619728 2569 scope.go:117] "RemoveContainer" containerID="b9e18dc8f5ecc44ab428b78932ca6fb7a7ee67d424b1a4082a48c63781ca798d" Apr 24 15:15:12.631261 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:15:12.631241 2569 scope.go:117] "RemoveContainer" containerID="67268467f8695b5c203bac38dd2fdcd66210b31d9146c0269ed7e6e4bc0d04f6" Apr 24 15:15:12.639458 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:15:12.639438 2569 scope.go:117] "RemoveContainer" containerID="b9e18dc8f5ecc44ab428b78932ca6fb7a7ee67d424b1a4082a48c63781ca798d" Apr 24 15:15:12.639750 ip-10-0-129-231 kubenswrapper[2569]: E0424 15:15:12.639730 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9e18dc8f5ecc44ab428b78932ca6fb7a7ee67d424b1a4082a48c63781ca798d\": container with ID starting with b9e18dc8f5ecc44ab428b78932ca6fb7a7ee67d424b1a4082a48c63781ca798d not found: ID does not exist" containerID="b9e18dc8f5ecc44ab428b78932ca6fb7a7ee67d424b1a4082a48c63781ca798d" Apr 24 15:15:12.639807 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:15:12.639761 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9e18dc8f5ecc44ab428b78932ca6fb7a7ee67d424b1a4082a48c63781ca798d"} err="failed to get container status \"b9e18dc8f5ecc44ab428b78932ca6fb7a7ee67d424b1a4082a48c63781ca798d\": rpc error: code = NotFound desc = could not find container \"b9e18dc8f5ecc44ab428b78932ca6fb7a7ee67d424b1a4082a48c63781ca798d\": container with ID starting with b9e18dc8f5ecc44ab428b78932ca6fb7a7ee67d424b1a4082a48c63781ca798d not found: ID does not exist" Apr 24 15:15:12.639807 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:15:12.639782 2569 scope.go:117] "RemoveContainer" containerID="67268467f8695b5c203bac38dd2fdcd66210b31d9146c0269ed7e6e4bc0d04f6" Apr 24 15:15:12.640068 ip-10-0-129-231 kubenswrapper[2569]: E0424 15:15:12.640044 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67268467f8695b5c203bac38dd2fdcd66210b31d9146c0269ed7e6e4bc0d04f6\": container with ID starting with 67268467f8695b5c203bac38dd2fdcd66210b31d9146c0269ed7e6e4bc0d04f6 not found: ID does not exist" containerID="67268467f8695b5c203bac38dd2fdcd66210b31d9146c0269ed7e6e4bc0d04f6" Apr 24 15:15:12.640190 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:15:12.640074 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67268467f8695b5c203bac38dd2fdcd66210b31d9146c0269ed7e6e4bc0d04f6"} err="failed to get container status \"67268467f8695b5c203bac38dd2fdcd66210b31d9146c0269ed7e6e4bc0d04f6\": rpc error: code = NotFound desc = could not find container \"67268467f8695b5c203bac38dd2fdcd66210b31d9146c0269ed7e6e4bc0d04f6\": container with ID starting with 67268467f8695b5c203bac38dd2fdcd66210b31d9146c0269ed7e6e4bc0d04f6 not found: ID does not exist" Apr 24 15:15:12.642294 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:15:12.642271 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-56tmz"] Apr 24 15:15:12.645418 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:15:12.645397 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-56tmz"] Apr 24 15:15:13.827364 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:15:13.827324 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8dedb61f-56be-4ca0-afb9-17e91f11d64e" path="/var/lib/kubelet/pods/8dedb61f-56be-4ca0-afb9-17e91f11d64e/volumes" Apr 24 15:15:14.627134 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:15:14.627102 2569 generic.go:358] "Generic (PLEG): container finished" podID="7e1ad16b-9f6c-4e23-a8e3-d214fac8ac43" containerID="97744b4179a82efae0d0193a5d5e00759002167b16670a2a2cadc903c8782479" exitCode=0 Apr 24 15:15:14.627298 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:15:14.627151 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-qw45f" event={"ID":"7e1ad16b-9f6c-4e23-a8e3-d214fac8ac43","Type":"ContainerDied","Data":"97744b4179a82efae0d0193a5d5e00759002167b16670a2a2cadc903c8782479"} Apr 24 15:15:38.707720 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:15:38.707687 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-qw45f" event={"ID":"7e1ad16b-9f6c-4e23-a8e3-d214fac8ac43","Type":"ContainerStarted","Data":"0c1ff90bef223466fa9a9800f788a6f6738a1574f319116f5384f7368f83c32b"} Apr 24 15:15:38.708148 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:15:38.707966 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-qw45f" Apr 24 15:15:38.709178 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:15:38.709154 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-qw45f" podUID="7e1ad16b-9f6c-4e23-a8e3-d214fac8ac43" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.56:8080: connect: connection refused" Apr 24 15:15:38.725100 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:15:38.725048 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-qw45f" podStartSLOduration=6.522386816 podStartE2EDuration="29.725032804s" podCreationTimestamp="2026-04-24 15:15:09 +0000 UTC" firstStartedPulling="2026-04-24 15:15:14.628377208 +0000 UTC m=+3067.422434088" lastFinishedPulling="2026-04-24 15:15:37.831023187 +0000 UTC m=+3090.625080076" observedRunningTime="2026-04-24 15:15:38.723882064 +0000 UTC m=+3091.517938977" watchObservedRunningTime="2026-04-24 15:15:38.725032804 +0000 UTC m=+3091.519089705" Apr 24 15:15:39.711125 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:15:39.711083 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-qw45f" podUID="7e1ad16b-9f6c-4e23-a8e3-d214fac8ac43" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.56:8080: connect: connection refused" Apr 24 15:15:49.711984 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:15:49.711935 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-qw45f" podUID="7e1ad16b-9f6c-4e23-a8e3-d214fac8ac43" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.56:8080: connect: connection refused" Apr 24 15:15:59.711457 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:15:59.711415 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-qw45f" podUID="7e1ad16b-9f6c-4e23-a8e3-d214fac8ac43" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.56:8080: connect: connection refused" Apr 24 15:16:09.711869 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:16:09.711820 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-qw45f" podUID="7e1ad16b-9f6c-4e23-a8e3-d214fac8ac43" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.56:8080: connect: connection refused" Apr 24 15:16:19.711625 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:16:19.711578 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-qw45f" podUID="7e1ad16b-9f6c-4e23-a8e3-d214fac8ac43" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.56:8080: connect: connection refused" Apr 24 15:16:29.711260 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:16:29.711163 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-qw45f" podUID="7e1ad16b-9f6c-4e23-a8e3-d214fac8ac43" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.56:8080: connect: connection refused" Apr 24 15:16:39.712797 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:16:39.712764 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-qw45f" Apr 24 15:16:49.834638 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:16:49.834600 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-qw45f"] Apr 24 15:16:49.835136 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:16:49.834860 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-qw45f" podUID="7e1ad16b-9f6c-4e23-a8e3-d214fac8ac43" containerName="kserve-container" containerID="cri-o://0c1ff90bef223466fa9a9800f788a6f6738a1574f319116f5384f7368f83c32b" gracePeriod=30 Apr 24 15:16:49.915446 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:16:49.915415 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-xqb9c"] Apr 24 15:16:49.915745 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:16:49.915733 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8dedb61f-56be-4ca0-afb9-17e91f11d64e" containerName="storage-initializer" Apr 24 15:16:49.915790 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:16:49.915746 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dedb61f-56be-4ca0-afb9-17e91f11d64e" containerName="storage-initializer" Apr 24 15:16:49.915790 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:16:49.915754 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8dedb61f-56be-4ca0-afb9-17e91f11d64e" containerName="kserve-container" Apr 24 15:16:49.915790 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:16:49.915760 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dedb61f-56be-4ca0-afb9-17e91f11d64e" containerName="kserve-container" Apr 24 15:16:49.915911 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:16:49.915814 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="8dedb61f-56be-4ca0-afb9-17e91f11d64e" containerName="kserve-container" Apr 24 15:16:49.918880 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:16:49.918863 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-xqb9c" Apr 24 15:16:49.927444 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:16:49.927414 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-xqb9c"] Apr 24 15:16:49.950550 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:16:49.950523 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5424b380-c619-4f37-9edf-6b4499f4addb-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-58b7db6668-xqb9c\" (UID: \"5424b380-c619-4f37-9edf-6b4499f4addb\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-xqb9c" Apr 24 15:16:50.051807 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:16:50.051776 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5424b380-c619-4f37-9edf-6b4499f4addb-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-58b7db6668-xqb9c\" (UID: \"5424b380-c619-4f37-9edf-6b4499f4addb\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-xqb9c" Apr 24 15:16:50.052181 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:16:50.052163 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5424b380-c619-4f37-9edf-6b4499f4addb-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-58b7db6668-xqb9c\" (UID: \"5424b380-c619-4f37-9edf-6b4499f4addb\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-xqb9c" Apr 24 15:16:50.230044 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:16:50.229995 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-xqb9c" Apr 24 15:16:50.383174 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:16:50.383124 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-xqb9c"] Apr 24 15:16:50.387053 ip-10-0-129-231 kubenswrapper[2569]: W0424 15:16:50.387026 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5424b380_c619_4f37_9edf_6b4499f4addb.slice/crio-99605acfbf85d50efbb127e36a5dacd3496ad5793b98f10173577206730ac99e WatchSource:0}: Error finding container 99605acfbf85d50efbb127e36a5dacd3496ad5793b98f10173577206730ac99e: Status 404 returned error can't find the container with id 99605acfbf85d50efbb127e36a5dacd3496ad5793b98f10173577206730ac99e Apr 24 15:16:50.388695 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:16:50.388680 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 15:16:50.916318 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:16:50.916279 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-xqb9c" event={"ID":"5424b380-c619-4f37-9edf-6b4499f4addb","Type":"ContainerStarted","Data":"fdd01fd251faf49046f72e43e33d3f10bd1c2b8b0c530456f052c75fb71e3b84"} Apr 24 15:16:50.916318 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:16:50.916321 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-xqb9c" event={"ID":"5424b380-c619-4f37-9edf-6b4499f4addb","Type":"ContainerStarted","Data":"99605acfbf85d50efbb127e36a5dacd3496ad5793b98f10173577206730ac99e"} Apr 24 15:16:53.669726 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:16:53.669702 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-qw45f" Apr 24 15:16:53.677056 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:16:53.677036 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7e1ad16b-9f6c-4e23-a8e3-d214fac8ac43-kserve-provision-location\") pod \"7e1ad16b-9f6c-4e23-a8e3-d214fac8ac43\" (UID: \"7e1ad16b-9f6c-4e23-a8e3-d214fac8ac43\") " Apr 24 15:16:53.677363 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:16:53.677344 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e1ad16b-9f6c-4e23-a8e3-d214fac8ac43-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7e1ad16b-9f6c-4e23-a8e3-d214fac8ac43" (UID: "7e1ad16b-9f6c-4e23-a8e3-d214fac8ac43"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 15:16:53.778323 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:16:53.778235 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7e1ad16b-9f6c-4e23-a8e3-d214fac8ac43-kserve-provision-location\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 15:16:53.927092 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:16:53.927054 2569 generic.go:358] "Generic (PLEG): container finished" podID="7e1ad16b-9f6c-4e23-a8e3-d214fac8ac43" containerID="0c1ff90bef223466fa9a9800f788a6f6738a1574f319116f5384f7368f83c32b" exitCode=0 Apr 24 15:16:53.927355 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:16:53.927136 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-qw45f" Apr 24 15:16:53.927355 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:16:53.927139 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-qw45f" event={"ID":"7e1ad16b-9f6c-4e23-a8e3-d214fac8ac43","Type":"ContainerDied","Data":"0c1ff90bef223466fa9a9800f788a6f6738a1574f319116f5384f7368f83c32b"} Apr 24 15:16:53.927355 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:16:53.927179 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-qw45f" event={"ID":"7e1ad16b-9f6c-4e23-a8e3-d214fac8ac43","Type":"ContainerDied","Data":"71c970a8d9da237ede4de58843a152619725cb7d40706b27cd899d5c5ed27814"} Apr 24 15:16:53.927355 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:16:53.927196 2569 scope.go:117] "RemoveContainer" containerID="0c1ff90bef223466fa9a9800f788a6f6738a1574f319116f5384f7368f83c32b" Apr 24 15:16:53.934884 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:16:53.934864 2569 scope.go:117] "RemoveContainer" containerID="97744b4179a82efae0d0193a5d5e00759002167b16670a2a2cadc903c8782479" Apr 24 15:16:53.942063 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:16:53.942044 2569 scope.go:117] "RemoveContainer" containerID="0c1ff90bef223466fa9a9800f788a6f6738a1574f319116f5384f7368f83c32b" Apr 24 15:16:53.942357 ip-10-0-129-231 kubenswrapper[2569]: E0424 15:16:53.942327 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c1ff90bef223466fa9a9800f788a6f6738a1574f319116f5384f7368f83c32b\": container with ID starting with 0c1ff90bef223466fa9a9800f788a6f6738a1574f319116f5384f7368f83c32b not found: ID does not exist" containerID="0c1ff90bef223466fa9a9800f788a6f6738a1574f319116f5384f7368f83c32b" Apr 24 15:16:53.942428 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:16:53.942368 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c1ff90bef223466fa9a9800f788a6f6738a1574f319116f5384f7368f83c32b"} err="failed to get container status \"0c1ff90bef223466fa9a9800f788a6f6738a1574f319116f5384f7368f83c32b\": rpc error: code = NotFound desc = could not find container \"0c1ff90bef223466fa9a9800f788a6f6738a1574f319116f5384f7368f83c32b\": container with ID starting with 0c1ff90bef223466fa9a9800f788a6f6738a1574f319116f5384f7368f83c32b not found: ID does not exist" Apr 24 15:16:53.942428 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:16:53.942392 2569 scope.go:117] "RemoveContainer" containerID="97744b4179a82efae0d0193a5d5e00759002167b16670a2a2cadc903c8782479" Apr 24 15:16:53.942619 ip-10-0-129-231 kubenswrapper[2569]: E0424 15:16:53.942603 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97744b4179a82efae0d0193a5d5e00759002167b16670a2a2cadc903c8782479\": container with ID starting with 97744b4179a82efae0d0193a5d5e00759002167b16670a2a2cadc903c8782479 not found: ID does not exist" containerID="97744b4179a82efae0d0193a5d5e00759002167b16670a2a2cadc903c8782479" Apr 24 15:16:53.942676 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:16:53.942625 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97744b4179a82efae0d0193a5d5e00759002167b16670a2a2cadc903c8782479"} err="failed to get container status \"97744b4179a82efae0d0193a5d5e00759002167b16670a2a2cadc903c8782479\": rpc error: code = NotFound desc = could not find container \"97744b4179a82efae0d0193a5d5e00759002167b16670a2a2cadc903c8782479\": container with ID starting with 97744b4179a82efae0d0193a5d5e00759002167b16670a2a2cadc903c8782479 not found: ID does not exist" Apr 24 15:16:53.942870 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:16:53.942850 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-qw45f"] Apr 24 15:16:53.944863 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:16:53.944841 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-qw45f"] Apr 24 15:16:54.931312 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:16:54.931273 2569 generic.go:358] "Generic (PLEG): container finished" podID="5424b380-c619-4f37-9edf-6b4499f4addb" containerID="fdd01fd251faf49046f72e43e33d3f10bd1c2b8b0c530456f052c75fb71e3b84" exitCode=0 Apr 24 15:16:54.931786 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:16:54.931351 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-xqb9c" event={"ID":"5424b380-c619-4f37-9edf-6b4499f4addb","Type":"ContainerDied","Data":"fdd01fd251faf49046f72e43e33d3f10bd1c2b8b0c530456f052c75fb71e3b84"} Apr 24 15:16:55.831384 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:16:55.831344 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e1ad16b-9f6c-4e23-a8e3-d214fac8ac43" path="/var/lib/kubelet/pods/7e1ad16b-9f6c-4e23-a8e3-d214fac8ac43/volumes" Apr 24 15:16:55.936968 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:16:55.936933 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-xqb9c" event={"ID":"5424b380-c619-4f37-9edf-6b4499f4addb","Type":"ContainerStarted","Data":"26f37d846c496b951c7a0992f69314596f5441514da0a7b09a2fe99c49616500"} Apr 24 15:16:55.937453 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:16:55.937183 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-xqb9c" Apr 24 15:16:55.953477 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:16:55.953415 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-xqb9c" podStartSLOduration=6.953396733 podStartE2EDuration="6.953396733s" podCreationTimestamp="2026-04-24 15:16:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 15:16:55.951586205 +0000 UTC m=+3168.745643106" watchObservedRunningTime="2026-04-24 15:16:55.953396733 +0000 UTC m=+3168.747453637" Apr 24 15:17:26.941852 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:17:26.941805 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-xqb9c" podUID="5424b380-c619-4f37-9edf-6b4499f4addb" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.57:8080/v2/models/isvc-xgboost-v2-mlserver/ready\": dial tcp 10.134.0.57:8080: connect: connection refused" Apr 24 15:17:36.944101 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:17:36.944070 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-xqb9c" Apr 24 15:17:39.999739 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:17:39.999704 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-xqb9c"] Apr 24 15:17:40.000226 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:17:40.000055 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-xqb9c" podUID="5424b380-c619-4f37-9edf-6b4499f4addb" containerName="kserve-container" containerID="cri-o://26f37d846c496b951c7a0992f69314596f5441514da0a7b09a2fe99c49616500" gracePeriod=30 Apr 24 15:17:40.039826 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:17:40.039788 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-fxj75"] Apr 24 15:17:40.040133 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:17:40.040121 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7e1ad16b-9f6c-4e23-a8e3-d214fac8ac43" containerName="kserve-container" Apr 24 15:17:40.040181 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:17:40.040135 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e1ad16b-9f6c-4e23-a8e3-d214fac8ac43" containerName="kserve-container" Apr 24 15:17:40.040181 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:17:40.040156 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7e1ad16b-9f6c-4e23-a8e3-d214fac8ac43" containerName="storage-initializer" Apr 24 15:17:40.040181 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:17:40.040163 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e1ad16b-9f6c-4e23-a8e3-d214fac8ac43" containerName="storage-initializer" Apr 24 15:17:40.040274 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:17:40.040210 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="7e1ad16b-9f6c-4e23-a8e3-d214fac8ac43" containerName="kserve-container" Apr 24 15:17:40.043410 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:17:40.043390 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-fxj75" Apr 24 15:17:40.052783 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:17:40.052756 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-fxj75"] Apr 24 15:17:40.163109 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:17:40.163060 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8fc4d60-8de5-4937-a4ee-5e5ad1df01f3-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-c58d48f-fxj75\" (UID: \"f8fc4d60-8de5-4937-a4ee-5e5ad1df01f3\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-fxj75" Apr 24 15:17:40.263703 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:17:40.263595 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8fc4d60-8de5-4937-a4ee-5e5ad1df01f3-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-c58d48f-fxj75\" (UID: \"f8fc4d60-8de5-4937-a4ee-5e5ad1df01f3\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-fxj75" Apr 24 15:17:40.264025 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:17:40.264003 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8fc4d60-8de5-4937-a4ee-5e5ad1df01f3-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-c58d48f-fxj75\" (UID: \"f8fc4d60-8de5-4937-a4ee-5e5ad1df01f3\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-fxj75" Apr 24 15:17:40.354491 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:17:40.354441 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-fxj75" Apr 24 15:17:40.474477 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:17:40.474446 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-fxj75"] Apr 24 15:17:40.476875 ip-10-0-129-231 kubenswrapper[2569]: W0424 15:17:40.476845 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8fc4d60_8de5_4937_a4ee_5e5ad1df01f3.slice/crio-0b1699d4639367257d4f2bd1bac14c17620ccf7e918ec60e5369fb88a0e90c53 WatchSource:0}: Error finding container 0b1699d4639367257d4f2bd1bac14c17620ccf7e918ec60e5369fb88a0e90c53: Status 404 returned error can't find the container with id 0b1699d4639367257d4f2bd1bac14c17620ccf7e918ec60e5369fb88a0e90c53 Apr 24 15:17:41.076210 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:17:41.076175 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-fxj75" event={"ID":"f8fc4d60-8de5-4937-a4ee-5e5ad1df01f3","Type":"ContainerStarted","Data":"2ae292224a5dc38d44e0153f90678862b7bd89be8e0d81fc85576c607cbf1778"} Apr 24 15:17:41.076210 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:17:41.076212 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-fxj75" event={"ID":"f8fc4d60-8de5-4937-a4ee-5e5ad1df01f3","Type":"ContainerStarted","Data":"0b1699d4639367257d4f2bd1bac14c17620ccf7e918ec60e5369fb88a0e90c53"} Apr 24 15:17:44.086257 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:17:44.086220 2569 generic.go:358] "Generic (PLEG): container finished" podID="f8fc4d60-8de5-4937-a4ee-5e5ad1df01f3" containerID="2ae292224a5dc38d44e0153f90678862b7bd89be8e0d81fc85576c607cbf1778" exitCode=0 Apr 24 15:17:44.086652 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:17:44.086274 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-fxj75" event={"ID":"f8fc4d60-8de5-4937-a4ee-5e5ad1df01f3","Type":"ContainerDied","Data":"2ae292224a5dc38d44e0153f90678862b7bd89be8e0d81fc85576c607cbf1778"} Apr 24 15:17:45.090760 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:17:45.090727 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-fxj75" event={"ID":"f8fc4d60-8de5-4937-a4ee-5e5ad1df01f3","Type":"ContainerStarted","Data":"56bf75fcba18a3cb1666cea7f65c9e4215be4d5691b75e089210e40a3d8da4ad"} Apr 24 15:17:45.091219 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:17:45.090932 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-fxj75" Apr 24 15:17:45.106742 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:17:45.106693 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-fxj75" podStartSLOduration=5.106679654 podStartE2EDuration="5.106679654s" podCreationTimestamp="2026-04-24 15:17:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 15:17:45.105637317 +0000 UTC m=+3217.899694220" watchObservedRunningTime="2026-04-24 15:17:45.106679654 +0000 UTC m=+3217.900736555" Apr 24 15:17:46.939984 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:17:46.939960 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-xqb9c" Apr 24 15:17:47.098561 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:17:47.098472 2569 generic.go:358] "Generic (PLEG): container finished" podID="5424b380-c619-4f37-9edf-6b4499f4addb" containerID="26f37d846c496b951c7a0992f69314596f5441514da0a7b09a2fe99c49616500" exitCode=0 Apr 24 15:17:47.098561 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:17:47.098543 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-xqb9c" Apr 24 15:17:47.098774 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:17:47.098555 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-xqb9c" event={"ID":"5424b380-c619-4f37-9edf-6b4499f4addb","Type":"ContainerDied","Data":"26f37d846c496b951c7a0992f69314596f5441514da0a7b09a2fe99c49616500"} Apr 24 15:17:47.098774 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:17:47.098592 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-xqb9c" event={"ID":"5424b380-c619-4f37-9edf-6b4499f4addb","Type":"ContainerDied","Data":"99605acfbf85d50efbb127e36a5dacd3496ad5793b98f10173577206730ac99e"} Apr 24 15:17:47.098774 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:17:47.098608 2569 scope.go:117] "RemoveContainer" containerID="26f37d846c496b951c7a0992f69314596f5441514da0a7b09a2fe99c49616500" Apr 24 15:17:47.106600 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:17:47.106568 2569 scope.go:117] "RemoveContainer" containerID="fdd01fd251faf49046f72e43e33d3f10bd1c2b8b0c530456f052c75fb71e3b84" Apr 24 15:17:47.113490 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:17:47.113469 2569 scope.go:117] "RemoveContainer" containerID="26f37d846c496b951c7a0992f69314596f5441514da0a7b09a2fe99c49616500" Apr 24 15:17:47.113750 ip-10-0-129-231 kubenswrapper[2569]: E0424 15:17:47.113733 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26f37d846c496b951c7a0992f69314596f5441514da0a7b09a2fe99c49616500\": container with ID starting with 26f37d846c496b951c7a0992f69314596f5441514da0a7b09a2fe99c49616500 not found: ID does not exist" containerID="26f37d846c496b951c7a0992f69314596f5441514da0a7b09a2fe99c49616500" Apr 24 15:17:47.113750 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:17:47.113758 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26f37d846c496b951c7a0992f69314596f5441514da0a7b09a2fe99c49616500"} err="failed to get container status \"26f37d846c496b951c7a0992f69314596f5441514da0a7b09a2fe99c49616500\": rpc error: code = NotFound desc = could not find container \"26f37d846c496b951c7a0992f69314596f5441514da0a7b09a2fe99c49616500\": container with ID starting with 26f37d846c496b951c7a0992f69314596f5441514da0a7b09a2fe99c49616500 not found: ID does not exist" Apr 24 15:17:47.113750 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:17:47.113779 2569 scope.go:117] "RemoveContainer" containerID="fdd01fd251faf49046f72e43e33d3f10bd1c2b8b0c530456f052c75fb71e3b84" Apr 24 15:17:47.114087 ip-10-0-129-231 kubenswrapper[2569]: E0424 15:17:47.114049 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdd01fd251faf49046f72e43e33d3f10bd1c2b8b0c530456f052c75fb71e3b84\": container with ID starting with fdd01fd251faf49046f72e43e33d3f10bd1c2b8b0c530456f052c75fb71e3b84 not found: ID does not exist" containerID="fdd01fd251faf49046f72e43e33d3f10bd1c2b8b0c530456f052c75fb71e3b84" Apr 24 15:17:47.114087 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:17:47.114077 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdd01fd251faf49046f72e43e33d3f10bd1c2b8b0c530456f052c75fb71e3b84"} err="failed to get container status \"fdd01fd251faf49046f72e43e33d3f10bd1c2b8b0c530456f052c75fb71e3b84\": rpc error: code = NotFound desc = could not find container \"fdd01fd251faf49046f72e43e33d3f10bd1c2b8b0c530456f052c75fb71e3b84\": container with ID starting with fdd01fd251faf49046f72e43e33d3f10bd1c2b8b0c530456f052c75fb71e3b84 not found: ID does not exist" Apr 24 15:17:47.127057 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:17:47.127028 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5424b380-c619-4f37-9edf-6b4499f4addb-kserve-provision-location\") pod \"5424b380-c619-4f37-9edf-6b4499f4addb\" (UID: \"5424b380-c619-4f37-9edf-6b4499f4addb\") " Apr 24 15:17:47.127386 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:17:47.127360 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5424b380-c619-4f37-9edf-6b4499f4addb-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5424b380-c619-4f37-9edf-6b4499f4addb" (UID: "5424b380-c619-4f37-9edf-6b4499f4addb"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 15:17:47.228144 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:17:47.228102 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5424b380-c619-4f37-9edf-6b4499f4addb-kserve-provision-location\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 15:17:47.423681 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:17:47.423644 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-xqb9c"] Apr 24 15:17:47.427078 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:17:47.427053 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-xqb9c"] Apr 24 15:17:47.827174 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:17:47.827081 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5424b380-c619-4f37-9edf-6b4499f4addb" path="/var/lib/kubelet/pods/5424b380-c619-4f37-9edf-6b4499f4addb/volumes" Apr 24 15:18:16.174430 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:18:16.174389 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-fxj75" Apr 24 15:18:20.338760 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:18:20.338721 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-fxj75"] Apr 24 15:18:20.339291 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:18:20.339080 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-fxj75" podUID="f8fc4d60-8de5-4937-a4ee-5e5ad1df01f3" containerName="kserve-container" containerID="cri-o://56bf75fcba18a3cb1666cea7f65c9e4215be4d5691b75e089210e40a3d8da4ad" gracePeriod=30 Apr 24 15:18:20.422346 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:18:20.422310 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-59pgt"] Apr 24 15:18:20.422671 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:18:20.422659 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5424b380-c619-4f37-9edf-6b4499f4addb" containerName="storage-initializer" Apr 24 15:18:20.422713 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:18:20.422672 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="5424b380-c619-4f37-9edf-6b4499f4addb" containerName="storage-initializer" Apr 24 15:18:20.422713 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:18:20.422680 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5424b380-c619-4f37-9edf-6b4499f4addb" containerName="kserve-container" Apr 24 15:18:20.422713 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:18:20.422686 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="5424b380-c619-4f37-9edf-6b4499f4addb" containerName="kserve-container" Apr 24 15:18:20.422811 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:18:20.422736 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="5424b380-c619-4f37-9edf-6b4499f4addb" containerName="kserve-container" Apr 24 15:18:20.425769 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:18:20.425747 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-59pgt" Apr 24 15:18:20.435097 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:18:20.435073 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-59pgt"] Apr 24 15:18:20.501775 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:18:20.501741 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ae51398d-9ec7-45a8-b17b-319cf560683d-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-687c7765c9-59pgt\" (UID: \"ae51398d-9ec7-45a8-b17b-319cf560683d\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-59pgt" Apr 24 15:18:20.602455 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:18:20.602350 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ae51398d-9ec7-45a8-b17b-319cf560683d-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-687c7765c9-59pgt\" (UID: \"ae51398d-9ec7-45a8-b17b-319cf560683d\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-59pgt" Apr 24 15:18:20.602748 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:18:20.602726 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ae51398d-9ec7-45a8-b17b-319cf560683d-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-687c7765c9-59pgt\" (UID: \"ae51398d-9ec7-45a8-b17b-319cf560683d\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-59pgt" Apr 24 15:18:20.735937 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:18:20.735888 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-59pgt" Apr 24 15:18:20.857932 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:18:20.857904 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-59pgt"] Apr 24 15:18:20.860430 ip-10-0-129-231 kubenswrapper[2569]: W0424 15:18:20.860400 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae51398d_9ec7_45a8_b17b_319cf560683d.slice/crio-ecbeafe3a86e1789dcf0b741fe71b3d518aee09b833434500fc419114409b8eb WatchSource:0}: Error finding container ecbeafe3a86e1789dcf0b741fe71b3d518aee09b833434500fc419114409b8eb: Status 404 returned error can't find the container with id ecbeafe3a86e1789dcf0b741fe71b3d518aee09b833434500fc419114409b8eb Apr 24 15:18:21.197860 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:18:21.197821 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-59pgt" event={"ID":"ae51398d-9ec7-45a8-b17b-319cf560683d","Type":"ContainerStarted","Data":"03fc9ca44bbefa3bc36d53c5f80e777d3dcd6bd1d7a427dd4ffdf6e7ef850bf1"} Apr 24 15:18:21.197860 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:18:21.197862 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-59pgt" event={"ID":"ae51398d-9ec7-45a8-b17b-319cf560683d","Type":"ContainerStarted","Data":"ecbeafe3a86e1789dcf0b741fe71b3d518aee09b833434500fc419114409b8eb"} Apr 24 15:18:25.210574 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:18:25.210540 2569 generic.go:358] "Generic (PLEG): container finished" podID="ae51398d-9ec7-45a8-b17b-319cf560683d" containerID="03fc9ca44bbefa3bc36d53c5f80e777d3dcd6bd1d7a427dd4ffdf6e7ef850bf1" exitCode=0 Apr 24 15:18:25.210986 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:18:25.210610 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-59pgt" event={"ID":"ae51398d-9ec7-45a8-b17b-319cf560683d","Type":"ContainerDied","Data":"03fc9ca44bbefa3bc36d53c5f80e777d3dcd6bd1d7a427dd4ffdf6e7ef850bf1"} Apr 24 15:18:26.095514 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:18:26.095464 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-fxj75" podUID="f8fc4d60-8de5-4937-a4ee-5e5ad1df01f3" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.58:8080/v2/models/xgboost-v2-mlserver/ready\": dial tcp 10.134.0.58:8080: connect: connection refused" Apr 24 15:18:26.215101 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:18:26.215065 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-59pgt" event={"ID":"ae51398d-9ec7-45a8-b17b-319cf560683d","Type":"ContainerStarted","Data":"2328aca4bada7f118b937bd52b119363398dbe6d3250c4176297ad0b872632ed"} Apr 24 15:18:26.215527 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:18:26.215395 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-59pgt" Apr 24 15:18:26.216868 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:18:26.216837 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-59pgt" podUID="ae51398d-9ec7-45a8-b17b-319cf560683d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.59:8080: connect: connection refused" Apr 24 15:18:26.228719 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:18:26.228668 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-59pgt" podStartSLOduration=6.228653187 podStartE2EDuration="6.228653187s" podCreationTimestamp="2026-04-24 15:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 15:18:26.228625585 +0000 UTC m=+3259.022682498" watchObservedRunningTime="2026-04-24 15:18:26.228653187 +0000 UTC m=+3259.022710090" Apr 24 15:18:27.219507 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:18:27.219472 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-59pgt" podUID="ae51398d-9ec7-45a8-b17b-319cf560683d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.59:8080: connect: connection refused" Apr 24 15:18:28.383043 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:18:28.383016 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-fxj75" Apr 24 15:18:28.472831 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:18:28.472736 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8fc4d60-8de5-4937-a4ee-5e5ad1df01f3-kserve-provision-location\") pod \"f8fc4d60-8de5-4937-a4ee-5e5ad1df01f3\" (UID: \"f8fc4d60-8de5-4937-a4ee-5e5ad1df01f3\") " Apr 24 15:18:28.473208 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:18:28.473176 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8fc4d60-8de5-4937-a4ee-5e5ad1df01f3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f8fc4d60-8de5-4937-a4ee-5e5ad1df01f3" (UID: "f8fc4d60-8de5-4937-a4ee-5e5ad1df01f3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 15:18:28.573799 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:18:28.573757 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8fc4d60-8de5-4937-a4ee-5e5ad1df01f3-kserve-provision-location\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 15:18:29.226728 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:18:29.226696 2569 generic.go:358] "Generic (PLEG): container finished" podID="f8fc4d60-8de5-4937-a4ee-5e5ad1df01f3" containerID="56bf75fcba18a3cb1666cea7f65c9e4215be4d5691b75e089210e40a3d8da4ad" exitCode=0 Apr 24 15:18:29.226935 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:18:29.226771 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-fxj75" Apr 24 15:18:29.226935 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:18:29.226797 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-fxj75" event={"ID":"f8fc4d60-8de5-4937-a4ee-5e5ad1df01f3","Type":"ContainerDied","Data":"56bf75fcba18a3cb1666cea7f65c9e4215be4d5691b75e089210e40a3d8da4ad"} Apr 24 15:18:29.226935 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:18:29.226835 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-fxj75" event={"ID":"f8fc4d60-8de5-4937-a4ee-5e5ad1df01f3","Type":"ContainerDied","Data":"0b1699d4639367257d4f2bd1bac14c17620ccf7e918ec60e5369fb88a0e90c53"} Apr 24 15:18:29.226935 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:18:29.226852 2569 scope.go:117] "RemoveContainer" containerID="56bf75fcba18a3cb1666cea7f65c9e4215be4d5691b75e089210e40a3d8da4ad" Apr 24 15:18:29.235607 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:18:29.235588 2569 scope.go:117] "RemoveContainer" containerID="2ae292224a5dc38d44e0153f90678862b7bd89be8e0d81fc85576c607cbf1778" Apr 24 15:18:29.244681 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:18:29.244655 2569 scope.go:117] "RemoveContainer" containerID="56bf75fcba18a3cb1666cea7f65c9e4215be4d5691b75e089210e40a3d8da4ad" Apr 24 15:18:29.245380 ip-10-0-129-231 kubenswrapper[2569]: E0424 15:18:29.245294 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56bf75fcba18a3cb1666cea7f65c9e4215be4d5691b75e089210e40a3d8da4ad\": container with ID starting with 56bf75fcba18a3cb1666cea7f65c9e4215be4d5691b75e089210e40a3d8da4ad not found: ID does not exist" containerID="56bf75fcba18a3cb1666cea7f65c9e4215be4d5691b75e089210e40a3d8da4ad" Apr 24 15:18:29.245563 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:18:29.245534 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56bf75fcba18a3cb1666cea7f65c9e4215be4d5691b75e089210e40a3d8da4ad"} err="failed to get container status \"56bf75fcba18a3cb1666cea7f65c9e4215be4d5691b75e089210e40a3d8da4ad\": rpc error: code = NotFound desc = could not find container \"56bf75fcba18a3cb1666cea7f65c9e4215be4d5691b75e089210e40a3d8da4ad\": container with ID starting with 56bf75fcba18a3cb1666cea7f65c9e4215be4d5691b75e089210e40a3d8da4ad not found: ID does not exist" Apr 24 15:18:29.245684 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:18:29.245650 2569 scope.go:117] "RemoveContainer" containerID="2ae292224a5dc38d44e0153f90678862b7bd89be8e0d81fc85576c607cbf1778" Apr 24 15:18:29.246028 ip-10-0-129-231 kubenswrapper[2569]: E0424 15:18:29.246000 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ae292224a5dc38d44e0153f90678862b7bd89be8e0d81fc85576c607cbf1778\": container with ID starting with 2ae292224a5dc38d44e0153f90678862b7bd89be8e0d81fc85576c607cbf1778 not found: ID does not exist" containerID="2ae292224a5dc38d44e0153f90678862b7bd89be8e0d81fc85576c607cbf1778" Apr 24 15:18:29.246157 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:18:29.246037 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ae292224a5dc38d44e0153f90678862b7bd89be8e0d81fc85576c607cbf1778"} err="failed to get container status \"2ae292224a5dc38d44e0153f90678862b7bd89be8e0d81fc85576c607cbf1778\": rpc error: code = NotFound desc = could not find container \"2ae292224a5dc38d44e0153f90678862b7bd89be8e0d81fc85576c607cbf1778\": container with ID starting with 2ae292224a5dc38d44e0153f90678862b7bd89be8e0d81fc85576c607cbf1778 not found: ID does not exist" Apr 24 15:18:29.247242 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:18:29.247220 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-fxj75"] Apr 24 15:18:29.250113 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:18:29.250094 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-fxj75"] Apr 24 15:18:29.828080 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:18:29.828048 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8fc4d60-8de5-4937-a4ee-5e5ad1df01f3" path="/var/lib/kubelet/pods/f8fc4d60-8de5-4937-a4ee-5e5ad1df01f3/volumes" Apr 24 15:18:37.219823 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:18:37.219776 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-59pgt" podUID="ae51398d-9ec7-45a8-b17b-319cf560683d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.59:8080: connect: connection refused" Apr 24 15:18:47.220246 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:18:47.220199 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-59pgt" podUID="ae51398d-9ec7-45a8-b17b-319cf560683d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.59:8080: connect: connection refused" Apr 24 15:18:57.219822 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:18:57.219775 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-59pgt" podUID="ae51398d-9ec7-45a8-b17b-319cf560683d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.59:8080: connect: connection refused" Apr 24 15:19:07.219630 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:19:07.219584 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-59pgt" podUID="ae51398d-9ec7-45a8-b17b-319cf560683d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.59:8080: connect: connection refused" Apr 24 15:19:17.220314 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:19:17.220266 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-59pgt" podUID="ae51398d-9ec7-45a8-b17b-319cf560683d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.59:8080: connect: connection refused" Apr 24 15:19:27.220163 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:19:27.220057 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-59pgt" Apr 24 15:19:30.530281 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:19:30.530243 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-mqffw"] Apr 24 15:19:30.530720 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:19:30.530703 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8fc4d60-8de5-4937-a4ee-5e5ad1df01f3" containerName="storage-initializer" Apr 24 15:19:30.530794 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:19:30.530723 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8fc4d60-8de5-4937-a4ee-5e5ad1df01f3" containerName="storage-initializer" Apr 24 15:19:30.530794 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:19:30.530755 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8fc4d60-8de5-4937-a4ee-5e5ad1df01f3" containerName="kserve-container" Apr 24 15:19:30.530794 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:19:30.530764 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8fc4d60-8de5-4937-a4ee-5e5ad1df01f3" containerName="kserve-container" Apr 24 15:19:30.530916 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:19:30.530843 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="f8fc4d60-8de5-4937-a4ee-5e5ad1df01f3" containerName="kserve-container" Apr 24 15:19:30.534323 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:19:30.534300 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-mqffw" Apr 24 15:19:30.539848 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:19:30.539814 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-mqffw"] Apr 24 15:19:30.549086 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:19:30.549057 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-59pgt"] Apr 24 15:19:30.549348 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:19:30.549308 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-59pgt" podUID="ae51398d-9ec7-45a8-b17b-319cf560683d" containerName="kserve-container" containerID="cri-o://2328aca4bada7f118b937bd52b119363398dbe6d3250c4176297ad0b872632ed" gracePeriod=30 Apr 24 15:19:30.606739 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:19:30.606692 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/efe01912-1bac-4cde-aea7-67d20399427c-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-b5d6966c7-mqffw\" (UID: \"efe01912-1bac-4cde-aea7-67d20399427c\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-mqffw" Apr 24 15:19:30.707293 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:19:30.707238 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/efe01912-1bac-4cde-aea7-67d20399427c-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-b5d6966c7-mqffw\" (UID: \"efe01912-1bac-4cde-aea7-67d20399427c\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-mqffw" Apr 24 15:19:30.707633 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:19:30.707612 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/efe01912-1bac-4cde-aea7-67d20399427c-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-b5d6966c7-mqffw\" (UID: \"efe01912-1bac-4cde-aea7-67d20399427c\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-mqffw" Apr 24 15:19:30.845764 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:19:30.845662 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-mqffw" Apr 24 15:19:30.968467 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:19:30.968442 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-mqffw"] Apr 24 15:19:30.971105 ip-10-0-129-231 kubenswrapper[2569]: W0424 15:19:30.971068 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podefe01912_1bac_4cde_aea7_67d20399427c.slice/crio-fac0c353e4cc0370bdad00eee8cdebebf583e6462b2e702f4e5121e21c306ce4 WatchSource:0}: Error finding container fac0c353e4cc0370bdad00eee8cdebebf583e6462b2e702f4e5121e21c306ce4: Status 404 returned error can't find the container with id fac0c353e4cc0370bdad00eee8cdebebf583e6462b2e702f4e5121e21c306ce4 Apr 24 15:19:31.415119 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:19:31.415081 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-mqffw" event={"ID":"efe01912-1bac-4cde-aea7-67d20399427c","Type":"ContainerStarted","Data":"85b4e49a85b77b1570fdf1f566627f220c388eba6aef9e2983cc828623208140"} Apr 24 15:19:31.415119 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:19:31.415123 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-mqffw" event={"ID":"efe01912-1bac-4cde-aea7-67d20399427c","Type":"ContainerStarted","Data":"fac0c353e4cc0370bdad00eee8cdebebf583e6462b2e702f4e5121e21c306ce4"} Apr 24 15:19:34.426455 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:19:34.426420 2569 generic.go:358] "Generic (PLEG): container finished" podID="ae51398d-9ec7-45a8-b17b-319cf560683d" containerID="2328aca4bada7f118b937bd52b119363398dbe6d3250c4176297ad0b872632ed" exitCode=0 Apr 24 15:19:34.426805 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:19:34.426463 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-59pgt" event={"ID":"ae51398d-9ec7-45a8-b17b-319cf560683d","Type":"ContainerDied","Data":"2328aca4bada7f118b937bd52b119363398dbe6d3250c4176297ad0b872632ed"} Apr 24 15:19:34.489064 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:19:34.489041 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-59pgt" Apr 24 15:19:34.539916 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:19:34.539802 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ae51398d-9ec7-45a8-b17b-319cf560683d-kserve-provision-location\") pod \"ae51398d-9ec7-45a8-b17b-319cf560683d\" (UID: \"ae51398d-9ec7-45a8-b17b-319cf560683d\") " Apr 24 15:19:34.540167 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:19:34.540145 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae51398d-9ec7-45a8-b17b-319cf560683d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ae51398d-9ec7-45a8-b17b-319cf560683d" (UID: "ae51398d-9ec7-45a8-b17b-319cf560683d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 15:19:34.641429 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:19:34.641387 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ae51398d-9ec7-45a8-b17b-319cf560683d-kserve-provision-location\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 15:19:35.430904 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:19:35.430868 2569 generic.go:358] "Generic (PLEG): container finished" podID="efe01912-1bac-4cde-aea7-67d20399427c" containerID="85b4e49a85b77b1570fdf1f566627f220c388eba6aef9e2983cc828623208140" exitCode=0 Apr 24 15:19:35.431353 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:19:35.430945 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-mqffw" event={"ID":"efe01912-1bac-4cde-aea7-67d20399427c","Type":"ContainerDied","Data":"85b4e49a85b77b1570fdf1f566627f220c388eba6aef9e2983cc828623208140"} Apr 24 15:19:35.432407 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:19:35.432382 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-59pgt" event={"ID":"ae51398d-9ec7-45a8-b17b-319cf560683d","Type":"ContainerDied","Data":"ecbeafe3a86e1789dcf0b741fe71b3d518aee09b833434500fc419114409b8eb"} Apr 24 15:19:35.432511 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:19:35.432414 2569 scope.go:117] "RemoveContainer" containerID="2328aca4bada7f118b937bd52b119363398dbe6d3250c4176297ad0b872632ed" Apr 24 15:19:35.432511 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:19:35.432432 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-59pgt" Apr 24 15:19:35.440464 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:19:35.440445 2569 scope.go:117] "RemoveContainer" containerID="03fc9ca44bbefa3bc36d53c5f80e777d3dcd6bd1d7a427dd4ffdf6e7ef850bf1" Apr 24 15:19:35.457158 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:19:35.457126 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-59pgt"] Apr 24 15:19:35.459443 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:19:35.459413 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-59pgt"] Apr 24 15:19:35.827284 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:19:35.827201 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae51398d-9ec7-45a8-b17b-319cf560683d" path="/var/lib/kubelet/pods/ae51398d-9ec7-45a8-b17b-319cf560683d/volumes" Apr 24 15:19:36.437585 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:19:36.437552 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-mqffw" event={"ID":"efe01912-1bac-4cde-aea7-67d20399427c","Type":"ContainerStarted","Data":"1736da535ad4016681d651f476d114e2f107fd47e5fc6ba97bd6668ce0b930fd"} Apr 24 15:19:36.438068 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:19:36.437796 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-mqffw" Apr 24 15:19:36.453740 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:19:36.453682 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-mqffw" podStartSLOduration=6.453659265 podStartE2EDuration="6.453659265s" podCreationTimestamp="2026-04-24 15:19:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 15:19:36.453441744 +0000 UTC m=+3329.247498657" watchObservedRunningTime="2026-04-24 15:19:36.453659265 +0000 UTC m=+3329.247716165" Apr 24 15:19:47.314198 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:19:47.314157 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mjpgb_be4da945-f6d1-4406-adea-f3ccedab88f6/ovn-acl-logging/0.log" Apr 24 15:19:47.316477 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:19:47.316451 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mjpgb_be4da945-f6d1-4406-adea-f3ccedab88f6/ovn-acl-logging/0.log" Apr 24 15:20:07.474469 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:20:07.474408 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-mqffw" podUID="efe01912-1bac-4cde-aea7-67d20399427c" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 24 15:20:17.473574 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:20:17.473522 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-mqffw" podUID="efe01912-1bac-4cde-aea7-67d20399427c" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 24 15:20:27.444649 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:20:27.444608 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-mqffw" Apr 24 15:20:30.629539 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:20:30.629506 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-mqffw"] Apr 24 15:20:30.630043 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:20:30.629836 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-mqffw" podUID="efe01912-1bac-4cde-aea7-67d20399427c" containerName="kserve-container" containerID="cri-o://1736da535ad4016681d651f476d114e2f107fd47e5fc6ba97bd6668ce0b930fd" gracePeriod=30 Apr 24 15:20:30.688915 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:20:30.688854 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-slpsm"] Apr 24 15:20:30.689247 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:20:30.689232 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ae51398d-9ec7-45a8-b17b-319cf560683d" containerName="storage-initializer" Apr 24 15:20:30.689294 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:20:30.689249 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae51398d-9ec7-45a8-b17b-319cf560683d" containerName="storage-initializer" Apr 24 15:20:30.689294 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:20:30.689259 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ae51398d-9ec7-45a8-b17b-319cf560683d" containerName="kserve-container" Apr 24 15:20:30.689294 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:20:30.689265 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae51398d-9ec7-45a8-b17b-319cf560683d" containerName="kserve-container" Apr 24 15:20:30.689387 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:20:30.689313 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="ae51398d-9ec7-45a8-b17b-319cf560683d" containerName="kserve-container" Apr 24 15:20:30.692161 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:20:30.692146 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-slpsm" Apr 24 15:20:30.700986 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:20:30.700950 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-slpsm"] Apr 24 15:20:30.809292 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:20:30.809244 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1a820c62-fd08-497d-a118-39f591f5d84b-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-5db5686f9f-slpsm\" (UID: \"1a820c62-fd08-497d-a118-39f591f5d84b\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-slpsm" Apr 24 15:20:30.909887 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:20:30.909800 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1a820c62-fd08-497d-a118-39f591f5d84b-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-5db5686f9f-slpsm\" (UID: \"1a820c62-fd08-497d-a118-39f591f5d84b\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-slpsm" Apr 24 15:20:30.910247 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:20:30.910227 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1a820c62-fd08-497d-a118-39f591f5d84b-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-5db5686f9f-slpsm\" (UID: \"1a820c62-fd08-497d-a118-39f591f5d84b\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-slpsm" Apr 24 15:20:31.009348 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:20:31.009314 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-slpsm" Apr 24 15:20:31.161563 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:20:31.161536 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-slpsm"] Apr 24 15:20:31.164305 ip-10-0-129-231 kubenswrapper[2569]: W0424 15:20:31.164274 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a820c62_fd08_497d_a118_39f591f5d84b.slice/crio-bdcef8ccae1a8f4d0df6c47e3e81c1a68f50f229a051bddf4dad36a084e0cfbc WatchSource:0}: Error finding container bdcef8ccae1a8f4d0df6c47e3e81c1a68f50f229a051bddf4dad36a084e0cfbc: Status 404 returned error can't find the container with id bdcef8ccae1a8f4d0df6c47e3e81c1a68f50f229a051bddf4dad36a084e0cfbc Apr 24 15:20:31.606058 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:20:31.605950 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-slpsm" event={"ID":"1a820c62-fd08-497d-a118-39f591f5d84b","Type":"ContainerStarted","Data":"b2f80d2776ac061dda405fa6b95b744a785d5f45ee8120130044188520529aa2"} Apr 24 15:20:31.606058 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:20:31.605994 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-slpsm" event={"ID":"1a820c62-fd08-497d-a118-39f591f5d84b","Type":"ContainerStarted","Data":"bdcef8ccae1a8f4d0df6c47e3e81c1a68f50f229a051bddf4dad36a084e0cfbc"} Apr 24 15:20:35.621813 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:20:35.621778 2569 generic.go:358] "Generic (PLEG): container finished" podID="1a820c62-fd08-497d-a118-39f591f5d84b" containerID="b2f80d2776ac061dda405fa6b95b744a785d5f45ee8120130044188520529aa2" exitCode=0 Apr 24 15:20:35.622219 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:20:35.621847 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-slpsm" event={"ID":"1a820c62-fd08-497d-a118-39f591f5d84b","Type":"ContainerDied","Data":"b2f80d2776ac061dda405fa6b95b744a785d5f45ee8120130044188520529aa2"} Apr 24 15:20:36.626240 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:20:36.626206 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-slpsm" event={"ID":"1a820c62-fd08-497d-a118-39f591f5d84b","Type":"ContainerStarted","Data":"30c4c2572df4afba46e2b9effc0097eb6038ce3e99d4c097c6ab6aa2f87f852f"} Apr 24 15:20:36.626653 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:20:36.626490 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-slpsm" Apr 24 15:20:36.627860 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:20:36.627834 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-slpsm" podUID="1a820c62-fd08-497d-a118-39f591f5d84b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.61:8080: connect: connection refused" Apr 24 15:20:36.642361 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:20:36.642308 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-slpsm" podStartSLOduration=6.642290313 podStartE2EDuration="6.642290313s" podCreationTimestamp="2026-04-24 15:20:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 15:20:36.640577327 +0000 UTC m=+3389.434634227" watchObservedRunningTime="2026-04-24 15:20:36.642290313 +0000 UTC m=+3389.436347217" Apr 24 15:20:37.442717 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:20:37.442674 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-mqffw" podUID="efe01912-1bac-4cde-aea7-67d20399427c" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.60:8080/v2/models/isvc-xgboost-v2-runtime/ready\": dial tcp 10.134.0.60:8080: connect: connection refused" Apr 24 15:20:37.629172 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:20:37.629130 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-slpsm" podUID="1a820c62-fd08-497d-a118-39f591f5d84b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.61:8080: connect: connection refused" Apr 24 15:20:38.633371 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:20:38.633334 2569 generic.go:358] "Generic (PLEG): container finished" podID="efe01912-1bac-4cde-aea7-67d20399427c" containerID="1736da535ad4016681d651f476d114e2f107fd47e5fc6ba97bd6668ce0b930fd" exitCode=0 Apr 24 15:20:38.633866 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:20:38.633411 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-mqffw" event={"ID":"efe01912-1bac-4cde-aea7-67d20399427c","Type":"ContainerDied","Data":"1736da535ad4016681d651f476d114e2f107fd47e5fc6ba97bd6668ce0b930fd"} Apr 24 15:20:38.669696 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:20:38.669672 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-mqffw" Apr 24 15:20:38.780884 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:20:38.780787 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/efe01912-1bac-4cde-aea7-67d20399427c-kserve-provision-location\") pod \"efe01912-1bac-4cde-aea7-67d20399427c\" (UID: \"efe01912-1bac-4cde-aea7-67d20399427c\") " Apr 24 15:20:38.781198 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:20:38.781170 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efe01912-1bac-4cde-aea7-67d20399427c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "efe01912-1bac-4cde-aea7-67d20399427c" (UID: "efe01912-1bac-4cde-aea7-67d20399427c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 15:20:38.882200 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:20:38.882147 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/efe01912-1bac-4cde-aea7-67d20399427c-kserve-provision-location\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 15:20:39.638005 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:20:39.637972 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-mqffw" Apr 24 15:20:39.638428 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:20:39.637976 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-mqffw" event={"ID":"efe01912-1bac-4cde-aea7-67d20399427c","Type":"ContainerDied","Data":"fac0c353e4cc0370bdad00eee8cdebebf583e6462b2e702f4e5121e21c306ce4"} Apr 24 15:20:39.638428 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:20:39.638090 2569 scope.go:117] "RemoveContainer" containerID="1736da535ad4016681d651f476d114e2f107fd47e5fc6ba97bd6668ce0b930fd" Apr 24 15:20:39.646290 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:20:39.646271 2569 scope.go:117] "RemoveContainer" containerID="85b4e49a85b77b1570fdf1f566627f220c388eba6aef9e2983cc828623208140" Apr 24 15:20:39.657778 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:20:39.657752 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-mqffw"] Apr 24 15:20:39.662598 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:20:39.662575 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-mqffw"] Apr 24 15:20:39.827505 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:20:39.827476 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efe01912-1bac-4cde-aea7-67d20399427c" path="/var/lib/kubelet/pods/efe01912-1bac-4cde-aea7-67d20399427c/volumes" Apr 24 15:20:47.629419 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:20:47.629382 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-slpsm" podUID="1a820c62-fd08-497d-a118-39f591f5d84b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.61:8080: connect: connection refused" Apr 24 15:20:57.629690 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:20:57.629593 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-slpsm" podUID="1a820c62-fd08-497d-a118-39f591f5d84b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.61:8080: connect: connection refused" Apr 24 15:21:07.629149 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:21:07.629100 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-slpsm" podUID="1a820c62-fd08-497d-a118-39f591f5d84b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.61:8080: connect: connection refused" Apr 24 15:21:17.629773 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:21:17.629721 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-slpsm" podUID="1a820c62-fd08-497d-a118-39f591f5d84b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.61:8080: connect: connection refused" Apr 24 15:21:27.629152 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:21:27.629109 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-slpsm" podUID="1a820c62-fd08-497d-a118-39f591f5d84b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.61:8080: connect: connection refused" Apr 24 15:21:37.630083 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:21:37.630046 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-slpsm" Apr 24 15:21:40.799353 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:21:40.799315 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-slpsm"] Apr 24 15:21:40.799837 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:21:40.799651 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-slpsm" podUID="1a820c62-fd08-497d-a118-39f591f5d84b" containerName="kserve-container" containerID="cri-o://30c4c2572df4afba46e2b9effc0097eb6038ce3e99d4c097c6ab6aa2f87f852f" gracePeriod=30 Apr 24 15:21:40.832492 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:21:40.832456 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-674d4f474d-q8xvh"] Apr 24 15:21:40.832793 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:21:40.832781 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="efe01912-1bac-4cde-aea7-67d20399427c" containerName="storage-initializer" Apr 24 15:21:40.832838 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:21:40.832795 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="efe01912-1bac-4cde-aea7-67d20399427c" containerName="storage-initializer" Apr 24 15:21:40.832838 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:21:40.832804 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="efe01912-1bac-4cde-aea7-67d20399427c" containerName="kserve-container" Apr 24 15:21:40.832838 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:21:40.832810 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="efe01912-1bac-4cde-aea7-67d20399427c" containerName="kserve-container" Apr 24 15:21:40.832968 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:21:40.832867 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="efe01912-1bac-4cde-aea7-67d20399427c" containerName="kserve-container" Apr 24 15:21:40.835851 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:21:40.835833 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-674d4f474d-q8xvh" Apr 24 15:21:40.838067 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:21:40.838044 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"storage-config\"" Apr 24 15:21:40.850458 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:21:40.850431 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-674d4f474d-q8xvh"] Apr 24 15:21:40.927676 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:21:40.927634 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/386ea3ca-da1e-4bfb-a044-af1efdedac22-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-674d4f474d-q8xvh\" (UID: \"386ea3ca-da1e-4bfb-a044-af1efdedac22\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-674d4f474d-q8xvh" Apr 24 15:21:41.028608 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:21:41.028561 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/386ea3ca-da1e-4bfb-a044-af1efdedac22-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-674d4f474d-q8xvh\" (UID: \"386ea3ca-da1e-4bfb-a044-af1efdedac22\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-674d4f474d-q8xvh" Apr 24 15:21:41.029019 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:21:41.028998 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/386ea3ca-da1e-4bfb-a044-af1efdedac22-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-674d4f474d-q8xvh\" (UID: \"386ea3ca-da1e-4bfb-a044-af1efdedac22\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-674d4f474d-q8xvh" Apr 24 15:21:41.146611 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:21:41.146528 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-674d4f474d-q8xvh" Apr 24 15:21:41.272591 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:21:41.272513 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-674d4f474d-q8xvh"] Apr 24 15:21:41.274985 ip-10-0-129-231 kubenswrapper[2569]: W0424 15:21:41.274942 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod386ea3ca_da1e_4bfb_a044_af1efdedac22.slice/crio-50f06fece39d4c2f3a6cb1b4048ddb27c5fb50bcd5e95c3fa6663c3de22614f1 WatchSource:0}: Error finding container 50f06fece39d4c2f3a6cb1b4048ddb27c5fb50bcd5e95c3fa6663c3de22614f1: Status 404 returned error can't find the container with id 50f06fece39d4c2f3a6cb1b4048ddb27c5fb50bcd5e95c3fa6663c3de22614f1 Apr 24 15:21:41.821457 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:21:41.821418 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-674d4f474d-q8xvh" event={"ID":"386ea3ca-da1e-4bfb-a044-af1efdedac22","Type":"ContainerStarted","Data":"cbf59f25343f1b54a284f8bf21a08ac559cf3e191bd00879da5ce7d3c9702223"} Apr 24 15:21:41.821457 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:21:41.821455 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-674d4f474d-q8xvh" event={"ID":"386ea3ca-da1e-4bfb-a044-af1efdedac22","Type":"ContainerStarted","Data":"50f06fece39d4c2f3a6cb1b4048ddb27c5fb50bcd5e95c3fa6663c3de22614f1"} Apr 24 15:21:42.826663 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:21:42.826625 2569 generic.go:358] "Generic (PLEG): container finished" podID="386ea3ca-da1e-4bfb-a044-af1efdedac22" containerID="cbf59f25343f1b54a284f8bf21a08ac559cf3e191bd00879da5ce7d3c9702223" exitCode=0 Apr 24 15:21:42.826663 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:21:42.826667 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-674d4f474d-q8xvh" event={"ID":"386ea3ca-da1e-4bfb-a044-af1efdedac22","Type":"ContainerDied","Data":"cbf59f25343f1b54a284f8bf21a08ac559cf3e191bd00879da5ce7d3c9702223"} Apr 24 15:21:43.830438 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:21:43.830404 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-674d4f474d-q8xvh" event={"ID":"386ea3ca-da1e-4bfb-a044-af1efdedac22","Type":"ContainerStarted","Data":"c7f0924f9f0b31d4e72756b7d37b857f39335a43906324b57c643bff25a1f560"} Apr 24 15:21:43.830850 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:21:43.830636 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-674d4f474d-q8xvh" Apr 24 15:21:43.831916 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:21:43.831872 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-674d4f474d-q8xvh" podUID="386ea3ca-da1e-4bfb-a044-af1efdedac22" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Apr 24 15:21:43.845777 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:21:43.845730 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-674d4f474d-q8xvh" podStartSLOduration=3.8457111409999998 podStartE2EDuration="3.845711141s" podCreationTimestamp="2026-04-24 15:21:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 15:21:43.844126191 +0000 UTC m=+3456.638183091" watchObservedRunningTime="2026-04-24 15:21:43.845711141 +0000 UTC m=+3456.639768041" Apr 24 15:21:44.637889 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:21:44.637865 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-slpsm" Apr 24 15:21:44.761118 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:21:44.761031 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1a820c62-fd08-497d-a118-39f591f5d84b-kserve-provision-location\") pod \"1a820c62-fd08-497d-a118-39f591f5d84b\" (UID: \"1a820c62-fd08-497d-a118-39f591f5d84b\") " Apr 24 15:21:44.761396 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:21:44.761371 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a820c62-fd08-497d-a118-39f591f5d84b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1a820c62-fd08-497d-a118-39f591f5d84b" (UID: "1a820c62-fd08-497d-a118-39f591f5d84b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 15:21:44.835144 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:21:44.835112 2569 generic.go:358] "Generic (PLEG): container finished" podID="1a820c62-fd08-497d-a118-39f591f5d84b" containerID="30c4c2572df4afba46e2b9effc0097eb6038ce3e99d4c097c6ab6aa2f87f852f" exitCode=0 Apr 24 15:21:44.835487 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:21:44.835177 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-slpsm" Apr 24 15:21:44.835487 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:21:44.835198 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-slpsm" event={"ID":"1a820c62-fd08-497d-a118-39f591f5d84b","Type":"ContainerDied","Data":"30c4c2572df4afba46e2b9effc0097eb6038ce3e99d4c097c6ab6aa2f87f852f"} Apr 24 15:21:44.835487 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:21:44.835234 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-slpsm" event={"ID":"1a820c62-fd08-497d-a118-39f591f5d84b","Type":"ContainerDied","Data":"bdcef8ccae1a8f4d0df6c47e3e81c1a68f50f229a051bddf4dad36a084e0cfbc"} Apr 24 15:21:44.835487 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:21:44.835249 2569 scope.go:117] "RemoveContainer" containerID="30c4c2572df4afba46e2b9effc0097eb6038ce3e99d4c097c6ab6aa2f87f852f" Apr 24 15:21:44.835910 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:21:44.835859 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-674d4f474d-q8xvh" podUID="386ea3ca-da1e-4bfb-a044-af1efdedac22" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Apr 24 15:21:44.843261 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:21:44.843244 2569 scope.go:117] "RemoveContainer" containerID="b2f80d2776ac061dda405fa6b95b744a785d5f45ee8120130044188520529aa2" Apr 24 15:21:44.850125 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:21:44.850110 2569 scope.go:117] "RemoveContainer" containerID="30c4c2572df4afba46e2b9effc0097eb6038ce3e99d4c097c6ab6aa2f87f852f" Apr 24 15:21:44.850346 ip-10-0-129-231 kubenswrapper[2569]: E0424 15:21:44.850328 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30c4c2572df4afba46e2b9effc0097eb6038ce3e99d4c097c6ab6aa2f87f852f\": container with ID starting with 30c4c2572df4afba46e2b9effc0097eb6038ce3e99d4c097c6ab6aa2f87f852f not found: ID does not exist" containerID="30c4c2572df4afba46e2b9effc0097eb6038ce3e99d4c097c6ab6aa2f87f852f" Apr 24 15:21:44.850420 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:21:44.850358 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30c4c2572df4afba46e2b9effc0097eb6038ce3e99d4c097c6ab6aa2f87f852f"} err="failed to get container status \"30c4c2572df4afba46e2b9effc0097eb6038ce3e99d4c097c6ab6aa2f87f852f\": rpc error: code = NotFound desc = could not find container \"30c4c2572df4afba46e2b9effc0097eb6038ce3e99d4c097c6ab6aa2f87f852f\": container with ID starting with 30c4c2572df4afba46e2b9effc0097eb6038ce3e99d4c097c6ab6aa2f87f852f not found: ID does not exist" Apr 24 15:21:44.850420 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:21:44.850382 2569 scope.go:117] "RemoveContainer" containerID="b2f80d2776ac061dda405fa6b95b744a785d5f45ee8120130044188520529aa2" Apr 24 15:21:44.850604 ip-10-0-129-231 kubenswrapper[2569]: E0424 15:21:44.850587 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2f80d2776ac061dda405fa6b95b744a785d5f45ee8120130044188520529aa2\": container with ID starting with b2f80d2776ac061dda405fa6b95b744a785d5f45ee8120130044188520529aa2 not found: ID does not exist" containerID="b2f80d2776ac061dda405fa6b95b744a785d5f45ee8120130044188520529aa2" Apr 24 15:21:44.850653 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:21:44.850609 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2f80d2776ac061dda405fa6b95b744a785d5f45ee8120130044188520529aa2"} err="failed to get container status \"b2f80d2776ac061dda405fa6b95b744a785d5f45ee8120130044188520529aa2\": rpc error: code = NotFound desc = could not find container \"b2f80d2776ac061dda405fa6b95b744a785d5f45ee8120130044188520529aa2\": container with ID starting with b2f80d2776ac061dda405fa6b95b744a785d5f45ee8120130044188520529aa2 not found: ID does not exist" Apr 24 15:21:44.854611 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:21:44.854588 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-slpsm"] Apr 24 15:21:44.857780 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:21:44.857759 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-slpsm"] Apr 24 15:21:44.862502 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:21:44.862472 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1a820c62-fd08-497d-a118-39f591f5d84b-kserve-provision-location\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 15:21:45.828018 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:21:45.827982 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a820c62-fd08-497d-a118-39f591f5d84b" path="/var/lib/kubelet/pods/1a820c62-fd08-497d-a118-39f591f5d84b/volumes" Apr 24 15:21:54.835842 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:21:54.835799 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-674d4f474d-q8xvh" podUID="386ea3ca-da1e-4bfb-a044-af1efdedac22" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Apr 24 15:22:04.835881 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:22:04.835838 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-674d4f474d-q8xvh" podUID="386ea3ca-da1e-4bfb-a044-af1efdedac22" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Apr 24 15:22:14.836044 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:22:14.835990 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-674d4f474d-q8xvh" podUID="386ea3ca-da1e-4bfb-a044-af1efdedac22" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Apr 24 15:22:24.836440 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:22:24.836387 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-674d4f474d-q8xvh" podUID="386ea3ca-da1e-4bfb-a044-af1efdedac22" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Apr 24 15:22:34.836262 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:22:34.836169 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-674d4f474d-q8xvh" podUID="386ea3ca-da1e-4bfb-a044-af1efdedac22" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Apr 24 15:22:44.836247 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:22:44.836200 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-674d4f474d-q8xvh" podUID="386ea3ca-da1e-4bfb-a044-af1efdedac22" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Apr 24 15:22:54.836816 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:22:54.836777 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-674d4f474d-q8xvh" Apr 24 15:23:00.962329 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:23:00.962293 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-674d4f474d-q8xvh"] Apr 24 15:23:00.962737 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:23:00.962637 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-674d4f474d-q8xvh" podUID="386ea3ca-da1e-4bfb-a044-af1efdedac22" containerName="kserve-container" containerID="cri-o://c7f0924f9f0b31d4e72756b7d37b857f39335a43906324b57c643bff25a1f560" gracePeriod=30 Apr 24 15:23:01.059547 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:23:01.059510 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5bd7488f4d-hfwd8"] Apr 24 15:23:01.059858 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:23:01.059840 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1a820c62-fd08-497d-a118-39f591f5d84b" containerName="storage-initializer" Apr 24 15:23:01.059935 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:23:01.059858 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a820c62-fd08-497d-a118-39f591f5d84b" containerName="storage-initializer" Apr 24 15:23:01.059935 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:23:01.059882 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1a820c62-fd08-497d-a118-39f591f5d84b" containerName="kserve-container" Apr 24 15:23:01.059935 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:23:01.059888 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a820c62-fd08-497d-a118-39f591f5d84b" containerName="kserve-container" Apr 24 15:23:01.060055 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:23:01.059953 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="1a820c62-fd08-497d-a118-39f591f5d84b" containerName="kserve-container" Apr 24 15:23:01.062915 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:23:01.062872 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5bd7488f4d-hfwd8" Apr 24 15:23:01.064764 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:23:01.064743 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 24 15:23:01.069681 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:23:01.069651 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5bd7488f4d-hfwd8"] Apr 24 15:23:01.108077 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:23:01.108045 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6a14e6ff-c81b-4423-b6fd-b390a2c8f81d-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5bd7488f4d-hfwd8\" (UID: \"6a14e6ff-c81b-4423-b6fd-b390a2c8f81d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5bd7488f4d-hfwd8" Apr 24 15:23:01.108250 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:23:01.108105 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6a14e6ff-c81b-4423-b6fd-b390a2c8f81d-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5bd7488f4d-hfwd8\" (UID: \"6a14e6ff-c81b-4423-b6fd-b390a2c8f81d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5bd7488f4d-hfwd8" Apr 24 15:23:01.209375 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:23:01.209328 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6a14e6ff-c81b-4423-b6fd-b390a2c8f81d-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5bd7488f4d-hfwd8\" (UID: \"6a14e6ff-c81b-4423-b6fd-b390a2c8f81d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5bd7488f4d-hfwd8" Apr 24 15:23:01.209599 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:23:01.209419 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6a14e6ff-c81b-4423-b6fd-b390a2c8f81d-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5bd7488f4d-hfwd8\" (UID: \"6a14e6ff-c81b-4423-b6fd-b390a2c8f81d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5bd7488f4d-hfwd8" Apr 24 15:23:01.209729 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:23:01.209704 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6a14e6ff-c81b-4423-b6fd-b390a2c8f81d-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5bd7488f4d-hfwd8\" (UID: \"6a14e6ff-c81b-4423-b6fd-b390a2c8f81d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5bd7488f4d-hfwd8" Apr 24 15:23:01.210121 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:23:01.210099 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6a14e6ff-c81b-4423-b6fd-b390a2c8f81d-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5bd7488f4d-hfwd8\" (UID: \"6a14e6ff-c81b-4423-b6fd-b390a2c8f81d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5bd7488f4d-hfwd8" Apr 24 15:23:01.374139 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:23:01.374034 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5bd7488f4d-hfwd8" Apr 24 15:23:01.498827 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:23:01.498790 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5bd7488f4d-hfwd8"] Apr 24 15:23:01.501930 ip-10-0-129-231 kubenswrapper[2569]: W0424 15:23:01.501885 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a14e6ff_c81b_4423_b6fd_b390a2c8f81d.slice/crio-cd3441d02de64acc90fde98214d127b2a04297956395a87989ce4b9516607ab6 WatchSource:0}: Error finding container cd3441d02de64acc90fde98214d127b2a04297956395a87989ce4b9516607ab6: Status 404 returned error can't find the container with id cd3441d02de64acc90fde98214d127b2a04297956395a87989ce4b9516607ab6 Apr 24 15:23:01.503841 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:23:01.503823 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 15:23:02.061505 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:23:02.061463 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5bd7488f4d-hfwd8" event={"ID":"6a14e6ff-c81b-4423-b6fd-b390a2c8f81d","Type":"ContainerStarted","Data":"2e9e9b13ae7c1bf2d21dc28ddee5406b5760721377346e659da655fc011da899"} Apr 24 15:23:02.061505 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:23:02.061508 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5bd7488f4d-hfwd8" event={"ID":"6a14e6ff-c81b-4423-b6fd-b390a2c8f81d","Type":"ContainerStarted","Data":"cd3441d02de64acc90fde98214d127b2a04297956395a87989ce4b9516607ab6"} Apr 24 15:23:03.065658 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:23:03.065622 2569 generic.go:358] "Generic (PLEG): container finished" podID="6a14e6ff-c81b-4423-b6fd-b390a2c8f81d" containerID="2e9e9b13ae7c1bf2d21dc28ddee5406b5760721377346e659da655fc011da899" exitCode=0 Apr 24 15:23:03.066075 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:23:03.065707 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5bd7488f4d-hfwd8" event={"ID":"6a14e6ff-c81b-4423-b6fd-b390a2c8f81d","Type":"ContainerDied","Data":"2e9e9b13ae7c1bf2d21dc28ddee5406b5760721377346e659da655fc011da899"} Apr 24 15:23:04.070698 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:23:04.070665 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5bd7488f4d-hfwd8" event={"ID":"6a14e6ff-c81b-4423-b6fd-b390a2c8f81d","Type":"ContainerStarted","Data":"9d6370f71d2bb09e1d3bd607362ec6258ed5ca1feeded7783f4878f350b61ee3"} Apr 24 15:23:04.071159 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:23:04.070860 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5bd7488f4d-hfwd8" Apr 24 15:23:04.072140 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:23:04.072108 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5bd7488f4d-hfwd8" podUID="6a14e6ff-c81b-4423-b6fd-b390a2c8f81d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.63:8080: connect: connection refused" Apr 24 15:23:04.086555 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:23:04.086505 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5bd7488f4d-hfwd8" podStartSLOduration=3.086489794 podStartE2EDuration="3.086489794s" podCreationTimestamp="2026-04-24 15:23:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 15:23:04.084518059 +0000 UTC m=+3536.878574962" watchObservedRunningTime="2026-04-24 15:23:04.086489794 +0000 UTC m=+3536.880546684" Apr 24 15:23:04.836361 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:23:04.836315 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-674d4f474d-q8xvh" podUID="386ea3ca-da1e-4bfb-a044-af1efdedac22" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Apr 24 15:23:05.074645 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:23:05.074608 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5bd7488f4d-hfwd8" podUID="6a14e6ff-c81b-4423-b6fd-b390a2c8f81d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.63:8080: connect: connection refused" Apr 24 15:23:05.705969 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:23:05.705945 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-674d4f474d-q8xvh" Apr 24 15:23:05.850709 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:23:05.850617 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/386ea3ca-da1e-4bfb-a044-af1efdedac22-kserve-provision-location\") pod \"386ea3ca-da1e-4bfb-a044-af1efdedac22\" (UID: \"386ea3ca-da1e-4bfb-a044-af1efdedac22\") " Apr 24 15:23:05.850963 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:23:05.850940 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/386ea3ca-da1e-4bfb-a044-af1efdedac22-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "386ea3ca-da1e-4bfb-a044-af1efdedac22" (UID: "386ea3ca-da1e-4bfb-a044-af1efdedac22"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 15:23:05.951841 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:23:05.951796 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/386ea3ca-da1e-4bfb-a044-af1efdedac22-kserve-provision-location\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 15:23:06.079094 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:23:06.079047 2569 generic.go:358] "Generic (PLEG): container finished" podID="386ea3ca-da1e-4bfb-a044-af1efdedac22" containerID="c7f0924f9f0b31d4e72756b7d37b857f39335a43906324b57c643bff25a1f560" exitCode=0 Apr 24 15:23:06.079549 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:23:06.079125 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-674d4f474d-q8xvh" Apr 24 15:23:06.079549 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:23:06.079135 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-674d4f474d-q8xvh" event={"ID":"386ea3ca-da1e-4bfb-a044-af1efdedac22","Type":"ContainerDied","Data":"c7f0924f9f0b31d4e72756b7d37b857f39335a43906324b57c643bff25a1f560"} Apr 24 15:23:06.079549 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:23:06.079183 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-674d4f474d-q8xvh" event={"ID":"386ea3ca-da1e-4bfb-a044-af1efdedac22","Type":"ContainerDied","Data":"50f06fece39d4c2f3a6cb1b4048ddb27c5fb50bcd5e95c3fa6663c3de22614f1"} Apr 24 15:23:06.079549 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:23:06.079205 2569 scope.go:117] "RemoveContainer" containerID="c7f0924f9f0b31d4e72756b7d37b857f39335a43906324b57c643bff25a1f560" Apr 24 15:23:06.087470 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:23:06.087451 2569 scope.go:117] "RemoveContainer" containerID="cbf59f25343f1b54a284f8bf21a08ac559cf3e191bd00879da5ce7d3c9702223" Apr 24 15:23:06.094683 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:23:06.094666 2569 scope.go:117] "RemoveContainer" containerID="c7f0924f9f0b31d4e72756b7d37b857f39335a43906324b57c643bff25a1f560" Apr 24 15:23:06.094969 ip-10-0-129-231 kubenswrapper[2569]: E0424 15:23:06.094942 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7f0924f9f0b31d4e72756b7d37b857f39335a43906324b57c643bff25a1f560\": container with ID starting with c7f0924f9f0b31d4e72756b7d37b857f39335a43906324b57c643bff25a1f560 not found: ID does not exist" containerID="c7f0924f9f0b31d4e72756b7d37b857f39335a43906324b57c643bff25a1f560" Apr 24 15:23:06.095073 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:23:06.094978 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7f0924f9f0b31d4e72756b7d37b857f39335a43906324b57c643bff25a1f560"} err="failed to get container status \"c7f0924f9f0b31d4e72756b7d37b857f39335a43906324b57c643bff25a1f560\": rpc error: code = NotFound desc = could not find container \"c7f0924f9f0b31d4e72756b7d37b857f39335a43906324b57c643bff25a1f560\": container with ID starting with c7f0924f9f0b31d4e72756b7d37b857f39335a43906324b57c643bff25a1f560 not found: ID does not exist" Apr 24 15:23:06.095073 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:23:06.095000 2569 scope.go:117] "RemoveContainer" containerID="cbf59f25343f1b54a284f8bf21a08ac559cf3e191bd00879da5ce7d3c9702223" Apr 24 15:23:06.095270 ip-10-0-129-231 kubenswrapper[2569]: E0424 15:23:06.095249 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbf59f25343f1b54a284f8bf21a08ac559cf3e191bd00879da5ce7d3c9702223\": container with ID starting with cbf59f25343f1b54a284f8bf21a08ac559cf3e191bd00879da5ce7d3c9702223 not found: ID does not exist" containerID="cbf59f25343f1b54a284f8bf21a08ac559cf3e191bd00879da5ce7d3c9702223" Apr 24 15:23:06.095334 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:23:06.095280 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbf59f25343f1b54a284f8bf21a08ac559cf3e191bd00879da5ce7d3c9702223"} err="failed to get container status \"cbf59f25343f1b54a284f8bf21a08ac559cf3e191bd00879da5ce7d3c9702223\": rpc error: code = NotFound desc = could not find container \"cbf59f25343f1b54a284f8bf21a08ac559cf3e191bd00879da5ce7d3c9702223\": container with ID starting with cbf59f25343f1b54a284f8bf21a08ac559cf3e191bd00879da5ce7d3c9702223 not found: ID does not exist" Apr 24 15:23:06.099578 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:23:06.099551 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-674d4f474d-q8xvh"] Apr 24 15:23:06.102310 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:23:06.102259 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-674d4f474d-q8xvh"] Apr 24 15:23:07.828045 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:23:07.828015 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="386ea3ca-da1e-4bfb-a044-af1efdedac22" path="/var/lib/kubelet/pods/386ea3ca-da1e-4bfb-a044-af1efdedac22/volumes" Apr 24 15:23:15.074951 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:23:15.074888 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5bd7488f4d-hfwd8" podUID="6a14e6ff-c81b-4423-b6fd-b390a2c8f81d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.63:8080: connect: connection refused" Apr 24 15:23:25.074925 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:23:25.074844 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5bd7488f4d-hfwd8" podUID="6a14e6ff-c81b-4423-b6fd-b390a2c8f81d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.63:8080: connect: connection refused" Apr 24 15:23:35.074809 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:23:35.074763 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5bd7488f4d-hfwd8" podUID="6a14e6ff-c81b-4423-b6fd-b390a2c8f81d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.63:8080: connect: connection refused" Apr 24 15:23:45.074758 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:23:45.074713 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5bd7488f4d-hfwd8" podUID="6a14e6ff-c81b-4423-b6fd-b390a2c8f81d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.63:8080: connect: connection refused" Apr 24 15:23:55.075619 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:23:55.075568 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5bd7488f4d-hfwd8" podUID="6a14e6ff-c81b-4423-b6fd-b390a2c8f81d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.63:8080: connect: connection refused" Apr 24 15:24:05.075409 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:05.075359 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5bd7488f4d-hfwd8" podUID="6a14e6ff-c81b-4423-b6fd-b390a2c8f81d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.63:8080: connect: connection refused" Apr 24 15:24:06.824199 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:06.824153 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5bd7488f4d-hfwd8" podUID="6a14e6ff-c81b-4423-b6fd-b390a2c8f81d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.63:8080: connect: connection refused" Apr 24 15:24:16.826038 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:16.826002 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5bd7488f4d-hfwd8" Apr 24 15:24:21.109248 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:21.109208 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5bd7488f4d-hfwd8"] Apr 24 15:24:21.109660 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:21.109471 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5bd7488f4d-hfwd8" podUID="6a14e6ff-c81b-4423-b6fd-b390a2c8f81d" containerName="kserve-container" containerID="cri-o://9d6370f71d2bb09e1d3bd607362ec6258ed5ca1feeded7783f4878f350b61ee3" gracePeriod=30 Apr 24 15:24:22.165003 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:22.164956 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-94466cc4-zzfqr"] Apr 24 15:24:22.165473 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:22.165418 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="386ea3ca-da1e-4bfb-a044-af1efdedac22" containerName="kserve-container" Apr 24 15:24:22.165473 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:22.165439 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="386ea3ca-da1e-4bfb-a044-af1efdedac22" containerName="kserve-container" Apr 24 15:24:22.165473 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:22.165455 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="386ea3ca-da1e-4bfb-a044-af1efdedac22" containerName="storage-initializer" Apr 24 15:24:22.165473 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:22.165463 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="386ea3ca-da1e-4bfb-a044-af1efdedac22" containerName="storage-initializer" Apr 24 15:24:22.165702 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:22.165564 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="386ea3ca-da1e-4bfb-a044-af1efdedac22" containerName="kserve-container" Apr 24 15:24:22.168683 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:22.168659 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-94466cc4-zzfqr" Apr 24 15:24:22.178281 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:22.178254 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-94466cc4-zzfqr"] Apr 24 15:24:22.305041 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:22.305004 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bb2070f4-7f3e-4935-a0d7-fe863cf1d0f6-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-94466cc4-zzfqr\" (UID: \"bb2070f4-7f3e-4935-a0d7-fe863cf1d0f6\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-94466cc4-zzfqr" Apr 24 15:24:22.406506 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:22.406465 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bb2070f4-7f3e-4935-a0d7-fe863cf1d0f6-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-94466cc4-zzfqr\" (UID: \"bb2070f4-7f3e-4935-a0d7-fe863cf1d0f6\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-94466cc4-zzfqr" Apr 24 15:24:22.406846 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:22.406824 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bb2070f4-7f3e-4935-a0d7-fe863cf1d0f6-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-94466cc4-zzfqr\" (UID: \"bb2070f4-7f3e-4935-a0d7-fe863cf1d0f6\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-94466cc4-zzfqr" Apr 24 15:24:22.479585 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:22.479475 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-94466cc4-zzfqr" Apr 24 15:24:22.595843 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:22.595810 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-94466cc4-zzfqr"] Apr 24 15:24:22.600398 ip-10-0-129-231 kubenswrapper[2569]: W0424 15:24:22.600365 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb2070f4_7f3e_4935_a0d7_fe863cf1d0f6.slice/crio-595ff460f440c44243c7904431d227c2b8692a679a8ad075338fd5b11c27bb09 WatchSource:0}: Error finding container 595ff460f440c44243c7904431d227c2b8692a679a8ad075338fd5b11c27bb09: Status 404 returned error can't find the container with id 595ff460f440c44243c7904431d227c2b8692a679a8ad075338fd5b11c27bb09 Apr 24 15:24:23.309877 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:23.309841 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-94466cc4-zzfqr" event={"ID":"bb2070f4-7f3e-4935-a0d7-fe863cf1d0f6","Type":"ContainerStarted","Data":"651b8bd0e36abb90c181e0acc0a9f14600842132758f14efd56b2184bfa93ceb"} Apr 24 15:24:23.309877 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:23.309880 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-94466cc4-zzfqr" event={"ID":"bb2070f4-7f3e-4935-a0d7-fe863cf1d0f6","Type":"ContainerStarted","Data":"595ff460f440c44243c7904431d227c2b8692a679a8ad075338fd5b11c27bb09"} Apr 24 15:24:25.852445 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:25.852411 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5bd7488f4d-hfwd8" Apr 24 15:24:26.037141 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:26.037043 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6a14e6ff-c81b-4423-b6fd-b390a2c8f81d-kserve-provision-location\") pod \"6a14e6ff-c81b-4423-b6fd-b390a2c8f81d\" (UID: \"6a14e6ff-c81b-4423-b6fd-b390a2c8f81d\") " Apr 24 15:24:26.037141 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:26.037082 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6a14e6ff-c81b-4423-b6fd-b390a2c8f81d-cabundle-cert\") pod \"6a14e6ff-c81b-4423-b6fd-b390a2c8f81d\" (UID: \"6a14e6ff-c81b-4423-b6fd-b390a2c8f81d\") " Apr 24 15:24:26.037436 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:26.037413 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a14e6ff-c81b-4423-b6fd-b390a2c8f81d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6a14e6ff-c81b-4423-b6fd-b390a2c8f81d" (UID: "6a14e6ff-c81b-4423-b6fd-b390a2c8f81d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 15:24:26.037484 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:26.037460 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a14e6ff-c81b-4423-b6fd-b390a2c8f81d-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "6a14e6ff-c81b-4423-b6fd-b390a2c8f81d" (UID: "6a14e6ff-c81b-4423-b6fd-b390a2c8f81d"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 15:24:26.137907 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:26.137870 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6a14e6ff-c81b-4423-b6fd-b390a2c8f81d-kserve-provision-location\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 15:24:26.138072 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:26.137923 2569 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6a14e6ff-c81b-4423-b6fd-b390a2c8f81d-cabundle-cert\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 15:24:26.319175 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:26.319080 2569 generic.go:358] "Generic (PLEG): container finished" podID="6a14e6ff-c81b-4423-b6fd-b390a2c8f81d" containerID="9d6370f71d2bb09e1d3bd607362ec6258ed5ca1feeded7783f4878f350b61ee3" exitCode=0 Apr 24 15:24:26.319175 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:26.319151 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5bd7488f4d-hfwd8" Apr 24 15:24:26.319360 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:26.319170 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5bd7488f4d-hfwd8" event={"ID":"6a14e6ff-c81b-4423-b6fd-b390a2c8f81d","Type":"ContainerDied","Data":"9d6370f71d2bb09e1d3bd607362ec6258ed5ca1feeded7783f4878f350b61ee3"} Apr 24 15:24:26.319360 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:26.319213 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5bd7488f4d-hfwd8" event={"ID":"6a14e6ff-c81b-4423-b6fd-b390a2c8f81d","Type":"ContainerDied","Data":"cd3441d02de64acc90fde98214d127b2a04297956395a87989ce4b9516607ab6"} Apr 24 15:24:26.319360 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:26.319232 2569 scope.go:117] "RemoveContainer" containerID="9d6370f71d2bb09e1d3bd607362ec6258ed5ca1feeded7783f4878f350b61ee3" Apr 24 15:24:26.327834 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:26.327810 2569 scope.go:117] "RemoveContainer" containerID="2e9e9b13ae7c1bf2d21dc28ddee5406b5760721377346e659da655fc011da899" Apr 24 15:24:26.335154 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:26.335133 2569 scope.go:117] "RemoveContainer" containerID="9d6370f71d2bb09e1d3bd607362ec6258ed5ca1feeded7783f4878f350b61ee3" Apr 24 15:24:26.335418 ip-10-0-129-231 kubenswrapper[2569]: E0424 15:24:26.335398 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d6370f71d2bb09e1d3bd607362ec6258ed5ca1feeded7783f4878f350b61ee3\": container with ID starting with 9d6370f71d2bb09e1d3bd607362ec6258ed5ca1feeded7783f4878f350b61ee3 not found: ID does not exist" containerID="9d6370f71d2bb09e1d3bd607362ec6258ed5ca1feeded7783f4878f350b61ee3" Apr 24 15:24:26.335489 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:26.335432 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d6370f71d2bb09e1d3bd607362ec6258ed5ca1feeded7783f4878f350b61ee3"} err="failed to get container status \"9d6370f71d2bb09e1d3bd607362ec6258ed5ca1feeded7783f4878f350b61ee3\": rpc error: code = NotFound desc = could not find container \"9d6370f71d2bb09e1d3bd607362ec6258ed5ca1feeded7783f4878f350b61ee3\": container with ID starting with 9d6370f71d2bb09e1d3bd607362ec6258ed5ca1feeded7783f4878f350b61ee3 not found: ID does not exist" Apr 24 15:24:26.335489 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:26.335461 2569 scope.go:117] "RemoveContainer" containerID="2e9e9b13ae7c1bf2d21dc28ddee5406b5760721377346e659da655fc011da899" Apr 24 15:24:26.335730 ip-10-0-129-231 kubenswrapper[2569]: E0424 15:24:26.335712 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e9e9b13ae7c1bf2d21dc28ddee5406b5760721377346e659da655fc011da899\": container with ID starting with 2e9e9b13ae7c1bf2d21dc28ddee5406b5760721377346e659da655fc011da899 not found: ID does not exist" containerID="2e9e9b13ae7c1bf2d21dc28ddee5406b5760721377346e659da655fc011da899" Apr 24 15:24:26.335773 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:26.335736 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e9e9b13ae7c1bf2d21dc28ddee5406b5760721377346e659da655fc011da899"} err="failed to get container status \"2e9e9b13ae7c1bf2d21dc28ddee5406b5760721377346e659da655fc011da899\": rpc error: code = NotFound desc = could not find container \"2e9e9b13ae7c1bf2d21dc28ddee5406b5760721377346e659da655fc011da899\": container with ID starting with 2e9e9b13ae7c1bf2d21dc28ddee5406b5760721377346e659da655fc011da899 not found: ID does not exist" Apr 24 15:24:26.340286 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:26.340264 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5bd7488f4d-hfwd8"] Apr 24 15:24:26.343670 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:26.343649 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5bd7488f4d-hfwd8"] Apr 24 15:24:27.828917 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:27.828864 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a14e6ff-c81b-4423-b6fd-b390a2c8f81d" path="/var/lib/kubelet/pods/6a14e6ff-c81b-4423-b6fd-b390a2c8f81d/volumes" Apr 24 15:24:28.327650 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:28.327621 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-94466cc4-zzfqr_bb2070f4-7f3e-4935-a0d7-fe863cf1d0f6/storage-initializer/0.log" Apr 24 15:24:28.327827 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:28.327659 2569 generic.go:358] "Generic (PLEG): container finished" podID="bb2070f4-7f3e-4935-a0d7-fe863cf1d0f6" containerID="651b8bd0e36abb90c181e0acc0a9f14600842132758f14efd56b2184bfa93ceb" exitCode=1 Apr 24 15:24:28.327827 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:28.327689 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-94466cc4-zzfqr" event={"ID":"bb2070f4-7f3e-4935-a0d7-fe863cf1d0f6","Type":"ContainerDied","Data":"651b8bd0e36abb90c181e0acc0a9f14600842132758f14efd56b2184bfa93ceb"} Apr 24 15:24:29.332388 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:29.332361 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-94466cc4-zzfqr_bb2070f4-7f3e-4935-a0d7-fe863cf1d0f6/storage-initializer/0.log" Apr 24 15:24:29.332776 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:29.332432 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-94466cc4-zzfqr" event={"ID":"bb2070f4-7f3e-4935-a0d7-fe863cf1d0f6","Type":"ContainerStarted","Data":"38e0e72f54adb2c5358b1ca66348e4792e63fa2d3ba8d6639bc073a962392b56"} Apr 24 15:24:32.169214 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:32.169182 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-94466cc4-zzfqr"] Apr 24 15:24:32.169662 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:32.169517 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-94466cc4-zzfqr" podUID="bb2070f4-7f3e-4935-a0d7-fe863cf1d0f6" containerName="storage-initializer" containerID="cri-o://38e0e72f54adb2c5358b1ca66348e4792e63fa2d3ba8d6639bc073a962392b56" gracePeriod=30 Apr 24 15:24:32.330825 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:32.330800 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-94466cc4-zzfqr_bb2070f4-7f3e-4935-a0d7-fe863cf1d0f6/storage-initializer/1.log" Apr 24 15:24:32.331214 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:32.331194 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-94466cc4-zzfqr_bb2070f4-7f3e-4935-a0d7-fe863cf1d0f6/storage-initializer/0.log" Apr 24 15:24:32.331317 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:32.331272 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-94466cc4-zzfqr" Apr 24 15:24:32.341217 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:32.341193 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-94466cc4-zzfqr_bb2070f4-7f3e-4935-a0d7-fe863cf1d0f6/storage-initializer/1.log" Apr 24 15:24:32.341552 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:32.341537 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-94466cc4-zzfqr_bb2070f4-7f3e-4935-a0d7-fe863cf1d0f6/storage-initializer/0.log" Apr 24 15:24:32.341607 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:32.341574 2569 generic.go:358] "Generic (PLEG): container finished" podID="bb2070f4-7f3e-4935-a0d7-fe863cf1d0f6" containerID="38e0e72f54adb2c5358b1ca66348e4792e63fa2d3ba8d6639bc073a962392b56" exitCode=1 Apr 24 15:24:32.341657 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:32.341647 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-94466cc4-zzfqr" Apr 24 15:24:32.341702 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:32.341674 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-94466cc4-zzfqr" event={"ID":"bb2070f4-7f3e-4935-a0d7-fe863cf1d0f6","Type":"ContainerDied","Data":"38e0e72f54adb2c5358b1ca66348e4792e63fa2d3ba8d6639bc073a962392b56"} Apr 24 15:24:32.341745 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:32.341723 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-94466cc4-zzfqr" event={"ID":"bb2070f4-7f3e-4935-a0d7-fe863cf1d0f6","Type":"ContainerDied","Data":"595ff460f440c44243c7904431d227c2b8692a679a8ad075338fd5b11c27bb09"} Apr 24 15:24:32.341745 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:32.341743 2569 scope.go:117] "RemoveContainer" containerID="38e0e72f54adb2c5358b1ca66348e4792e63fa2d3ba8d6639bc073a962392b56" Apr 24 15:24:32.350177 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:32.350157 2569 scope.go:117] "RemoveContainer" containerID="651b8bd0e36abb90c181e0acc0a9f14600842132758f14efd56b2184bfa93ceb" Apr 24 15:24:32.357073 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:32.357055 2569 scope.go:117] "RemoveContainer" containerID="38e0e72f54adb2c5358b1ca66348e4792e63fa2d3ba8d6639bc073a962392b56" Apr 24 15:24:32.357325 ip-10-0-129-231 kubenswrapper[2569]: E0424 15:24:32.357305 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38e0e72f54adb2c5358b1ca66348e4792e63fa2d3ba8d6639bc073a962392b56\": container with ID starting with 38e0e72f54adb2c5358b1ca66348e4792e63fa2d3ba8d6639bc073a962392b56 not found: ID does not exist" containerID="38e0e72f54adb2c5358b1ca66348e4792e63fa2d3ba8d6639bc073a962392b56" Apr 24 15:24:32.357392 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:32.357333 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38e0e72f54adb2c5358b1ca66348e4792e63fa2d3ba8d6639bc073a962392b56"} err="failed to get container status \"38e0e72f54adb2c5358b1ca66348e4792e63fa2d3ba8d6639bc073a962392b56\": rpc error: code = NotFound desc = could not find container \"38e0e72f54adb2c5358b1ca66348e4792e63fa2d3ba8d6639bc073a962392b56\": container with ID starting with 38e0e72f54adb2c5358b1ca66348e4792e63fa2d3ba8d6639bc073a962392b56 not found: ID does not exist" Apr 24 15:24:32.357392 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:32.357352 2569 scope.go:117] "RemoveContainer" containerID="651b8bd0e36abb90c181e0acc0a9f14600842132758f14efd56b2184bfa93ceb" Apr 24 15:24:32.357593 ip-10-0-129-231 kubenswrapper[2569]: E0424 15:24:32.357574 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"651b8bd0e36abb90c181e0acc0a9f14600842132758f14efd56b2184bfa93ceb\": container with ID starting with 651b8bd0e36abb90c181e0acc0a9f14600842132758f14efd56b2184bfa93ceb not found: ID does not exist" containerID="651b8bd0e36abb90c181e0acc0a9f14600842132758f14efd56b2184bfa93ceb" Apr 24 15:24:32.357638 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:32.357601 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"651b8bd0e36abb90c181e0acc0a9f14600842132758f14efd56b2184bfa93ceb"} err="failed to get container status \"651b8bd0e36abb90c181e0acc0a9f14600842132758f14efd56b2184bfa93ceb\": rpc error: code = NotFound desc = could not find container \"651b8bd0e36abb90c181e0acc0a9f14600842132758f14efd56b2184bfa93ceb\": container with ID starting with 651b8bd0e36abb90c181e0acc0a9f14600842132758f14efd56b2184bfa93ceb not found: ID does not exist" Apr 24 15:24:32.392586 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:32.392552 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bb2070f4-7f3e-4935-a0d7-fe863cf1d0f6-kserve-provision-location\") pod \"bb2070f4-7f3e-4935-a0d7-fe863cf1d0f6\" (UID: \"bb2070f4-7f3e-4935-a0d7-fe863cf1d0f6\") " Apr 24 15:24:32.392815 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:32.392793 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb2070f4-7f3e-4935-a0d7-fe863cf1d0f6-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "bb2070f4-7f3e-4935-a0d7-fe863cf1d0f6" (UID: "bb2070f4-7f3e-4935-a0d7-fe863cf1d0f6"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 15:24:32.493108 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:32.493069 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bb2070f4-7f3e-4935-a0d7-fe863cf1d0f6-kserve-provision-location\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 15:24:32.672131 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:32.672092 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-94466cc4-zzfqr"] Apr 24 15:24:32.677722 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:32.677686 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-94466cc4-zzfqr"] Apr 24 15:24:33.226122 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:33.226085 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-f7b55c5dd-dsjg8"] Apr 24 15:24:33.226500 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:33.226401 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6a14e6ff-c81b-4423-b6fd-b390a2c8f81d" containerName="storage-initializer" Apr 24 15:24:33.226500 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:33.226411 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a14e6ff-c81b-4423-b6fd-b390a2c8f81d" containerName="storage-initializer" Apr 24 15:24:33.226500 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:33.226419 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6a14e6ff-c81b-4423-b6fd-b390a2c8f81d" containerName="kserve-container" Apr 24 15:24:33.226500 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:33.226424 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a14e6ff-c81b-4423-b6fd-b390a2c8f81d" containerName="kserve-container" Apr 24 15:24:33.226500 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:33.226433 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bb2070f4-7f3e-4935-a0d7-fe863cf1d0f6" containerName="storage-initializer" Apr 24 15:24:33.226500 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:33.226440 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb2070f4-7f3e-4935-a0d7-fe863cf1d0f6" containerName="storage-initializer" Apr 24 15:24:33.226500 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:33.226451 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bb2070f4-7f3e-4935-a0d7-fe863cf1d0f6" containerName="storage-initializer" Apr 24 15:24:33.226500 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:33.226456 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb2070f4-7f3e-4935-a0d7-fe863cf1d0f6" containerName="storage-initializer" Apr 24 15:24:33.226500 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:33.226500 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="bb2070f4-7f3e-4935-a0d7-fe863cf1d0f6" containerName="storage-initializer" Apr 24 15:24:33.226770 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:33.226510 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="6a14e6ff-c81b-4423-b6fd-b390a2c8f81d" containerName="kserve-container" Apr 24 15:24:33.226770 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:33.226593 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="bb2070f4-7f3e-4935-a0d7-fe863cf1d0f6" containerName="storage-initializer" Apr 24 15:24:33.230837 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:33.230813 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-f7b55c5dd-dsjg8" Apr 24 15:24:33.234264 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:33.234237 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 24 15:24:33.234384 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:33.234299 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-5qb2v\"" Apr 24 15:24:33.235109 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:33.235094 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"storage-config\"" Apr 24 15:24:33.239570 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:33.239545 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-f7b55c5dd-dsjg8"] Apr 24 15:24:33.298690 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:33.298655 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/244bc842-5c1d-4ffb-88f4-ec12fffb9144-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-f7b55c5dd-dsjg8\" (UID: \"244bc842-5c1d-4ffb-88f4-ec12fffb9144\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-f7b55c5dd-dsjg8" Apr 24 15:24:33.298690 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:33.298688 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/244bc842-5c1d-4ffb-88f4-ec12fffb9144-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-f7b55c5dd-dsjg8\" (UID: \"244bc842-5c1d-4ffb-88f4-ec12fffb9144\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-f7b55c5dd-dsjg8" Apr 24 15:24:33.399520 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:33.399487 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/244bc842-5c1d-4ffb-88f4-ec12fffb9144-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-f7b55c5dd-dsjg8\" (UID: \"244bc842-5c1d-4ffb-88f4-ec12fffb9144\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-f7b55c5dd-dsjg8" Apr 24 15:24:33.399520 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:33.399519 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/244bc842-5c1d-4ffb-88f4-ec12fffb9144-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-f7b55c5dd-dsjg8\" (UID: \"244bc842-5c1d-4ffb-88f4-ec12fffb9144\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-f7b55c5dd-dsjg8" Apr 24 15:24:33.399876 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:33.399854 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/244bc842-5c1d-4ffb-88f4-ec12fffb9144-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-f7b55c5dd-dsjg8\" (UID: \"244bc842-5c1d-4ffb-88f4-ec12fffb9144\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-f7b55c5dd-dsjg8" Apr 24 15:24:33.400115 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:33.400098 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/244bc842-5c1d-4ffb-88f4-ec12fffb9144-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-f7b55c5dd-dsjg8\" (UID: \"244bc842-5c1d-4ffb-88f4-ec12fffb9144\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-f7b55c5dd-dsjg8" Apr 24 15:24:33.542111 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:33.542018 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-f7b55c5dd-dsjg8" Apr 24 15:24:33.661205 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:33.661174 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-f7b55c5dd-dsjg8"] Apr 24 15:24:33.665023 ip-10-0-129-231 kubenswrapper[2569]: W0424 15:24:33.664992 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod244bc842_5c1d_4ffb_88f4_ec12fffb9144.slice/crio-5028220dc640ab25a639a390467c8dfec0b151e5bab8ef31e47c57b01bbea34d WatchSource:0}: Error finding container 5028220dc640ab25a639a390467c8dfec0b151e5bab8ef31e47c57b01bbea34d: Status 404 returned error can't find the container with id 5028220dc640ab25a639a390467c8dfec0b151e5bab8ef31e47c57b01bbea34d Apr 24 15:24:33.828938 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:33.828834 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb2070f4-7f3e-4935-a0d7-fe863cf1d0f6" path="/var/lib/kubelet/pods/bb2070f4-7f3e-4935-a0d7-fe863cf1d0f6/volumes" Apr 24 15:24:34.350789 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:34.350750 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-f7b55c5dd-dsjg8" event={"ID":"244bc842-5c1d-4ffb-88f4-ec12fffb9144","Type":"ContainerStarted","Data":"60f6b855b3c52af50c6e2ab2e3576de66acb6037042faee3fa95a5e938a7afde"} Apr 24 15:24:34.350789 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:34.350794 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-f7b55c5dd-dsjg8" event={"ID":"244bc842-5c1d-4ffb-88f4-ec12fffb9144","Type":"ContainerStarted","Data":"5028220dc640ab25a639a390467c8dfec0b151e5bab8ef31e47c57b01bbea34d"} Apr 24 15:24:35.354709 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:35.354667 2569 generic.go:358] "Generic (PLEG): container finished" podID="244bc842-5c1d-4ffb-88f4-ec12fffb9144" containerID="60f6b855b3c52af50c6e2ab2e3576de66acb6037042faee3fa95a5e938a7afde" exitCode=0 Apr 24 15:24:35.355124 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:35.354755 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-f7b55c5dd-dsjg8" event={"ID":"244bc842-5c1d-4ffb-88f4-ec12fffb9144","Type":"ContainerDied","Data":"60f6b855b3c52af50c6e2ab2e3576de66acb6037042faee3fa95a5e938a7afde"} Apr 24 15:24:36.359160 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:36.359124 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-f7b55c5dd-dsjg8" event={"ID":"244bc842-5c1d-4ffb-88f4-ec12fffb9144","Type":"ContainerStarted","Data":"03f8c92ac9304254117d916ca4e9d057052a9793e6cbb3ab5fe9a76263d938f4"} Apr 24 15:24:36.359549 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:36.359345 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-f7b55c5dd-dsjg8" Apr 24 15:24:36.360726 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:36.360698 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-f7b55c5dd-dsjg8" podUID="244bc842-5c1d-4ffb-88f4-ec12fffb9144" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.65:8080: connect: connection refused" Apr 24 15:24:36.377749 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:36.377703 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-f7b55c5dd-dsjg8" podStartSLOduration=3.377688652 podStartE2EDuration="3.377688652s" podCreationTimestamp="2026-04-24 15:24:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 15:24:36.375665816 +0000 UTC m=+3629.169722744" watchObservedRunningTime="2026-04-24 15:24:36.377688652 +0000 UTC m=+3629.171745553" Apr 24 15:24:37.362815 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:37.362778 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-f7b55c5dd-dsjg8" podUID="244bc842-5c1d-4ffb-88f4-ec12fffb9144" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.65:8080: connect: connection refused" Apr 24 15:24:47.341722 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:47.341684 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mjpgb_be4da945-f6d1-4406-adea-f3ccedab88f6/ovn-acl-logging/0.log" Apr 24 15:24:47.343353 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:47.343329 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mjpgb_be4da945-f6d1-4406-adea-f3ccedab88f6/ovn-acl-logging/0.log" Apr 24 15:24:47.362932 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:47.362868 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-f7b55c5dd-dsjg8" podUID="244bc842-5c1d-4ffb-88f4-ec12fffb9144" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.65:8080: connect: connection refused" Apr 24 15:24:57.363557 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:24:57.363510 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-f7b55c5dd-dsjg8" podUID="244bc842-5c1d-4ffb-88f4-ec12fffb9144" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.65:8080: connect: connection refused" Apr 24 15:25:07.363728 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:25:07.363683 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-f7b55c5dd-dsjg8" podUID="244bc842-5c1d-4ffb-88f4-ec12fffb9144" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.65:8080: connect: connection refused" Apr 24 15:25:17.363657 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:25:17.363610 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-f7b55c5dd-dsjg8" podUID="244bc842-5c1d-4ffb-88f4-ec12fffb9144" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.65:8080: connect: connection refused" Apr 24 15:25:27.363725 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:25:27.363674 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-f7b55c5dd-dsjg8" podUID="244bc842-5c1d-4ffb-88f4-ec12fffb9144" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.65:8080: connect: connection refused" Apr 24 15:25:37.363355 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:25:37.363303 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-f7b55c5dd-dsjg8" podUID="244bc842-5c1d-4ffb-88f4-ec12fffb9144" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.65:8080: connect: connection refused" Apr 24 15:25:47.364097 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:25:47.364059 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-f7b55c5dd-dsjg8" Apr 24 15:25:53.264190 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:25:53.264156 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-f7b55c5dd-dsjg8"] Apr 24 15:25:53.264661 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:25:53.264420 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-f7b55c5dd-dsjg8" podUID="244bc842-5c1d-4ffb-88f4-ec12fffb9144" containerName="kserve-container" containerID="cri-o://03f8c92ac9304254117d916ca4e9d057052a9793e6cbb3ab5fe9a76263d938f4" gracePeriod=30 Apr 24 15:25:54.316490 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:25:54.316452 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-58bb75b9f8-djcbl"] Apr 24 15:25:54.319837 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:25:54.319820 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-58bb75b9f8-djcbl" Apr 24 15:25:54.329057 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:25:54.329029 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-58bb75b9f8-djcbl"] Apr 24 15:25:54.384674 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:25:54.384634 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a059f2a2-f7db-4589-b1ed-58a50a089e7e-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-58bb75b9f8-djcbl\" (UID: \"a059f2a2-f7db-4589-b1ed-58a50a089e7e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-58bb75b9f8-djcbl" Apr 24 15:25:54.485436 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:25:54.485390 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a059f2a2-f7db-4589-b1ed-58a50a089e7e-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-58bb75b9f8-djcbl\" (UID: \"a059f2a2-f7db-4589-b1ed-58a50a089e7e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-58bb75b9f8-djcbl" Apr 24 15:25:54.485756 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:25:54.485738 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a059f2a2-f7db-4589-b1ed-58a50a089e7e-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-58bb75b9f8-djcbl\" (UID: \"a059f2a2-f7db-4589-b1ed-58a50a089e7e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-58bb75b9f8-djcbl" Apr 24 15:25:54.630638 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:25:54.630547 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-58bb75b9f8-djcbl" Apr 24 15:25:54.756159 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:25:54.756125 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-58bb75b9f8-djcbl"] Apr 24 15:25:54.759341 ip-10-0-129-231 kubenswrapper[2569]: W0424 15:25:54.759309 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda059f2a2_f7db_4589_b1ed_58a50a089e7e.slice/crio-930ed3b98f54576df31cbe41b10c98edb7327923fef3a472ab0a24318b531dbe WatchSource:0}: Error finding container 930ed3b98f54576df31cbe41b10c98edb7327923fef3a472ab0a24318b531dbe: Status 404 returned error can't find the container with id 930ed3b98f54576df31cbe41b10c98edb7327923fef3a472ab0a24318b531dbe Apr 24 15:25:55.597258 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:25:55.597215 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-58bb75b9f8-djcbl" event={"ID":"a059f2a2-f7db-4589-b1ed-58a50a089e7e","Type":"ContainerStarted","Data":"d1eb5268fabb6800c1ce6ac9b9c150cdacd65d73259fb10c47f144d1551623f4"} Apr 24 15:25:55.597258 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:25:55.597260 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-58bb75b9f8-djcbl" event={"ID":"a059f2a2-f7db-4589-b1ed-58a50a089e7e","Type":"ContainerStarted","Data":"930ed3b98f54576df31cbe41b10c98edb7327923fef3a472ab0a24318b531dbe"} Apr 24 15:25:57.363261 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:25:57.363215 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-f7b55c5dd-dsjg8" podUID="244bc842-5c1d-4ffb-88f4-ec12fffb9144" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.65:8080: connect: connection refused" Apr 24 15:25:57.907612 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:25:57.907587 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-f7b55c5dd-dsjg8" Apr 24 15:25:58.017771 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:25:58.017661 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/244bc842-5c1d-4ffb-88f4-ec12fffb9144-kserve-provision-location\") pod \"244bc842-5c1d-4ffb-88f4-ec12fffb9144\" (UID: \"244bc842-5c1d-4ffb-88f4-ec12fffb9144\") " Apr 24 15:25:58.017771 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:25:58.017741 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/244bc842-5c1d-4ffb-88f4-ec12fffb9144-cabundle-cert\") pod \"244bc842-5c1d-4ffb-88f4-ec12fffb9144\" (UID: \"244bc842-5c1d-4ffb-88f4-ec12fffb9144\") " Apr 24 15:25:58.018082 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:25:58.018043 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/244bc842-5c1d-4ffb-88f4-ec12fffb9144-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "244bc842-5c1d-4ffb-88f4-ec12fffb9144" (UID: "244bc842-5c1d-4ffb-88f4-ec12fffb9144"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 15:25:58.018133 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:25:58.018095 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/244bc842-5c1d-4ffb-88f4-ec12fffb9144-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "244bc842-5c1d-4ffb-88f4-ec12fffb9144" (UID: "244bc842-5c1d-4ffb-88f4-ec12fffb9144"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 15:25:58.118950 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:25:58.118912 2569 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/244bc842-5c1d-4ffb-88f4-ec12fffb9144-cabundle-cert\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 15:25:58.118950 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:25:58.118945 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/244bc842-5c1d-4ffb-88f4-ec12fffb9144-kserve-provision-location\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 15:25:58.607817 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:25:58.607771 2569 generic.go:358] "Generic (PLEG): container finished" podID="244bc842-5c1d-4ffb-88f4-ec12fffb9144" containerID="03f8c92ac9304254117d916ca4e9d057052a9793e6cbb3ab5fe9a76263d938f4" exitCode=0 Apr 24 15:25:58.608332 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:25:58.607858 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-f7b55c5dd-dsjg8" event={"ID":"244bc842-5c1d-4ffb-88f4-ec12fffb9144","Type":"ContainerDied","Data":"03f8c92ac9304254117d916ca4e9d057052a9793e6cbb3ab5fe9a76263d938f4"} Apr 24 15:25:58.608332 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:25:58.607886 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-f7b55c5dd-dsjg8" Apr 24 15:25:58.608332 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:25:58.607924 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-f7b55c5dd-dsjg8" event={"ID":"244bc842-5c1d-4ffb-88f4-ec12fffb9144","Type":"ContainerDied","Data":"5028220dc640ab25a639a390467c8dfec0b151e5bab8ef31e47c57b01bbea34d"} Apr 24 15:25:58.608332 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:25:58.607943 2569 scope.go:117] "RemoveContainer" containerID="03f8c92ac9304254117d916ca4e9d057052a9793e6cbb3ab5fe9a76263d938f4" Apr 24 15:25:58.616802 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:25:58.616783 2569 scope.go:117] "RemoveContainer" containerID="60f6b855b3c52af50c6e2ab2e3576de66acb6037042faee3fa95a5e938a7afde" Apr 24 15:25:58.624275 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:25:58.624250 2569 scope.go:117] "RemoveContainer" containerID="03f8c92ac9304254117d916ca4e9d057052a9793e6cbb3ab5fe9a76263d938f4" Apr 24 15:25:58.624564 ip-10-0-129-231 kubenswrapper[2569]: E0424 15:25:58.624544 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03f8c92ac9304254117d916ca4e9d057052a9793e6cbb3ab5fe9a76263d938f4\": container with ID starting with 03f8c92ac9304254117d916ca4e9d057052a9793e6cbb3ab5fe9a76263d938f4 not found: ID does not exist" containerID="03f8c92ac9304254117d916ca4e9d057052a9793e6cbb3ab5fe9a76263d938f4" Apr 24 15:25:58.624644 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:25:58.624579 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03f8c92ac9304254117d916ca4e9d057052a9793e6cbb3ab5fe9a76263d938f4"} err="failed to get container status \"03f8c92ac9304254117d916ca4e9d057052a9793e6cbb3ab5fe9a76263d938f4\": rpc error: code = NotFound desc = could not find container \"03f8c92ac9304254117d916ca4e9d057052a9793e6cbb3ab5fe9a76263d938f4\": container with ID starting with 03f8c92ac9304254117d916ca4e9d057052a9793e6cbb3ab5fe9a76263d938f4 not found: ID does not exist" Apr 24 15:25:58.624644 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:25:58.624607 2569 scope.go:117] "RemoveContainer" containerID="60f6b855b3c52af50c6e2ab2e3576de66acb6037042faee3fa95a5e938a7afde" Apr 24 15:25:58.624880 ip-10-0-129-231 kubenswrapper[2569]: E0424 15:25:58.624862 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60f6b855b3c52af50c6e2ab2e3576de66acb6037042faee3fa95a5e938a7afde\": container with ID starting with 60f6b855b3c52af50c6e2ab2e3576de66acb6037042faee3fa95a5e938a7afde not found: ID does not exist" containerID="60f6b855b3c52af50c6e2ab2e3576de66acb6037042faee3fa95a5e938a7afde" Apr 24 15:25:58.624981 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:25:58.624886 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60f6b855b3c52af50c6e2ab2e3576de66acb6037042faee3fa95a5e938a7afde"} err="failed to get container status \"60f6b855b3c52af50c6e2ab2e3576de66acb6037042faee3fa95a5e938a7afde\": rpc error: code = NotFound desc = could not find container \"60f6b855b3c52af50c6e2ab2e3576de66acb6037042faee3fa95a5e938a7afde\": container with ID starting with 60f6b855b3c52af50c6e2ab2e3576de66acb6037042faee3fa95a5e938a7afde not found: ID does not exist" Apr 24 15:25:58.628707 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:25:58.628682 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-f7b55c5dd-dsjg8"] Apr 24 15:25:58.634800 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:25:58.634776 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-f7b55c5dd-dsjg8"] Apr 24 15:25:59.613652 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:25:59.613623 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-58bb75b9f8-djcbl_a059f2a2-f7db-4589-b1ed-58a50a089e7e/storage-initializer/0.log" Apr 24 15:25:59.614092 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:25:59.613659 2569 generic.go:358] "Generic (PLEG): container finished" podID="a059f2a2-f7db-4589-b1ed-58a50a089e7e" containerID="d1eb5268fabb6800c1ce6ac9b9c150cdacd65d73259fb10c47f144d1551623f4" exitCode=1 Apr 24 15:25:59.614092 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:25:59.613696 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-58bb75b9f8-djcbl" event={"ID":"a059f2a2-f7db-4589-b1ed-58a50a089e7e","Type":"ContainerDied","Data":"d1eb5268fabb6800c1ce6ac9b9c150cdacd65d73259fb10c47f144d1551623f4"} Apr 24 15:25:59.828258 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:25:59.828213 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="244bc842-5c1d-4ffb-88f4-ec12fffb9144" path="/var/lib/kubelet/pods/244bc842-5c1d-4ffb-88f4-ec12fffb9144/volumes" Apr 24 15:26:00.617744 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:26:00.617718 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-58bb75b9f8-djcbl_a059f2a2-f7db-4589-b1ed-58a50a089e7e/storage-initializer/0.log" Apr 24 15:26:00.618181 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:26:00.617797 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-58bb75b9f8-djcbl" event={"ID":"a059f2a2-f7db-4589-b1ed-58a50a089e7e","Type":"ContainerStarted","Data":"1eae3557e4de79072f9426acc73bba5c121a36a78d8ce6c58268f4b9ae16ebf8"} Apr 24 15:26:03.627952 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:26:03.627922 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-58bb75b9f8-djcbl_a059f2a2-f7db-4589-b1ed-58a50a089e7e/storage-initializer/1.log" Apr 24 15:26:03.628389 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:26:03.628306 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-58bb75b9f8-djcbl_a059f2a2-f7db-4589-b1ed-58a50a089e7e/storage-initializer/0.log" Apr 24 15:26:03.628389 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:26:03.628348 2569 generic.go:358] "Generic (PLEG): container finished" podID="a059f2a2-f7db-4589-b1ed-58a50a089e7e" containerID="1eae3557e4de79072f9426acc73bba5c121a36a78d8ce6c58268f4b9ae16ebf8" exitCode=1 Apr 24 15:26:03.628499 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:26:03.628427 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-58bb75b9f8-djcbl" event={"ID":"a059f2a2-f7db-4589-b1ed-58a50a089e7e","Type":"ContainerDied","Data":"1eae3557e4de79072f9426acc73bba5c121a36a78d8ce6c58268f4b9ae16ebf8"} Apr 24 15:26:03.628499 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:26:03.628470 2569 scope.go:117] "RemoveContainer" containerID="d1eb5268fabb6800c1ce6ac9b9c150cdacd65d73259fb10c47f144d1551623f4" Apr 24 15:26:03.628853 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:26:03.628834 2569 scope.go:117] "RemoveContainer" containerID="d1eb5268fabb6800c1ce6ac9b9c150cdacd65d73259fb10c47f144d1551623f4" Apr 24 15:26:03.639052 ip-10-0-129-231 kubenswrapper[2569]: E0424 15:26:03.639012 2569 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-custom-fail-predictor-58bb75b9f8-djcbl_kserve-ci-e2e-test_a059f2a2-f7db-4589-b1ed-58a50a089e7e_0 in pod sandbox 930ed3b98f54576df31cbe41b10c98edb7327923fef3a472ab0a24318b531dbe from index: no such id: 'd1eb5268fabb6800c1ce6ac9b9c150cdacd65d73259fb10c47f144d1551623f4'" containerID="d1eb5268fabb6800c1ce6ac9b9c150cdacd65d73259fb10c47f144d1551623f4" Apr 24 15:26:03.639150 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:26:03.639068 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1eb5268fabb6800c1ce6ac9b9c150cdacd65d73259fb10c47f144d1551623f4"} err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-custom-fail-predictor-58bb75b9f8-djcbl_kserve-ci-e2e-test_a059f2a2-f7db-4589-b1ed-58a50a089e7e_0 in pod sandbox 930ed3b98f54576df31cbe41b10c98edb7327923fef3a472ab0a24318b531dbe from index: no such id: 'd1eb5268fabb6800c1ce6ac9b9c150cdacd65d73259fb10c47f144d1551623f4'" Apr 24 15:26:03.639267 ip-10-0-129-231 kubenswrapper[2569]: E0424 15:26:03.639247 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-sklearn-s3-tls-custom-fail-predictor-58bb75b9f8-djcbl_kserve-ci-e2e-test(a059f2a2-f7db-4589-b1ed-58a50a089e7e)\"" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-58bb75b9f8-djcbl" podUID="a059f2a2-f7db-4589-b1ed-58a50a089e7e" Apr 24 15:26:04.344372 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:26:04.344339 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-58bb75b9f8-djcbl"] Apr 24 15:26:04.633039 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:26:04.632945 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-58bb75b9f8-djcbl_a059f2a2-f7db-4589-b1ed-58a50a089e7e/storage-initializer/1.log" Apr 24 15:26:04.761946 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:26:04.761923 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-58bb75b9f8-djcbl_a059f2a2-f7db-4589-b1ed-58a50a089e7e/storage-initializer/1.log" Apr 24 15:26:04.762070 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:26:04.761987 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-58bb75b9f8-djcbl" Apr 24 15:26:04.877362 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:26:04.877323 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a059f2a2-f7db-4589-b1ed-58a50a089e7e-kserve-provision-location\") pod \"a059f2a2-f7db-4589-b1ed-58a50a089e7e\" (UID: \"a059f2a2-f7db-4589-b1ed-58a50a089e7e\") " Apr 24 15:26:04.877607 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:26:04.877584 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a059f2a2-f7db-4589-b1ed-58a50a089e7e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a059f2a2-f7db-4589-b1ed-58a50a089e7e" (UID: "a059f2a2-f7db-4589-b1ed-58a50a089e7e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 15:26:04.978666 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:26:04.978609 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a059f2a2-f7db-4589-b1ed-58a50a089e7e-kserve-provision-location\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 15:26:05.370031 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:26:05.369946 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-bdf9d7997-5jhnd"] Apr 24 15:26:05.370284 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:26:05.370271 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="244bc842-5c1d-4ffb-88f4-ec12fffb9144" containerName="storage-initializer" Apr 24 15:26:05.370330 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:26:05.370286 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="244bc842-5c1d-4ffb-88f4-ec12fffb9144" containerName="storage-initializer" Apr 24 15:26:05.370330 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:26:05.370309 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="244bc842-5c1d-4ffb-88f4-ec12fffb9144" containerName="kserve-container" Apr 24 15:26:05.370330 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:26:05.370316 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="244bc842-5c1d-4ffb-88f4-ec12fffb9144" containerName="kserve-container" Apr 24 15:26:05.370428 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:26:05.370331 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a059f2a2-f7db-4589-b1ed-58a50a089e7e" containerName="storage-initializer" Apr 24 15:26:05.370428 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:26:05.370337 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="a059f2a2-f7db-4589-b1ed-58a50a089e7e" containerName="storage-initializer" Apr 24 15:26:05.370428 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:26:05.370347 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a059f2a2-f7db-4589-b1ed-58a50a089e7e" containerName="storage-initializer" Apr 24 15:26:05.370428 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:26:05.370352 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="a059f2a2-f7db-4589-b1ed-58a50a089e7e" containerName="storage-initializer" Apr 24 15:26:05.370428 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:26:05.370418 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="a059f2a2-f7db-4589-b1ed-58a50a089e7e" containerName="storage-initializer" Apr 24 15:26:05.370428 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:26:05.370428 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="a059f2a2-f7db-4589-b1ed-58a50a089e7e" containerName="storage-initializer" Apr 24 15:26:05.370661 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:26:05.370435 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="244bc842-5c1d-4ffb-88f4-ec12fffb9144" containerName="kserve-container" Apr 24 15:26:05.374979 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:26:05.374961 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-bdf9d7997-5jhnd" Apr 24 15:26:05.377032 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:26:05.377008 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 24 15:26:05.379776 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:26:05.379748 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-bdf9d7997-5jhnd"] Apr 24 15:26:05.483080 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:26:05.483028 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/9078dad8-4695-4963-b506-962a4933c2b5-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-bdf9d7997-5jhnd\" (UID: \"9078dad8-4695-4963-b506-962a4933c2b5\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-bdf9d7997-5jhnd" Apr 24 15:26:05.483271 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:26:05.483121 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9078dad8-4695-4963-b506-962a4933c2b5-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-bdf9d7997-5jhnd\" (UID: \"9078dad8-4695-4963-b506-962a4933c2b5\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-bdf9d7997-5jhnd" Apr 24 15:26:05.584454 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:26:05.584411 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/9078dad8-4695-4963-b506-962a4933c2b5-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-bdf9d7997-5jhnd\" (UID: \"9078dad8-4695-4963-b506-962a4933c2b5\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-bdf9d7997-5jhnd" Apr 24 15:26:05.584644 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:26:05.584487 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9078dad8-4695-4963-b506-962a4933c2b5-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-bdf9d7997-5jhnd\" (UID: \"9078dad8-4695-4963-b506-962a4933c2b5\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-bdf9d7997-5jhnd" Apr 24 15:26:05.584864 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:26:05.584844 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9078dad8-4695-4963-b506-962a4933c2b5-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-bdf9d7997-5jhnd\" (UID: \"9078dad8-4695-4963-b506-962a4933c2b5\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-bdf9d7997-5jhnd" Apr 24 15:26:05.585080 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:26:05.585060 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/9078dad8-4695-4963-b506-962a4933c2b5-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-bdf9d7997-5jhnd\" (UID: \"9078dad8-4695-4963-b506-962a4933c2b5\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-bdf9d7997-5jhnd" Apr 24 15:26:05.637773 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:26:05.637682 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-58bb75b9f8-djcbl_a059f2a2-f7db-4589-b1ed-58a50a089e7e/storage-initializer/1.log" Apr 24 15:26:05.638199 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:26:05.637794 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-58bb75b9f8-djcbl" Apr 24 15:26:05.638199 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:26:05.637806 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-58bb75b9f8-djcbl" event={"ID":"a059f2a2-f7db-4589-b1ed-58a50a089e7e","Type":"ContainerDied","Data":"930ed3b98f54576df31cbe41b10c98edb7327923fef3a472ab0a24318b531dbe"} Apr 24 15:26:05.638199 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:26:05.637848 2569 scope.go:117] "RemoveContainer" containerID="1eae3557e4de79072f9426acc73bba5c121a36a78d8ce6c58268f4b9ae16ebf8" Apr 24 15:26:05.670379 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:26:05.670341 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-58bb75b9f8-djcbl"] Apr 24 15:26:05.673796 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:26:05.673764 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-58bb75b9f8-djcbl"] Apr 24 15:26:05.687322 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:26:05.687299 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-bdf9d7997-5jhnd" Apr 24 15:26:05.813880 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:26:05.813854 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-bdf9d7997-5jhnd"] Apr 24 15:26:05.816483 ip-10-0-129-231 kubenswrapper[2569]: W0424 15:26:05.816454 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9078dad8_4695_4963_b506_962a4933c2b5.slice/crio-448a3c32e8682de165263e83bafc1ab846f8ce94211e4a92c61c22d4e8fbca09 WatchSource:0}: Error finding container 448a3c32e8682de165263e83bafc1ab846f8ce94211e4a92c61c22d4e8fbca09: Status 404 returned error can't find the container with id 448a3c32e8682de165263e83bafc1ab846f8ce94211e4a92c61c22d4e8fbca09 Apr 24 15:26:05.827456 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:26:05.827426 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a059f2a2-f7db-4589-b1ed-58a50a089e7e" path="/var/lib/kubelet/pods/a059f2a2-f7db-4589-b1ed-58a50a089e7e/volumes" Apr 24 15:26:06.642866 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:26:06.642823 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-bdf9d7997-5jhnd" event={"ID":"9078dad8-4695-4963-b506-962a4933c2b5","Type":"ContainerStarted","Data":"bb81e29d2c7d4f0475969e9223f924970d1f343f414c4282119ff35f40ab0714"} Apr 24 15:26:06.643320 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:26:06.642876 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-bdf9d7997-5jhnd" event={"ID":"9078dad8-4695-4963-b506-962a4933c2b5","Type":"ContainerStarted","Data":"448a3c32e8682de165263e83bafc1ab846f8ce94211e4a92c61c22d4e8fbca09"} Apr 24 15:26:07.648454 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:26:07.648418 2569 generic.go:358] "Generic (PLEG): container finished" podID="9078dad8-4695-4963-b506-962a4933c2b5" containerID="bb81e29d2c7d4f0475969e9223f924970d1f343f414c4282119ff35f40ab0714" exitCode=0 Apr 24 15:26:07.648937 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:26:07.648505 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-bdf9d7997-5jhnd" event={"ID":"9078dad8-4695-4963-b506-962a4933c2b5","Type":"ContainerDied","Data":"bb81e29d2c7d4f0475969e9223f924970d1f343f414c4282119ff35f40ab0714"} Apr 24 15:26:08.653449 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:26:08.653405 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-bdf9d7997-5jhnd" event={"ID":"9078dad8-4695-4963-b506-962a4933c2b5","Type":"ContainerStarted","Data":"dcf03dec499689b8ef2e881da2068eb7c0853d29a1cc4327af7cca799925978a"} Apr 24 15:26:08.653962 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:26:08.653563 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-bdf9d7997-5jhnd" Apr 24 15:26:08.654977 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:26:08.654949 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-bdf9d7997-5jhnd" podUID="9078dad8-4695-4963-b506-962a4933c2b5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.67:8080: connect: connection refused" Apr 24 15:26:08.668561 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:26:08.668509 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-bdf9d7997-5jhnd" podStartSLOduration=3.668493775 podStartE2EDuration="3.668493775s" podCreationTimestamp="2026-04-24 15:26:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 15:26:08.666470788 +0000 UTC m=+3721.460527690" watchObservedRunningTime="2026-04-24 15:26:08.668493775 +0000 UTC m=+3721.462550676" Apr 24 15:26:09.657226 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:26:09.657181 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-bdf9d7997-5jhnd" podUID="9078dad8-4695-4963-b506-962a4933c2b5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.67:8080: connect: connection refused" Apr 24 15:26:19.657513 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:26:19.657461 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-bdf9d7997-5jhnd" podUID="9078dad8-4695-4963-b506-962a4933c2b5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.67:8080: connect: connection refused" Apr 24 15:26:29.658245 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:26:29.658197 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-bdf9d7997-5jhnd" podUID="9078dad8-4695-4963-b506-962a4933c2b5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.67:8080: connect: connection refused" Apr 24 15:26:39.657851 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:26:39.657806 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-bdf9d7997-5jhnd" podUID="9078dad8-4695-4963-b506-962a4933c2b5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.67:8080: connect: connection refused" Apr 24 15:26:49.657680 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:26:49.657636 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-bdf9d7997-5jhnd" podUID="9078dad8-4695-4963-b506-962a4933c2b5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.67:8080: connect: connection refused" Apr 24 15:26:59.658120 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:26:59.658029 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-bdf9d7997-5jhnd" podUID="9078dad8-4695-4963-b506-962a4933c2b5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.67:8080: connect: connection refused" Apr 24 15:27:09.657992 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:09.657946 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-bdf9d7997-5jhnd" podUID="9078dad8-4695-4963-b506-962a4933c2b5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.67:8080: connect: connection refused" Apr 24 15:27:19.659313 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:19.659282 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-bdf9d7997-5jhnd" Apr 24 15:27:25.410824 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:25.410785 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-bdf9d7997-5jhnd"] Apr 24 15:27:25.411254 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:25.411115 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-bdf9d7997-5jhnd" podUID="9078dad8-4695-4963-b506-962a4933c2b5" containerName="kserve-container" containerID="cri-o://dcf03dec499689b8ef2e881da2068eb7c0853d29a1cc4327af7cca799925978a" gracePeriod=30 Apr 24 15:27:26.449647 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:26.449611 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b694d9b8c-x7k7k"] Apr 24 15:27:26.453043 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:26.453021 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b694d9b8c-x7k7k" Apr 24 15:27:26.460108 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:26.460078 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b694d9b8c-x7k7k"] Apr 24 15:27:26.471853 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:26.471824 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a343793a-5bd5-411b-8215-dc6e88ea0fe3-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5b694d9b8c-x7k7k\" (UID: \"a343793a-5bd5-411b-8215-dc6e88ea0fe3\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b694d9b8c-x7k7k" Apr 24 15:27:26.572933 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:26.572877 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a343793a-5bd5-411b-8215-dc6e88ea0fe3-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5b694d9b8c-x7k7k\" (UID: \"a343793a-5bd5-411b-8215-dc6e88ea0fe3\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b694d9b8c-x7k7k" Apr 24 15:27:26.573294 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:26.573276 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a343793a-5bd5-411b-8215-dc6e88ea0fe3-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5b694d9b8c-x7k7k\" (UID: \"a343793a-5bd5-411b-8215-dc6e88ea0fe3\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b694d9b8c-x7k7k" Apr 24 15:27:26.764565 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:26.764469 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b694d9b8c-x7k7k" Apr 24 15:27:26.886499 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:26.886319 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b694d9b8c-x7k7k"] Apr 24 15:27:27.900193 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:27.900155 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b694d9b8c-x7k7k" event={"ID":"a343793a-5bd5-411b-8215-dc6e88ea0fe3","Type":"ContainerStarted","Data":"af164514377358ad38e9f87690d71126eba8b248738d98ac0ad20fd7eb8f72c6"} Apr 24 15:27:27.900193 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:27.900190 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b694d9b8c-x7k7k" event={"ID":"a343793a-5bd5-411b-8215-dc6e88ea0fe3","Type":"ContainerStarted","Data":"ce23c724ae5b593c91db9cd39cf9c8956b987988f1d8efaa77973f95d2c0679f"} Apr 24 15:27:29.657437 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:29.657387 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-bdf9d7997-5jhnd" podUID="9078dad8-4695-4963-b506-962a4933c2b5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.67:8080: connect: connection refused" Apr 24 15:27:29.908686 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:29.908607 2569 generic.go:358] "Generic (PLEG): container finished" podID="9078dad8-4695-4963-b506-962a4933c2b5" containerID="dcf03dec499689b8ef2e881da2068eb7c0853d29a1cc4327af7cca799925978a" exitCode=0 Apr 24 15:27:29.908686 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:29.908659 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-bdf9d7997-5jhnd" event={"ID":"9078dad8-4695-4963-b506-962a4933c2b5","Type":"ContainerDied","Data":"dcf03dec499689b8ef2e881da2068eb7c0853d29a1cc4327af7cca799925978a"} Apr 24 15:27:29.952888 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:29.952862 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-bdf9d7997-5jhnd" Apr 24 15:27:30.000369 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:30.000334 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9078dad8-4695-4963-b506-962a4933c2b5-kserve-provision-location\") pod \"9078dad8-4695-4963-b506-962a4933c2b5\" (UID: \"9078dad8-4695-4963-b506-962a4933c2b5\") " Apr 24 15:27:30.000527 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:30.000393 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/9078dad8-4695-4963-b506-962a4933c2b5-cabundle-cert\") pod \"9078dad8-4695-4963-b506-962a4933c2b5\" (UID: \"9078dad8-4695-4963-b506-962a4933c2b5\") " Apr 24 15:27:30.000725 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:30.000696 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9078dad8-4695-4963-b506-962a4933c2b5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9078dad8-4695-4963-b506-962a4933c2b5" (UID: "9078dad8-4695-4963-b506-962a4933c2b5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 15:27:30.000838 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:30.000811 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9078dad8-4695-4963-b506-962a4933c2b5-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "9078dad8-4695-4963-b506-962a4933c2b5" (UID: "9078dad8-4695-4963-b506-962a4933c2b5"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 15:27:30.101697 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:30.101661 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9078dad8-4695-4963-b506-962a4933c2b5-kserve-provision-location\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 15:27:30.101697 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:30.101691 2569 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/9078dad8-4695-4963-b506-962a4933c2b5-cabundle-cert\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 15:27:30.912928 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:30.912882 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-bdf9d7997-5jhnd" Apr 24 15:27:30.913461 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:30.912880 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-bdf9d7997-5jhnd" event={"ID":"9078dad8-4695-4963-b506-962a4933c2b5","Type":"ContainerDied","Data":"448a3c32e8682de165263e83bafc1ab846f8ce94211e4a92c61c22d4e8fbca09"} Apr 24 15:27:30.913461 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:30.913025 2569 scope.go:117] "RemoveContainer" containerID="dcf03dec499689b8ef2e881da2068eb7c0853d29a1cc4327af7cca799925978a" Apr 24 15:27:30.914498 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:30.914480 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-5b694d9b8c-x7k7k_a343793a-5bd5-411b-8215-dc6e88ea0fe3/storage-initializer/0.log" Apr 24 15:27:30.914609 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:30.914518 2569 generic.go:358] "Generic (PLEG): container finished" podID="a343793a-5bd5-411b-8215-dc6e88ea0fe3" containerID="af164514377358ad38e9f87690d71126eba8b248738d98ac0ad20fd7eb8f72c6" exitCode=1 Apr 24 15:27:30.914609 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:30.914549 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b694d9b8c-x7k7k" event={"ID":"a343793a-5bd5-411b-8215-dc6e88ea0fe3","Type":"ContainerDied","Data":"af164514377358ad38e9f87690d71126eba8b248738d98ac0ad20fd7eb8f72c6"} Apr 24 15:27:30.922443 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:30.922425 2569 scope.go:117] "RemoveContainer" containerID="bb81e29d2c7d4f0475969e9223f924970d1f343f414c4282119ff35f40ab0714" Apr 24 15:27:30.941230 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:30.941205 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-bdf9d7997-5jhnd"] Apr 24 15:27:30.946051 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:30.946026 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-bdf9d7997-5jhnd"] Apr 24 15:27:31.827840 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:31.827802 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9078dad8-4695-4963-b506-962a4933c2b5" path="/var/lib/kubelet/pods/9078dad8-4695-4963-b506-962a4933c2b5/volumes" Apr 24 15:27:31.919753 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:31.919724 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-5b694d9b8c-x7k7k_a343793a-5bd5-411b-8215-dc6e88ea0fe3/storage-initializer/0.log" Apr 24 15:27:31.920238 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:31.919854 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b694d9b8c-x7k7k" event={"ID":"a343793a-5bd5-411b-8215-dc6e88ea0fe3","Type":"ContainerStarted","Data":"6b1e61e1f38aa89d794d50a5bd27acd1d817f256cb32bce617ec145b62d1c20f"} Apr 24 15:27:33.928247 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:33.928220 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-5b694d9b8c-x7k7k_a343793a-5bd5-411b-8215-dc6e88ea0fe3/storage-initializer/1.log" Apr 24 15:27:33.928685 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:33.928603 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-5b694d9b8c-x7k7k_a343793a-5bd5-411b-8215-dc6e88ea0fe3/storage-initializer/0.log" Apr 24 15:27:33.928685 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:33.928645 2569 generic.go:358] "Generic (PLEG): container finished" podID="a343793a-5bd5-411b-8215-dc6e88ea0fe3" containerID="6b1e61e1f38aa89d794d50a5bd27acd1d817f256cb32bce617ec145b62d1c20f" exitCode=1 Apr 24 15:27:33.928801 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:33.928713 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b694d9b8c-x7k7k" event={"ID":"a343793a-5bd5-411b-8215-dc6e88ea0fe3","Type":"ContainerDied","Data":"6b1e61e1f38aa89d794d50a5bd27acd1d817f256cb32bce617ec145b62d1c20f"} Apr 24 15:27:33.928801 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:33.928754 2569 scope.go:117] "RemoveContainer" containerID="af164514377358ad38e9f87690d71126eba8b248738d98ac0ad20fd7eb8f72c6" Apr 24 15:27:33.929133 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:33.929118 2569 scope.go:117] "RemoveContainer" containerID="af164514377358ad38e9f87690d71126eba8b248738d98ac0ad20fd7eb8f72c6" Apr 24 15:27:33.939454 ip-10-0-129-231 kubenswrapper[2569]: E0424 15:27:33.939425 2569 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-serving-fail-predictor-5b694d9b8c-x7k7k_kserve-ci-e2e-test_a343793a-5bd5-411b-8215-dc6e88ea0fe3_0 in pod sandbox ce23c724ae5b593c91db9cd39cf9c8956b987988f1d8efaa77973f95d2c0679f from index: no such id: 'af164514377358ad38e9f87690d71126eba8b248738d98ac0ad20fd7eb8f72c6'" containerID="af164514377358ad38e9f87690d71126eba8b248738d98ac0ad20fd7eb8f72c6" Apr 24 15:27:33.939528 ip-10-0-129-231 kubenswrapper[2569]: E0424 15:27:33.939474 2569 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-serving-fail-predictor-5b694d9b8c-x7k7k_kserve-ci-e2e-test_a343793a-5bd5-411b-8215-dc6e88ea0fe3_0 in pod sandbox ce23c724ae5b593c91db9cd39cf9c8956b987988f1d8efaa77973f95d2c0679f from index: no such id: 'af164514377358ad38e9f87690d71126eba8b248738d98ac0ad20fd7eb8f72c6'; Skipping pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5b694d9b8c-x7k7k_kserve-ci-e2e-test(a343793a-5bd5-411b-8215-dc6e88ea0fe3)\"" logger="UnhandledError" Apr 24 15:27:33.940805 ip-10-0-129-231 kubenswrapper[2569]: E0424 15:27:33.940785 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-sklearn-s3-tls-serving-fail-predictor-5b694d9b8c-x7k7k_kserve-ci-e2e-test(a343793a-5bd5-411b-8215-dc6e88ea0fe3)\"" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b694d9b8c-x7k7k" podUID="a343793a-5bd5-411b-8215-dc6e88ea0fe3" Apr 24 15:27:34.932728 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:34.932700 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-5b694d9b8c-x7k7k_a343793a-5bd5-411b-8215-dc6e88ea0fe3/storage-initializer/1.log" Apr 24 15:27:36.515108 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:36.515074 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b694d9b8c-x7k7k"] Apr 24 15:27:36.646112 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:36.646089 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-5b694d9b8c-x7k7k_a343793a-5bd5-411b-8215-dc6e88ea0fe3/storage-initializer/1.log" Apr 24 15:27:36.646249 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:36.646148 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b694d9b8c-x7k7k" Apr 24 15:27:36.757778 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:36.757739 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a343793a-5bd5-411b-8215-dc6e88ea0fe3-kserve-provision-location\") pod \"a343793a-5bd5-411b-8215-dc6e88ea0fe3\" (UID: \"a343793a-5bd5-411b-8215-dc6e88ea0fe3\") " Apr 24 15:27:36.758096 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:36.758070 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a343793a-5bd5-411b-8215-dc6e88ea0fe3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a343793a-5bd5-411b-8215-dc6e88ea0fe3" (UID: "a343793a-5bd5-411b-8215-dc6e88ea0fe3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 15:27:36.858954 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:36.858821 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a343793a-5bd5-411b-8215-dc6e88ea0fe3-kserve-provision-location\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 15:27:36.940053 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:36.940028 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-5b694d9b8c-x7k7k_a343793a-5bd5-411b-8215-dc6e88ea0fe3/storage-initializer/1.log" Apr 24 15:27:36.940215 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:36.940094 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b694d9b8c-x7k7k" event={"ID":"a343793a-5bd5-411b-8215-dc6e88ea0fe3","Type":"ContainerDied","Data":"ce23c724ae5b593c91db9cd39cf9c8956b987988f1d8efaa77973f95d2c0679f"} Apr 24 15:27:36.940215 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:36.940131 2569 scope.go:117] "RemoveContainer" containerID="6b1e61e1f38aa89d794d50a5bd27acd1d817f256cb32bce617ec145b62d1c20f" Apr 24 15:27:36.940215 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:36.940139 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b694d9b8c-x7k7k" Apr 24 15:27:36.974194 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:36.974148 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b694d9b8c-x7k7k"] Apr 24 15:27:36.977106 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:36.977079 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b694d9b8c-x7k7k"] Apr 24 15:27:37.827984 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:37.827948 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a343793a-5bd5-411b-8215-dc6e88ea0fe3" path="/var/lib/kubelet/pods/a343793a-5bd5-411b-8215-dc6e88ea0fe3/volumes" Apr 24 15:27:39.024307 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:39.024270 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-nj8c8/must-gather-rkh4d"] Apr 24 15:27:39.024682 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:39.024600 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a343793a-5bd5-411b-8215-dc6e88ea0fe3" containerName="storage-initializer" Apr 24 15:27:39.024682 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:39.024612 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="a343793a-5bd5-411b-8215-dc6e88ea0fe3" containerName="storage-initializer" Apr 24 15:27:39.024682 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:39.024628 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9078dad8-4695-4963-b506-962a4933c2b5" containerName="storage-initializer" Apr 24 15:27:39.024682 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:39.024637 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="9078dad8-4695-4963-b506-962a4933c2b5" containerName="storage-initializer" Apr 24 15:27:39.024682 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:39.024649 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9078dad8-4695-4963-b506-962a4933c2b5" containerName="kserve-container" Apr 24 15:27:39.024682 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:39.024655 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="9078dad8-4695-4963-b506-962a4933c2b5" containerName="kserve-container" Apr 24 15:27:39.024870 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:39.024698 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="9078dad8-4695-4963-b506-962a4933c2b5" containerName="kserve-container" Apr 24 15:27:39.024870 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:39.024712 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="a343793a-5bd5-411b-8215-dc6e88ea0fe3" containerName="storage-initializer" Apr 24 15:27:39.024870 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:39.024720 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="a343793a-5bd5-411b-8215-dc6e88ea0fe3" containerName="storage-initializer" Apr 24 15:27:39.024870 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:39.024776 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a343793a-5bd5-411b-8215-dc6e88ea0fe3" containerName="storage-initializer" Apr 24 15:27:39.024870 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:39.024783 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="a343793a-5bd5-411b-8215-dc6e88ea0fe3" containerName="storage-initializer" Apr 24 15:27:39.029042 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:39.029022 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nj8c8/must-gather-rkh4d" Apr 24 15:27:39.031155 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:39.031129 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-nj8c8\"/\"openshift-service-ca.crt\"" Apr 24 15:27:39.031273 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:39.031182 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-nj8c8\"/\"kube-root-ca.crt\"" Apr 24 15:27:39.036564 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:39.036539 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nj8c8/must-gather-rkh4d"] Apr 24 15:27:39.079782 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:39.079752 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/efba8b38-547f-451c-8bf6-90717fcb934e-must-gather-output\") pod \"must-gather-rkh4d\" (UID: \"efba8b38-547f-451c-8bf6-90717fcb934e\") " pod="openshift-must-gather-nj8c8/must-gather-rkh4d" Apr 24 15:27:39.079986 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:39.079812 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjjp4\" (UniqueName: \"kubernetes.io/projected/efba8b38-547f-451c-8bf6-90717fcb934e-kube-api-access-kjjp4\") pod \"must-gather-rkh4d\" (UID: \"efba8b38-547f-451c-8bf6-90717fcb934e\") " pod="openshift-must-gather-nj8c8/must-gather-rkh4d" Apr 24 15:27:39.180983 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:39.180940 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kjjp4\" (UniqueName: \"kubernetes.io/projected/efba8b38-547f-451c-8bf6-90717fcb934e-kube-api-access-kjjp4\") pod \"must-gather-rkh4d\" (UID: \"efba8b38-547f-451c-8bf6-90717fcb934e\") " pod="openshift-must-gather-nj8c8/must-gather-rkh4d" Apr 24 15:27:39.181174 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:39.180998 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/efba8b38-547f-451c-8bf6-90717fcb934e-must-gather-output\") pod \"must-gather-rkh4d\" (UID: \"efba8b38-547f-451c-8bf6-90717fcb934e\") " pod="openshift-must-gather-nj8c8/must-gather-rkh4d" Apr 24 15:27:39.181295 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:39.181280 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/efba8b38-547f-451c-8bf6-90717fcb934e-must-gather-output\") pod \"must-gather-rkh4d\" (UID: \"efba8b38-547f-451c-8bf6-90717fcb934e\") " pod="openshift-must-gather-nj8c8/must-gather-rkh4d" Apr 24 15:27:39.190205 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:39.190181 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjjp4\" (UniqueName: \"kubernetes.io/projected/efba8b38-547f-451c-8bf6-90717fcb934e-kube-api-access-kjjp4\") pod \"must-gather-rkh4d\" (UID: \"efba8b38-547f-451c-8bf6-90717fcb934e\") " pod="openshift-must-gather-nj8c8/must-gather-rkh4d" Apr 24 15:27:39.347545 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:39.347447 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nj8c8/must-gather-rkh4d" Apr 24 15:27:39.467300 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:39.467271 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nj8c8/must-gather-rkh4d"] Apr 24 15:27:39.470274 ip-10-0-129-231 kubenswrapper[2569]: W0424 15:27:39.470243 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podefba8b38_547f_451c_8bf6_90717fcb934e.slice/crio-2be12fb16da5793edac4f241b83fbf42e69a3a5aec854d2395d54e699a68edba WatchSource:0}: Error finding container 2be12fb16da5793edac4f241b83fbf42e69a3a5aec854d2395d54e699a68edba: Status 404 returned error can't find the container with id 2be12fb16da5793edac4f241b83fbf42e69a3a5aec854d2395d54e699a68edba Apr 24 15:27:39.951414 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:39.951377 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nj8c8/must-gather-rkh4d" event={"ID":"efba8b38-547f-451c-8bf6-90717fcb934e","Type":"ContainerStarted","Data":"2be12fb16da5793edac4f241b83fbf42e69a3a5aec854d2395d54e699a68edba"} Apr 24 15:27:43.967106 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:43.967053 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nj8c8/must-gather-rkh4d" event={"ID":"efba8b38-547f-451c-8bf6-90717fcb934e","Type":"ContainerStarted","Data":"2978459b68f0acafccdbd59dd59635cce96cd3e930080848489020f387e964bd"} Apr 24 15:27:43.967497 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:43.967114 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nj8c8/must-gather-rkh4d" event={"ID":"efba8b38-547f-451c-8bf6-90717fcb934e","Type":"ContainerStarted","Data":"580f89cb32656307aabbed1b740d77836ac4c304cd5de8c0044bf482dd801b49"} Apr 24 15:27:43.982156 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:27:43.982088 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-nj8c8/must-gather-rkh4d" podStartSLOduration=0.815389978 podStartE2EDuration="4.982066914s" podCreationTimestamp="2026-04-24 15:27:39 +0000 UTC" firstStartedPulling="2026-04-24 15:27:39.472338404 +0000 UTC m=+3812.266395301" lastFinishedPulling="2026-04-24 15:27:43.639015352 +0000 UTC m=+3816.433072237" observedRunningTime="2026-04-24 15:27:43.981576866 +0000 UTC m=+3816.775633767" watchObservedRunningTime="2026-04-24 15:27:43.982066914 +0000 UTC m=+3816.776123816" Apr 24 15:28:05.038131 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:05.038036 2569 generic.go:358] "Generic (PLEG): container finished" podID="efba8b38-547f-451c-8bf6-90717fcb934e" containerID="580f89cb32656307aabbed1b740d77836ac4c304cd5de8c0044bf482dd801b49" exitCode=0 Apr 24 15:28:05.038131 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:05.038080 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nj8c8/must-gather-rkh4d" event={"ID":"efba8b38-547f-451c-8bf6-90717fcb934e","Type":"ContainerDied","Data":"580f89cb32656307aabbed1b740d77836ac4c304cd5de8c0044bf482dd801b49"} Apr 24 15:28:05.038599 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:05.038426 2569 scope.go:117] "RemoveContainer" containerID="580f89cb32656307aabbed1b740d77836ac4c304cd5de8c0044bf482dd801b49" Apr 24 15:28:05.771431 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:05.771400 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-nj8c8_must-gather-rkh4d_efba8b38-547f-451c-8bf6-90717fcb934e/gather/0.log" Apr 24 15:28:06.365142 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:06.365108 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6vl4w/must-gather-6k2xh"] Apr 24 15:28:06.368565 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:06.368546 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6vl4w/must-gather-6k2xh" Apr 24 15:28:06.370685 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:06.370667 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-6vl4w\"/\"kube-root-ca.crt\"" Apr 24 15:28:06.371099 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:06.371078 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-6vl4w\"/\"default-dockercfg-9rknq\"" Apr 24 15:28:06.371186 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:06.371110 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-6vl4w\"/\"openshift-service-ca.crt\"" Apr 24 15:28:06.374990 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:06.374968 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6vl4w/must-gather-6k2xh"] Apr 24 15:28:06.531985 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:06.531947 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5437eeea-7f08-4358-a911-0b3395ebc7e3-must-gather-output\") pod \"must-gather-6k2xh\" (UID: \"5437eeea-7f08-4358-a911-0b3395ebc7e3\") " pod="openshift-must-gather-6vl4w/must-gather-6k2xh" Apr 24 15:28:06.531985 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:06.531985 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qt2m\" (UniqueName: \"kubernetes.io/projected/5437eeea-7f08-4358-a911-0b3395ebc7e3-kube-api-access-9qt2m\") pod \"must-gather-6k2xh\" (UID: \"5437eeea-7f08-4358-a911-0b3395ebc7e3\") " pod="openshift-must-gather-6vl4w/must-gather-6k2xh" Apr 24 15:28:06.632951 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:06.632826 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5437eeea-7f08-4358-a911-0b3395ebc7e3-must-gather-output\") pod \"must-gather-6k2xh\" (UID: \"5437eeea-7f08-4358-a911-0b3395ebc7e3\") " pod="openshift-must-gather-6vl4w/must-gather-6k2xh" Apr 24 15:28:06.632951 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:06.632868 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9qt2m\" (UniqueName: \"kubernetes.io/projected/5437eeea-7f08-4358-a911-0b3395ebc7e3-kube-api-access-9qt2m\") pod \"must-gather-6k2xh\" (UID: \"5437eeea-7f08-4358-a911-0b3395ebc7e3\") " pod="openshift-must-gather-6vl4w/must-gather-6k2xh" Apr 24 15:28:06.633215 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:06.633194 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5437eeea-7f08-4358-a911-0b3395ebc7e3-must-gather-output\") pod \"must-gather-6k2xh\" (UID: \"5437eeea-7f08-4358-a911-0b3395ebc7e3\") " pod="openshift-must-gather-6vl4w/must-gather-6k2xh" Apr 24 15:28:06.641047 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:06.641020 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qt2m\" (UniqueName: \"kubernetes.io/projected/5437eeea-7f08-4358-a911-0b3395ebc7e3-kube-api-access-9qt2m\") pod \"must-gather-6k2xh\" (UID: \"5437eeea-7f08-4358-a911-0b3395ebc7e3\") " pod="openshift-must-gather-6vl4w/must-gather-6k2xh" Apr 24 15:28:06.678117 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:06.678079 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6vl4w/must-gather-6k2xh" Apr 24 15:28:06.802047 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:06.802005 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6vl4w/must-gather-6k2xh"] Apr 24 15:28:06.804762 ip-10-0-129-231 kubenswrapper[2569]: W0424 15:28:06.804722 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5437eeea_7f08_4358_a911_0b3395ebc7e3.slice/crio-17459623f54527840c7c278dfbd8993294c65b80da34b8080233d2f2292094cf WatchSource:0}: Error finding container 17459623f54527840c7c278dfbd8993294c65b80da34b8080233d2f2292094cf: Status 404 returned error can't find the container with id 17459623f54527840c7c278dfbd8993294c65b80da34b8080233d2f2292094cf Apr 24 15:28:06.806448 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:06.806433 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 15:28:07.044476 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:07.044439 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6vl4w/must-gather-6k2xh" event={"ID":"5437eeea-7f08-4358-a911-0b3395ebc7e3","Type":"ContainerStarted","Data":"17459623f54527840c7c278dfbd8993294c65b80da34b8080233d2f2292094cf"} Apr 24 15:28:08.050662 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:08.050617 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6vl4w/must-gather-6k2xh" event={"ID":"5437eeea-7f08-4358-a911-0b3395ebc7e3","Type":"ContainerStarted","Data":"1262b5b597fd48203f019bcd668ab36a3abe39342586634ac2a6b7d11113eb25"} Apr 24 15:28:08.050662 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:08.050659 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6vl4w/must-gather-6k2xh" event={"ID":"5437eeea-7f08-4358-a911-0b3395ebc7e3","Type":"ContainerStarted","Data":"78fea2ba0d9b79917dc2d9c57057b0b195c92f4560dc6181e66881d134be9793"} Apr 24 15:28:08.082971 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:08.082918 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6vl4w/must-gather-6k2xh" podStartSLOduration=1.328953268 podStartE2EDuration="2.082887139s" podCreationTimestamp="2026-04-24 15:28:06 +0000 UTC" firstStartedPulling="2026-04-24 15:28:06.806555012 +0000 UTC m=+3839.600611891" lastFinishedPulling="2026-04-24 15:28:07.560488883 +0000 UTC m=+3840.354545762" observedRunningTime="2026-04-24 15:28:08.079857613 +0000 UTC m=+3840.873914581" watchObservedRunningTime="2026-04-24 15:28:08.082887139 +0000 UTC m=+3840.876944041" Apr 24 15:28:09.258704 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:09.258665 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-jbgh4_6e81e6c3-58b8-46a4-8e92-6b8943176758/global-pull-secret-syncer/0.log" Apr 24 15:28:09.325545 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:09.325499 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-k8phw_c5b32595-8f39-477c-8d42-8cc919341875/konnectivity-agent/0.log" Apr 24 15:28:09.444137 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:09.444105 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-129-231.ec2.internal_b8b57e2ee8dbdeeec77b109df4b31692/haproxy/0.log" Apr 24 15:28:11.211101 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:11.211043 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-nj8c8/must-gather-rkh4d"] Apr 24 15:28:11.212345 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:11.212305 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-nj8c8/must-gather-rkh4d" podUID="efba8b38-547f-451c-8bf6-90717fcb934e" containerName="copy" containerID="cri-o://2978459b68f0acafccdbd59dd59635cce96cd3e930080848489020f387e964bd" gracePeriod=2 Apr 24 15:28:11.214387 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:11.214353 2569 status_manager.go:895] "Failed to get status for pod" podUID="efba8b38-547f-451c-8bf6-90717fcb934e" pod="openshift-must-gather-nj8c8/must-gather-rkh4d" err="pods \"must-gather-rkh4d\" is forbidden: User \"system:node:ip-10-0-129-231.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-nj8c8\": no relationship found between node 'ip-10-0-129-231.ec2.internal' and this object" Apr 24 15:28:11.217526 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:11.217505 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-nj8c8/must-gather-rkh4d"] Apr 24 15:28:11.592040 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:11.591987 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-nj8c8_must-gather-rkh4d_efba8b38-547f-451c-8bf6-90717fcb934e/copy/0.log" Apr 24 15:28:11.592549 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:11.592525 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nj8c8/must-gather-rkh4d" Apr 24 15:28:11.594270 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:11.594235 2569 status_manager.go:895] "Failed to get status for pod" podUID="efba8b38-547f-451c-8bf6-90717fcb934e" pod="openshift-must-gather-nj8c8/must-gather-rkh4d" err="pods \"must-gather-rkh4d\" is forbidden: User \"system:node:ip-10-0-129-231.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-nj8c8\": no relationship found between node 'ip-10-0-129-231.ec2.internal' and this object" Apr 24 15:28:11.787228 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:11.787113 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjjp4\" (UniqueName: \"kubernetes.io/projected/efba8b38-547f-451c-8bf6-90717fcb934e-kube-api-access-kjjp4\") pod \"efba8b38-547f-451c-8bf6-90717fcb934e\" (UID: \"efba8b38-547f-451c-8bf6-90717fcb934e\") " Apr 24 15:28:11.787228 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:11.787180 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/efba8b38-547f-451c-8bf6-90717fcb934e-must-gather-output\") pod \"efba8b38-547f-451c-8bf6-90717fcb934e\" (UID: \"efba8b38-547f-451c-8bf6-90717fcb934e\") " Apr 24 15:28:11.788883 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:11.788850 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efba8b38-547f-451c-8bf6-90717fcb934e-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "efba8b38-547f-451c-8bf6-90717fcb934e" (UID: "efba8b38-547f-451c-8bf6-90717fcb934e"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 15:28:11.790006 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:11.789973 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efba8b38-547f-451c-8bf6-90717fcb934e-kube-api-access-kjjp4" (OuterVolumeSpecName: "kube-api-access-kjjp4") pod "efba8b38-547f-451c-8bf6-90717fcb934e" (UID: "efba8b38-547f-451c-8bf6-90717fcb934e"). InnerVolumeSpecName "kube-api-access-kjjp4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 15:28:11.829871 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:11.829831 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efba8b38-547f-451c-8bf6-90717fcb934e" path="/var/lib/kubelet/pods/efba8b38-547f-451c-8bf6-90717fcb934e/volumes" Apr 24 15:28:11.887884 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:11.887838 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kjjp4\" (UniqueName: \"kubernetes.io/projected/efba8b38-547f-451c-8bf6-90717fcb934e-kube-api-access-kjjp4\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 15:28:11.887884 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:11.887885 2569 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/efba8b38-547f-451c-8bf6-90717fcb934e-must-gather-output\") on node \"ip-10-0-129-231.ec2.internal\" DevicePath \"\"" Apr 24 15:28:12.068205 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:12.068114 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-nj8c8_must-gather-rkh4d_efba8b38-547f-451c-8bf6-90717fcb934e/copy/0.log" Apr 24 15:28:12.068953 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:12.068924 2569 generic.go:358] "Generic (PLEG): container finished" podID="efba8b38-547f-451c-8bf6-90717fcb934e" containerID="2978459b68f0acafccdbd59dd59635cce96cd3e930080848489020f387e964bd" exitCode=143 Apr 24 15:28:12.069424 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:12.069186 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nj8c8/must-gather-rkh4d" Apr 24 15:28:12.073393 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:12.069217 2569 scope.go:117] "RemoveContainer" containerID="2978459b68f0acafccdbd59dd59635cce96cd3e930080848489020f387e964bd" Apr 24 15:28:12.086250 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:12.086130 2569 scope.go:117] "RemoveContainer" containerID="580f89cb32656307aabbed1b740d77836ac4c304cd5de8c0044bf482dd801b49" Apr 24 15:28:12.118669 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:12.118642 2569 scope.go:117] "RemoveContainer" containerID="2978459b68f0acafccdbd59dd59635cce96cd3e930080848489020f387e964bd" Apr 24 15:28:12.121339 ip-10-0-129-231 kubenswrapper[2569]: E0424 15:28:12.121293 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2978459b68f0acafccdbd59dd59635cce96cd3e930080848489020f387e964bd\": container with ID starting with 2978459b68f0acafccdbd59dd59635cce96cd3e930080848489020f387e964bd not found: ID does not exist" containerID="2978459b68f0acafccdbd59dd59635cce96cd3e930080848489020f387e964bd" Apr 24 15:28:12.121487 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:12.121352 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2978459b68f0acafccdbd59dd59635cce96cd3e930080848489020f387e964bd"} err="failed to get container status \"2978459b68f0acafccdbd59dd59635cce96cd3e930080848489020f387e964bd\": rpc error: code = NotFound desc = could not find container \"2978459b68f0acafccdbd59dd59635cce96cd3e930080848489020f387e964bd\": container with ID starting with 2978459b68f0acafccdbd59dd59635cce96cd3e930080848489020f387e964bd not found: ID does not exist" Apr 24 15:28:12.121487 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:12.121380 2569 scope.go:117] "RemoveContainer" containerID="580f89cb32656307aabbed1b740d77836ac4c304cd5de8c0044bf482dd801b49" Apr 24 15:28:12.122424 ip-10-0-129-231 kubenswrapper[2569]: E0424 15:28:12.122401 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"580f89cb32656307aabbed1b740d77836ac4c304cd5de8c0044bf482dd801b49\": container with ID starting with 580f89cb32656307aabbed1b740d77836ac4c304cd5de8c0044bf482dd801b49 not found: ID does not exist" containerID="580f89cb32656307aabbed1b740d77836ac4c304cd5de8c0044bf482dd801b49" Apr 24 15:28:12.122583 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:12.122560 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"580f89cb32656307aabbed1b740d77836ac4c304cd5de8c0044bf482dd801b49"} err="failed to get container status \"580f89cb32656307aabbed1b740d77836ac4c304cd5de8c0044bf482dd801b49\": rpc error: code = NotFound desc = could not find container \"580f89cb32656307aabbed1b740d77836ac4c304cd5de8c0044bf482dd801b49\": container with ID starting with 580f89cb32656307aabbed1b740d77836ac4c304cd5de8c0044bf482dd801b49 not found: ID does not exist" Apr 24 15:28:13.057591 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:13.057497 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c8f97875-527a-4f66-8d2b-339e82496bab/alertmanager/0.log" Apr 24 15:28:13.087465 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:13.087372 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c8f97875-527a-4f66-8d2b-339e82496bab/config-reloader/0.log" Apr 24 15:28:13.110138 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:13.110109 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c8f97875-527a-4f66-8d2b-339e82496bab/kube-rbac-proxy-web/0.log" Apr 24 15:28:13.134838 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:13.134810 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c8f97875-527a-4f66-8d2b-339e82496bab/kube-rbac-proxy/0.log" Apr 24 15:28:13.158323 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:13.158264 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c8f97875-527a-4f66-8d2b-339e82496bab/kube-rbac-proxy-metric/0.log" Apr 24 15:28:13.186536 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:13.186509 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c8f97875-527a-4f66-8d2b-339e82496bab/prom-label-proxy/0.log" Apr 24 15:28:13.212061 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:13.212027 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c8f97875-527a-4f66-8d2b-339e82496bab/init-config-reloader/0.log" Apr 24 15:28:13.279736 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:13.279707 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-27q6z_3026d04a-ab27-47dd-8ae1-01c9844e4de6/kube-state-metrics/0.log" Apr 24 15:28:13.301435 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:13.301403 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-27q6z_3026d04a-ab27-47dd-8ae1-01c9844e4de6/kube-rbac-proxy-main/0.log" Apr 24 15:28:13.324161 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:13.324058 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-27q6z_3026d04a-ab27-47dd-8ae1-01c9844e4de6/kube-rbac-proxy-self/0.log" Apr 24 15:28:13.408394 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:13.408357 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-6zmm6_a575d177-9774-44af-890a-2397d9ddba99/node-exporter/0.log" Apr 24 15:28:13.433510 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:13.433471 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-6zmm6_a575d177-9774-44af-890a-2397d9ddba99/kube-rbac-proxy/0.log" Apr 24 15:28:13.461391 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:13.461367 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-6zmm6_a575d177-9774-44af-890a-2397d9ddba99/init-textfile/0.log" Apr 24 15:28:13.652366 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:13.652264 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-r7s98_a58cee7f-bde9-4e5c-9153-311121f9b806/kube-rbac-proxy-main/0.log" Apr 24 15:28:13.679305 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:13.679269 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-r7s98_a58cee7f-bde9-4e5c-9153-311121f9b806/kube-rbac-proxy-self/0.log" Apr 24 15:28:13.707248 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:13.707100 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-r7s98_a58cee7f-bde9-4e5c-9153-311121f9b806/openshift-state-metrics/0.log" Apr 24 15:28:13.761078 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:13.761045 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ec9e6a83-e9de-48cd-9b7d-783eafaf3f92/prometheus/0.log" Apr 24 15:28:13.780496 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:13.780464 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ec9e6a83-e9de-48cd-9b7d-783eafaf3f92/config-reloader/0.log" Apr 24 15:28:13.802729 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:13.802683 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ec9e6a83-e9de-48cd-9b7d-783eafaf3f92/thanos-sidecar/0.log" Apr 24 15:28:13.826991 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:13.826959 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ec9e6a83-e9de-48cd-9b7d-783eafaf3f92/kube-rbac-proxy-web/0.log" Apr 24 15:28:13.852096 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:13.852066 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ec9e6a83-e9de-48cd-9b7d-783eafaf3f92/kube-rbac-proxy/0.log" Apr 24 15:28:13.877283 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:13.877253 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ec9e6a83-e9de-48cd-9b7d-783eafaf3f92/kube-rbac-proxy-thanos/0.log" Apr 24 15:28:13.901704 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:13.901676 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ec9e6a83-e9de-48cd-9b7d-783eafaf3f92/init-config-reloader/0.log" Apr 24 15:28:14.004913 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:14.004854 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-64c55f559-vskpb_cc9e438c-2d31-45f3-8009-2f9d7cbf559b/telemeter-client/0.log" Apr 24 15:28:14.028846 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:14.028810 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-64c55f559-vskpb_cc9e438c-2d31-45f3-8009-2f9d7cbf559b/reload/0.log" Apr 24 15:28:14.050691 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:14.050663 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-64c55f559-vskpb_cc9e438c-2d31-45f3-8009-2f9d7cbf559b/kube-rbac-proxy/0.log" Apr 24 15:28:16.405650 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:16.405615 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6vl4w/perf-node-gather-daemonset-kcxfd"] Apr 24 15:28:16.406139 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:16.406119 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="efba8b38-547f-451c-8bf6-90717fcb934e" containerName="gather" Apr 24 15:28:16.406184 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:16.406144 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="efba8b38-547f-451c-8bf6-90717fcb934e" containerName="gather" Apr 24 15:28:16.406184 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:16.406156 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="efba8b38-547f-451c-8bf6-90717fcb934e" containerName="copy" Apr 24 15:28:16.406184 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:16.406164 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="efba8b38-547f-451c-8bf6-90717fcb934e" containerName="copy" Apr 24 15:28:16.406294 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:16.406237 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="efba8b38-547f-451c-8bf6-90717fcb934e" containerName="gather" Apr 24 15:28:16.406294 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:16.406250 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="efba8b38-547f-451c-8bf6-90717fcb934e" containerName="copy" Apr 24 15:28:16.411398 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:16.411367 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6vl4w/perf-node-gather-daemonset-kcxfd" Apr 24 15:28:16.415460 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:16.415432 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6vl4w/perf-node-gather-daemonset-kcxfd"] Apr 24 15:28:16.429032 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:16.428993 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/dcbb52e6-9e84-41c5-8a7a-5a707fa1b28a-podres\") pod \"perf-node-gather-daemonset-kcxfd\" (UID: \"dcbb52e6-9e84-41c5-8a7a-5a707fa1b28a\") " pod="openshift-must-gather-6vl4w/perf-node-gather-daemonset-kcxfd" Apr 24 15:28:16.429218 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:16.429049 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dcbb52e6-9e84-41c5-8a7a-5a707fa1b28a-lib-modules\") pod \"perf-node-gather-daemonset-kcxfd\" (UID: \"dcbb52e6-9e84-41c5-8a7a-5a707fa1b28a\") " pod="openshift-must-gather-6vl4w/perf-node-gather-daemonset-kcxfd" Apr 24 15:28:16.429218 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:16.429093 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s55g4\" (UniqueName: \"kubernetes.io/projected/dcbb52e6-9e84-41c5-8a7a-5a707fa1b28a-kube-api-access-s55g4\") pod \"perf-node-gather-daemonset-kcxfd\" (UID: \"dcbb52e6-9e84-41c5-8a7a-5a707fa1b28a\") " pod="openshift-must-gather-6vl4w/perf-node-gather-daemonset-kcxfd" Apr 24 15:28:16.429218 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:16.429122 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dcbb52e6-9e84-41c5-8a7a-5a707fa1b28a-sys\") pod \"perf-node-gather-daemonset-kcxfd\" (UID: \"dcbb52e6-9e84-41c5-8a7a-5a707fa1b28a\") " pod="openshift-must-gather-6vl4w/perf-node-gather-daemonset-kcxfd" Apr 24 15:28:16.429218 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:16.429153 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/dcbb52e6-9e84-41c5-8a7a-5a707fa1b28a-proc\") pod \"perf-node-gather-daemonset-kcxfd\" (UID: \"dcbb52e6-9e84-41c5-8a7a-5a707fa1b28a\") " pod="openshift-must-gather-6vl4w/perf-node-gather-daemonset-kcxfd" Apr 24 15:28:16.529733 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:16.529688 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/dcbb52e6-9e84-41c5-8a7a-5a707fa1b28a-podres\") pod \"perf-node-gather-daemonset-kcxfd\" (UID: \"dcbb52e6-9e84-41c5-8a7a-5a707fa1b28a\") " pod="openshift-must-gather-6vl4w/perf-node-gather-daemonset-kcxfd" Apr 24 15:28:16.529733 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:16.529727 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dcbb52e6-9e84-41c5-8a7a-5a707fa1b28a-lib-modules\") pod \"perf-node-gather-daemonset-kcxfd\" (UID: \"dcbb52e6-9e84-41c5-8a7a-5a707fa1b28a\") " pod="openshift-must-gather-6vl4w/perf-node-gather-daemonset-kcxfd" Apr 24 15:28:16.529983 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:16.529755 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s55g4\" (UniqueName: \"kubernetes.io/projected/dcbb52e6-9e84-41c5-8a7a-5a707fa1b28a-kube-api-access-s55g4\") pod \"perf-node-gather-daemonset-kcxfd\" (UID: \"dcbb52e6-9e84-41c5-8a7a-5a707fa1b28a\") " pod="openshift-must-gather-6vl4w/perf-node-gather-daemonset-kcxfd" Apr 24 15:28:16.529983 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:16.529775 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dcbb52e6-9e84-41c5-8a7a-5a707fa1b28a-sys\") pod \"perf-node-gather-daemonset-kcxfd\" (UID: \"dcbb52e6-9e84-41c5-8a7a-5a707fa1b28a\") " pod="openshift-must-gather-6vl4w/perf-node-gather-daemonset-kcxfd" Apr 24 15:28:16.529983 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:16.529795 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/dcbb52e6-9e84-41c5-8a7a-5a707fa1b28a-proc\") pod \"perf-node-gather-daemonset-kcxfd\" (UID: \"dcbb52e6-9e84-41c5-8a7a-5a707fa1b28a\") " pod="openshift-must-gather-6vl4w/perf-node-gather-daemonset-kcxfd" Apr 24 15:28:16.529983 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:16.529853 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/dcbb52e6-9e84-41c5-8a7a-5a707fa1b28a-podres\") pod \"perf-node-gather-daemonset-kcxfd\" (UID: \"dcbb52e6-9e84-41c5-8a7a-5a707fa1b28a\") " pod="openshift-must-gather-6vl4w/perf-node-gather-daemonset-kcxfd" Apr 24 15:28:16.529983 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:16.529869 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dcbb52e6-9e84-41c5-8a7a-5a707fa1b28a-sys\") pod \"perf-node-gather-daemonset-kcxfd\" (UID: \"dcbb52e6-9e84-41c5-8a7a-5a707fa1b28a\") " pod="openshift-must-gather-6vl4w/perf-node-gather-daemonset-kcxfd" Apr 24 15:28:16.529983 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:16.529923 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/dcbb52e6-9e84-41c5-8a7a-5a707fa1b28a-proc\") pod \"perf-node-gather-daemonset-kcxfd\" (UID: \"dcbb52e6-9e84-41c5-8a7a-5a707fa1b28a\") " pod="openshift-must-gather-6vl4w/perf-node-gather-daemonset-kcxfd" Apr 24 15:28:16.529983 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:16.529955 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dcbb52e6-9e84-41c5-8a7a-5a707fa1b28a-lib-modules\") pod \"perf-node-gather-daemonset-kcxfd\" (UID: \"dcbb52e6-9e84-41c5-8a7a-5a707fa1b28a\") " pod="openshift-must-gather-6vl4w/perf-node-gather-daemonset-kcxfd" Apr 24 15:28:16.537184 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:16.537155 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s55g4\" (UniqueName: \"kubernetes.io/projected/dcbb52e6-9e84-41c5-8a7a-5a707fa1b28a-kube-api-access-s55g4\") pod \"perf-node-gather-daemonset-kcxfd\" (UID: \"dcbb52e6-9e84-41c5-8a7a-5a707fa1b28a\") " pod="openshift-must-gather-6vl4w/perf-node-gather-daemonset-kcxfd" Apr 24 15:28:16.726826 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:16.726791 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6vl4w/perf-node-gather-daemonset-kcxfd" Apr 24 15:28:16.864704 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:16.864530 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6vl4w/perf-node-gather-daemonset-kcxfd"] Apr 24 15:28:16.867375 ip-10-0-129-231 kubenswrapper[2569]: W0424 15:28:16.867345 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poddcbb52e6_9e84_41c5_8a7a_5a707fa1b28a.slice/crio-b43ced4b5f9f8b1cfabaa2bab56e466a7802045d2b6be404079379d7bc941dac WatchSource:0}: Error finding container b43ced4b5f9f8b1cfabaa2bab56e466a7802045d2b6be404079379d7bc941dac: Status 404 returned error can't find the container with id b43ced4b5f9f8b1cfabaa2bab56e466a7802045d2b6be404079379d7bc941dac Apr 24 15:28:17.090857 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:17.090823 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6vl4w/perf-node-gather-daemonset-kcxfd" event={"ID":"dcbb52e6-9e84-41c5-8a7a-5a707fa1b28a","Type":"ContainerStarted","Data":"1e2f2d7909b304d6268e1f3df60c797df99faf44e64ef1c12474e9647770e425"} Apr 24 15:28:17.091095 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:17.090865 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6vl4w/perf-node-gather-daemonset-kcxfd" event={"ID":"dcbb52e6-9e84-41c5-8a7a-5a707fa1b28a","Type":"ContainerStarted","Data":"b43ced4b5f9f8b1cfabaa2bab56e466a7802045d2b6be404079379d7bc941dac"} Apr 24 15:28:17.091095 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:17.090924 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-6vl4w/perf-node-gather-daemonset-kcxfd" Apr 24 15:28:17.105559 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:17.105512 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6vl4w/perf-node-gather-daemonset-kcxfd" podStartSLOduration=1.105493639 podStartE2EDuration="1.105493639s" podCreationTimestamp="2026-04-24 15:28:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 15:28:17.104506803 +0000 UTC m=+3849.898563694" watchObservedRunningTime="2026-04-24 15:28:17.105493639 +0000 UTC m=+3849.899550540" Apr 24 15:28:17.344852 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:17.344763 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-q6p5b_4ea1b220-dbce-4a4c-bfe3-3c3c34321b6a/dns/0.log" Apr 24 15:28:17.369698 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:17.369670 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-q6p5b_4ea1b220-dbce-4a4c-bfe3-3c3c34321b6a/kube-rbac-proxy/0.log" Apr 24 15:28:17.439648 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:17.439611 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-2ngpp_e741aaa3-8cb4-407a-835a-a552c5b77737/dns-node-resolver/0.log" Apr 24 15:28:17.942982 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:17.942947 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-5f7nk_5a0260d3-f8e0-4a3d-b95a-547feee30046/node-ca/0.log" Apr 24 15:28:19.046436 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:19.046396 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-kkfvn_74ac045c-eb3d-44e5-b6cb-0da4951109bb/serve-healthcheck-canary/0.log" Apr 24 15:28:19.399928 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:19.399819 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-65cl5_4bbd5597-4646-4720-a0e3-cd74795416fa/kube-rbac-proxy/0.log" Apr 24 15:28:19.420006 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:19.419975 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-65cl5_4bbd5597-4646-4720-a0e3-cd74795416fa/exporter/0.log" Apr 24 15:28:19.440982 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:19.440950 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-65cl5_4bbd5597-4646-4720-a0e3-cd74795416fa/extractor/0.log" Apr 24 15:28:21.596470 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:21.596440 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-2nnn4_828e9036-22ef-4b11-9e4b-23d76347c236/server/0.log" Apr 24 15:28:22.002113 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:22.002077 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-fcvjw_859fe824-f035-4f43-b93e-db5cf4147cca/manager/0.log" Apr 24 15:28:22.022882 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:22.022856 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-2cjh7_c386d0ac-d3e3-4ac1-b91c-c57e23d548b8/s3-init/0.log" Apr 24 15:28:22.047612 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:22.047562 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-tls-init-custom-z859z_07fc62cc-dc14-4433-b588-31a1a40d5d27/s3-tls-init-custom/0.log" Apr 24 15:28:22.071568 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:22.071534 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-tls-init-serving-rgjhp_e6a4b37e-7c77-44e6-9bd3-994938f6dba6/s3-tls-init-serving/0.log" Apr 24 15:28:22.156485 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:22.156451 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-tls-serving-7fd5766db9-lstx2_dedfe7b4-3fd9-43f5-8aad-cb4ee3ddc9aa/seaweedfs-tls-serving/0.log" Apr 24 15:28:23.105332 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:23.105306 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-6vl4w/perf-node-gather-daemonset-kcxfd" Apr 24 15:28:27.653833 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:27.653791 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4cxgt_47ccc73c-0a43-4642-8362-1fa6a8574f23/kube-multus/0.log" Apr 24 15:28:27.681387 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:27.681356 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2bsbc_663fea9a-c74c-4a2d-8d62-31e14a29d43a/kube-multus-additional-cni-plugins/0.log" Apr 24 15:28:27.715715 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:27.715682 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2bsbc_663fea9a-c74c-4a2d-8d62-31e14a29d43a/egress-router-binary-copy/0.log" Apr 24 15:28:27.756268 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:27.756241 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2bsbc_663fea9a-c74c-4a2d-8d62-31e14a29d43a/cni-plugins/0.log" Apr 24 15:28:27.784979 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:27.784943 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2bsbc_663fea9a-c74c-4a2d-8d62-31e14a29d43a/bond-cni-plugin/0.log" Apr 24 15:28:27.807017 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:27.806983 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2bsbc_663fea9a-c74c-4a2d-8d62-31e14a29d43a/routeoverride-cni/0.log" Apr 24 15:28:27.828515 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:27.828483 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2bsbc_663fea9a-c74c-4a2d-8d62-31e14a29d43a/whereabouts-cni-bincopy/0.log" Apr 24 15:28:27.849598 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:27.849565 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2bsbc_663fea9a-c74c-4a2d-8d62-31e14a29d43a/whereabouts-cni/0.log" Apr 24 15:28:28.304151 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:28.304117 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-d47r2_496c729f-9eee-4311-8fe8-4502d4af37f8/network-metrics-daemon/0.log" Apr 24 15:28:28.319918 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:28.319835 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-d47r2_496c729f-9eee-4311-8fe8-4502d4af37f8/kube-rbac-proxy/0.log" Apr 24 15:28:29.640641 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:29.640604 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mjpgb_be4da945-f6d1-4406-adea-f3ccedab88f6/ovn-controller/0.log" Apr 24 15:28:29.658816 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:29.658781 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mjpgb_be4da945-f6d1-4406-adea-f3ccedab88f6/ovn-acl-logging/0.log" Apr 24 15:28:29.697963 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:29.697931 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mjpgb_be4da945-f6d1-4406-adea-f3ccedab88f6/ovn-acl-logging/1.log" Apr 24 15:28:29.716473 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:29.716435 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mjpgb_be4da945-f6d1-4406-adea-f3ccedab88f6/kube-rbac-proxy-node/0.log" Apr 24 15:28:29.741040 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:29.741012 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mjpgb_be4da945-f6d1-4406-adea-f3ccedab88f6/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 15:28:29.760848 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:29.760821 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mjpgb_be4da945-f6d1-4406-adea-f3ccedab88f6/northd/0.log" Apr 24 15:28:29.785920 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:29.785869 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mjpgb_be4da945-f6d1-4406-adea-f3ccedab88f6/nbdb/0.log" Apr 24 15:28:29.814590 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:29.814562 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mjpgb_be4da945-f6d1-4406-adea-f3ccedab88f6/sbdb/0.log" Apr 24 15:28:30.013459 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:30.013428 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mjpgb_be4da945-f6d1-4406-adea-f3ccedab88f6/ovnkube-controller/0.log" Apr 24 15:28:31.242119 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:31.242090 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-tzvkp_66892658-4db6-4064-b52f-60baa00dcc6d/network-check-target-container/0.log" Apr 24 15:28:32.140011 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:32.139982 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-8grsv_853085d1-3eec-4e3a-a1e6-999af329c8d0/iptables-alerter/0.log" Apr 24 15:28:32.845328 ip-10-0-129-231 kubenswrapper[2569]: I0424 15:28:32.845300 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-7fkmg_45ac4dc2-9823-4bb9-9fb7-9837cf58d4a0/tuned/0.log"